AI, Existential Risks, and Societal collapse, or a Brief Guide to the End of the World

0 315

Global nuclear annihilation, biological warfare, climate change, asteroids, supervolcanoes, engineered viruses, emerging technologies or human-like artificial intelligence — the omnicide end may be closer than you think. 

HL/AI looks the cause of and solution to all existential risk. As Hawking wrote in his post mortem book: 

“The advent of super-intelligent A.I. would be either the best or the worst thing ever to happen to humanity.”

Existential Risks, global catastrophic risks or doomsday scenarios

A global catastrophic risk is one with the potential to wreak death and destruction on a global scale. In human history, wars and plagues have done so on more than one occasion, and misguided ideologies and totalitarian regimes have darkened an entire era or a region. 

Potential global catastrophic risks include anthropogenic risks, caused by humans (technology, governance, climate change), and non-anthropogenic or natural risks.

Technological risks include the creation of destructive artificial intelligence, biotechnology or nanotechnology. Insufficient or malign global governance creates risks in the social and political domain, such as a global war, including nuclear holocaust, bioterrorism using genetically modified organisms, cyberterrorism destroying critical infrastructure like the electrical grid; or the failure to manage a natural pandemic. Problems and risks in the domain of earth system governance include global warming, environmental degradation, including extinction of species, famine as a result of non-equitable resource distribution, human overpopulation, crop failures and non-sustainable agriculture.

Examples of non-anthropogenic risks are an asteroid impact event, a supervolcanic eruption, a lethal gamma-ray burst, a geomagnetic storm destroying electronic equipment, natural long-term climate change, hostile extraterrestrial life, or the predictable Sun transforming into a red giant star engulfing the Earth. 

Societal collapse

Societal collapse (also known as civilizational collapse) is the fall of a complex human society characterized by the loss of cultural identity and of socioeconomic complexity, the downfall of government, and the rise of violence. 

Possible causes of a societal collapse include natural catastrophe, war, pestilence, famine, economic collapse, population decline, and mass migration. A collapsed society may revert to a more primitive state, be absorbed into a stronger society, or completely disappear.

Virtually all civilizations have suffered such a fate, regardless of their size or complexity, but some of them later revived and transformed, such as China, India, and Egypt. However, others never recovered, such as the Western and Eastern Roman Empires, the Mayan civilization, and the Easter Island civilization. Societal collapse is generally quick but rarely abrupt. However, some cases involve not a collapse but only a gradual fading away with little negative effect on the civilization in question, such as the British Empire since 1918 or the USSR since 1991.

Anthropologists, (quantitative) historians, and sociologists have proposed a variety of explanations for the collapse of civilizations involving causative factors such as environmental change, depletion of resources, unsustainable complexity, invasion, disease, decay of social cohesion, rising inequality, secular decline of cognitive abilities, loss of creativity, and misfortune. However, complete extinction of a culture is not inevitable, and in some cases, the new societies that arise from the ashes of the old one are evidently its offspring, despite a dramatic reduction in sophistication. Moreover, the influence of a collapsed society, such as the Western Roman Empire, may linger on long after its death. 

A.I. Is the Cause Of — And Solution To — the End of the World

The development of artificial general intelligence offers tremendous benefits and terrible risks for future humanity

A.I. is the ultimate existential risk, because our destruction would come at the hands of a creation that would represent the summation of human intelligence. But A.I. is also the ultimate source of what some call “existential hope,” the flip side of existential risk.

Our vulnerability to existential threats, natural or man-made, largely comes down to a matter of intelligence. We may not be smart enough to figure out how to deflect a massive asteroid, and we don’t yet know how to prevent a supereruption. We know how to prevent nuclear war, but we aren’t wise enough to ensure that those missiles will never be fired. We aren’t intelligent enough yet to develop clean and ultra-cheap sources of energy that could eliminate the threat of climate change while guaranteeing that every person on this planet could enjoy the life that they deserve. We’re not smart enough to eradicate the threat of infectious disease, or to design biological defenses that could neutralize any engineered pathogens. We’re not smart enough to outsmart death — of ourselves, or of our species.

But if A.I. becomes what its most fervent evangelists believe it could be — not merely artificial intelligence, but superintelligence — then nothing will be impossible. We could colonize the stars, live forever by uploading our consciousness into a virtual heaven, eliminate all the pain and ills that are part of being human. Instead of an existential catastrophe, we could create what is called existential “eucatastrophe” — a sudden explosion of value. The only obstacle is intelligence — an obstacle put in place by our own biology and evolution. But our silicon creations, which have no such limits, just might pull it off — and they could bring us along. 

Resources 

Новая система искусственного интеллекта может самостоятельно разрабатывать гиперзвуковое оружие 

КАК МЫ СОЗДАЕМ НИИИ [НАСТОЯЩИЙ И ИСТИННЫЙ ИСКУССТВЕННЫЙ ИНТЕЛЛЕКТ]

«Крокус-покус» Агаларовых: здание в кадастре не числится, а работали дети и самозанятые

Многие наверняка обратили внимание на школьников, выводивших людей из «Крокус Холла» в ходе теракта 22 марта. Они прославились на всю страну и получили уже немало наград. Правда, юридич...

Русская ракета попала "куда нужно". Варшава спешно отправила в отставку инструктора ВСУ после смерти генерала
  • ATRcons
  • Вчера 20:06
  • В топе

Решение об отстранении было принято на основании данных контрразведки Польши. Кадровые перестановки в "Еврокорпусе"  Пресс-служба Министерства обороны Польши сообщает об уволь...

Наши спортфедерации потоком отказываются от Олимпиады

"Слать команду бомжей не будем!" Федерации одна за другой посылают Париж-2024 лесом.История с допуском-недопуском наших спортсменов на парижские ОИ уже изрядно приелась. Столько было сл...