WAAS/EXTRA Working Group, Michael Marien
Director of Research, July 15, 2025
This brief overview is designed for 1) initial learners; 2) those who know something about X-risk and want a deeper view, and 3) specialists who know a lot about some risks but seek a broader view, in that this multi-disciplinary global challenge concerns many definitions and rapidly evolving outlooks. Updates to this overview will be made, and readers are invited to suggest additions, deletions, and other revisions.
Existential Threats and Risks are Growing
Concern with threats and risks (heretofore X-risks) is not new. Still, it is growing in all major categories: nuclear war, climate change, pollution, health, AI, etc., due to four recent global shocks: the COVID-19 pandemic, Russia’s invasion of Ukraine, the Israel-Hamas war, and the re-election of Donald Trump as US President. To cite three very different sources: the “Doomsday Clock” of The Bulletin of the Atomic Scientists (Jan 2025) is now at 89 seconds to midnight—the closest it has ever been. The annual Global Risks Report of the World Economic Forum in Davos ranks 32 risks in the next two years and ten years, and warns of “a deteriorating global outlook.” The Annual Threat Assessment of the US Intelligence Community (Feb 2024) sees “an increasingly fragile global order.” (But the shorter 2025 version, reflecting Trump 2.0, ignores much of the 2024 report.)
Two Elephants in the Room—and More?
Nuclear war has been at the head of many lists since weapons were detonated over Japan in 1945. Risks are growing with more nuclear nations, more powerful weapons and means of delivery, and increasing tensions between nations. But climate change is now the more obvious “elephant in the room,” due to more frequent heat waves, storms, floods, droughts, and wildfires in recent years, with 2024 as the warmest year on record. Artificial intelligence, which emerged in 2022 for better and for worse, could potentially be the largest elephant in the room in only a few years, as big US technology firms and several nations (notably China) are investing hundreds of billions of dollars in a race to develop Artificial General Intelligence (AGI), where machines think for themselves. Many experts fear that AGI without sufficient guardrails will be available in a few years. There are also fears about biotechnology enabled by AI, pandemics (notably from mutation of H5N1 bird flu), pollution, climate tipping points, and six of the nine planetary boundaries already surpassed.
Doom for Whom? Inequitable Risks in Space and Time
Limiting the study of X-risks to doom or catastrophe for all of humanity ignores whose existence will be affected first. Nuclear weapons, of course, can affect everyone. But climate change has grave consequences for some countries more than others, and for women, children, poor people, outdoor workers, the disabled, indigenous people, and those living in countries that are already hot and getting hotter. In addition to spatial distinctions, young people should be far more concerned than older people, because their expected multi-decade lifespans are threatened far more than seniors with only a decade or so ahead.
Different Terminology
“Risks” are cited far more often than “threats” (a concern of military and intelligence communities), but overlap in some reports. They are also expressed as “challenges”, “catastrophes,” “global catastrophes,” “global shocks,” and “disruptions on the horizon.” Threats and risks are future-oriented, in contrast to the present-oriented “polycrisis”. All are compound concepts, in contrast to more frequent thinking on single X-risk issues such as climate, biodiversity, AI, etc.
Different Timelines
Nuclear extinction is quick, near-total, and well-understood. Climate impacts are scattered, varied, and not total. Many adverse effects of AI are suggested, and may be slow or fast, depending on whether it enables military decisions. Climate tipping points could take decades to appear, and impacts could lead to local or regional extinction. A pandemic may take a year to travel worldwide and several years to subside. Natural catastrophes such as an asteroid or a major volcanic explosion are instant, but rare; asteroids can be anticipated, but the latter generally cannot.
Different Probabilities
A nuclear attack or accident could happen at any time; the policy of Mutually Assured Destruction has lasted for nearly 80 years, but appears less trustworthy in the following decades. Climate change is inevitable, but the question is how much and how soon. Tipping points also appear confident, but also with questions of how soon. Major natural calamities (earthquakes, supervolcano eruptions, solar flares, asteroid impacts) are possible, but cannot be predicted. The sixth extinction of species is underway, but how many species and when is also unclear, as well as the impacts on humans.
Different Remedies—and Headwinds
Many other actions are being taken or could be taken, ranging from broad normative agendas such as the UN’s Sustainable Development Goals and many specific actions to mitigate climate change or adapt, preserve species, ensure adequate food and water, etc. Unfortunately, there are headwinds: vested interests, disinformation, rising costs, denial, infoglut, lack of funding, immediate national concerns, fear of global government, and hopelessness. Adding to this fragmentation are more than 80 organizations involved with EXTRA thinking in some way.
Combine X-Risk Negativity with Positive Trends and Proposals
In a world where many are already suffering from “eco-anxiety”, focus on X-risks can be depressing and subject to being ignored as “doom-mongering”. It is also about future years and decades. Thus, they are easily dismissed as “speculative.” To offset this necessary gloom, X-risks should be paired with positive thinking about the SDGs and/or Human Security for All (HS4A), a potential complementary or successor focus to the SDG Agenda 2030, which will not be met. Emphasis should also be given to cost/benefit analysis and redirecting finance to high-priority action for reducing risks and adaptation.
Engage Sustainability Programs in Universities and Schools
Focusing on X-risks promotes sustainability thinking and action. There are numerous organizations and programs to promote sustainability in education, including some 53 PhD programs in 40 institutions (see SSG QuickLook). These programs are likely to expand their horizons to consider the urgency of X-risks.
Lifelong Learning for Needed Transformations
Any transformation to sustainability and greater security will require learning new ideas over the next decades—with AI, or despite it—a challenge for all age groups. Youth groups, particularly when allied, can press for more relevant academic programs and change the minds of adults who resist 21st-century realities. A monthly EXTRA Update Newsletter covering critical new books, articles, and—mainly–reports will be distributed to various organizations and interested individuals.
Human extinction or a major global catastrophe is neither inevitable nor impossible. In the next 15 years, this will likely occur due to AGI without adequate guardrails. Without major positive transformations, it is probable that in the following 30 years (2040-2070), most likely due to climate tipping points, exceeding planetary boundaries, major wars, and/or unbound AGI.