Key Points
- •Events that could cause human extinction or permanent civilization collapse
- •Natural risks: asteroids, supervolcanoes, gamma ray bursts
- •Anthropogenic risks: nuclear war, engineered pandemics, AI
- •Toby Ord estimates ~1/6 chance of existential catastrophe this century
- •AI increasingly seen as the most significant near-term x-risk
Defining Existential Risk
An existential risk (x-risk) is a threat that could cause human extinction or permanently and drastically curtail humanity's potential. This includes not just events that kill everyone, but those that lock us into a permanently diminished state—a global totalitarian regime that lasts forever, or a catastrophe that prevents us from ever reaching the stars.
The field of existential risk studies was largely founded by philosophers Nick Bostrom and Toby Ord, who argue that preventing extinction should be a top global priority given the stakes involved.
Categories of Risk
Natural risks:
- Asteroid impacts (like the one that killed the dinosaurs)
- Supervolcanic eruptions that could trigger volcanic winter
- Gamma ray bursts from nearby stellar events
- Pandemics from natural pathogen evolution
Anthropogenic risks:
- Nuclear war, especially nuclear winter scenarios
- Engineered pandemics (bioweapons or lab accidents)
- Artificial intelligence that pursues goals harmful to humanity
- Climate change severe enough to cause civilizational collapse
- Nanotechnology-enabled weapons
The Current Risk Landscape
Toby Ord, in his 2020 book The Precipice, estimated roughly a 1 in 6 chance of existential catastrophe this century. His breakdown:
- Unaligned AI: ~10%
- Engineered pandemics: ~3%
- Unforeseen anthropogenic risks: ~3%
- Nuclear war: ~0.1%
- Climate change: ~0.1%
- Natural risks: <0.1%
These estimates predate the rapid AI advances of 2023-2026. Many researchers have since revised their AI-specific risk assessments upward as frontier model capabilities have advanced faster than expected. The dominance of AI in existential risk assessments has only grown.
Why X-Risk Matters Disproportionately
If humanity survives, our potential future is vast—trillions of people living across billions of years, perhaps spreading throughout the galaxy. From this perspective, even a small reduction in extinction risk has enormous expected value.
This is why researchers argue existential risk reduction should be a priority even if the probability of any particular risk is low. The stakes are simply too high.
What Can Be Done
Reducing existential risk involves:
- Technical research on AI alignment and biosecurity
- Improving international coordination and governance
- Building resilient systems that can survive catastrophes
- Preserving knowledge and capabilities for civilizational recovery
- Monitoring for emerging risks and responding quickly

