August 16, 2020

The Precipice by Toby Ord

The Big Idea: The chance of an existential catastrophe in the hundred years is 1 in 6. Man-made risks far outweigh natural risks. We need global cooperation, continued work, and deep reflection to prevent the end of our species and reach our true potential.

INTRODUCTION

Safeguarding humanity’s future is the defining challenge of our time.

CH 1 STANDING AT THE PRECIPICE

Given everything I know, I put the existential risk this century at around one in six.

CH 2 EXISTENTIAL RISK

The world is just waking up to the importance of existential risk .

Management of existential risk is best done at the global level. But the absence of effective global institutions for doing so makes it extremely difficult.

CH 3 NATURAL RISKS

ASTEROIDS & COMETS

The existential risk from asteroidal impact has been studied in great detail and shown to be vanishingly low. Astronomers have succeeded so well in tracking asteroids that it may be time to switch some of their attention to comets.

SUPERVOLCANIC ERUPTIONS

Supervolcanic eruptions have occurred in the past, but they are extremely difficult to predict. There is very little known about how to prevent or delay an impending supereruption. Even the largest eruptions would be very unlikely to lead to extinction or unrecoverable collapse.

STELLAR EXPLOSIONS

Existential risk from a supernova is very small.

Another ice age would cause significant difficulties for humanity, but is effectively ruled out over the next thousand years.

CH 4 ANTHROPOGENIC RISKS

We face about a thousand times more anthropogenic risk over the next century than natural risk.

NUCLEAR WEAPONS

The world’s major crops would fail, and billions could face starvation in a nuclear winter.

CLIMATE CHANGE

The most extreme climate possibility is known as a “runaway greenhouse effect,” where warming continues until the oceans have mostly boiled off.

The best I can say is that when accounting for all the uncertainties, we could plausibly end up with anywhere up to 13°C of warming by 2300.

Warming at such levels would be a global calamity of unprecedented scale. Major effects of climate change include reduced agricultural yields, sea level rises, water scarcity, increased tropical diseases, ocean acidification and the collapse of the Gulf Stream.

None of these threaten extinction or irrevocable collapse. Direct existential risk from climate change appears very small, but cannot yet be ruled out.

ENVIRONMENTAL DAMAGE

While possibly calamitous and painful, loss of biodiversity and resource scarcity don’t appear to pose any direct risk of destroying our potential.

However, nuclear war, climate change and environmental damage each probably pose an existential risk that is higher than that of all natural existential risks put together.

CH 5 FUTURE RISKS

PANDEMICS

During the twentieth century, fifteen countries are known to have developed bioweapons programs.

Since biotechnology (CRISPR and gene drives) can be misused to lethal effect, democratization also means proliferation.

An escape of a pandemic pathogen is a matter of time.

Most of the remaining existential risk would come from the threat of permanent collapse: a pandemic severe enough to collapse civilization globally.

UNALIGNED ARTIFICIAL INTELLIGENCE

Asked when an AI system would be “able to accomplish every task better and more cheaply than human workers,” on average experts estimated a 50 percent chance of this happening by 2061 and a 10 percent chance of it happening as soon as 2025.

There is good reason to expect a sufficiently intelligent system to resist our attempts to shut it down.

AI may also help make our longterm future brighter than anything that could be achieved without it.

AI progress may come very suddenly: through unpredictable research breakthroughs, or by rapid scaling-up of the first intelligent systems.

DYSTOPIAN SCENARIOS

Unrecoverable dystopia means a world with civilization intact, but locked into a terrible form.

Examples: a totalitarian state, environmental degradation, a world that completely renounces technological progress, a world ruled by a single fundamentalist religion.

OTHER RISKS

Nanotechnology that allow small-scale production of new weapons of mass destruction.

Space exploration that results in back contamination of Earth.

Malevolent alien civilizations visiting Earth.

Radical scientific experiments with consequences we can’t predict.

CH 6 THE RISK LANDSCAPE (EXISTENTIAL CATASTROPHE)

Asteroid or comet impact within next 100 years: 1 in 1,000,000

Supervolcanic eruption within next 100 years: 1 in 10,000

Stellar explosion within next 100 years: 1 in 1,000,000,000

Nuclear war within next 100 years: 1 in 1,000

Climate change within next 100 years: 1 in 1,000

Other environmental damage within next 100 years: 1 in 1,000

Naturally arising pandemics within next 100 years: 1 in 10,000

Engineered pandemics within next 100 years: 1 in 30

Unaligned artificial intelligence within next 100 years: 1 in 10

Unforeseen anthropogenic risks within next 100 years: 1 in 30

Other anthropogenic risks within next 100 years: 1 in 50

I think the chance of an existential catastrophe striking humanity in the next hundred years is about one in six.

COMBINING AND COMPARING RISKS

Spend the resources allocated to existential risk in such a way as to reduce total risk by the greatest amount.

Prioritize risks that will not have a warning shot, where little effort has already been made, and where we can reduce the risk the greatest.

Reduce risk indirectly by preventing wars, by promoting global cooperation, by building new institutions, by improving education.

Start early. Focus resources.

CH 7 SAFEGUARDING HUMANITY

We can deliberately choose to have no catastrophes at all . — Isaac Asimov

GRAND STRATEGY FOR HUMANITY

  1. Reaching Existential Security: prevent fires
  2. The Long Reflection: decide what future we want for humanity
  3. Achieving Our Potential: perhaps includes space exploration and settlement

RISKS WITHOUT PRECEDENT

We can’t rely on our current intuitions. We cannot afford to fail even once.

INTERNATIONAL COORDINATION

Safeguarding humanity is a global public good. We need international coordination and everyone needs to share the costs.

TECHNOLOGICAL PROGRESS

Our technology has progressed faster than our collective wisdom. Our civilization needs to mature and grow wiser.

WHAT YOU CAN DO

Help organizations working on existential risk, donate money to a cause, participate in public conversation about the future of humanity.

CH 8 OUR POTENTIAL

DURATION

Human history so far has seen 200,000 years of Homo sapiens and 10,000 years of civilization .

On average, mammalian species last about one million years.

Within 100,000 years, the Earth should be almost fully recovered from the climate damage.

Eventually, our species will be succeeded by another species. Perhaps the evolution will involved deeply implanted technology.

Without the carbon dioxide from volcanoes, scientists estimate that in about 800 million years photosynthesis will become impossible in 97 percent of plants, causing an extreme mass extinction.

In 8 billion years our Sun itself will die.

SCALE

The best reason to settle other planets in other solar systems is to achieve some additional protection from existential risks.

The biggest challenge will be surviving on Earth for the century or two until it becomes technologically feasible.

If we could travel just six light years at a time, then almost all the stars of our galaxy would be reachable .

QUALITY

Human life is on the whole much better today than ever before.

Human civilization has probed only a tiny fraction of what is possible.

CHOICES

It’s time to relfect on our future.

Imagine people 10,000 years ago, sowing their first seeds and reflecting upon what opportunities agriculture might enable.