What is Giving Compass?
We connect donors to learning resources and ways to support community-led solutions. Learn more about us.
Each time we discover a new technology, most of the time it yields huge benefits. But there’s also a chance we discover a technology with more destructive power than we have the ability to wisely use.
It’s very hard to estimate, but it seems hard to conclude that the chance of a civilization-ending nuclear war in the next century isn’t over 0.3%.
So long as civilisation continues to exist, we’ll have the chance to solve all our other problems, and have a far better future. But if we go extinct, that’s it.
The most likely outcome is 2-4 degrees of warming, which would be bad, but survivable.
However, these estimates give a 10% chance of warming over 6 degrees, and perhaps a 1% chance of warming of 9 degrees. That would render large fractions of the Earth functionally uninhabitable, requiring at least a massive reorganization of society.
If there’s a 75% chance that high-level machine intelligence is developed in the next century, then this means that the chance of a major AI disaster is 5% of 75%, which is about 4%.
We’ll sketch out some ways to reduce these risks, divided into three broad categories:
- Targeted efforts to reduce specific risks
- Broad efforts to reduce risks
- Learning more and building capacity
Read the full article on human extinction at 80000Hours