« Stoneage Disco | Main | Carnival of Tomorrow 1.0 »

The Big Risk

The Geological Society of London reported last month that a super-volcano is 5-10 times as likely to occur than a major meteorite impact, but could be just as disastrous.

An area the size of North America can be devastated, and pronounced deterioration of global climate would be expected for a few years following the eruption. They could result in the devastation of world agriculture, severe disruption of food supplies, and mass starvation. These effects could be sufficiently severe to threaten the fabric of civilization.

Phil and I recently examined the ten threats to civilization reported by The Guardian. To review they are:

1. Climate Change
2. Telomere Erosion
3. Viral Pandemic
4. Terrorism
5. Nuclear War
6. Meteorite Impact
7. Robot Takeover
8. Cosmic Ray Blast
9. Super-Volcano
10. Artificial Black Hole

One question that The Gaurdian left unanswered was, "which of these risks is the biggest?" Risk equals the probability that the risk will occur multiplied by the damage it would do if it did; or:

Risk = Probability x Consequences

While any of these disasters might be sufficient for the task of doing us in, some of these scenarios are more worrisome than others. I completely discount telomere erosion (manageable consequences, and I think zero probability). And the risks of climate change, viral pandemic, terrorism (which is a viral meme), and nuclear war are not likely to result in human extinction were they to occur.

We are a uniquely adaptable species, so climate change is not likely to kill us all. There's no benefit to a virus or to terrorists to wipe out all of humanity. And the risk of global nuclear war seems to have faded as the risk of regional nuclear war has increased.

Physicists have reassured us that the artificial black hole risk is nothing to worry about - zero probability. I don't know whether to be reassured by recent experiments or not. Obviously scientists didn't create a black hole, but they got something totally unexpected - a perfect liquid at a trillion degrees. How certain can these guys be that there is no risk of disaster if they are capable of being completely surprised?

That said, there's no point in arguing the level of this risk with people who know much more about the probability than I. That leaves "meteorite impact," "robot takeover," "cosmic ray blast," and "super-volcano." If we accept the Geological Society of London assessment (and I've got no reason to doubt it), then risk of meteorite impact is significantly less than "super vocano." So we're down to three.

The risk of extinction from a supernova was downgraded in 2003.

Scientists at NASA and Kansas University have determined that the supernova would need to be within 26 light years from Earth to significantly damage the ozone layer and allow cancer-causing ultraviolet radiation to saturate the Earth's surface.

An encounter with a supernova that close only happens at a rate of about once in 670 million years.

That risk is pretty remote. So, we're down to two great existential risks: robot/AI takeover and super-volcano. Which takes the "Big Risk" title?

Each has much to recommend it.

Ray Kurzweil tells us that the state of art in A.I. and computer science is perpetually ahead of what the public realizes. We could wake up one day in a world dominated by super-intelligent machines before we could mobilize to stop it.

Not that we could stop it if we tried. If we legislated against super-A.I. here in the U.S., it would simply push its development elsewhere. The potential payoff is too valuable to ignore, and we humans tend to push the envelope regardless of risk or benefit. If it's possible to do, it will be done.

There is no question of "if" with a super-volcano - only when.

The Earth tends to have super-volcanic eruptions every 100,000 years. The last super-volcano to blow was 74,000 years ago. This doesn't mean we necessarily have 26,000 years until the next huge eruption. The Yellowstone super-volcano tends to blow every 600,000 years. The last eruption was 640,000 years ago. We're overdue.

Time to pick the "Big Risk" champion. Drum roll please.....

And the winner is...

ROBOTIC TAKEOVER!

Why? Both scenarios are pretty grim. But the human race survived the last super-volcano, and I have hope that we can survive the next one - if not on earth, perhaps in a self-sustaining off-world colony.

But the risk of robotic takeover will be with us wherever we go (see the DUNE novels and Battlestar Galactica). I agree with Kurzweil that our best bet is to be part of the leap.

Comments

Robotic takeover is a "risk" for humanity the way coming down from the trees was a risk for our chimplike ancestors. It's true, those creatures no longer exist. But here we are!

Coming down from the trees was a good risk to take. So, I think, will be joining up with our electronic progeny.

The whole "coming down from the trees" thing looks great from our perspective. I'm sure it was frightening for those early hominids.

Not everyone will be willing to join with AI (whatever that will mean). Unless we are talking about Borg bent on mandatory assimilation, many will choose to be left behind.

Post a comment