Irregular Webcomic!

Archive     Blog     Cast     Forum     RSS     Books!     Poll Results     About     Search     Fan Art     Podcast     More Stuff     Random
New comics Mon-Fri; reruns Sat-Sun
<   No. 3661   2017-05-18   >

Comic #3661

1 Simon: Actually, the space mites are a major existential threat. And so is Cthulhu.
2 Simon: And we all know the best way to fight fire is with fire, so...
3 Terry: This is what you learn at the Centre for the Study of Existential Risk? Playing one risk against another and hoping they cancel?
4 Simon: Well, normally we do it on a game board, but it's much more fun in reality.

First (1) | Previous (3660) | Next (3662) || Latest Rerun (2664) | Latest New (5335)
First 5 | Previous 5 | Next 5 | Latest 5
Steve and Terry theme: First | Previous | Next | Latest || First 5 | Previous 5 | Next 5 | Latest 5
This strip's permanent URL: http://www.irregularwebcomic.net/3661.html
Annotations off: turn on
Annotations on: turn off

Now that I mention it... Existential Risk would be a great name for a board game. I wonder if I can sell the idea to Hasbro for another Risk variant, officially licensed by the Centre for the Study of Existential Risk at Cambridge University.

Existential risks are defined as hypothetical events which could potentially cause the extinction of humanity.

There are surprisingly many of them.

Let's go roughly from largest to smallest. Firstly, according to some models of the universe, it's possible that our visible universe - everything we can see out to about 15 billion light years - is merely a metastable bubble within a larger universe, in which the apparent vacuum of space is in fact what is called a false vacuum. This is essentially a region where the energy level of the vacuum is a local minimum (meaning that if you change the vacuum in any way, you need to supply external energy, so it's difficult to do), but not as low as it could possibly be. It could exist in a lower energy state, but it can't easily get there. But one way it could get there is by quantum tunnelling, a quantum mechanical effect in which things can jump from one energy state to another, without needing to pass through the states in between. This is highly improbable, but not zero probability, so it could happen. If it's true that our universe is in such a state, and such a quantum tunnelling event happens, then the results would be ... well, "catastrophic" is far too mild a term. Basically, our entire visible universe would be destroyed as it collapses to a lower energy state.

The good news is the odds of this happening are pretty low. If anything would be likely to trigger such a quantum tunnelling event, it might be ultra-high energy particle events, as these provide highly localised concentrations of energy that can serve to lower the energy barrier for a tunnelling event. We know that cosmic events such as black hole collapses and collisions can produce particles with energies around 1018 electronvolts (eV), so presumably anything up to that energy is safe, because it hasn't destroyed the universe yet. The biggest particle accelerators on Earth, such as the Large Hadron Collider, can only produce particles with energies up to 1012 eV, so we're a factor of a million below that. So far.[1]

Moving down in scale from the entire universe, there are many astronomical events that produce enormous amounts of radiation and high energy particles. The aforementioned black hole formations and collisions, stars blowing up in supernova explosions, neutron star collisions, gamma ray bursts, and so on. If any of these were to happen close enough to Earth - and by close enough I mean within a few hundred light years - the radiation could well be intense enough to have major effects on life on Earth: damaging DNA, causing cancers and fatal mutations, and causing extinctions of some or many species. We don't expect any such events within the danger zone any time in the foreseeable future, but there's always a chance for something surprising.

Closer to home still, the next big potential threat is contact with an advanced alien civilisation. From our own history as a species, we know that contact between civilisations with different technology levels usually goes pretty poorly for the lower one. In the case of alien contact, that would be us. We can only hope that if aliens have the technology to travel to Earth, they also have the philosophical ideal of benevolence - plus the wisdom to apply it without accidentally destroying us. Otherwise we're toast.

Moving to within our own solar system, we come to a threat that ranks as one of the more likely ones in this list. We know that large asteroids have hit Earth in the past, causing major extinction events, and it's virtually inevitable that they will in the future. The only question is: "How soon?" The answer is: We don't know. Asteroids a kilometre in size hit Earth roughly once every half million years. One could hit this year, with virtually no warning. We'd probably spot an approaching asteroid of that size a few days or weeks ahead of the impact, but that's nowhere near enough time to launch any sort of means of dealing with it.

Fortunately, an asteroid a kilometre across will probably only destroy whatever country or continent it hits, or cause tsunamis that kill millions of people if it lands in an ocean. It probably won't render humanity extinct. For that you need an asteroid about 10 kilometres across, like the one that occurred 65 million years ago, at the end of the age of the dinosaurs. That's much less likely to happen any time soon, and we'd probably detect the object years in advance... and might be able to do something about it. Maybe.

Contracting down to Earth itself, we have a geologically active planet under our feet, and there are possibilities for either enormous volcanic eruptions or giant earthquakes which cause huge tsunamis. One hypothesis even suggests that the Toba eruption 75,000 years ago came very close to wiping out humanity already. An eruption like this, possibly in the Yellowstone caldera, would not kill everyone immediately, but would eject so much matter into the atmosphere that it would cool the world for a decade or more, causing crop failures and mass famine which could destroy the global civilisation.

Speaking of cooling the planet, we come to our first human-caused existential risk, and possibly the most likely one on this list: global warming. At our current rate of adding carbon dioxide to the atmosphere, we are on track for disastrous levels of climatic warming within the next hundred years or so. A few degrees doesn't sound like much, but it's enough to disrupt crops and ecosystems, and raise sea levels significantly, with terrible knock-on effects: famine, human displacement, geopolitical instability, wars. And then there's the chance that Earth will reach a tipping point and enter a positive feedback loop, with the increased warmth releasing even more greenhouse gases such as methane clathrates, leading to a runaway greenhouse, which would almost surely see most species on Earth wiped out.

Another human-caused risk which seems frighteningly plausible given the current developments with North Korea is nuclear war. Even a so-called limited exchange of nuclear weapons could create large-scale firestorms that trigger a nuclear winter, with similar crop failure and famine outcomes to a supervolcano.

Next comes environmental disasters. These could be triggered by any number of factors. Overuse of pesticides, or indeed infestations of mites, might wipe out bees, and thus the crops they pollinate. Overfishing could disrupt oceanic food chains, wiping out the sea as a food resource. Deforestation or water pollution could affect some species so badly that extinctions cascade through the ecosystem as predators lose food sources. Desertification and salinity could again cause crop failures.

High technology could also be dangerous. Artificial intelligence, if it reaches a point rivalling or exceeding the intelligence of humanity, could decide it's better off without us and, evolving more rapidly than we can, quickly outpace any means we have of fighting it. Nanotechnology could go wild if a self-replicating form is ever produced, reducing the entire Earth to grey goo. And biotechnology could generate frightening new diseases and toxins that could escape into the wild and spread around the planet.

Not that we have to engineer new diseases. Plenty of old fashioned natural diseases have the potential to mutate into forms that we can't fight, unleashing a global pandemic that kills virtually everyone. Especially now that we've been over-reliant on antibiotics for so long, and many pathogens have evolved resistance to virtually every antibiotic we have.

This may seem a depressing list (and I've skipped some other things which could have gone on it), but we are a resourceful species and we've survived this long. And there are some very clever people working on solving all of these potential problems and their risks, both individually in their prospective fields of expertise, and by considering existential risk itself as a field of study. The latter include the Centre for the Study of Existential Risk at Cambridge University, the Future of Humanity Institute at Oxford University, and the Future of Life Institute in Boston.

It's only in the last hundred years or so that we've really been able to understand and appreciate the nature of risks to our planet and species. So we're in the early stages of studying existential risk, and presumably have a lot to learn. With some careful thinking and a bit of luck, hopefully we'll pull through this early stage of our growth and emerge a wiser and more careful species.

But what is it about existential risk that is simultaneously so fascinating and so frightening? These are just my own thoughts, but I suspect it's related to reasons we find danger and thrills and horror so interesting. We have an adrenal system that primes our bodies for the classic fight-or-flight response when faced with potential danger. But many of us in the modern world don't really face realistic danger much. I can't remember the last time I faced anything genuinely dangerous - maybe that time five or so years ago when a taxi carelessly reversed into my car, and even that was more danger to my property than to me personally.

Many of us make up for this relative lack of excitement in our lives by seeking out thrilling experiences. Roller coasters and other thrill rides. Bungee jumping or skydiving. Rock climbing. Playing sports. Watching horror movies. All of these things are relatively safe, but can still give us that adrenaline high because our bodies react to rapid, sudden, or unusual stimuli even when our minds know we're safe.

The existential risks mentioned above, as well as others we might think about, are not that likely to kill us in the immediate future. The odds are non-zero, but close enough to zero for us to be able to think about them relatively dispassionately, without immediately panicking. Still, there's a bit of an odd thrill that comes from thinking that an asteroid might smash into Earth tomorrow, or a giant volcano might erupt. It's interesting to think about as a hypothetical. What would you do? Where would you go?

Another reaction to the thought of these scenarios is humour. Humour is a refuge many people take to defuse nervousness or to make the unbearable bearable. How can I get away with making jokes about existential risk? Because although serious, these risks are not so immediately threatening that they have to be treated seriously at all times. They also provide complex intellectual scenarios that can form the scaffolding for the clever references and surprising links that are the backbone of humour: things that are unexpected yet somehow relevant. In other words, existential risks are almost perfect joke material.

Existential risk scenarios can also come across as slightly outlandish. So a possible reaction might go along the lines of: "Heh, that's a bit of a silly thing to think about! As if... oh wait... You're serious?" It's a short step from here to having people genuinely laugh about them - perhaps even the researchers who consider them more seriously than anyone else. Whenever people get together with shared experiences, jokes are sure to follow. Just think about how many jokes you make with your co-workers about the office, or computer programming or marketing or whatever your job is about - or with your fellow students about the topics you're studying. Now imagine the people working at the Centre for the Study of Existential Risk making the same sorts of jokes about humanity being wiped out my an asteroid or an unstoppable pandemic. Yeah, you bet they do!

At least, I hope they do. Their field of study is serious enough that they could use the comedy relief.


[1] This was the source of part of the scare you might have seen in the media just before the LHC began operating, about it potentially having the ability to destroy the universe. In fact, if we continue building bigger and more powerful particle colliders at the same rate as we've been improving them, we should reach energies around 1018 eV around the year 2150. The scientists alive at that time will need to make some careful decisions about whether they want to generate energy densities higher than any we have observed in the universe around us...


Note: This annotation was inspired by reader Dr Simon Beard (major Kickstarter backer and inspiration for the character of the same name in this very comic!), who requested an essay on the topic of existential risk as part of his Patreon supporter reward. If you'd like me to write an extended annotation on any topic you care to name, or if you just want to show some support for the comics and other creative work I share, please consider becoming a patron.

LEGO® is a registered trademark of the LEGO Group of companies, which does not sponsor, authorise, or endorse this site.
This material is presented in accordance with the LEGO® Fair Play Guidelines.

My comics: Irregular Webcomic! | Darths & Droids | Eavesdropper | Planet of Hats | The Dinosaur Whiteboard | mezzacotta
My blogs: dangermouse.net (daily updates) | 100 Proofs that the Earth is a Globe (science!) | Carpe DMM (long form posts) | Snot Block & Roll (food reviews)
More comics I host: The Prisoner of Monty Hall | Lightning Made of Owls | Square Root of Minus Garfield | iToons | Comments on a Postcard | Awkward Fumbles
© 2002-2024 Creative Commons License
This work is copyright and is licensed under a Creative Commons Attribution-Noncommercial-Share Alike 4.0 International Licence by David Morgan-Mar. dmm@irregularwebcomic.net