Is the world actually that vulnerable?
Slight update toward coziness from a week of sporadic work
Recently I've been going over the main risks that could plausibly catastrophically damage humanity. In a lot of cases, we turn out to be surprisingly resilient:
Pandemics, as Dynomight notes, are a battle humanity is expected to win, because viruses are surprisingly defense-dominant threats:
[…] while engineered infectious diseases loom ever-larger as a potential very big problem, we also have lots of crazier tricks we could pull out like panopticon viral screening or toilet monitors or daily individualized saliva sampling or engineered microbe-resistant surfaces or even dividing society into cells with rotating interlocks or having people walk around in little personal spacesuits, and while admittedly most of this doesn’t sound awesome, I see no reason this shouldn’t be a battle that we would win.
There are in fact masks out there that are almost invulnerable to any pathogen and indoor spaces can be largely protected with far-UV lights and fog machines (1). And pandemics are a rare occurrence that would likely not get anywhere near the level of lethality required to bring humanity to its heels, especially as pandemics tend to run out of steam by themselves via infighting.
Coronal mass ejections appeared just as catastrophic as pandemics when I first looked into them, but, like pandemics, Earth's vulnerability is lesser than I assumed. A Carrington event in 2026 left unimpeded would likely blow most major transformers in the world’s grids, thereby sending all civilization into a blackout. This blackout could last from months to years because building and replacing transformers is slow in 2026. No one knows what blackouts lasting for MONTHS would do to our infrastructure, societies and well-being, but it would likely be very bad.
However, we WOULD impede a Carrington event if it came! We have two satellites able to warn is if one were headed our way, which would give us 20 minutes to shut our transformers down in a controlled blackout that wouldn't overheat the grids.
New Zealand has proven they could do this in an exquisitely organized manner. Québec may not even have to do this, because their own transformers are plausibly shielded enough to be quickly repaired in case they're fried. No other countries are as well fit for this, but they definitely have a fighting chance. Also, as Operation Warp Speed or America in WW2 or China’s industrial rise has proven, drastically increasing the speed at which a crisis’ silver bullet is mass manufactured is possible. We may be able to build and ship new transformers fast enough to severely diminish the time we spend in blackout.
The moral of these stories is that civilization is often more resilient than I had assumed. For most tail risks which are too big for markets to insure against, the answer to “what would happen if it came about” is less “oh we are so deeply unprepared it would be a knockout for humanity” and more “yeah we're 30% prepared, and in a pinch we can whip up another 50% via the indomitable human spirit”.
Even artificial intelligence, which I still think poses by far the greatest risk of deep catastrophe, doesn't perfectly slot into “we are so deeply unprepared”. Claude’s Constitution doesn't strike me as the work of a profoundly incompetent civilization. Alignment is going vastly better than expected, according to some researchers who've earned a fair bit of Bayes points. The demons we have spun from the ether are far stranger and less straightforwardly bad and dangerous than we had assumed in 2014.
My immediate reaction to this is mild frustration. I've been paid to figure out how to avert or mitigate these risks using a tiny budget and good foresight and it turns out the market is either efficient or in some rarer-than-predicted cases inadequate such that any money budges almost nothing. Overall though it is of course good news. As Emmet Shear put it:
There is an internal sensation—like a bat—of safeguarding the world against risks insurance and governments don't cover. It's hard, and it necessarily looks a little apocalyptic from the inside because it requires waking up every day with a giant unsolved high stakes problem hanging above your head for hundreds or thousands of days in a row. But so much of this internal sensation has happened in the past in the subjective experiences of Heroes that we are actually not in a terribly catastrophic situation.
My mom kept giving out a strangled gasp whenever I described what I was working on, and she's asked multiple times whether I felt OK thinking about all these horrible catastrophies. The answer is that this has felt surprisingly warm and reassuring.
To take the typical example of a doomer narrative even as far as 2026, and how things are going for it:









excellent post! particularly liked the shear tweet