A big blast
AbstractScientific and technological progress might change people’s capabilities or incentives in ways that would destabilize civilization. For example, advances in DIY biohacking tools might make it easy for anybody with basic training in biology to kill millions; novel military technologies could trigger arms races in which whoever strikes first has a decisive advantage; or some economically advantageous process may be invented that produces disastrous negative global externalities that are hard to regulate. This paper introduces the concept of a vulnerable world: roughly, one in which there is some level of technological development at which civilization almost certainly gets devastated by default, i.e. unless it has exited the ‘semi-anarchic default condition’. .... A general ability to stabilize a vulnerable world would require greatly amplified capacities for preventive policing and global governance. The vulnerable world hypothesis thus offers a new perspective from which to evaluate the risk-benefit balance of developments towards ubiquitous surveillance or a unipolar world order. -- Nick Bostrom, The Vulnerable World Hypothesis, 2019
Modern industrial civilization rests on a tightly coupled global system that is far more fragile than its everyday normality and false beliefs suggest. Bomb shelters would be temporary before supplies ran out and the ugly reality of civilization collapse would start to bite. Existential risk expert Nick Bostrom’s (Future of Humanity Institute, University of Oxford) Vulnerable World Hypothesis about existential risk and the global vulnerabilities, such as famine, describes the situation. Continuing technological progress in a system with weak global governance creates conditions in which civilization is probably destroyed within a few years once certain destructive human forces spin out of control. Bostrom reasonably argues that preventing potential extinction is a current moral priority.
Fragility of modern civilization
Bostrom’s “Vulnerable World Hypothesis” imagines technological progress as drawing balls from an urn, where a single “black ball” technology (cheap, widely accessible means of mass destruction) can render ordinary levels of social control insufficient to prevent civilizational breakdown. In such a world, the combination of high complexity, global interdependence, and increasing offensive or nuclear capability means that a reasonable default expectation is systemic failure unless unprecedented forms of coordination (global cooperation) or control are achieved. Even short‑term survivors of an all‑out nuclear war could be pushed back to “stone‑age conditions,” with no guarantee that recovery to an advanced state is possible. Bomb shelters are going to be useless once people are forced to leave them to continue to survive.
Psychological reluctance to face harsh outcomes
Many existential risk assessments are flawed because humans are flawed. Humans cannot face harsh realities like civilization collapse. Some of the clearest accounts of why academics and policymakers understate horrible possibilities come from the scholars studying global catastrophic risks. In the opening chapter of Global Catastrophic Risks, Bostrom and Cirković summarize a large literature on cognitive biases such as availability, scope neglect, overconfidence, confirmation bias, and affect heuristics. All of those unconscious biases systematically distorted expert judgment about low‑frequency, high‑impact events. They note that when cataclysmic endings are at stake, a distinctive apocalyptic psychocultural mindset tends to appear. Experts evince either irrational enthusiasm for lethal mass scale catastrophe or, more commonly in respectable institutions, denial, fragmentation, and a refusal to follow scenarios all the way to their logical endpoints. This dovetails with empirical work on nuclear‑war psychology finding widespread avoidance and numbing in the face of annihilation‑scale threats. Link 1, link 2, link 3
Likely death tolls from collapse
Recent climate–crop models give a concrete sense of the stakes. A 2022 study led by Rutgers researchers, modeling a full‑scale U.S.–Russia nuclear war, found that soot‑driven cooling would reduce global average caloric production by about 90% within three to four years. Their results suggest that more than 5 billion people would die from starvation on top of hundreds of millions of direct casualties, meaning over 75% of humanity would be starving within two years and the great majority dead soon after. Even the smallest scenarios modeled, e.g., regional nuclear exchanges, produce food shocks exceeding any recorded anomaly in United Nations Food and Agriculture Organization data, with catastrophic disruption of global food markets. Bostrom classifies such events as candidates for existential risk not only because of their immediate death toll, but because the survivors may be trapped at permanently lower levels of complexity and productivity, unable to recreate the scientific and industrial base required for long‑term flourishing. Link 4, link 5, link 6, link 7, link 8
In other words, once the complexity of modern civilization shuts down, survivors will not be able to restart it. They will have to rebuild it, pretty much from scratch. That will take decades, maybe a century or two.
A civilizational collapse in which on the order of 90–95% of humans die within months to a few years is not an extravagant outlier estimate. It is close to the median of serious model‑based scenarios for large‑scale nuclear war and related global shocks. The real outlier, as Bostrom and others imply, may be our collective insistence on treating such outcomes as too extreme to discuss clearly and honestly.
Q: Are we morally obligated to try to avoid polluting, poisoning or blowing ourselves and civilization to smithereens, or does the great philosopher Alfred E. Newman have it about right?
No comments:
Post a Comment