OUR INNER NEED TO FEAR
Posted September 20, 2021
on:
THIS POST IS A HISTORICAL REVIEW OF THE NEED OF THE HUMANS TO FEAR IMAGINARY HORRORS
RELATED POST ON CATASTROPHISM: LINK: https://tambonthongchai.com/2021/11/08/the-catastrophism-of-the-humans/
HISTORICAL BACKGROUND:
WE HUMANS ARE SUPERSTITIOUS CREATURES. IN MODERN TIMES THE ROLE OF SUPERSTITION HAS BEEN MODERATED BY SCIENTIFIC ADVANCES BUT IT STILL PLAYS A ROLE IN OUR UNDERSTANDING OF OUR SURROUNDINGS PARTICULARLY SO WHEN RISK AND DANGER ARE INVOLVED. IN JUST THE LAST TWO CENTURIES, WE HUMANS HAVE PROGRESSED FROM A RATE OF INFANT MORTALITY WHERE LESS THAN HALF OF ALL CHILDREN BORN MADE IT TO ADULTHOOD TO A STATE WHERE INFANT MORTALITY IS UNKNOWN EXCEPT IN RARE AND UNUSUAL CIRCUMSTANCES. AT THE SAME TIME, AVERAGE LIFE EXPECTANCY ROSE FROM 30 TO 70 YEARS, AND OUR VIEW OF THE WORLD AND OUR PLACE IN IT EXPANDED SIGNIFICANTLY FROM SUPERSTITION, WITCHCRAFT, AND RELIGION TO DATA COLLECTION AND SCIENTIFIC INVESTIGATION OF NATURAL PHENOMENA, AND MEDICAL CARE. AS RECENTLY AS TWO CENTURIES AGO WE LIVED IN FEAR OF NATURE’S WRATH UNABLE TO RESPOND TO THE EXTREMES OF NATURE THAT INCLUDED BOTH CLIMATE AND WILDLIFE. THE IMPLICATION OF THIS HISTORY IS THAT WE HAVE JUST RECENTLY EMERGED FROM LIFE IN THE UNKOWN FACING UNCERTAINTY AND RISK AND DANGERS AND THE CRISES OF NATURE THAT WE COULD NOT COMPREHEND, NOT FORECAST, AND NOT COPE WITH SUCH THAT THEY COULD WIPE OUT ENTIRE FAMILIES AND ENTIRE COMMUNITIES. THIS PROGRESS IN SCIENCE, MEDICAL CARE, AND EDUCATION HAS BEEN SO RAPID THAT OUR INNER SENSE OF THE WORLD AROUND US IN TERMS OF SUPERSITION AND RELIGION HAS NOT COMPLETELY CAUGHT UP WITH THAT REALITY AND OUR SUPERSTITIONS ABOUT NATURE AND OUR FEAR OF NATURE HAVE NOT CAUGHT UP WITH OUR NEW SCIENCE AND KNOWLEDGE ABOUT NATURE. THIS MAKES IT POSSIBLE THAT EVEN TODAY IN THE AGE OF SCIENCE AND KNOWLEDGE, OUR INNER FEARS OF THE UNKOWN STILL PLAYS A ROLE IN HOW WE INTERACT WITH NATURE SUCH THAT WE OFTEN FIND OURSELVES IN FEARFUL SITUATIONS THAT ARE CREATIONS OF OUR IMAGINATION OR OF SUGGESTIONS OF DANGER LURKING IN OUR FUTURE AND, AS WE DEMONSTRATE WTH THE CITATIONS BELOW, THIS RECENT CONDITION OF THE HUMAN RACE MAKES IT EASY TO CREATE FEAR OF WHAT IS TO COME IN THE FUTURE WITH A WELL KNOWN ACTIVISM TOOL CALLED FEAR APPEAL. LINK TO FEAR APPEAL: https://tambonthongchai.com/2021/07/22/fear-appeal-in-climate-change/
CITATION#1: SOURCE: A HISTORY OF CATASTROPHIC THINKING BY DAVID SEPKOSKI: LINK: https://las.illinois.edu/news/2020-11-23/history-catastrophic-thinking
A global pandemic, wildfires, and hurricanes have made 2020 a year for catastrophic thinking, so a new book with that title seems appropriate. But don’t mistake David Sepkoski’s “Catastrophic Thinking” as a doomsday warning about the future. It’s focused instead on how we got here – how our current-day concerns regarding threats to both the planet and the human race came to be. Sepkoski travels through two centuries of history, “from Darwin to the Anthropocene,” to examine how science and culture have interacted to shape those views, especially on extinction and the value of diversity. One example came in the 1980s, when a new scientific theory that the dinosaurs’ extinction resulted from a massive meteor strike coincided with rising public anxiety over nuclear annihilation.
Sepkoski is the Thomas M. Siebel Chair in History of Science at the University of Illinois Urbana-Champaign, and focuses much of his research on how science influences, and is influenced by, the larger society. Contrary to some perceptions that science stands apart, he said, “Scientific developments are interwoven into the culture, politics, and ideologies of the time.” That’s illustrated early in Sepkoski’s book in describing the contention over two theories of extinction during the 1800s. One, championed by Charles Darwin, an Englishman, said species died out slowly and continuously over time. The other, promoted by Georges Cuvier, a Frenchman, theorized that species came to an end primarily through large catastrophic events. Each drew on scientific evidence, but each also was influenced by politics and their support or distrust of political revolution. Likewise, the science and culture of the 1800s had little notion of the inherent value of biological or cultural diversity, Sepkoski said. People worried about the loss of a given “charismatic” species, but not about its effect on the stability of ecosystems or the larger environment. The end of a species or culture was more likely to be viewed as part of a natural process. “The notion that diversity has inherent value is treated self-evidently by many scientists today, as is cultural diversity by many others,” Sepkoski said. “But the emergence of that idea has been very complex, involving interactions over time between people in diverse realms of human inquiry, from genetics and ecology to the humanities, anthropology and other disciplines. “In other words, it didn’t just materialize out of thin air. Nor was it an inevitable way, or even a natural way, of thinking for many people.” Sepkoski traces the evolution of these ideas through the 20th century in the context of trends in society at large – from the onset of a modernist pessimism in Europe early in the century and through the world wars, to Cold War anxieties around nuclear proliferation and what has come since. He makes it personal by noting the influence of his paleontologist father, who was involved in the new theory on dinosaur extinction, and describes the impact on him as a child watching the TV film “The Day After,” depicting the aftermath of a nuclear war. “What happens in the second half of the 20th century is that the pervasive sense of doom that early 20th-century Europeans felt gets amped up to a much more immediate sense that a catastrophe could visit us literally at any moment,” Sepkoski said. “Those sorts of things conditioned a broader public to accept a model of crisis we’re now absorbed in – this rhetoric of biodiversity loss, or of a sixth mass extinction. We cannot understand that particular model of extinction and threat looking only at the science. We have to understand also the huge contribution of changing cultural attitudes and values and concerns.” Sepkoski also noted that the catastrophic thinking of today is different from the Cold War fears about nuclear war. “Rather than being terrified of a bomb that’s going to wipe us out in an eyeblink, we have learned to live in a state of constant slow-motion crisis or catastrophe. That’s the way we’ve come to view the environmental crisis that we’re facing. That’s the way we’re viewing the pandemic. It is simultaneously a state of hope and despair. “The silver lining to this slow-motion view of apocalypse is that there’s some hope we might still have a chance to remediate it in some way. We might have some possibility of addressing it.”
CITATION#2: AEON: END TIMES FOR HUMANITY: LINK: https://aeon.co/essays/the-human-world-is-not-more-fragile-now-it-always-has-been
The end of the world is a growth industry. You can almost feel Armageddon in the air: from survivalist and ‘prepper’ websites (survivopedia.com, doomandbloom.net, prepforshtf.com) to new academic disciplines (‘disaster studies’, ‘Anthropocene studies’, ‘extinction studies’), human vulnerability is in vogue. The panic isn’t merely about civilisational threats, but existential ones. Beyond doomsday proclamations about mass extinction, climate change, viral pandemics, global systemic collapse and resource depletion, we seem to be seized by an anxiety about losing the qualities that make us human. Social media, we’re told, threatens our capacity for empathy and genuine connection. Then there’s the disaster porn and apocalyptic cinema, in which zombies, vampires, genetic mutants, artificial intelligence and alien invaders are oh-so-nearly human that they cast doubt on the value and essence of the category itself. How did we arrive at this moment in history, in which humanity is more technologically powerful than ever before, and yet we feel ourselves to be increasingly fragile? The answer lies in the long history of how we’ve understood the quintessence of ‘the human’, and the way this category has fortified itself by feeding on the fantasy of its own collapse. Fears about the frailty of human wisdom go back at least as far as Ancient Greece and the fable of Plato’s cave, in which humans are held captive and can only glimpse the shadows of true forms flickering on the stone walls. We prisoners struggle to turn towards the light and see the source (or truth) of images, and we resist doing so. In another Platonic dialogue, the Phaedrus, Socrates worries that the very medium of knowledge – writing – might discourage us from memorising and thinking for ourselves. It’s as though the faculty of reason that defines us is also something we’re constantly in danger of losing, and even tend to avoid. This paradoxical logic of loss – in which we value that which we’re at the greatest risk of forsaking – is at work in how we’re dealing with our current predicament. It’s only by confronting how close we are to destruction that we might finally do something; it’s only by embracing the vulnerability of humanity itself that we have any hope of establishing a just future. Or so say the sages of pop culture, political theory and contemporary philosophy. Ecological destruction is what will finally force us to act on the violence of capitalism, according to Naomi Klein in This Changes Everything: Capitalism vs the Climate (2014). The philosopher Martha Nussbaum has long argued that an attempt to secure humans from fragility and vulnerability explains the origins of political hierarchies from Plato to the present; it is only if we appreciate our own precarious bodily life, and the emotions and fears that attach to being human animals, that we can understand and overcome racism, sexism and other irrational hatreds. Disorder and potential destruction are actually opportunities to become more robust, argues Nassim Nicholas Taleb in Antifragile (2012) – and in Thank You for Being Late (2016), the New York Times’ columnist Thomas Friedman claims that the current, overwhelming ‘age of accelerations’ is an opportunity to take a pause. Meanwhile, Oxford University’s Future of Humanity Institute pursues research focused on avoiding existential catastrophes, at the same time as working on technological maturity and ‘superintelligence’. It’s here that one can discern a tight knit between fragility and virility. ‘Humanity’ is a hardened concept, but a brittle one. History suggests that the more we define ‘the human’ as a subject of intellect, mastery and progress – the more ‘we’ insist on global unity under the umbrella of a supposedly universal kinship – the less possible it becomes to imagine any other mode of existence as human. The apocalypse is typically depicted as humanity reduced to mere life, fragile, exposed to all forms of exploitation and the arbitrary exercise of power. But these dystopian future scenarios are nothing worse than the conditions in which most humans live as their day-to-day reality. By ‘end of the world’, we usually mean the end of our world. What we don’t tend to ask is who gets included in the ‘we’, what it cost to attain our world, and whether we were entitled to such a world in the first place. Stories about the end of time have a long history, from biblical eschatology to medieval plague narratives. But our fear of a peculiarly ‘human’ apocalypse really begins with the 18th-century Enlightenment. This was the intellectual birthplace of the modern notion of ‘humanity’, a community of fellow beings united by shared endowments of reason and rights. This humanist ideal continues to inform progressive activism and democratic discourse to this day. However, it’s worth taking a moment to go back to René Descartes’s earlier declaration of ‘I think, therefore I am’, and ask how it was possible for an isolated self to detach their person from the world, and devote writing, reading and persuasion to the task of defending an isolated and pure ego. Or fast-forward a few centuries to 1792, and consider how Mary Wollstonecraft had the time to read about the rights of man, and then demand the rights of woman. The novelist Amitav Ghosh provides a compelling answer in his study of global warming, The Great Derangement (2017). Colonisation, empire and climate change are inextricably intertwined as practices, he says. The resources of what would become the Third World were crucial in creating the comfortable middle-class existences of the modern era, but those resources could not be made available to all: ‘the patterns of life that modernity engenders can only be practised by a small minority … Every family in the world cannot have two cars, a washing machine and a refrigerator – not because of technical or economic limitations but because humanity would asphyxiate in the process.’ Ghosh disputes one crucial aspect of the story of humanity: that it should involve increasing progress and inclusion until we all reap the benefits. But I’d add a further strand to this dissenting narrative: the Enlightenment conception of rights, freedom and the pursuit of happiness simply wouldn’t have been imaginable if the West had not enjoyed a leisured ease and technological sophistication that allowed for an increasingly liberal middle class. The affirmation of basic human freedoms could become widespread moral concerns only because modern humans were increasingly comfortable at a material level – in large part thanks to the economic benefits afforded by the conquest, colonisation and enslavement of others. So it wasn’t possible to be against slavery and servitude (in the literal and immediate sense) until large portions of the globe had been subjected to the industries of energy-extraction. The rights due to ‘us all’, then, relied on ignoring the fact that these favourable conditions had been purchased at the expense of the lives of other humans and non-humans. A truly universal entitlement to security, dignity and rights came about only because the beneficiaries of ‘humanity’ had secured their own comfort and status by rendering those they deemed less than human even more fragile. What’s interesting about the emergence of this 18th-century humanism isn’t only that it required a prior history of the abjection it later rejected. It’s also that the idea of ‘humanity’ continued to have an ongoing relation to that same abjection. After living off the wealth extracted from the bodies and territories of ‘others’, Western thought began to extend the category of ‘humanity’ to capture more and more of these once-excluded individuals, via abolitionism, women’s suffrage and movements to expand the franchise. In a strange way these shifts resemble the pronouncements of today’s tech billionaires, who, having extracted unimaginable amounts of value from the mechanics of global capitalism, are now calling for Universal Basic Income to offset the impacts of automation and artificial intelligence. Mastery can afford to correct itself only from a position of leisured ease, after all. But there’s a twist. While everyone’s ‘humanity’ had become inherent and unalienable, certain people still got to be more fully ‘realised’ as humans than others. As the circle of humanity grew to capture the vulnerable, the risk that ‘we’ would slip back into a semi-human or non-human state seemed more present than before – and so justified demands for an ever more elevated and robust conception of ‘the human’. ‘Humanity’ was to be cherished and protected precisely because it was so precariously elevated above mere life One can see this dynamic at work in the 18th-century discussions about slavery. By then the practice itself had become morally repugnant, not only because it dehumanised slaves, but because the very possibility of enslavement – of some humans not realising their potential as rational subjects – was considered pernicious for humanity as a whole. In A Vindication of the Rights of Woman (1792), for example, Wollstonecraft compared women to slaves, but insisted that slavery would allow no one to be a true master. ‘We’ are all rendered more brutal and base by enslaving others, she said. ‘[Women] may be convenient slaves,’ Wollstonecraft wrote, ‘but slavery will have its constant effect, degrading the master and the abject dependent.’These statements assumed that an entitlement to freedom was the natural condition of the ‘human’, and that real slavery and servitude were no longer genuine threats to ‘us’. When Jean-Jacques Rousseau argued in The Social Contract (1762) that ‘man is born free, and everywhere he is in chains’, he was certainly not most concerned about those who were literally in chains; likewise William Blake’s notion of ‘mind-forg’d manacles’ implies that the true horror is not physical entrapment but a capacity to enslave oneself by failing to think. It’s thus at the very moment of abolition, when slavery is reduced to a mere symbol of fragility, that it becomes a condition that imperils the potency of humanity from within. I’m certainly not suggesting that there is something natural or inevitable about slavery. What I’m arguing is that the very writers who argued against slavery, who argued that slavery was not fitting for humans in their very nature, nevertheless saw the unnatural and monstrous potential for slavery as far too proximate to humans in their proper state. Yet rather than adopt a benevolence towards the world in light of this vulnerability in oneself, the opposite has tended to be the case. It is because humans can fail to reach their rational potential and be ‘everywhere in chains’ that they must ever more vigilantly secure their future. ‘Humanity’ was to be cherished and protected precisely because it was so precariously elevated above mere life. The risk of debasement to ‘the human’ turned into a force that solidified and extended the category itself. And so slavery was not conceived as a historical condition for some humans, subjected by ruthless, inhuman and overpowering others; it was an ongoing insider threat, a spectre of fragility that has justified the drive for power. How different are the stories we tell ourselves today? Movies are an interesting barometer of the cultural mood. In the 1970s, cinematic disaster tales routinely featured parochial horrors such as shipwrecks (The Poseidon Adventure, 1972), burning skyscrapers (The Towering Inferno, 1974), and man-eating sharks (Jaws, 1975). Now, they concern the whole of humanity. What threatens us today are not localised incidents, but humans. The wasteland of Interstellar (2014) is one of resource depletion following human over-consumption; the world reduced to enslaved existence in Elysium (2013) is a result of species-bifurcation, as some humans seize the only resources left, while those left on Earth enjoy a life of indentured labour. That the world will end (soon) seems to be so much a part of the cultural imagination that we entertain ourselves by imagining how, not whether, it will play out. But if you look closely, you’ll see that most ‘end of the world’ narratives end up becoming ‘save the world’ narratives. Popular culture might heighten the scale and intensity of catastrophe, but it does so with the payoff of a more robust and final triumph. Interstellar pits the frontier spirit of space exploration over a miserly and merely survivalist bureaucracy, culminating with a retired astronaut risking it all to save the world. Even the desolate cinematic version (2009) of Cormac McCarthy’s novel The Road (2006) concludes with a young boy joining a family. The most reduced, enslaved, depleted and lifeless terrains are still opportunities for ‘humanity’ to confront the possibility of non-existence in order to achieve a more resilient future. Such films hint at a desire for new ways of being. In Avatar (2009), a militaristic and plundering West invades the moon Pandora in order to mine ‘unobtanium’; they are ultimately thwarted by the indigenous Na’vi, whose attitude to nature is not one of acquisition but of symbiotic harmony. Native ecological wisdom and attunement is what ultimately leads to victory over the instrumental reason of the self-interested invaders. In Mad Max: Fury Road (2015), a resource-depleted future world is controlled by a rapacious, parasitic, and wasteful elite. But salvation comes from the revolutionary return of a group of ecologically attuned and other-directed women, all blessed with a mythic wisdom that enables ultimate triumph over the violent self-interest of the literally blood-sucking tyrant family. These stories rely on quasi-indigenous and feminist images of community to offer alternatives to Western hyper-extraction; both resolve their disaster narratives with the triumph of intuitive and holistic modes of existence over imperialism and militarism. They not only depict the post-post-apocalyptic future in joyous terms, but do so by appealing to a more benevolent and ecologically attuned humanity. These films whisper: take a second glance at the present, and what looks like a desperate situation might actually be an occasion for enhancement. The very world that appears to be at the brink of destruction is really a world of opportunity. Once again, the self-declared universal humanity of the Enlightenment – that same humanity that enslaved and colonised on the grounds that ‘we’ would all benefit from the march of reason and progress – has started to appear as both fragile and capable of ethical redemption. It’s our own weakness, we seem to say, that endows humanity with a right to ultimate mastery. If everything that is ‘the human’ relies upon an exploitative existence, then any diminution is deemed apocalyptic What contemporary post-apocalyptic culture fears isn’t the end of ‘the world’ so much as the end of ‘a world’ – the rich, white, leisured, affluent one. Western lifestyles are reliant on what the French philosopher Bruno Latour has referred to as a ‘slowly built set of irreversibilities’, requiring the rest of the world to live in conditions that ‘humanity’ regards as unliveable. And nothing could be more precarious than a species that contracts itself to a small portion of the Earth, draws its resources from elsewhere, transfers its waste and violence, and then declares that its mode of existence is humanity as such. To define humanity as such by this specific form of humanity is to see the end of that humanity as the end of the world. If everything that defines ‘us’ relies upon such a complex, exploitative and appropriative mode of existence, then of course any diminution of this hyper-humanity is deemed to be an apocalyptic event. ‘We’ have lost our world of security, we seem to be telling ourselves, and will soon be living like all those peoples on whom we have relied to bear the true cost of what it means for ‘us’ to be ‘human’. The lesson that I take from this analysis is that the ethical direction of fragility must be reversed. The more invulnerable and resilient humanity insists on trying to become, the more vulnerable it must necessarily be. But rather than looking at the apocalypse as an inhuman horror show that might befall ‘us’, we should recognise that what presents itself as ‘humanity’ has always outsourced its fragility to others. ‘We’ have experienced an epoch of universal ‘human’ benevolence, a globe of justice and security as an aspiration for all, only by intensifying and generating utterly fragile modes of life for other humans. So the supposedly galvanising catastrophes that should prompt ‘us’ to secure our stability are not only things that many humans have already lived through, but perhaps shouldn’t be excluded from how we imagine our own future. This is why contemporary disaster scenarios still depict a world and humans, but this world is not ‘the world’, and the humans who are left are not ‘humanity’. The ‘we’ of humanity, the ‘we’ that imagines itself to be blessed with favourable conditions that ought to extend to all, is actually the most fragile of historical events. If today ‘humanity’ has started to express a sense of unprecedented fragility, this is not because a life of precarious, exposed and vulnerable existence has suddenly and accidentally interrupted a history of stability. Rather, it reveals that the thing calling itself ‘humanity’ is better seen as a hiatus and an intensification of an essential and transcendental fragility.
CITATION#3: SOURCE: THE CONVERSATON: https://theconversation.com/the-end-of-the-world-a-history-of-how-a-silent-cosmos-led-humans-to-fear-the-worst-120193 The end of the world: a history of how a silent cosmos led humans to fear the worst
The Fermi Paradox is the widespread belief in extra-terrestrials and flying saucers in the complete absence of evidence,but the greater issue is that humanity still hasn’t found any evidence of intelligent activity outside of us humans. Not a single feat of “astro-engineering”, no visible superstructures, not one space-faring empire, not even a radio transmission. But wait, maybe the eerie silence from the sky above is telling us something ominous about the future course of our own civilisation. OMG OMG. This fear actually exists. Last year Astrophysicist Adam Frank implored an audience at Google that we see climate change – and the newly baptised geological age of the Anthropocene – against this cosmological backdrop. The Anthropocene refers to a human control of the planet. Could it be that we do not see evidence of space-faring galactic civilisations because they too had gone through the technological advances to reach a kind of anthropocene such that they could destroy life on the planet. This is why all we see are dead planets. OMG OMG. In 2018 the IPCC predicted a sombre future if we do not decarbonise that Extinction Rebellion’s protests, a new climate report upped the ante, warning: “Human life on earth may be on the way to extinction. Meanwhile, NASA has been publishing press releases about an asteroid set to hit New York within a month, perhaps some kind of stress test to simulate responses to catastrophe. Elon Musk has been relaying his fears about artificial intelligence. He worries that the ability for AI systems to rewrite and self-improve themselves may trigger a sudden runaway “intelligence explosion”, that will leave us far behind such that an artificial superintelligence created by the humans could wipe out the humans and take over the planet. OMG OMG. Meanwhile, the Nick Bostrom Institute: https://www.fhi.ox.ac.uk/ scrutinises the long-term fate of humanity and the perils we face at a truly cosmic scale, examining the risks of things such as climate change, asteroids, and artificial intelligence. It also looks into what it calls “Universe destroying physics experiments, gamma-ray bursts, planet-consuming nanotechnology and exploding supernovae” any one of which could end life on earth. OMG OMG. This fearology trend in research is not a creation of the post-atomic age but of our long history of a growing obsession with our extinction. Back in 1816 we were worried about a 100 megaton sulfate aerosol layer in the stratosphere created by the eruption of Mount Tambora in 1815. In the wake of the Tambora eruption there was a global cascade of harvest collapse, mass famine, cholera outbreak and geopolitical instability. And it provoked a widely popular fictional depiction of human extinction. The now famous Byron poem “Darkness”. imagines what would happen if the sun died. OMG OMG. Later in our history, in the age of our nuclear war fearology, Byron’s poem flashed back into the spotlight during our obsession with the fear of nuclear winter. Soon thereafter we had the book “Frankenstein” where creatures created in the lab by the humans out-bred and exterminated the humans. OMG OMG. Prior to that we were already worried about our planet being scorched by a comet and the William Godwin hypothesis of the impossibility of our species would continuing forever OMG OMG. and then there was Immanuel Kant who gave us sleepless nights worrying about the emergence of a new species smarter than humans that would wipe out the humans OMG OMG. And let’s not forget David Hume who told us that the rise and extinction cycle of nature will catch up with us and this fear became headlines after Tambora where the weirdness of sunsets after Tambora became the focal point of the fearology OMG OMG! Fear, Apocalypse, Extinction, and Superstition are part of who and what we are. Our never ending need to fear with or without evidence, even just a suggestion, may be understood in that context. An extension of these imaginations is the Copernicus and Kepler imagination of plural worlds, an imagination of a universe of humaoid lifeforms that can create yet more layers of imagined events for us to fear. The Copernicus/Kepler fearology is inverted in the Edmond Halley fearology where we don’t fear aliens in ourter space but aliens that live in the immense heat of the core and mantle. OMG OMG. The bottom line is that some way or another we live with an imagination of a future riddled with risk.
CITATION#4: BOOK REVIEW OF THE NIALL FERGUSON BOOK: DOOM The Politics of Catastrophe
Writing about the past, like every human endeavor, has a history, with its own traditions, fads and shifts in scope and method. It was once common for historians to think big, scanning the decades, centuries and even millenniums for grand patterns and enduring lessons amid the rise and fall of states, empires, economic structures, intellectual systems and world religions. Today such sweeping ambition is out of fashion among academic historians. Instead of attempting to make sense of the big picture by examining the elite layer of societies over long stretches of time, most of our professional historians tend to focus more narrowly and then dive deep, studying a cross-section of a society from top to bottom and advancing broader claims from what they unearth in the excavation. Niall Ferguson is, in many ways, a historian of the old school. He was trained in the history of business and finance, but over the past two decades his interests have broadened. In a long list of books written mostly for popular audiences, he has tackled (among other topics) the mistaken decisions that led to World War I, the rise and fall of global empires (he regrets their passing), the distinctive advantages of Western civilization and the life of Henry Kissinger. (The admiring first part of a projected two-volume biography appeared in 2015.) Along the way, Ferguson has also produced numerous historical documentaries and written for Bloomberg and other publications. (I edited his columns and features for Newsweek during 2011 and 2012.) Ferguson’s latest book, “Doom: The Politics of Catastrophe,” aims to place the continuing Covid-19 pandemic in the broadest possible context in order to gain a “proper perspective” on it. That context includes the history of pandemics, but also many other types of disaster, including earthquakes, tsunamis, volcanic eruptions, asteroid strikes, famines, wars and numerous catastrophic accidents. The result is a book that hopscotches breezily across continents and centuries while also displaying an impressive command of the latest research in a large number of specialized fields, among them medical history, epidemiology, probability theory, cliodynamics and network theory. If the book’s vast temporal scope leads it to resemble histories written in earlier times, its drive to pronounce on events in cultures spanning the globe and its heavy reliance on cutting-edge theories makes “Doom” very much a product of our moment. It belongs on the shelf next to recent ambitious and eclectic books by authors like Jared Diamond, Nassim Nicholas Taleb and Steven Pinker. What unites these writers is their disregard for traditional disciplinary boundaries and a determination to reach for synoptic knowledge of stupefyingly complex subjects. The result, in Ferguson’s case, is a book containing some genuine wisdom, but also some perplexing lacunae. One of its concluding lessons for the current pandemic, for example, is that lockdowns, which do great economic damage, should be avoided in favor of more precisely targeted measures, among them the quarantining of superspreaders — people who interact with far more people than most and therefore play an outsize role in spreading disease. That sounds reasonable. Yet 300 or so pages earlier, in a section of the book’s introduction titled “Confessions of a Superspreader,” Ferguson tells us he “first spoke and wrote publicly about the rising probability of a global pandemic” long before most Western journalists, in late January 2020, while he was in the midst of a round of travel that took him from the United States to Asia, Europe and then back to North America. His travels continued over the following weeks, despite his awareness of the risk and the fact that he was “ill for most of February, with a painful cough I could not shake off.” The globe-trotting finally came to an end on March 15, when Ferguson flew with his wife and two youngest children to Montana, where they would ride out the pandemic in rural isolation. Since one of the central purposes of the book is to show his readers that “all disasters are at some level man-made,” one might have expected Ferguson to reflect, beyond a cleverly self-deprecating section title, on his own possible role in spreading Covid around the world. This is a book, after all, containing a chapter titled “The Fractal Geometry of Disaster,” about how “nested within a massive event like the collapse of an empire are multiple smaller but similar disasters, each one, at each scale, a microcosm of the whole.” Yet Ferguson’s own arguably irresponsible actions do not inform his analysis in any notable way. This is probably a function of Ferguson’s preference for highlighting systemic, as opposed to individual, failures. Eschewing great man theories of history, Ferguson treats political leaders as “hubs” within complex networks of information. When those hubs communicate efficiently with one another, the results are good. But when communication breaks down or information is less than accurate, a cascade of failures ensues that makes a disaster far worse. Superspreaders are hubs, too, within social networks, though in their case the more connections they have with others, the worse, since those connections spread disease far and wide. Hence, Ferguson says, the need to build an institutional infrastructure that can disrupt social networks in times of emergency to halt contagion. Sensible advice. But what guarantees a well-placed information hub (like a government official in charge of public health) will take such advice and act on it? Reading “Doom,” it’s hard to escape the impression that responding intelligently to pandemics depends on people in high office being smart enough to listen to Niall Ferguson so they will do a better job of disrupting the behavior of people like Niall Ferguson. In Ferguson’s view, our response to the pandemic shows that we’ve mostly fallen short of that exacting standard. The reasons are enumerated throughout the book. Human beings continually misjudge risk. They dismiss the cries of Cassandras warning of impending doom. They stumble in their attempts to organize intelligent responses under pressure. And they spread disinformation (as well as contagion) along social networks, making disasters far worse than they might otherwise be. These are worthwhile points that promise to make a contribution to improving our management of future disasters. Unfortunately, Ferguson raises doubts about his own judgment by seeming to wave away concerns about climate change — the most widely understood and anticipated catastrophe looming on the horizon. To cite just a few of the disasters likely to spin off from this global calamity in the making: a proliferation of floods, fires, storms and famines; increased numbers of diseases and pandemics; and a sharp rise in temperatures rendering large parts of the globe uninhabitable, an eventuality that could prompt refugee flows on a scale without precedent in human history, destabilizing governments around the world.
CITATION#5: SOURCE: DAEDALUS: LINK: https://www.amacad.org/publication/slow-disaster-anthropocene-historian-climate-change-korean-peninsula
THIS CITATION IS A DEMONSTRATION OF HOW OUR SUPERSTITION IS USED TO CREATE FEAR OF THE IMAGINED DANGERS OF THINGS LIKE CLIMATE CHANGE AND THE ANTHROPOCENE.
The Anthropocene is the time in which human activity is the dominant force of change on the planet. The terminology is in the strict sense geological, coined by atmospheric chemist Paul Crutzen in 2000. Geological ages are named for the organisms or processes that define the earth in their time. The Anthropocene is our time, an age marked by the increasingly obvious cumulative impacts of humanity on Earth systems, and more so by the cascading effects of human-crafted systems. If you are looking for the material evidence that scientific advocates of the Anthropocene collect and analyze, you should watch for concentrations out of place: too much phosphorus at the mouths of rivers and acid in the oceans, too much carbon in the atmosphere, radioactive particles and plastics everywhere. There are also absences: ice melt, vegetative loss, biodiversity loss, aridity. The evidence for the Anthropocene as a stratigraphic layer of the earth with a clear starting point is still a matter of fierce debate among scientists, divided into roughly four camps: those who reject the concept out of hand; those who date the start of the Anthropocene to the advent of agriculture approximately ten thousand years ago; those who date it to the rise of industrialization roughly 250 years ago; and those who insist that the entry into the nuclear age marks the moment of the Anthropocene, beginning in 1945. Start date aside, there is broad consensus that a so-called great acceleration of Anthropocenic growth processes, from globalized industrial production, to GDP, to global population, to oceanic surface temperatures, is obvious from the 1950s onward. The Anthropocene is by no means the first time humans have contemplated suffering, or even the complete end of humanity: apocalyptic eschatology is quite nearly a universal feature of world religions. It’s not even the first time in which humans have contemplated their end brought by their own hand; that would be the Cold War “mutually assured destruction”. But it is the first time that a mass extinction–including the Anthropos–is contemplated by us as a creeping process producing a slow disaster of global proportions, toxicity and global warming driving us from every corner of the globe to the same fate. Climate scientist Will Steffen has also described the Anthropocene as a challenge of temporal imagination: “the concatenation of both slow- and quick-onset events . . . can lead to some unexpected global crises. . . . The Earth System scale adds another twist to the concept of speed of change. . . . Humanity . . . has no experience of dealing with such combinations of scale and speed of environmental change.”3 How long will it take? Is it too late? Is it reversible? Who will be the first to suffer, and how can their suffering be lessened? Are the same forces of industrialization that created the Anthropocene capable of being turned toward solutions? These are the existential questions of the Anthropocene, and they go well beyond geology. {Translation: the less we know the greater the role of imagination and the scarier it gets}. Historians of disaster have a role to play in grounding these free-floating questions in local contexts: the Anthropocene is a global process playing out in human lives and communities every day. And in every one of those lives and places, there are historical trajectories, inheritances of place and politics that will shape who suffers more and who suffers less. Understanding the everyday politics of the Anthropocene requires the work of historians. Climate change is a product of industrialization, but its effects are known in different geographical and temporal scales. This realization came home to me when I was researching the Twin Towers, but also Hurricane Katrina, Fukushima, and many other disasters of the past two decades. In each case, our naming conventions are to emphasize the event of the disaster over the process that made the disaster. The rush to name the disaster, investigate the cause, and get back to normal defines the work of the modern disaster preparedness state. I have struggled in my career with the temporal limitations of the term in its general (and I believe quite misleading) usage. What, I have wondered, if we named disasters by the processes that made them? The September 11 Terror Attacks, Fires, and Engineering Failures; the New Orleans Flood and Levee Failures of 2005; the Great East Japan Earthquake, Tsunami, and Failure of Nuclear Safety. This thought experiment takes us into useful conceptual terrain if we care to actually understand the social, economic, and political actors who establish so-called acceptable levels of risk, and why publics accept (or don’t accept!) such levels. Following this path demands a history of disaster that is decidedly more complicated than a presidential “disaster declaration.” War is an example of an anthropogenic disaster that we can apprehend as an “event in the now.” In terms of definitions, war fulfills the requirements of what we generally mean by disaster: it overcomes society’s ability to cope with stress. That is what war is for after all: it is a human-induced disaster aimed at achieving political ends. As such, warfare cannot last beyond the time frame in which it is useful for the combatants. The time frame of war is short: it may be repetitive, but it is an imminent way of destroying, killing, and dying. War and other so-called rapid-onset disasters fit the definition of “events” and, in fact, the classic social scientific definitions of disaster were framed in the early years of the Cold War, when governments (especially the United States) were funding research to model the societal impact of nuclear war. The model of disaster that emerged by 1960 in the writings coming from the Disaster Research Center was something that arrives rapidly, with little or no warning, and then it’s over. That aftermath phase is what the government planners were keen to predict: would society return to some sort of normalcy, or would society fall apart at the seams? Their conclusions weren’t optimistic, but are slightly beside the point here. What’s important is to note their framing of disaster as an event, the result of a shock from outside, overwhelming a particular community at a particular time. The Anthropocene is also a disaster, but a slow one, moving according to a different temporal logic. The traditional definition of disaster describes an overwhelming event delimited by spatiotemporal limits that are tightly bounded with clear cause-and-effect relationships. “Slow disaster” is a way to think about disasters not as discrete events but as long-term processes linked across time. The slow disaster stretches both back in time and forward across generations to indeterminate points, punctuated by moments we have traditionally conceptualized as “disaster,” but in fact claim much more life, health, and wealth across time than is generally calculated. The slow disaster is the time scale at which technological systems decay and posttraumatic stress grinds its victims; this is the scale at which deferred maintenance of infrastructure takes its steady toll, often in ways hard to sense or monetize until a disaster occurs in “event time.” The experience of war victims fits the concept well, as does the process of climate change, sea level rise, the intensification of coastal flooding, and heat waves. Yet the old false binaries confront us at every turn. For example, in the aftermath of a disaster–like Hurricane Katrina, or the sinking of the Sewol Ferry, or the Fukushima Daiichi nuclear disaster–the event is often presented as a laboratory of sorts. After each of these crises, we hear a great deal from policy-makers and experts about the opportunity to “learn from disaster.” But we should be aware that this learning exercise is trapped in a dynamic that splits the technical from the social. SLOW MOVING DISASTERS LIKE CLIMATE CHANGE AND THE ANTHROPOCENE ARE THE REAL DISASTERS BUT THEY ARE TOO SLOW MOVING FOR US TO REALIZE THAT THEY ARE DISASTERS AND THAT IS THE REAL DISASTER THAT WILL END THE RISE OF THE HUMANS.
Leave a Reply