The way we talk about climate change and our effect on the planet is all wrong—and increasingly dangerous.

By David Sepkoski APRIL 16, 2021,

Welcome to the age of humans—the Anthropocene. Scientists, academics, public intellectuals, and policymakers have been using this term to describe a new geological epoch marking an unprecedented era of human impact on the natural environment. Beginning with the Industrial Revolution in the late 18th century, carrying through the development and testing of nuclear weapons, and peaking in recent decades with rapid global warming and the catastrophic depletion of the Earth’s biodiversity, the Anthropocene is often framed as an existential threat to the survival of the human species. Like some of the great environmental catastrophes of the past—such as the mass extinction that wiped out the dinosaurs 65 million years ago—the footprint of human activity will be present in the geological record for millions of years to come. Or so the reasoning goes.This article is adapted from the author's most recent book, Catastrophic Thinking: Extinction and the Value of Diversity From Darwin to the Anthropocene.

This article is adapted from Catastrophic Thinking: Extinction and the Value of Diversity From Darwin to the Anthropocene by David Sepkoski (University of Chicago Press, 360 pp., $35, September 2020).

There is every reason to be alarmed about anthropogenic climate change, pollution, and biodiversity loss, all of which have been accelerating in recent decades and do pose existential threats. Warming trends could cause the collapse of Arctic and Antarctic ice sheets that would dramatically increase sea levels by dozens of feet by the end of this century. If that happens, say goodbye to New York City, San Francisco, Seattle, Mumbai, London, Istanbul, Dubai, St. Petersburg, Mumbai, and Beijing, to name just some of the most populous cities that would be drowned. Left unchecked, climate change would also involve ocean acidification (as the oceans absorb atmospheric carbon), terrible droughts and heat waves (with equatorial regions reaching unlivable temperatures for much of the year), air pollution at unbreathable levels in many major cities, and mass extinctions of plants and animals at levels not seen since some of the greatest geological catastrophes in the Earth’s history—perhaps as severe as the “great dying” at the end of the Permian period some 250 million years ago, when as many as 96 percent of all living species may have died out. The resulting Earth from this catastrophe may become devoid not only of humans but perhaps of most complex life on land and in the seas.

Scientists have been aware of these threats for decades, but it is only recently that we’ve begun to talk about human damage to the Earth as a potential geological transformation. In the early 2000s, the Nobel Prize-winning atmospheric chemist Paul Crutzen proposed formally amending the established geological timescale to acknowledge the irreversible changes that humans had wrought. These changes, Crutzen and others have argued, will be permanently recorded in the layers of the Earth: a spike in radioactivity from atomic testing, human-made microfossils of plastic and other industrial compounds that will take millions of years to decompose, and of course drastic changes to the composition of life.

The Anthropocene is a somewhat controversial notion in geology. Crutzen’s proposal has been taken up by various professional bodies responsible for ratifying changes to the geological timescale, including the International Commission on Stratigraphy and the International Union of Geological Sciences, although no formal action has yet been taken.

As a matter of dating and stratigraphic nomenclature, this question can and will be decided on empirical grounds. But over the past decade, the Anthropocene has taken on a much broader cultural significance: Championed by observers in fields including climate science, history, and the arts, it now signifies not just a proposal about how we date the Earth’s history but an existential crisis for late modern human society and a diagnosis of its failures. As Crutzen and collaborators argued in an influential 2007 article, the Anthropocene embodies the recognition of a “profound shift in the relationship between humans and the rest of nature,” in which “[h]umankind will remain a major geological force for many millennia, maybe millions of years, to come.” In this perspective, the Anthropocene is not merely a proposal for renaming a geological epoch but a new state of awareness about the permanence of human intervention in the natural world. It crystallizes a host of new and preexisting anxieties and ambitions relating to climate change, biodiversity preservation, geoengineering, biotechnology, human population expansion, environmental and economic justice, and the future of humankind on, or even beyond, the planet Earth.

The Anthropocene’s relevance as a cultural touchstone is indisputable. Its usefulness as a guide for how to act and feel at a time of crisis is another matter.

The Anthropocene’s relevance as a cultural touchstone is indisputable. Its usefulness as a guide for how to act and feel at a time of crisis is another matter. The Anthropocene concept is part of a long history in the West of projecting current anxieties onto imagined catastrophic futures. For nearly 2,000 years, the apocalyptic theology of the Book of Revelation has influenced Western Christian theology and culture. More recently—since the later 19th century—European and American societies have experienced waves of catastrophic thinking connected to, successively, the collapse of imperial economic systems, anxieties about globalization, the specter of nuclear war, environmental degradation, and, most recently, global warming and a biodiversity crisis poised to produce a sixth mass extinction.

Thinking catastrophically can have real value if it encourages people and policymakers to address problems of momentous import. Heightened levels of anxiety about nuclear proliferation—captured by Carl Sagan’s famous “nuclear winter” hypothesis—contributed directly to major reductions in the world’s nuclear arsenals in the 1980s and 1990s. The rallying cry presented in Rachel Carson’s 1962 bestseller, Silent Spring, helped curb the use of industrial pesticides and raised awareness about environmental threats posed by pollution. More recent calls to action about global warming (such as Davis Guggenheim and Al Gore’s 2006 documentary, An Inconvenient Truth) and biodiversity conservation (E.O. Wilson’s 1992 book, The Diversity of Life, or Elizabeth Kolbert’s The Sixth Extinction) have undoubtedly raised consciousness of these issues and have had positive effects on the policies of many governments around the world.

But what do we do when the scope of the crisis is presented as so permanent, all-encompassing, and perhaps unavoidable that it will be written into the very strata of the Earth? It’s not an idle concern to wonder whether the rhetoric around the Anthropocene is so extreme, so dispiriting, and so fatalistic that it could simply paralyze us. That has certainly been the case with some recent and notable responses: In 2015, the literary scholar Roy Scranton published a book with the cheery title Learning to Die in the Anthropocene: Reflections on the End of a Civilization, while David Wallace-Wells’s 2019 book, The Uninhabitable Earth, documents a litany of catastrophes terrifying, and paralyzing, to contemplate. And these are just two of the more prominent and popular accounts of the consequences of the Anthropocene.

I’m not for a moment questioning either the reality of the crisis these authors describe or the sincerity of their responses. But it’s fair to wonder whether the way the Anthropocene has come to dominate Westerners’ imagination of the future is even accurate or helpful.

At a very basic level, the idea of naming a geologic epoch after our own species deserves more scrutiny than it has received. Geologists normally recognize geological changes after the fact, rather than in advance. The science of stratigraphy (as the study of the Earth’s layers is known) essentially breaks the geological record up into a series of roughly equal units of time that are demarcated by observable changes in the composition of rock layers, called signals, and the distinctive types of plant and animal fossils that characterize particular layers.

It turns out that a number of these stratigraphic breaks that distinguish one period from another—we can look to the famous boundary at the Cretaceous and Paleogene periods, when the dinosaurs died out—do correspond to major environmental upheavals or mass extinctions. In the case of the Cretaceous-Paleogene boundary, one of the most significant signals is an anomalous layer of the element iridium, which is quite rare on Earth but common elsewhere in the solar system. The discovery of this layer in the 1980s led scientists to propose (and eventually confirm) the hypothesis that a collision with a massive asteroid triggered a massive calamity that blanketed the Earth in dust and ash for more than a year, wiping out not only the dinosaurs but a host of other species on land and in the seas. More recent research has detected the signal of the wildfires and volcanic eruptions that contributed to this extinction, and it’s not unreasonable to expect that future geologists (perhaps sentient cockroaches or extraterrestrial visitors?) might detect a similar signature from our own era. But whether or not that’s the case, it will be the job of other, far-future observers to document this.

In a more basic sense, isn’t it a little grandiose to project ourselves onto geological history in the way that the Anthropocene supposes? The human species has been around for a little over 100,000 years (or a few million years if you count our direct hominid ancestors) and only dominant on a global scale during the last few thousand. That’s a vanishingly small percentage of the 4.5 billion years that the Earth has been around or even the roughly 3.5 billion years during which life has existed. The typical species longevity in the fossil record is about a million years, so we’re still well short of—and quite possibly will not achieve—even an average duration. In contrast, the dinosaurs (of course a group, not a single species) dominated for some 165 million years, and the humble cockroaches have been around for a staggering 280 million years.

Isn’t it a little grandiose to project ourselves onto geological history in the way that the Anthropocene supposes?

Beyond that, both the formal geological proposal and some of the more superheated cultural discussions of the Anthropocene seem more than a little anthropocentric. Even if the worst does come to pass and we wipe ourselves out through our actions (combined with inaction), I’m not convinced that the Earth will remember us much at all. One thing that paleontologists who have studied earlier eras of environmental crisis have discovered is that the Earth and its inhabitants tend to rebound fairly quickly: Even the great extinction event at the end of Permian period saw a fairly rapid return of life’s diversity, and of course the Cretaceous-Paleogene event that ushered out the dinosaurs simultaneously ushered in our distant mammalian ancestors. Mass extinctions, it turns out, can actually be a source of new evolutionary pathways and greater levels of species diversity.

In many cultural discussions of the Anthropocene, it’s often argued, as justification for the label, that no species has ever had such a profound impact on the Earth as a whole. That’s simply not true. Photosynthesizing cyanobacteria some 2.4 billion years ago produced perhaps the greatest environmental revolution in the Earth’s history, when in a relatively short period they drastically reduced atmospheric and marine carbon dioxide and dramatically increased levels of oxygen in what is known as the Great Oxygenation Event. This set the stage for the evolution of all complex life. Nothing we could possibly do as a species will ever rival that, but the humble blue-green algae still don’t have an epoch named after them.

There’s also a certain conflation of victimhood and hubris in some of the Anthropocene rhetoric. Anthropogenic climate change and mass extinctions are often compared to the impersonal geological triggers, like asteroids or volcanoes, of past crises. At the same time, we like to compare our fate to those of long-dead prehistoric groups. Which is it—are we the asteroid or the dinosaur? As it turns out, the dinosaurs did nothing to deserve their fate; they simply had the misfortune to have a giant rock fall on their heads, rendering their environment inhospitable to millions of years of natural selection and adaptation. Humans, on the other hand, have been making a concerted effort to transform their own environment, and on some level, proponents of the Anthropocene seem to want to give them credit for that.

One could reasonably argue that despite their relatively short presence, humans have had an outsized impact, and that’s certainly true. But the Anthropocene concept also reflects the tendency for humans to put their names on everything they touch: from prehistoric megaliths to sports stadiums to office towers. It may be appropriate to memorialize our impact with a geological epoch, or it may not be, but it’s hard to see what the rush is to do so. As the evolutionary biologist Stephen Jay Gould put it in a 1990 essay, from a geologist’s perspective “our planet will take good care of itself and let time clear the impact of any human malfeasance.” 

The term Anthropocene is derived from the Greek word for human, anthropos. The cultural concept, accordingly, addresses humanity as a whole, both in assigning blame for the coming catastrophe and in imagining solutions (or the lack thereof). The implication is that people, as a whole, are a problem.

Proponents of this view argue with some justification that whatever their source, the technological innovations that have produced major changes to the Earth’s climate and environment—carbon dioxide emissions, industrial pollution, artificial radioactivity, deforestation, etc.—have been global in their impact. That is certainly true. But it’s also worth asking whether the responsibility for these consequences—and, perhaps more importantly, the agency in responding to them—is distributed fairly in Anthropocene commentaries.

China and India, for example, are among the leaders in global carbon dioxide production (No. 1 and 3 respectively, sandwiching the United States at No. 2). But Europe and the United States have been releasing carbon into the atmosphere for much longer and have reaped industrialization’s social, political, and economic benefits for two centuries. Is it fair for Western observers to demand the same level of accountability from developing economies in the global south?

Moreover, as a number of recent critics have noted, the Anthropocene is tied very closely to a specific form of economic and industrial development—to capitalism, in other words. For that reason, some authors have suggested replacing “Anthropocene” with “Capitalocene,” or even“Plantationocene,” to acknowledge the roles that Western economic development and, in particular, the system of industrialized agriculture that has dominated since the late 18th century have had on climate and environmental change.A mountain of tires in the Spanish countryside near Madrid on Sept. 24, 2014.

A mountain of tires in the Spanish countryside near Madrid on Sept. 24, 2014. PABLO BLAZQUEZ DOMINGUEZ/GETTY IMAGES

There are very good arguments in favor of naming and shaming the real perpetrators responsible for initiating these trends, but these alternative proposals have problems as well. In the first place, if what we’re really describing is a recent historical trend in economic policy and industrial technology, this starts to sound less and less like a genuine geological epoch. One of the signature features of the Anthropocene is its insistence on merging the scales of human and natural history and forcing humans to think about their role as agents in shaping their natural environment (something biologists call “niche construction” when discussing nonhuman species). Taken at face value, the Anthropocene involves humans, but it also involves a wide array of nonhuman actors and agents as well: the crops that make up today’s agricultural monocultures, the cows and pigs that produce atmospheric methane and other pollutants, the toxic cyanobacteria that thrive in acidifying oceans. These agents know nothing of capitalism or plantations or even of humans themselves in some cases.

On the human front, one can have real concerns about whether proposed solutions to climate and environmental crisis take into account issues of global social justice, self-determination, and agency. I am adamantly not arguing that unchecked economic development should take precedence to combating climate change, but we should be worried about who stands to benefit—and lose—in various solutions that have been proposed.

If what we’re really describing is a recent historical trend in economic policy and industrial technology, this starts to sound less and less like a genuine geological epoch.

Among those authors who have been most fatalistic about the Anthropocene, pessimistic scenarios seem to apply equally to everyone, everywhere. But as any resident of Mumbai or São Paulo will tell you, conditions are already catastrophic, with dangerous levels of air pollution and extreme heat. Among the areas projected to suffer most from rise in sea level by 2050, the vast majority are in the global south. Sure, New York and London and Amsterdam are also threatened, but they are part of societies with vastly greater economic and political resources. For residents of the global north, the effects of climate change have been—and will likely continue to be—more incremental. As the Anthropocene critic Jedediah Purdy puts it, “For all the talk of crisis that swirls around the Anthropocene, it is unlikely that a changing Earth will feel catastrophic or apocalyptic. … Indeed, the Anthropocene will be like today, only more so.” The sense of urgency, then, for immediate solutions to these problems is hardly distributed equally among those likely to be affected.

This concern applies to potential solutions as well. A variety of proposals have been floated, ranging from fairly uncontroversial steps like carbon neutrality and green architecture to the more fantastical, including broad geoengineering initiatives like carbon sequestration and giant orbital mirrors to block sunlight—and even colonies on Mars or elsewhere to escape this planet. These proposals raise the obvious concern about unintended consequences: We simply have no idea what cascading environmental effects such interventions may have, nor have most of these technologies even been invented. They also carry the hubristic sentiments present in the initial Anthropocene proposal (Crutzen and other central proponents have advocated these steps from the start) to a potentially frightening level. Blithely arguing that what technology has broken can be fixed by more technology seems dangerously oblivious to what got us into this mess in the first place.

And such steps simply underscore the inequalities that are already growing exponentially today. The vast sums of money and resources required to carry out these fanciful initiatives are clearly possible only for the richest and most developed economies—those societies that have already benefited from decades and centuries of unchecked industrialization, by the way. What guarantee do we have that those societies that pay for these solutions wouldn’t expect to benefit most from them or be particularly concerned about collateral damage to the economies and environments of societies that can’t? Again, Purdy sounds a necessary warning here, predicting that the “disasters of the Anthropocene in our near future will seem to confirm the rich countries’ resilience, flexibility, entrepreneurial capacity, and that everlasting mark of being touched by the gods, good luck, [while] amplifying existing inequality.”

To be clear, global society does face potentially catastrophic risks from anthropogenic climate change and other threats. We must act to address these problems, and we must act now. We must focus on the parts of the globe where human suffering is already extreme. But however you look at it—as a geological proposal, as a cultural touchstone, or as a set of policy solutions—the Anthropocene is overrated. It may even be dangerous.

David Sepkoski is a professor of history at the University of Illinois at Urbana-Champaign. 

https://foreignpolicy.com/2021/04/16/climate-change-anthropocene-overrated-humans/

Leave a comment