“It’s the Climate, Stupid.”

Geoffrey Parker—

Once upon a time, climate change was a hot topic. In 1979 the World Meteorological Organization (WMO), the United Nations Environment Programme, the National Science Foundation, the Ford Foundation and the Rockefeller Foundation paid for 250 historians, geographers, archaeologists and climatologists from thirty countries to share their expertise at the first International Conference on Climate and History, hosted by the Climatic Research Unit at the University of East Anglia (England)–a unit sponsored by (among others) British Petroleum and Royal Dutch Shell. That same year, the WMO created the World Climate Program, with a mandate to “insert climatic considerations into the formulation of rational policy alternatives”; President Jimmy Carter created the Federal Emergency Management Agency (FEMA) to consolidate federal policies related to the management of civil emergencies, including climate-induced disasters; and the United States Congress invited a committee of scientists “to assess the scientific basis for projection of possible future climatic changes resulting from man-made releases of carbon dioxide into the atmosphere.” The committee predicted that if airborne concentrations of carbon dioxide (CO2) continued to increase, during the first half of the twenty-first century “changes in global temperature of the order of 3oC will occur and… will be accompanied by significant changes in regional climatic patterns.” It also warned that “A wait-and-see policy might mean waiting until it is too late.” In response, Congress passed the Energy Security Act, which (among other things) ordered a Carbon Dioxide Assessment Committee (CDAC) to prepare a comprehensive survey of the “projected impact, on the level of carbon dioxide in the atmosphere, of fossil fuel combustion.”

These initiatives took place in the shadow of a world food crisis. The price of wheat tripled and that of rice quintupled between 1972 and 1974, a reflection of harvest failures in South Asia, North America, the Sahel and the Soviet Union, leading the United Nations to convene a World Food Conference that called on all countries to cooperate in the establishment of “a world food security system which would ensure adequate availability of, and reasonable prices for, food at all times, irrespective of periodic fluctuations and vagaries of weather.” Then, thanks to the Green Revolution (new high-yielding varieties of wheat, maize and rice, combined with increased investment in irrigation, fertilizers, pesticides and herbicides), food production dramatically increased. Famines virtually disappeared from the headlines, and concern about the vagaries of weather waned. The CDAC’s report to Congress in 1983, entitled Climate change, categorically denied that “the evidence at hand about CO2-induced climate change would support steps to change current fuel-use patterns away from fossil fuels,” and instead asserted that “The direct effects of more CO2 in the air are beneficial.”

Climate change nevertheless included disturbing data. If the amount of CO2 in the atmosphere continued to increase at the same rate, as the report predicted, then global temperatures would rise “by about 1oC” as early as the year 2000, and “in polar latitudes a doubling of the atmospheric CO2 concentration would cause a 5 to 10oC warming.” This would increase droughts, and decrease “yields of the three great American food crops [wheat, maize and soybeans] over the entire grain belt by 5 to 10%.” It would also cause sea levels worldwide to rise by 5 or 6 meters, so that even “The old dream of a ‘Northwest passage’ might become a reality.” To avert alarm concerning these eerily accurate predictions, Climate change cited a few historical precedents to demonstrate how easily humans adapt to abrupt climate change: Europe in the fourteenth century (based entirely on Barbara Tuchman’s A distant mirror); Dakota in the 1880s (citing The Bad Lands cow boy newspaper); and the Dust Bowl of the 1930s (“a natural experiment with results dramatized in John Steinbeck’s Grapes of Wrath”). Therefore, the report concluded cheerily: “The safest prediction of any we shall make is: Farmers will adapt to a change in climate, exploiting it and making our preceding predictions too pessimistic.” Moreover, should local adaptation fall short, the CDAC argued that migration would solve all problems. Once again the report cited apparently reassuring precedents:

People have moved from the seacoast to the prairie, from the snows to the Sun Belt. Not only have people moved, but they have taken with them their horses, dogs, children, technologies, crops, livestock, and hobbies. It is extraordinary how adaptable people can be in moving to drastically different climates.

In the unlikely event of a future climate-induced crisis, farmers would “move as promptly as the Okies, saving themselves while abandoning the cropland to some other use.” Despite these spurious historical parallels, the CDAC report immediately became a foundational text for those who sought to deny global warming.

Climate change failed to mention another precedent: the events of 1816, a year without a summer (the first since 1675, and the most recent). It occurred in the middle of a prolonged sunspot minimum amid major volcanic activity, which (as in the seventeenth century) both reduced average global temperatures by between 1ºC and 2ºC and caused extreme weather events. Intense cold prevailed from Finland to Morocco for most of the summer; rain fell on Ireland for 142 out of 153 days between May and September; grapes in French and Swiss vineyards ripened later than in any other year since continuous records began in 1437; the monsoon failed in India; and snow fell in Jiangnan and Taiwan. In America, north of a diagonal line stretching from British Columbia to Georgia, fronts of Arctic air produced temperature oscillations throughout the summer from 35ºC to freezing in a single day, killing the crops: the price of wheat in New York City in 1816 would not be surpassed until 1973. The “Yankee Chills,” as survivors in North America called their miserable summer, produced massive migration from New England to the Midwest. “The lands to the westward are luxuriant, and the climate mild and salubrious” crooned a land promoter, and from 1817 to 1820 the population of the State of Ohio rose by 50 per cent. Most newcomers were New Englanders fleeing the sudden climate change.

If, two centuries later, the Yankee Chills (or any other natural disaster) should strike New England, flight to Ohio would bring little relief. As the 2011 version of the State of Ohio homeland security strategic plan points out: “Getting food from farms to dinner tables involves a complex chain of events that could be interrupted at many different stages. Because food and agriculture are such vital industries to our state, Ohio must vigilantly protect animal, plant, and food supply chains”–but with over 11 million Ohioans, it is hard to see how the state could feed an additional 50 per cent in an emergency. Admittedly, if the Chills killed only corn, or only affected New England, the transport and distribution infrastructure developed since 1816 could probably import sufficient emergency food rations to Ohio from unaffected (or less affected) areas; but this might prove impossible in the wake of a large-scale natural disaster, not least because, in the words of the current State of Ohio emergency operations plan, “Manufacturing agencies within in the United States employ just-in-time inventory systems and do not stock large inventories, thus there may be a supply shortage nation-wide for critical items.” This is an understatement. Whereas once, as Stephen Carmel of Maersk Lines pointed out, “We were self-sufficient in some but not all of what we needed, and we could trade the excess of what we made to fill the gaps… Now we are self-sufficient in nothing.” Therefore, “As in any conveyor belt linking assembly lines, a disruption to any part of the system becomes a disruption to the whole system.” In addition, we are “completely dependent on the uninterrupted flow of accurate information. Without it, trade simply will not happen”–and neither will relief efforts.

The tragic experience of the Gulf Coast region after Hurricane Katrina struck in August 2005 highlighted the consequences of extreme weather when a society relies on just-in-time inventory systems and the uninterrupted flow of accurate information: almost 2,000 people killed; tens of thousands left without basic essentials for almost a week; over 1 million displaced; 92,000 square miles of land laid waste; and 200,000 homes destroyed. It was the largest and costliest natural disaster in the history of the United States, and it led to the largest domestic deployment of military forces since the Civil War. Yet, as a House of Representatives investigative committee reported, “None of this had to happen. The potential effects of a Category 4 or 5 storm were predictable and were in fact predicted”; but “despite years of recognition of the threat that was to materialize in Hurricane Katrina, no one–not the federal government, not the state government, and not the local government–seems to have planned for an evacuation of the city from flooding through breached levees.” The committee’s report, entitled A failure of initiative, was especially scathing about the inability of the numerous teams of responders to contact one another: “Catastrophic disasters may have some unpredictable consequences,” they noted, “but losing power and the dependent communications systems after a hurricane should not be one of them.” They cited the lament of the adjutant-general of the Mississippi National Guard that “We’ve got runners running from commander to commander. In other words, we’re going to the sound of gunfire, as we used to say in the Revolutionary War” (and also in the seventeenth century). The report concluded: “We are left scratching our heads at the range of inefficiency and ineffectiveness that characterized government behavior right before and after this storm. But passivity did the most damage.” Its authors therefore wondered “How can we set up a system to protect against passivity? Why do we repeatedly seem out of synch during disasters? Why do we continually seem to be one disaster behind?”

A few months later, a United States Senate investigative committee reached similar conclusions in a report entitled Hurricane Katrina: a nation still unprepared. The “still” referred to the inadequate responses to another catastrophic event revealed by the independent commission that studied the 9/11 attacks just four years earlier, and several members of Congress called for a similar independent commission into Katrina in order to learn from mistakes made–but the White House thwarted them. So the United States remained “one disaster behind”: although the federal government improved its ability to mobilize and deliver massive quantities of supplies to assist state and local government in the days and weeks after a disaster (search and rescue, law enforcement, temporary shelter, emergency distribution of food, water and medicine), it did far less to help vulnerable localities prepare for long-term recovery (how to keep local government operating, rehouse displaced people, and prepare for the inevitable health problems, mental as well as physical).

When Hurricane Sandy struck New York and New Jersey in October 2012, therefore, entirely predictable infrastructure failures (loss of electrical power, closure of transportation systems, shortage of gasoline) once again afflicted tens of millions of people for days and sometimes weeks, and a smaller (but still substantial) number for months and sometimes years. For Adam Sobel, an earth scientist, the primary culprit for these defects was what A failure of initiative had termed “passivity”:

Most of the damage could not have been prevented by any decision made or action taken in the days leading up to the storm. Most of it resulted from decisions made in the years, decades, and even centuries prior. Most was caused, perhaps unconsciously, by decisions to do nothing, or the absence of a decision to do something. These nondecisions were made not in response to scientific predictions, but despite them.

The same passivity had characterized responses to many seventeenth-century disasters. To take one glaring example, plague epidemics in 1603, 1625, and 1636 had killed tens of thousands of Londoners, and when a new epidemic ravaged continental Europe in 1664-65 it was easy to anticipate the consequences if it reached the English capital. Nevertheless, neither local nor national government took appropriate action. Instead, when plague struck, the king and his court, many magistrates and almost all the rich fled. Parliament assembled in Oxford to debate appropriate measures, but no legislation passed because the peers demanded an exemption from restrictive measures such as quarantine and insisted that no plague hospitals be erected near their own homes. One may wonder why the central government did not act unilaterally to save its capital; but, as a contemporary pamphlet pointed out, “their power was limited and they must proceed legally.” The rule of Oliver Cromwell and his army officers (which had ended only five years before) had left a bitter legacy, and Charles II dared not risk alienating his new subjects by imposing unpopular measures. The consequences of government nondecisions were therefore measured in the corpses of plague victims dumped daily into mass graves. In all, the epidemic killed 100,000 Londoners, one-quarter of the total population of the capital, plus 100,000 more victims elsewhere in England. Nevertheless, unlike Katrina and Sandy, this catastrophe proved to be a tipping point: the English government introduced and enforced stringent controls that ensured its citizens never again suffered a major plague epidemic.

Social psychologist Paul Slovik has argued that “the ability to sense and avoid harmful environmental conditions is necessary for the survival of all living organisms,” whereas “humans have an additional capability that allows them to alter their environment as well as respond to it”– but only if they deploy two distinct skills: learning processes (the observation, measurement and classification of natural phenomena) and learning steps (the development of techniques, practices and instructions designed to reduce vulnerability in future hazards). In order to activate this additional capability, humans apparently need to experience natural disasters “not only in magnitude but in frequency as well. Without repeated experiences, the process whereby managers evolve measures of coping with [disasters] does not take place.” The National Hurricane Center, a division of the United States National Weather Service, confirmed this insight in the wake of the disastrous hurricane seasons of 2004 and 2005 (which included not only Katrina but seven of the nine costliest storm systems ever to strike the United States). Another “disastrous loss of life is inevitable in the future,” they concluded sadly, because the majority of those living in areas at risk have “never experienced a direct hit by a major hurricane” and seemed incapable of envisaging what one is like, while the rest “only remember the worst effects of a hurricane for about seven years. Adam Sobel reached a similar conclusion after Sandy:

When a particular type of event has not happened before, predictions that the risk of that event is significant do not, historically, generate the collective will necessary for us to make investments in resiliency. This is true even when the science indicates quite clearly that the event is quite likely to happen eventually, and that the consequences of being unprepared for it will be severe. Just as the vulnerability of New Orleans was known for decades before Katrina, the vulnerability of New York City and the coastal areas around it was known for decades before Sandy.

In the words of Australian Professor of Public Ethics Clive Hamilton, “Sometimes facing up to the truth is just too hard. When the facts are distressing it is easier to reframe or ignore them.”

Perhaps cognitive dissonance explains why many societies fail “to make investments in resilience” to cope with our changing climate, despite the fact that the global temperature in 2016 was the warmest ever recorded, that it was the third consecutive year in which a new annual temperature record was set, and that the first sixteen years of the twenty-first century all ranked among the seventeen warmest on record; and despite the unprecedented extreme weather events–notably prolonged periods of unusual heat, heavy downpours, and flash floods–that disrupted people’s lives and damaged infrastructure. One may deny that the global climate is changing, but it is hard to deny that a heatwave in Europe in 2003 that lasted just two weeks caused the premature deaths of 70,000 people; that eighteen major flood events hit Texas, Louisiana, Oklahoma and Arkansas between March 2015 and August 2016; or that almost 18 inches of rain fell on Sacramento and other parts of California in January and February 2017, causing damage to roads, dams, and other infrastructure that may cost $1 billion to repair.

The scientific community is virtually unanimous on the reality of global warming: of 69,406 authors of recent peer-reviewed articles on the subject, only five rejected it. Likewise, only 2 per cent of the members of the American Association for the Advancement of Science, the world’s largest multidisciplinary scientific professional society, denied that the Earth is warming (the same proportion who rejected evolution), a figure that fell to 1 per cent among earth scientists surveyed. By contrast, a 2016 Pew Research Center survey of adults living in the United States revealed that less than half believe “that the Earth is warming mostly due to climate change,” and scarcely a quarter believe that “almost all climate scientists agree that human behavior is mostly responsible for climate change.”

These findings, and similar ones elsewhere, prompted a team of researchers to ask people in forty-seven different countries “How serious do you consider global warming?” and to share their rationale. The responses revealed three broad reasons for popular scepticism:

  • A marked negative correlation between biblical fundamentalism and concern for the environment is evident, particularly among Christians in the United States, many of whom see natural disasters as divine punishments for sin, so that both preparation and mitigation are a waste of time and money (a peccatogenic outlook also common in the seventeenth century.)
  • A negative correlation also exists between residence in regions highly exposed to natural disasters such as hurricanes and concern about global warming, either because hazard and disaster are accepted as aspects of daily life, or because it may be hard to admit that buying property in a location at high risk of damage from natural disasters such as hurricanes is foolish.
  • Respondents in rich countries, and in countries with large carbon dioxide emission levels, showed less concern about global warming than those in poor countries or in countries with low emissions, no doubt because it is harder to accept global warming as a problem when it requires recognition that it is partly your fault and when mitigation requires major personal sacrifices.

All three reasons reflect the culture of passivity described in A failure of initiative, but they receive reinforcement from a small but powerful group of activists.

In his blueprint for scientific research, the New Atlantis in 1626, Francis Bacon extolled the “merchants of light” who travelled afar to bring back scientific knowledge and then collaborated to debate its significance and apply its practical benefits. Almost all scientists today do precisely this, but on a few subjects they have to contend with “merchants of doubt” who seek to discredit their research and promote distrust of their conclusions. Over the past half-century, the merchants of doubt have vigorously challenged the scientific evidence for links between tobacco-smoking and lung cancer, between acid rain and environmental damage, and between chlorofluorocarbons (CFCs) and the depletion of the ozone layer, as well as between CO2 emissions and climate change. In each case, they have found and funded contrarian experts (almost always from a scientific field not relevant to the environmental or public health issues in question) and insisted that journalists devote equal attention to the deniers, however few and however unqualified. Although the deniers eventually lose, they still cause immense harm. Thus, in 2006, after seven years of litigation, US Federal Judge Gladys Kessler determined that the tobacco industry had engaged in an “unlawful conspiracy to deceive the American public about the health effects of smoking” since the 1950s. Moreover, because it “consistently, repeatedly, and with enormous skill and sophistication, denied these facts to the public, to the Government, and to the public health community,” it managed to delay the regulation of cigarettes by almost half a century, thereby causing “a staggering number of deaths per year, an immeasurable amount of human suffering and economic loss.”

In the case of climate change, the number of deaths, the amount of human suffering and the economic loss will be far higher. Extreme weather caused 91 per cent of almost 16,000 natural disasters recorded worldwide between 1980 and 2015. More specifically, in Europe, where extreme weather accounted for 92 per cent of all natural disasters reported in the same period, economic losses caused by weather-related events increased from a decadal average (adjusted for inflation) of over €7 billion in the 1980s, to over €13 billion in the 1990s, and over €14 billion in the 2000s.

These alarming statistics have attracted the attention of the world’s insurance companies. The International Association for the Study of Insurance Economics, also known as the Geneva Association, estimates that “losses from weather events are growing at an annual 6 per cent, thus doubling every twelve years”; it predicts that “rising loss trends will continue”; and it blames the loss trends on a synergy between climate change and human perversity.

The lack of preventive strategies (for instance, land zoning, building codes, and so on) in many countries’ development planning results in increasing vulnerabilities and risks due to disasters and climate change. Further, ever more people and assets are concentrated in exposed (urban) areas such as coastal regions in low- and middle-income countries. At the same time, interconnected global supply and manufacturing chains are highly vulnerable to disaster-induced disruption. And, last but not least, climate change is believed to add to the increasing severity and frequency of extreme events.

Amitav Ghosh has emphasized the special vulnerability of Asia. “If we consider the location of those who are most at threat from the changes that are now under way across the planet,” he wrote, “the great majority of potential victims are in Asia.” Because of rising sea-levels, falling aquifers and desertification, “the lives and livelihoods of half a billion people in South and Southeast Asia are at risk. Needless to add, the burden of those impacts will be borne largely by the region’s poorest people, and among them, disproportionately by women–and yet “there exist very few polities or public institutions that are capable of implementing, or even contemplating, a managed retreat from vulnerable locations.”

High-impact natural disasters will not be confined to South and Southeast Asia. A report published in 2007 by the military advisory board of the Center for Naval Analyses, a non-profit research organization based in Arlington, Virginia, presciently warned that “Climate change acts as a threat multiplier for instability in some of the most volatile regions of the world. Projected climate change will seriously exacerbate already marginal living standards in many Asian, African, and Middle Eastern nations, causing widespread instability and the likelihood of failed states.” “The major impact on Europe from climate change is likely to be migrations,” they continued, “now from the Maghreb (Northern Africa) and Turkey, and increasingly, as climate conditions worsen, from Africa.” Their prediction soon came true. In Syria, a country whose population had grown sevenfold in two generations, a multi-season, multi-year extreme drought, which began in the winter of 2006-7, reduced yields of wheat and barley by one-half and two-thirds respectively, and destroyed livestock herds. Over 1 million hungry and homeless people fled from rural areas to the cities, just as their ancestors had done when a similar prolonged drought struck in the seventeenth century. There they raised the population of some cities by one-third, dramatically driving up food and housing prices, overstraining services such as hospitals and schools, increasing urban unemployment, economic dislocation, and social unrest. Civil war broke out in 2012, resulting in over 1 million starving refugees crossing into Turkey and thence into Europe. By 2016, half the population of Syria had been displaced.

What can be done to avoid such heart-rending scenarios? The Church of Jesus Christ of the Latter Day Saints currently enjoins its members to prepare for a sudden natural disaster by taking “the amount of food you would need to purchase to feed your family for a day and multiply that by seven. That is how much food you would need for a one-week supply. Once you have a week’s supply, you can gradually expand it to a month, and eventually three months.” More modestly, the State of Ohio recommends that each family should store “enough food and water to last from several days up to two weeks.” In particular, since “you can exist on very little food for a long time, but after a short time without adequate water, your body will not be able to function… a family of 4 who wanted to keep a 1-week supply of water on hand would need to store 28 gallons.” And after that? The devastation caused by Hurricanes Katrina and Sandy and countless other abrupt high-impact weather events lasted for months, and in a few areas for years, far exceeding the resilience of even the most forward-looking family or local government acting alone.

In December 2012, two months after Sandy, Mayor Michael Bloomberg of New York City created a Special Initiative for Rebuilding and Resiliency and tasked it with producing a plan for “a stronger, more resilient New York.” Its report, 445 pages long, laid out over 250 initiatives to “make our city even tougher,” at a combined cost of almost $20 billion–but not alone. The report concluded: “Given the important role played by the Federal government in flood risk assessment, flood insurance, and coastal protection measures, a clear Federal agenda for the City to pursue (in partnership with the State and the Congressional delegation) is critical to the successful implementation of the plan outlined in this report.” It added: “While this list does not reflect all of New York City’s needs from the Federal government, it does reflect a set of priorities that require immediate attention.”

This recognition reflects the experience of other societies: preparing for, and coping with, a major weather-induced catastrophe requires resources that only a central government can command. The construction of the Thames Barrier in southeast England offers an instructive example. The river Thames has frequently burst its banks and flooded parts of London. In 1663 Samuel Pepys reported “the greatest tide that ever was remembered in England to have been in this river: all White Hall having been drowned.” Proposals were made to erect a barrier to prevent the recurrence of similar catastrophes but the opposition of London merchants, whose trade would suffer if ships could not sail up the Thames, and disagreements among competing jurisdictions over the cost, thwarted them. Then, in 1953, a tidal surge in the North Sea flooded some 150,000 acres of eastern England and drowned more than 300 people. The government declared: “We have had a sharp lesson, and we shall have only ourselves to blame if we fail to profit from it,” and set up a committee to propose remedies. It, too, recommended the immediate construction in the Thames estuary of a “suitable structure, capable of being closed,” but opposition from shipping interests and cash-strapped local authorities again prevented action.

In 1966 the government asked its chief scientific adviser, Hermann Bondi, to examine the matter afresh. A mathematician by training, Bondi devoted much attention to assessing risks; but he also consulted historical sources and found that the height of storm tides recorded at London Bridge had increased by more than 1 meter since 1791 (when records began). Although he could not identify the cause of this alarming development, Bondi predicted that sea levels would continue to rise, increasing the probability of another “major surge flood in London” that would deliver “a knock-out blow to the nerve center of the country.” He compared the likelihood of this with other low-probability/high-impact events, such as an asteroid or meteorite hitting central London, which would also cause immense damage; but concluded that the risk was remote and prevention almost impossible–whereas, given the rising level of the North Sea, another disastrous flood similar to, or worse than, that of 1953 was inevitable. He therefore unequivocally recommended the construction of a Thames Barrier.

Although shipping interests and local authorities did their best to thwart this plan, too, in 1972 Parliament passed the Thames Barrier and Flood Protection Act and promised to fund Bondi’s recommendation. By its completion in 1982, the barrier had cost £534 million–but the property it protects now exceeds £200 billion in value, and includes 40,000 commercial and industrial properties and 500,000 homes with over 1 million residents. If the Thames Barrier were not in place, and another flood were to “drown” Whitehall, the heart of government today as in the time of Pepys, it would displace the 87,000 members of the central administration who work there. It would also flood the new Docklands economic development as well as sixteen hospitals, eight power stations and many fire stations, police stations, roads, railway lines, rail stations and underground stations, as well as the shops and suppliers needed to repair and replace items damaged in the flood. Londoners would therefore lose not only their homes and their jobs but also the essential means of response and recovery. In short, without the Thames Barrier, London would resemble New Orleans in 2005: vulnerable to a natural disaster that, like Katrina, is sooner or later inevitable. By 2016 the Thames Barrier had been activated to prevent flooding 176 times, 50 of them over the winter of 2013-14.

Extreme weather is a great leveller. Despite all the obvious differences, humans in advanced societies have the same basic needs as humans elsewhere. We all need shelter, sufficient water and at least 2,000 calories a day; and we are all vulnerable during “the hungry time” (the term used by the Aboriginal people of Western Australia for the season between the end of one annual cycle and the beginning of the next), because if the power grid fails, very soon there will be no food on supermarket shelves and no water. The changing geographical distribution of the global population is increasing that vulnerability. In 1950, Europe had three times the population of Africa, but in 2016 the population of Africa was at least 50 per cent larger than that of Europe–a disparity that widens every year as the former grows and the latter declines. This shift increases the percentage of the global population that spends a high proportion of disposable income on basic needs such as water, food, energy, and housing, often in areas where even central governments lack effective means of dealing with major disasters, making them more vulnerable to the effects of climate change. The Global Crisis of the seventeenth century prematurely ended the lives of millions of people. A natural catastrophe of similar proportions and duration today would prematurely end the lives of billions of people.

We face clear choices. As Britain’s chief scientific adviser observed in 2004, summarizing the research of nearly ninety leading experts on the risks of flooding, “We must either invest more in sustainable approaches to flood and coastal management or learn to live with increased flooding.” Anthony Zinni, former commander-in-chief of US Central Command (responsible for the Middle East, North Africa and Central Asia), made a similar point with characteristic bluntness in a 2007 interview: “We will pay for this one way or another,” he said. “We will pay to reduce greenhouse gas emissions today, and we’ll have to take an economic hit of some kind. Or we will pay the price later in military terms. And that will involve human lives. There will be a human toll. There is no way out of this that does not have real costs attached to it.” In 2011, a study of the impact of climate change, based on thirty years of empirical data, prepared for New York State and published (ironically) just eleven months before Sandy struck, quantified the equation: “There is an approximate 4-to-1 benefit-to-cost ratio of investing in protective measures to keep losses from disaster low.” In short, we can pay to prepare, and commit substantial resources now, or we can incur far greater costs to repair at some future date.

Like Cassandra, historians who prophesy rarely receive much attention from their colleagues (or anyone else), and those who prophesy doom (whether or not they are historians) are normally dismissed as whiners–hoggidiani, to use the dismissive term in Secondo Lancellotti’s 1623 bestseller Nowadays. Yet hoggidiani are not always wrong. As environmental historian Sam White has observed, “Studying climate without considering the history of climate is like driving without a rearview mirror: it provides not just parables but also parallels about past climate change and its effects.” Earth scientists Tim O’Riordan and Tim Lenton concurred; because “there is no single template for anticipating and adjusting” to weather-induced disasters, they argued that there is ‘no substitute for good case history of successful practice’. We therefore “need to follow examples of successful anticipation” and adjustment in the past “in order to offer the best set of learning experience for others to follow.”

The seventeenth-century Global Crisis offers two successful but very different learning experiences. Famine and unrest in Japan led Tokugawa Iemitsu and his advisers to create more granaries, upgrade the communications infrastructure, issue detailed economic legislation and avoid foreign wars in order to preserve sufficient reserves to cope with the consequences of extreme weather. But, as Ghosh notes, “Climate change poses a powerful challenge to what is perhaps the single most important political conception of the modern era: the idea of freedom, which is central not only to contemporary politics but also to the humanities, the arts, and literature.” The successful efforts of the Tokugawa to protect their subjects from starvation involved policies that few today would deem acceptable: they forbade freedom of speech, belief, assembly or movement; monopolized the possession and use of firearms; conducted constant surveillance and summarily executed offenders; and, when all else failed, they permitted mabiki, literally “thinning out,” but also the metaphor of choice for infanticide. England followed a different strategy for adjusting to weather-induced disasters. Successive dearths in the 1590s, 1629-31, 1647-49, and the 1690s gradually forced reluctant property owners to accept the central government’s argument that it was both cheaper and more efficient (as well as more humane) to support those who became old, widowed, ill, disabled, or unemployed locally, thus creating the foundations of the world’s first welfare state.

Nevertheless, not even England could cope with an abrupt change in the global climate, as George Gordon, Lord Byron, discovered in 1816 after he had fled the country amid accusations of incest, adultery, wife-beating, and sodomy. He planned to relax in a villa near Lake Geneva with a former mistress, his personal physician John Polidori, and a select group of close friends. Instead, the party spent a “wet, ungenial summer” (Switzerland was one of the areas worst affected by global cooling), which forced Byron and his companions to spend almost all their time indoors. Among other recreations, they competed to see who could compose the most frightening story. Mary Wollstonecraft Shelley began work on Frankenstein, one of the first horror novels to become a bestseller; Polidori wrote The vampyre, the progenitor of the Dracula genre of fiction; Byron composed a poem that he called “Darkness.” All three works reflected the disorientation and desperation that even a few weeks of sudden climate change can cause. As we debate whether it makes better sense today to invest more resources in mitigation or “continually seem to be one disaster behind,” we might re-read with profit Byron’s poem because, unlike our ancestors in 1816 (and in the seventeenth century), we possess both the resources and the technology to choose whether to prepare today or repair tomorrow.

Darkness by Lord Byron

I had a dream, which was not all a dream.

The bright sun was extinguish’d, and the stars

Did wander darkling in the eternal space,

Rayless, and pathless; and the icy Earth

Swung blind and blackening in the moonless air.

Morn came and went – and came, and brought no day,

And men forgot their passions in the dread

Of this their desolation; and all hearts

Were chill’d into a selfish prayer for light:

And they did live by watchfires – and the thrones,

The palaces of crowned kings – the huts,

The habitations of all things which dwell,

Were burnt for beacons; cities were consum’d,

And men were gather’d round their blazing homes…

And War, which for a moment was no more,

Did glut himself again: a meal was bought

With blood, and each sat sullenly apart

Gorging himself in gloom: no love was left.

All Earth was but one thought – and that was death,

Immediate and inglorious; and the pang

Of famine fed upon all entrails…

 

From Global Crisis War, Climate Change and Catastrophe in the Seventeenth Century – Abridged and Revised Edition by Geoffrey Parker, published by Yale University Press in 2017. Reproduced by permission.


Geoffrey Parker is Andreas Dorpalen Professor of History and associate of the Mershon Center at The Ohio State University, and the 2012 winner of the Heineken Prize for History. He lives in Columbus, OH.


Further Reading

 

 

 

 

 

 

 

 

 

Featured Image: “A soldier directs Hurricane Katrina victims as they exit the back of an Army CH-47 Chinook helicopter during relief efforts in New Orleans, Sept. 3, 2005” by U.S. Navy Petty Officer 1st Class Robert McRill, licensed for use on the public domain by the U.S. Department of Defense.

1 Discussion on ““It’s the Climate, Stupid.””
  • Great article!When it comes to future, we don’t have enough tools to predict it with precision. But we can always prepare for it. This holds true in the case of climate change.

Leave A Comment

Your email address will not be published.