Recommended
Event
The first age of pandemics followed in the wake of farming, cities and trade, because infections leverage proximity and numbers to survive and evolve. After millennia of mass mortality, followed by two centuries of progress against plagues driven by sanitary and medical revolutions, will we allow a second age of pandemic death to flourish in the dense and connected world that progress has created? A poxed century can be avoided if we cooperate to respond.
The shift from prehistory to history was a shift to density. Even the most inefficient, early agriculture was associated with populations per square mile that were ten or twenty times higher than among nomadic groups. Farming brought humans into close living not only with each other, but with domesticated pigs, birds and cows that harboured species-hopping infections. This density—alongside links between populations—was vital to the emergence of mankind’s most deadly microbial adversaries. Measles, for example, mutated from a disease of cattle. It needs about five hundred thousand people living in close contact to survive, otherwise it dies out for lack of fresh victims to attack. New infections repeatedly assailed an ancient world of connected cities across large parts of Eurasia. The plague of Athens was one of the earliest reliably recorded of those pandemics. The city state was a major trading power, importing three million bushels of wheat a year from the Black Sea alone. According to the chronicler Thucydides, plague spread through North Africa and then invaded Athens in 430 BCE via the port city of Piraeus. Symptoms included “redness and inflammation in the eyes, the inward parts, such as the throat or tongue, becoming bloody and emitting an unnatural and fetid breath.”
This plague was not as deadly or widespread as one that came soon after Rome’s ambassadors first reached the Han emperor in China in 160 CE. Trading networks across the Eurasian Steppe helped spread a disease that might have been smallpox, which killed more than one in four people in parts of the empire. Later, in the sixth century, another pandemic from the Steppe finished off the Roman Empire altogether: an outbreak of Yersinia pestis, the plague of the Black Death. Rats carrying infected fleas boarded the grain ships of Alexandria, which travelled the Mediterranean, spreading the bacillus as they went.
The collapse of Rome was eventually followed by renewed population growth—precisely because Europe was less connected, less densely occupied, and therefore less vulnerable to infection. In turn, this led to a revival of cities and a rekindling of Eurasian trade. The cycle was completed in the fourteenth century, with the pandemic of the Black Death. Mortality rates were so high in the city of Avignon, the Pope simply consecrated the River Rhone as a burial site. Every morning, hundreds of corpses were thrown into the waters. Across Europe, more than half the population died.
Soon after, Europeans spread around the world, taking their diseases with them. The New World was comparatively free of its own infectious diseases—in part because there were no domesticated pigs, horses, or cattle (and domesticated guineapigs apparently posed little infectious threat). Before Columbus reached the Americas, areas 50 miles wide or more around the Amazon and its major tributaries may have been densely occupied all the way into Peru and the base of the Andes, 2,500 miles from the sea. The scale of destruction caused by imported European and African diseases—including malaria, smallpox, and typhoid fever—caused complete social breakdown. The few survivors in the Amazon basin were reduced to a Stone Age existence, their descendants to be found today among the region’s so-called ‘uncontacted tribes.’
Imperialism and global trade had the potential to make the cycle of plagues a worldwide phenomenon. But by the time of the last great die-offs of previously unexposed populations, the planet was on the verge of dramatic progress against infectious death that would close the first age of pandemics.
Until the last century and a half, our response to infectious diseases was limited by almost no understanding of what they were or how they spread. Instinct sometimes helped: running from concentrations of sick people, rats and fleas really worked to reduce plague risk, for example. The Florentine author Boccaccio, who lived through the Black Death, wrote a fictional account of a group of young social influencers who decamped to a country villa and told each other stories to pass the time in isolation. He suggested such people were “the most sound, perhaps, in judgment, as they were also the most harsh in temper.” Shutting out the potentially afflicted was a similar harsh but effective instinct-based approach, and quarantines were first introduced in response to the plague.
Meanwhile, China may have been spared the Black Death altogether, perhaps in part because less over-crowded cities with better sewage and waste disposal were home to fewer of the vermin that helped Yersinia pestis infiltrate human homes. China also introduced another sanitary innovation that reduced risk, at least for the lucky few: when Marco Polo visited the country in the thirteenth century and attended an imperial banquet, he reported that the wait staff “have their mouths and noses swathed in fine napkins of silk and gold, so that the food and drink are not contaminated by their breath or effluence.”
But most responses to most infection probably did more harm than good. In Edinburgh, for example, the authorities banned leeks, chives and onions in an attempt to quell the Black Death. The disease was blamed on misaligned planets, poor quality air and—at a deadly cost to the accused—a Jewish plot to poison wells. Meanwhile, treatments like bleeding and purging made patients more likely to die. Applying the shaved rear end of a chicken to the plague bubo at least broadly aligned with the Hippocratic oath to ‘first, do no harm.’
Comparative helplessness to respond to microbial threats began to abate in the nineteenth century. Sanitary advances—from sewage systems and clean water to food standards, housing codes and sterilisation—began raising life expectancies in the pestilential cities of the Industrial Revolution. In the second half of the twentieth century, with a medical revolution based on a solid understanding of microbial biology, progress against premature death spread worldwide.
Considerably reduced deaths from diarrheal diseases, respiratory infections, malaria, and infected wounds all played their part in reducing global child mortality to a fraction of its historical level. Edward Jenner predicted an end to smallpox soon after he demonstrated the first vaccine against the condition in 1796; it was finally accomplished in 1980. That victory alone has already saved hundreds of millions of lives, and was achieved by a global effort that saw cold war adversaries America and the Soviet Union cooperate to ensure vaccines were available worldwide. In the past few decades, we reached a huge milestone: globally, more people now die from non-infectious diseases than from infectious ones.
It’s not a coincidence that we’ve also recently passed another milestone. In 1960, 1 billion people lived in urban areas worldwide. Today that’s closer to 4 billion—and since 2007, towns and cities have been home to the majority of the world’s people. Something similar has happened to our domestic animal population: the number of chickens worldwide has climbed from 3.9 billion in 1961 to 21.7 billion today, and more than three-quarters of those chickens are factory-farmed. Meanwhile, we have become more connected than ever: 1.2 billion people travel internationally as tourists each (normal) year. Through much of history, the great majority of humans have lived in— and rarely left—communities no bigger than 250 people. A single train or plane can hold far more people than that, as it speeds them to offices, conferences, concerts and malls. History suggested what might follow when density and connection are at such unprecedented levels.
COVID-19 is only the latest in a succession of new infectious threats that have emerged in recent decades. Since 1970, we’ve faced bird flu, SARS, Ebola, Nipah virus, Marburg fever, cryptosporidiosis and hantavirus. Not to mention resurging diseases, which include monkeypox, dengue and yellow fever, drug-resistant malaria, and even plague. And while HIV, the virus that causes AIDS, may have emerged as early as the 1920s, it only became a mass global killer in the 1980s, responsible for over thirty million deaths since then. The irony of our progress against death from infection over the past two centuries is that our success has helped create the perfect environment for the emergence of a new disease outbreak, and for that outbreak to have catastrophic global social and economic impact.
Does that mean we are entering a new age of pandemics? Certainly, the risk is there. Evolutionary biologist Katherine Smith and colleagues at Brown University studied more than 12,000 reported disease outbreaks worldwide since 1980 and concluded that both the number of diseases and the number of outbreaks have been increasing over time. Given how rapidly they can mutate, newly evolved microbial threats will surely continue to hurl themselves at humanity. And a world of billions living in globally connected cities will always be prone to their rapid spread. An aging worldwide population increases the risk of serious illness or death from the threats that emerge. In the last year, a team from Metabiota estimated that the probability of a future zoonotic spillover event resulting in a pandemic of COVID-19 magnitude, or larger, may be as high as 3.3 percent each year.
Adding to the danger—and despite everything we’ve learned since the first age of pandemics—the world responds poorly to the risk of outbreak infections. Not least, we could reduce that risk in the first place through far better sanitation, for humans and domestic animals alike. Yet, nearly a third of the world’s people are still going to the toilet in open pits or in the fields, multiplying the danger posed by diseases that travel the fecal-oral route. Factory farm animals all too often stand packed together on piles of their own ordure. We make that infectious threat worse by regularly feeding antibiotics to livestock, increasing the risk that bacteria evolve resistance to one of our most powerful defences against illness.
We could also spot and respond to new threats faster with better disease surveillance. But World Health Organization Joint External Evaluation exercises carried out before COVID-19 hit suggested that much of the world failed to meet even the minimum surveillance obligations imposed by the international health regulations that the WHO oversees.
We could do a better job at controlling outbreaks as they occur through testing, tracing and isolation. But while a few countries—including Vietnam and South Korea—used those strategies to rapidly control spread of COVID-19 in 2020, most of the world utterly failed. The US, for example, seriously started thinking about how to expand access to cheap COVID-19 tests only eighteen months into the pandemic. Contact tracing efforts in place at the start were rapidly overwhelmed by the spread of the disease.
We could support rapid research and development of tests, preventatives, and treatments, as well as global capacity to produce them. But while we’re funding research on new bioweapons, we’ve let mass killers like AIDS and tuberculosis fester without vaccines—in part simply because they mostly kill poor people. And the last year demonstrated that production for everything from surgical masks through vaccines lacked the capacity to respond rapidly to demand. Once again, the world’s poor paid the greatest price.
As a result, we have been thrown back on the instinct-based responses of our late medieval forebears. Limiting contact through lockdowns helped slow COVID’s spread. Similarly, in countries like New Zealand that had domestic transmission controlled, travel restrictions helped keep COVID-19 out. But these were immensely costly solutions to flatten the curve of infection in the absence of better options. And travel restrictions were introduced (and have remained) where they do more harm than good. The US advance-announcement of a partial travel ban weeks after the country was already seeded with infections in early 2020 resulted in a rush to get home before the ban was introduced, leading to airports choked with unmasked people waiting hours packed together in lines to clear customs. That may have been one factor in the early and horrible COVID-19 losses suffered by New York City, home to JFK airport.
Worse, the exclusion instinct extended to an immensely parochial response once vaccines provided real hope for controlling pandemic impact. Not least, putting in place export restrictions on pharmaceuticals and ingredients while favouring full dosing—and even third doses—for low-risk populations in rich countries before first dosing high-risk groups in the developing world. Inequitable vaccine access will lead to unconscionably more deaths, but also risk extending the pandemic: new strains are more likely to emerge where COVID-19 spreads unabated.
All that said, the last eighteen months has also shown we can develop effective responses to infectious disease in record time. The first tests for COVID-19 were developed a matter of weeks after the virus was detected. Countries that combined testing with tracing and isolation regimes—including New Zealand, South Korea and Vietnam—have weathered the pandemic with very few cases. If the rest of the world puts in place similar capacity before the next pandemic, the risk of hundreds of millions being infected would dramatically decline. Death rates amongst hospitalized cases in countries, including the US and UK, fell considerably in 2020 thanks in part to improved treatment regimes. The process of creating, testing and manufacturing safe and effective vaccines has taken less than twelve months. If we had more global capacity for vaccine production, the timespan of inequitable vaccine access could be reduced from years to months.
These are signs of how much better placed we are than in the first age of pandemics. For most of history, ‘flattening the curve’ of cases would have made little sense as a disease fighting strategy: hospitals couldn’t save more people even with more time, and vaccines would never have arrived.
But we should do so much better. Global progress against infection over the past two centuries was built on efforts to make connection less dangerous. The prosperity and innovation that unleashed is what has given us unprecedented potential to respond to pandemic threats. In the circumstances, hiding behind walls is not only selfish, but also an immensely costly and high-risk strategy to respond. Instead, we need global cooperation to ensure truly universal access to sanitation and basic health services, including vaccination and treatments; far stronger surveillance with global coverage; and greater and more distributed global capacity to research, develop and produce tests, treatments and vaccines. The costs for such measures might reach into the hundreds of billions, but pandemic costs are in the trillions.
Through sufficient neglect or miscalculation, we could allow communicable diseases to fight back and reclaim their place as death’s most popular weapon. History suggests such a reversal would shape the coming century more than almost any other conceivable event—more than climate change, even. The threat is on a par with that of limited nuclear war. And even if that full threat doesn’t materialise, we could allow poor response to new diseases like COVID-19 to stifle global progress. We have the technologies and the know-how to flatten the plague cycle permanently and limit the human and economic costs. The choice to avoid a new age of pandemics is ours.
Rights & Permissions
You may use and disseminate CGD’s publications under these conditions.