Read Armageddon Science Online
Authors: Brian Clegg
Even when the ice didn’t extend that far, during the glacial events, much of the life in the temperate zones was disrupted. Ice sheets blocked rivers, stopping them from flowing into the sea, causing massive flooding of coastal regions. What had been forests became icy tundras where few of the existing species could survive. And where there was new ice and snow, it encouraged further cooling. Just as positive feedback encourages climate change now, the positive feedback from the ice sheets had an effect. The more ice there was, the more sunlight was reflected back without warming the surface. This made it cooler, enabling more ice to form.
The first clues to this very different climate came from unexpected boulders. In the eighteenth century it was noticed that there were boulders in Alpine valleys that were far away from the native rock that had spawned them. It was known that glaciers carried rocks like these, and the only explanation seemed to be that the glaciers had once extended much farther than they now did. As geological science improved, there were other, more subtle signs of ice sheets traveling much farther than they now did, from linear scratching on rocks to the chemical analysis of different geological layers and the distribution of fossils. All indicated a regular visitation and withdrawal of the ice sheets over thousands of years.
In the interglacial periods, such as the one we now occupy, the ice sheets withdraw, leaving only residual sheets on land areas like Antarctica and Greenland. These interglacial periods have often lasted around ten thousand to fifteen thousand years—about how long we have been in this interglacial period; so there was some concern until recently that we might soon be plunged back into a glacial period, with the ice rendering the United States, Europe, and much of Asia uninhabitable.
In fact, this fear rested on an oversimplification; all the evidence was that even before the impact of climate change, we were probably several thousand years away from any danger from the ice. The problems of climate change mean that this is even less of a threat. Nonetheless, we should be aware that sometime in the future temperatures will begin to drop again, and ice will once more threaten human existence—though exactly when that will occur is not clear.
With all the possible threats we face from the natural world, from our science and technology, from war and terrorism, it might seem that we should be looking to the future with fear. On a bad day, it appears almost inevitable that we are going to wipe ourselves out. Yet there is still hope for the human race. We have to remember that science has been anything but all bad for us. And, as we’ll see, the Pandora’s box we opened has brought good as well as evil.
It is science alone that can solve the problems of hunger and poverty, of insanitation and illiteracy, of superstition and deadening custom and tradition…. The future belongs to science and those who make friends with science.
Jawaharlal Nehru (1889–1964), quoted in
The Making of Optical Glass in India, Proceedings of the National Institute of Sciences of India
(1961)
When we look back over the 4.5 billion years or so that the Earth has been around, the vast majority of life-threatening disasters that have brought a form of Armageddon to the world have been natural rather than man-made. This is hardly surprising when you consider the mere pinprick in time that is Homo sapiens’ portion of the Earth’s existence. Allowing a very generous million years (most figures put the emergence of Homo sapiens between 100,000 and 200,000 years ago), we have been around for one 4,500th of the life of the Earth. Humanity’s significant civilizations have existed for about a millionth of the lifetime of the Earth.
This means that we have, so far, missed out on the megadisasters that have resulted in mass extinctions of species. Looking back in time, using the fossil record as a time telescope, it is possible to deduce events, known as extinction events, when immense swaths of life were wiped out on the Earth. The best-known we have already considered—the K/T event, which took place around 65 million years ago. At the boundary between the Cretaceous and Tertiary periods, the bulk of the dinosaurs perished.
The dinosaurs weren’t alone. Around 50 percent of genera (the classification above species in the hierarchy of the description of living things) disappeared. This wasn’t a uniform extinction—or we wouldn’t be here. The conditions that ended the dinosaurs were survived by enough varied species of mammals for us to finally come out of the mix.
There have been at least four other major events, all in the last 500 million years. This doesn’t mean that this period has been any worse than the years before it, but rather shows the limitations of the fossil-record telescope. There will have been events much longer ago, but we can’t get a picture from the fossil record, as fewer and fewer fossils are available, and for a considerable time the only life was microscopic. The oldest of the known events was the Cambrian/Ordivician, around 488 million years ago, followed by the bigger Ordovician/Silurian event around 450 million years ago, which wiped out around 57 percent of genera.
Then came the late Devonian event, around 370 million years ago, followed by the massive Permian/Triassic event, sometimes given the evocative name “the great dying.” Occurring 250 million years ago, this massive shock to the Earth’s living systems wiped out over 80 percent of genera, taking an even greater toll of marine species, where a remarkable 96 percent were wiped out. It’s the only mass extinction that has had a significant effect on insects as well as other creatures. In many ways this was the inverse of the K/T event, with many mammal-like reptiles wiped out, while dinosaurs were given the chance to come into the ascendant.
Finally, before the K/T, there was the Triassic/Jurassic event around 205 million years ago, with just under 48 percent of genera being eliminated. Small by comparison with some of the others, this was still a terrible toll when you consider that practically half of the families in existence were wiped out. These five mass extinctions were just the massively large-scale events, with other smaller groups of extinctions punctuating the periods in between.
If we consider the whole of human existence it has also typically been the natural events that have most significantly reduced the population, whether it was ice age or earthquake or plague. But in the last hundred years we have seen our ability to reap mass destruction from our science and technology blossom like a horrible flower.
The First World War killed around 10 million military personnel and more than 6 million civilians. The “Great War” was considered the war to end all wars because of the sheer horror of the mortality numbers, combined with the even greater casualty count. But just twenty-one years later, the Second World War dwarfed that level of slaughter. This would see more than 22 million military deaths and somewhere between 34 million and 47 million civilian deaths, a total death toll probably exceeding 60 million. This is comparable in scale to the Black Death, which is believed to have killed up to 50 million in Europe (though the population was much lower in the fourteenth century, so the percentage who died was much greater). With the two world wars, science and technology proved all too well their capability as a mass destroyer.
Outside of war, industrial accidents have provided our biggest man-made disasters. The Chernobyl disaster may have resulted in as many as four thousand deaths, though this is hard to verify, as much of the mortality ascribed to the reactor explosion took place many years later, when the initial cause was hard to prove. More certain is the direct link between the Bhopal disaster in India in 1984 and its casualties. This certainly killed four thousand and may have resulted in as many as twenty-five thousand deaths.
The American chemical giant Union Carbide (now subsumed into the Dow Chemical Company) ran a large chemical plant in Bohpal, a city in the Madhya Pradesh state of India. On the evening of December 3, 1984, storage tank number 610 at the site, containing forty-two tons of the dangerous chemical methyl isocyanate, used in the production of pesticides, was contaminated with a large quantity of water. It is still not certain how that water got into the tank.
In the ensuing reaction between the water and the methyl isocyanate, temperatures in the tank soared to well over the boiling point, reaching around 200 degrees Celsius (400 degrees Fahrenheit). The temperature and ensuing pressure far exceeded the tank’s capability to contain them. To avoid an explosion, the tank automatically vented gas, sending huge quantities of poisonous fumes into the atmosphere. Around half a million people were close enough to the plant to be affected by the gases.
Thousands died in their beds. Many more were injured in the struggle to get away from the area, or by inhaling the fumes. As well as methyl isocyanate, the population was exposed to a range of other noxious gases, from nitrous oxide to phosgene, as the overheated chemicals furiously reacted with the atmosphere. Union Carbide has since paid out millions of dollars in compensation, but maintains that it was not responsible for the accident, blaming sabotage by disgruntled workers. Whatever the initial trigger really was, it was the location of the plant that has to be seen as the most significant factor in this tragedy.
Not all the deaths that have arisen from our use of science and technology have been on a large scale, but they should not be ignored. There have also been small-scale tragedies like the death of Marie Curie already mentioned, and near misses like the partial reactor core meltdown at Three Mile Island in Pennsylvania. These might not be worthy of the “Armageddon” tag in their own right, but they indicate circumstances that could have resulted in much wider danger. Scientists don’t always have a great track record in keeping themselves and others safe.
There is no doubt that science, particularly the application of science through technology, carries with it dangers for humanity, dangers that time and again have come to terrible fruition. The old myth of Pandora’s box was never truer than it is now—in fact, if it hadn’t been dreamed up all those years ago, it would be necessary to create it today. The term “Pandora’s box” has become a cliché. So it’s worth spending a moment on just what the original story was before seeing how applicable it really is.
In ancient Greek mythology, Pandora was the equivalent of Eve, the first woman, created directly by the gods. She was given by her creators a jar that was never to be opened, containing all the ills of the world, from disease to suffering. This jar would become the legendary box due to some careless mistranslation in the sixteenth century. The jar was, in Greek, called a
pithos,
from which we probably get the word “pitcher.”
The ancient Greek poet Hesiod had written an epic poem called
Works and Days,
which retold many of the Greek creation myths. In it, he referred to Pandora’s
pithos.
But when the poem was translated to Latin by the medieval Dutch scholar Erasmus, he mistakenly turned
pithos
into
pyxis,
which is Latin for “box.” So the original Pandora myth was closer to the Arabic stories where evil djinns are locked away in jars.
But whatever the container, the intensely curious Pandora opened it. She had to learn what was inside. As the stopper was removed, all the evils that beset us flew out to infect mankind, leaving behind in the jar only hope. (I’m not clear why hope was in there in the first place, but its symbolic role in Pandora’s actions is clear.)
It’s easy to see parallels between Pandora’s opening of the jar and the story of the apple in Genesis. Here, in the second of the creation myths presented in the Bible, again it is the first woman who lets loose something dangerous, though in Eve’s case it is a more subtle danger. I need to be clear about what’s meant by myth here, as the word is often used in a derogatory way, and that isn’t my intention. A myth is a narrative with a purpose. It tells of something useful for our everyday life through a story, usually occurring far in the past or in a distant land. The originator of the myth uses this exotic setting to explain a universal truth, or to put across an important piece of information in a way that will make it easier to remember and absorb.
In chapter 2 of Genesis we hear that there was one special tree in Eden. God tells Adam, “You may eat from every tree in the garden, but not from the tree of the knowledge of good and evil; for on the day that you eat from it, you will certainly die.” Later, the newly created woman is told by the serpent, “Of course you will not die. God knows that as soon as you eat it, your eyes will be opened and you will be like gods knowing both good and evil.”
Eve likes the look of the fruit (never named as an apple in the Bible) and eats some of it, also giving a portion to Adam—so they are cast out, though strangely, the serpent’s version of events seems closer to the truth than God’s, because they don’t die on the day they eat the fruit.
In both these myths, curiosity, or the desire for something unattainable, perhaps something for which the human being is unworthy, produces suffering for humanity. Our innate desire to know more, to investigate and to learn, rewards us with pain. And there is no going back. We can’t close the box (or rather, reseal the jar), nor can we unbite the apple. There is no way to return to an age of innocence. As the second law of thermodynamics has it, entropy increases.
Yet surely, the mythical attitude to discovery reflects a misunderstanding that enables us to end this book on a note of cautious optimism. It’s true that curiosity in a physically dangerous environment leads to risk. Yet it also leads to great reward. There was no Garden of Eden in the real world. We don’t have an idyllic past, which we could return to if we only abandoned all the advances of science and technology and became once more scavengers on nature’s bounty.
The fact is that most of the time, nature is pretty harsh on those who suffer it without any mitigation. Sustenance is rarely plentiful, and there are many physical dangers that face us. Science and technology have mostly proved either harmless or vastly beneficial to humanity. The world has gained hugely from scientific discoveries in everything from medicine to information technology.
Yes, we have created terrible weapons of destruction, and we are responsible for the threats that arise from our abuse of the planet, bringing the very real menace of climate change. But we also live more comfortable lives, protected from many of the medical threats that are an everyday reality for those limited to a natural world. It has been pointed out that the life span of a healthy human hasn’t altered vastly since biblical times, when three score years and ten (seventy) was given as the target age—but this hides a terrible reality.
You only have to think of the changes in child mortality over the years. Bear in mind that until well into the nineteenth century the majority of babies would die before reaching adulthood. When there’s a funeral for a baby or a child it is always a very emotional and particularly sad occasion. It is sobering to think that not many years ago, and throughout all of history before that, the majority of funerals were for babies and children.
Perhaps even more significant than the medical benefits science has brought us are the changes in the quality of life that we experience. For practically every human being, until very recently in terms of human evolution, life was one long struggle. There was no time for enjoyment, for wonder, for all the things that arguably make life worth living. There was just the endless fight to keep food and drink coming in, to reproduce, and to avoid predation.
Sadly, for a shocking percentage of the world’s population this is still the case. But for those of us lucky enough to live in the rich world, we have a transformed life, largely thanks to science and technology. Just read the quote from former Indian prime minister Nehru at the opening of this chapter: “It is science alone that can solve the problems of hunger and poverty, of insanitation and illiteracy, of superstition and deadening custom and tradition…. The future belongs to science and those who make friends with science.” This was no Western fat cat, selfishly appreciating what he had, but someone who was well aware of the need for the benefits that science could bring.
What’s more, despite the caricature of emotionless beings driven by faulty logic, scientists are human too. In fact, there is something facile and inconsistent about the way that scientists are often portrayed as monomaniacs who ignore the consequences for the human race that inevitably follow from their evil work. On the one hand, scientists are portrayed as being cold, calculating, virtually inhuman in their logic. On the other hand, they can’t follow through as simple a logical chain as “Produce something that kills lots of people and lots of people may well be killed.”
There have been many examples of scientists contemplating the moral implications of their work, and often society exerts considerable controls to restrain scientific endeavor and to avoid unnecessary risk. Take the whole area of human genetic manipulation. In all countries where such work is carried out there are strict rules and regulations, with agonizingly careful consideration of each problem that is thrown up by any new developments. We might not all agree with the approach taken by another country’s genetic ethics, and there will always be the occasional mavericks, like those who claim (almost certainly falsely) to have produced human clones, but the reality is that science does tend to be better policed than most areas of human endeavor.