On a Farther Shore (33 page)

Read On a Farther Shore Online

Authors: William Souder

On went the tests, in Nevada and in the Pacific, always with the same fearful urgency that had become part of the national psyche in America and in the Soviet Union as the two countries built out arsenals capable of destroying the planet many times over. H. G. Wells had predicted as much, back in 1914 in a book titled
The World Set Free
, in which he imagined warfare in the 1950s involving “atomic bombs.” The arms race had, in fact, begun in 1939—the same year that Paul Müller discovered the lethal properties of DDT in his lab in Switzerland—when Albert Einstein sent a letter to President Roosevelt telling him about recent experiments in physics that might lead to a new kind of weapon unlike anything before.

Dr. Einstein explained that scientists in several countries—including Germany—were working on nuclear fission. If fission could be induced in a sufficiently large quantity of uranium, Einstein said,
it might lead to a chain reaction that would release “vast amounts of power and large quantities of new radium-like elements.” The likelihood of achieving this in the near future was all but certain. Fission, Einstein told the president, could be used for energy production or for building bombs. Einstein thought such bombs would be too big to be carried on airplanes, but one could perhaps be put on a boat and exploded near an enemy port, where it “might very well destroy the whole port together with some of the surrounding territory.”

An important consideration for America would be to secure a source of uranium, as there were poor stocks of the ore in the United States. Einstein pointed out that Czechoslovakia had good uranium mines, but that Germany had recently halted sales of the ore when it took control of Czechoslovakia. As this seemed to mean only one thing, Einstein urged that the United States—even though it was not yet at war with Germany—speed up its experimental work on nuclear fission at once.

After the war, the American fear of a competing nuclear state was transferred to a new enemy, the Soviet Union—whose 1949 test of an atomic bomb had been ahead of the timeline predicted by U.S. intelligence. It was followed by a hydrogen bomb in 1955. The Soviet program was believed to have been aided by an American husband-and-wife espionage team, Julius and Ethel Rosenberg, who supposedly handed over diagrams of atomic weapons and other documents that enabled Soviets to accelerate their development efforts.

It was actually unclear if the Rosenbergs gave up anything of real value to the Russians—or whether Ethel was even directly involved—but they were caught up in a new kind of public hysteria. The Rosenbergs were convicted of spying and were put to death in the electric chair at Sing Sing prison in 1953.
Ethel’s execution was unusually brutal, as two applications of the current failed to kill her, and the third, which finally ended her life, set her on fire.

President Truman all but shut down America’s civil defense efforts at the close of World War II, but after the Soviet atomic test in 1949 and the outbreak of war in Korea a year later—seen by some as a
distraction initiated as a prelude to a Soviet move in Europe—Truman launched a new agency called the Federal Civil Defense Administration in early 1951. Not surprisingly—given its ludicrous mission of protecting the United States in the event of nuclear war—the agency was widely distrusted, chronically underfunded, and in constant dispute with states and cities as to who was responsible for what.

The agency’s mission was to develop plans for the evacuation and sheltering of the entire U.S. population in the event of nuclear war. In practice, the agency mainly distributed pamphlets and short films—many aimed at schoolchildren—with advice on how best to survive a nuclear attack. It also plastered public buildings with the civil defense logo to indicate which ones were suitable as bomb shelters in the event of an air raid. But as the decade progressed, the concept of civil defense grew increasingly tenuous.
Just one month after the Castle Bravo test at Bikini atoll, civil defense officials in New York said they had to completely rethink how best to respond to a nuclear attack in which a single hydrogen bomb could level the city. The only solution seemed to be a complete evacuation of eight million people—an improbable undertaking given the Air Force estimate that it could provide an advance warning of about one hour in the case of a Soviet attack with long-range bombers.
A year later, citizens in Washington, D.C., were mortified to learn that Congress had allocated $115,000 for civil defense in the nation’s capital—compared with the $650,000 annual budget for the national zoo.

In the early 1950s, the civil defense advice dispensed to the public emphasized the likelihood that most people would survive a nuclear war. Anyone unfortunate enough to be close to ground zero of an atomic bomb explosion—from a weapon comparable to the bombs used against Japan or possibly a little larger—would, of course, be killed outright, as would many others within a radius of a couple of miles. But the effects of heat and blast and initial radiation that were the main threats diminished the farther away you were.

Civil defense officials also insisted that there was little to fear from the “radioactive clouds” sent high into the atmosphere in an atomic
explosion, as the debris within them emitted much less radiation than was given off in the initial blast and would be “carried off harmlessly” and dispersed over a large area. This had been the case in bombings of Hiroshima and Nagasaki, where all of the radiation sickness had been attributable to exposure to “explosive radiation” and not to lingering radioactivity that precipitated out of the sky.

It was also assumed that atomic bombs would be routinely set to detonate high over their targets—which would maximize the blast effects and also reduce the volume of radioactive debris sent skyward. People were told that it would be safe to leave cover “after a few minutes” following an atomic attack in order to assist the injured and help fight fires that were likely to have started. The notion that an atomic detonation would render a large area uninhabitable for a long time was dismissed as a “myth.” The U.S. Department of Defense said that for fallout to become a significant danger over a longer period would require the simultaneous explosion of “thousands” of atomic bombs.

Of course, as Castle Bravo demonstrated, a single hydrogen bomb could easily be a thousand times as powerful as an atomic bomb, and as the U.S. and Soviet nuclear arsenals shifted to hydrogen weapons, everyone had to rethink the meaning and nature of nuclear warfare. An aerial burst of a hydrogen bomb could be expected—depending on the size of the weapon—to cause massive damage as much as twenty miles away from ground zero and would fill the sky with tons of deadly radioactive debris. By the end of the decade fallout was seen as the
major
threat in a nuclear exchange that would kill millions of people initially and pose a continuing danger to tens of millions in its aftermath. This meant that, in theory at least, everyone in the United States needed access to a fallout shelter with supplies of food, water, medicines, and other necessities that could last at least two weeks.
In 1955, the Civil Defense Administration spent $8.3 million trying to develop “survival plans” for communities across the country—a grim undertaking that in the end seemed pointless.

The folly of civil defense planning grew in direct proportion to the increasing danger inherent in a nuclear exchange with the Soviet
Union. In fall of 1957, the Russians launched
Sputnik
, the first manmade earth satellite—gaining a shocking advantage over the U.S. space program and signaling that the Soviet Union would soon be capable of attacking the United States with rockets armed with nuclear warheads. This would cut any advance warning of an attack from an hour or two to something like fifteen minutes—rendering meaningless the concept of mass evacuations from targeted areas.

The Americans and the Soviets had both been working on missile systems since World War II, and in the late 1950s and early ’60s, the nuclear arsenal was rapidly deployed among growing fleets of long-range, land-based rockets and missiles that could be launched from submarines anywhere in the world. Armageddon, which had been coming into view since Trinity, could now be envisioned as two great shadows rising from the earth simultaneously and passing each other in opposite directions above the atmosphere, curving toward the end of all things in a white-hot hell of thermonuclear doom.

It was understood, of course, that not every American city would be targeted. But the lesson from Castle Bravo and the tests that followed was that living through a nuclear attack was only the first step in surviving whatever would come after that, as radioactive fallout would prove similarly deadly over a much greater area. Nuclear warfare was a two-headed demon that killed whatever was close to the fire and poisoned everything else.

In 1958, a high-ranking civil defense official declared that the “saving grace” in a nuclear attack was that anyone who wasn’t vaporized in the initial blast would have some time to reach shelter before the fallout began to come down. But the original plan to build a nationwide system of fallout shelters never happened. Civil defense officials had imagined an elaborate complex of underground community shelters, protected areas in schools and other public buildings, and subsidized private shelters for individual property owners.
But the cost of such a system—estimated at a then unimaginable $300 billion—was prohibitive.
In 1957, President Eisenhower rejected a more modest proposal for a $40 billion national shelter system, on
the novel theory that total nuclear war was an unacceptable proposition. “You can’t have this kind of war,” Eisenhower said. “There just aren’t enough bulldozers to scrape the bodies off the streets.”
By the 1960s, the question some civil defense planners had begun to ask was whether in the event of nuclear war “the survivors would envy the dead.”

Anxiety over the prospects of nuclear confrontation—mixed with a growing awareness of the mushroom clouds rising regularly over the American desert—were reflected in what became a science fiction subgenre unto itself, the “radioactive mutant” film. The prototype—a giant lizard called
Godzilla
—came from Japan in 1954 and gave rise to a string of B-movies produced in America, all formulated on the idea that exposure to radioactivity could change the nature and appearance of someone or something, and always for the worse. One favorite was a 1957 film called
The Amazing Colossal Man
, about an army officer injured in a nuclear test who suddenly grows to a height of sixty feet—a size at which insufficient blood supply to his brain sends him rampaging. Another was
Them!
, a surprisingly well cast and ingeniously written shocker that came out the same year as
Godzilla
. It was about an outbreak of radioactively enhanced killer ants that had morphed to the size of Studebakers and developed a paralyzing scream. These silly but nervous entertainments came after a more sober and cautionary issuance from Hollywood, the 1951 classic
The Day the Earth Stood Still
, in which an alien arrives on earth accompanied by a robot with immense powers. Their mission is to inform the planet’s inhabitants that if they continue the development of nuclear weapons an interplanetary police force would have no choice but to destroy the earth.

This was, of course, an indictment of the so-called Cold War between the United States and the Soviet Union—though the term seemed an antiseptic way of describing the grim images of nuclear holocaust that every American carried at all times, a mournful fear that one day—no one could say when or for sure—the sky might light up as if from a thousand suns and that would be it.
None of this
was lost on America’s children, who drilled regularly for the end of the world. If a teacher suddenly yelled “Flash!” every kid over the age of five knew that meant to “duck and cover” by whirling to the floor and crouching beneath his or her desk, arms wrapped tightly around heads to wait patiently for the shock wave to arrive. There were also panic-inducing policies concerning who was to go where in the event there was a warning of an imminent attack. For many kids, this meant that if you lived close enough to school to run home in less than fifteen minutes you could do so—and presumably then at least die with Mom and Dad. Those who lived farther away were to stay put and let death visit them at school.

These same children, having learned this lesson well, came of age in the 1960s having no problem believing that the world could end and that human technology could end it by means both seen and unseen.

Even after the attacks on Japan, atomic warfare—though a fearsome possibility—remained an abstraction for most people. Nuclear weapons were built in the belief that having them meant they would never be used. But this was not true of the other new weapon of the modern age. DDT had been invented to fight a war without end. The insecticide helped eradicate dwindling populations of malaria-carrying mosquitoes in the United States and in Europe, where the disease was already on the way out, and was then almost immediately commercialized for use in agriculture and many food service industries, in forestry, and for residential insect control. Nobody understood exactly how DDT worked, though it was clearly some kind of nerve poison. It was also relatively inexpensive to manufacture and its lethal effects persisted long after it was applied, notably when it was sprayed on the inside walls of buildings. As synthesized, DDT is a whitish powder. It doesn’t dissolve in water but can be formulated into dusts, oil-based sprays, and aerosol “bombs” used for fumigating entire rooms. People sprayed their beds with DDT to control bedbugs, delighted that a
single application worked “for months.” DDT went wherever people lived or ate or grew their food.

As the first in a wave of synthetic insecticides that came into use after World War II, DDT steadily replaced natural pesticides such as pyrethrum, which is derived from chrysanthemum flowers, and an assortment of arsenic-based compounds that were heavily used on crops such as cotton and tobacco—a poisonous additive that made cigarettes even more toxic than they inherently were. Nobody kept close records of how much DDT was used, or where—especially early on. But the speed with which it entered into widespread use was breathtaking.

Other books

The Heather Moon by Susan King
The Girl With No Name by Diney Costeloe
The Day I Killed James by Catherine Ryan Hyde
The Only Good Priest by Mark Richard Zubro
Ritos de Madurez by Octavia Butler
Ten Cents a Dance by Christine Fletcher
The Ramayana by R. K. Narayan