Absolute Zero and the Conquest of Cold (16 page)

In addition to flocking to cities for jobs, Americans also became urbanites in the latter part of the nineteenth century because there seemed to be fewer hospitable open spaces into which an exploding population could expand. Large areas of the United States were too hot during many months of the year to sustain colonies of human beings; these included the Southwest and parts of the Southeast, with their tropical and semitropical climates, deserts and swamps. Looked at in retrospect, the principal limitation on people settling in those areas was the lack of air conditioning and home refrigeration.

In the second half of the nineteenth century, the use of cold in the home became an index of civilization. In New York, 45 percent of the population kept provisions in natural-ice home refrigerators. It was said in this period that if all the natural-ice storage facilities along the Hudson River in New York State were grouped together, they would account for 7 miles of its length. Consumption of ice in New York rose steadily from the 100,000-tons-per-year level of 1860 toward a million tons annually in 1880. But while the per capita use of ice in large American cities climbed to two-thirds of a ton annually, in smaller cities it remained lower, a quarter of a ton per person per year.

When New York apple growers felt competitively squeezed by western growers who shipped their products in by refrigerated railroad car, they hired experts to improve the quality of their own apples. A specialist was hired to help prevent blue mold, a disease affecting oranges, so that California's oranges would be more appealing to New York consumers than oranges from Central and South America. Believing there were not enough good clams to eat on the West Coast, the city fathers of San Francisco ordered a refrigerator carload of eastern bivalves to plant in San Francisco Bay, founding a new industry there. Commenting in 1869 on the first refrigerated railroad-car shipment of strawberries from Chicago to New York,
Scientific American
predicted, "We shall expect to see grapes raised in California and brought over the Pacific Railroad for sale in New York this season."

The desire for refrigeration continued to grow, almost exponentially, but the perils associated with using sulfuric acid, ammonia, ether, and other chemicals in vapor compression and absorption systems remained a constraint on greater use of artificial ice, as did the high costs of manufacturing ice compared with the low costs of what had become a superbly efficient natural-ice industry. Artificial refrigeration finally began to surpass natural-ice refrigeration in the American West and Midwest in the mid-1870s. In the space of a few years, as a result of the introduction of refrigeration, hog production grew 86 percent, and the annual export of American beef (in ice-refrigerated ships) to the British Isles rose from 109,500 pounds to 72 million pounds. Simultaneously, the number of refrigerated railroad cars in the United States skyrocketed from a few thousand to more than 120,000.

Growth of the American railroads and of refrigeration went hand in hand; moreover, the ability conveyed by refrigeration to store food and to transport slaughtered meat in a relatively fresh state led to huge, socially significant increases in the food supply, and to changes in the American social and geographical landscape. "Slaughter of livestock for sale as fresh meat had remained essentially a local industry until a practical refrigerator car was invented," Oscar Anderson's study of the spread of refrigeration in the United States reported. And because refrigeration permitted processing to go on year-round, hog farmers no longer had to sell hogs only at the end of the summer, the traditional moment for sale—and the moment when the market was glutted with harvest-fattened hogs—but could sell them whenever they reached their best weight.

In Great Britain, the Bell family of Glasgow, who wanted to replace the natural-ice storage rooms on trans-Atlantic ships with artificially refrigerated rooms that could make their own ice, sought advice from another Glaswegian, Lord Kelvin, who assisted the engineer J. Coleman in designing what became the Bell-Coleman compressed-air machine, which the Bells used to aid in the transport of meat to the British Isles from as far away as Australia. Because of refrigeration, every region of the world able to produce meat, vegetables, or fruit could now be used as a source for food to sustain people in cities even half a world away. Oranges in winter were no longer a luxury affordable only by kings.

Refrigeration in combination with railroads helped cause the wealth of the United States to begin to flow west, raising the per capita income of workers in the food-packing and transshipment centers of Chicago and Kansas City at the expense of workers in Boston, New York, and Philadelphia. Refrigeration enabled midwestern dairy farmers, whose cost of land was low, to undercut the prices charged for butter and cheese by the dairy farmers of the Northeast. Refrigeration made it possible for St. Louis and Omaha packers to ship dressed beef, mutton, or lamb to market at a lower price per pound than it cost to ship live animals, and when the railroad magnates tried to coerce the packers to pay the same rate for dressed meat as for live animals, the packers built their own refrigerated railcars and forced a compromise.

The enormous jump in demand for meat, accelerated by refrigerated storage and transport, spurred ranchers and the federal government to take over millions of acres in the American West for use in raising cattle. This action brought on the last phase of the centuries-long push by European colonizers to rid America of its native tribes, by forcing to near extinction the buffalo and the Native American tribes whose lives centered on the buffalo. The conventional view of American history is that it was the "iron horse" that finally killed off the "red man"; but one could with as much justification say that it was the refrigerator.

Cold of the temperature of ice—cold adequate for most tasks of preserving food and medicines, making beer, transporting crops, preventing hospital rooms from overheating—could be produced by ordinary refrigeration. But scientific explorers wanted to journey far beyond the shoreline of the country of the cold into a temperature region more than a hundred degrees below the freezing point of water. This was an arena beyond the sensory equipment of warm-blooded human beings, a region so cold that skin and nerves could not even register the intensity of its cold; the only way to measure its grade of cold was through thermometers. To conquer this region scientists would require a more powerful technology. They found it in the liquefaction of gases.

This was a rediscovery, for liquefaction had begun with van Marum and ammonia in 1787, and significant leaps forward had been taken in 1823, when Faraday had liquefied chlorine, and in the early 1830s by Thilorier, who actually went beyond liquefaction to create solid dry ice from carbon dioxide.

In 1838 for an audience at the Royal Institution Faraday demonstrated the remarkably low temperature of—no°C, achieved by use of the "Thilorier mixture" of dry ice (carbonic acid), snow, and ether. He might have immediately gone further with liquefaction, using the Thilorier mixture, had he not suffered a mental collapse that friends attributed to the exhaustion of having done enough work in a single year to fill four scientific papers. Modern historians believe Faraday's illness may have been mercury poisoning, a then-unknown malady. Whatever the cause, bad health kept Faraday out of the laboratory until 1845; but as soon as he recovered, the possibilities for achieving lower temperatures by means of the Thilorier mixture induced him to return to liquefaction experiments. So enabled was Faraday by the Thilorier mixture that despite having otherwise primitive equipment—a hand pump to compress the gases, and a laboratory widely regarded as the worst then available to a serious experimenter in London—in a few months in 1845 he liquefied all the known gases, with the exception of six he could not change, and which were dubbed "permanent" gases: oxygen, nitrogen, hydrogen, nitric oxide, methane, and carbon monoxide.

The "permanent" gases were a significant scientific problem worthy of a strong attack by a scientist of Faraday's brilliance, but he seems to have decided in 1845 that he had exhausted the limits of his new tool, and he went no further in liquefaction, instead returning his attention to electricity and magnetism. Less brilliant researchers on the Continent took up the challenge. Recognizing that the weight of seawater produced high pressures, a French physicist named Aimé first compressed nitrogen and oxygen by ordinary means into metal cylinders, then lowered the cylinders into the ocean, to a depth of more than a mile. He recorded pressures of up to 200 atmospheres—about 3,000 pounds per square inch—but no liquefaction of the gases occurred. Johannes Natterer of Vienna, whom one historian calls an "otherwise undistinguished medical man," thought the problem of liquefaction basically simple: if Boyle's law held, all he needed to do was raise the amount of pressure on the gas, which should decrease its volume to the point of liquefaction. So he kept beefing up his apparatus until it was able to exert as much as 50,000 pounds of pressure on nitrogen gas. But even under such pressure, the gas would not liquefy.

Two abler researchers now addressed the problem, each from a different direction. One of the most astute scientists of the time, the Russian Dmitri Ivanovich Mendeleyev, compiler of the periodic table of atomic weights, started from the liquid state, trying to determine precisely at what temperature any liquefied gas could be again induced to become a vapor. That was a logical approach, but it did not prove fruitful. The opposite approach—to determine the conditions required to make a gas become a liquid—was adopted by a Scottish physician living in Belfast, Thomas Andrews.

The eldest son of a Belfast linen merchant, Andrews had a bent for chemistry, and at age seventeen, having exhausted the facilities for chemical study in Glasgow, he searched through several capitals in Europe for a laboratory to work in before making contact in Paris with a young chemist in the process of becoming a distinguished teacher, Jean-Baptiste Dumas. Andrews returned to Ireland in the late 1830s to study medicine and to teach chemistry. He practiced medicine for only a short time, then became absorbed in a series of experiments on the heat produced or consumed during chemical reactions. By 1849 he was the vice president of the new Queen's College in Belfast, its first professor of chemistry, and a Fellow of the Royal Society. He labored for five years on inconclusive experiments to determine the composition and density of ozone gas. But in the early 1860s these led him to his most important work, exploring in a systematic way what no one had adequately charted, the murky region in which gases and liquids transmuted into one another.

Andrews was not a highly innovative thinker—he missed some obvious things about ozone—but he picked the right substance to work with, carbon dioxide, known to change from gas to liquid under only moderate pressure. As he measured the volume of carbon dioxide gas while compressing it by various amounts of pressure and holding it at a constant temperature, he initially found that Boyle's law adequately predicted the lines on his graph. Those lines are called "isotherms" because they describe the relationship of pressure and volume along a line of a single, constant temperature. Until the time of van Marum, all investigators had found smooth isotherm lines. Van Marum had recorded a discontinuity—a change in the character of the line—at the point where pressure converted ammonia gas into a liquid, but he had not followed that experimental red flag by varying the temperature or by creating other isotherms along which to measure. Andrews did both. With better pressure equipment, he pushed carbon dioxide back and forth between gas and liquid, and he recorded precise measurements of its volume and composition at nearly a dozen different temperatures. He discovered that along the isotherms of an entire group of lower temperatures, certain combinations of volume and pressure kept gaseous and liquid carbon dioxide in equilibrium. Exploring further, Andrews found that so long as he kept carbon dioxide gas above its "critical temperature," no matter how much more pressure he applied, the gas would not liquefy.

This was quite a revelation, and it cried out for Andrews to deduce a new law from it. Because while at relatively high temperatures Boyle's law adequately described the inverse-square relationship between volume and pressure, below the critical temperature it did not, and some other law must apply—one Andrews initially shrank from trying to formulate. He was willing to make only the generalization "that there exists for every liquid a temperature at which no amount of pressure is sufficient to retain it in the liquid form." Later, becoming more daring, he articulated an idea that boggled other minds along with his own: gases and liquids were not independent and fixed forms of matter; rather, each substance (like carbon dioxide) existed along a continuum, and under certain pressure and temperature conditions it could become a liquid, a gas, or a solid.

Scientists had been inching up on this realization for the better part of a century, but before Andrews it had not been boldly stated, nor had experimental evidence been produced to support it. Even Andrews did not comprehensively formulate the notion, leaving unanswered many questions about the states of matter. But he did realize the implications of his work for the further exploration of the cold. Andrews predicted, "We may yet live to see, or at least we may feel with some confidence that those who come after us will see, such bodies as hydrogen and oxygen in the liquid, perhaps even in the solid state."

Andrews's continuum idea was greeted with some interest by his fellow British scientists, but with intense excitement by the Dutch physicist Johannes Diderik van der Waals. The son of a carpenter, van der Waals had struggled through his twenties to find a vocation; he became a grade-school teacher, then a high-school teacher, then a headmaster in The Hague. These positions did not satisfy him, and while he continued his supervisory work, he also studied physics at Leiden in the late 1860s. It was there that he encountered the work of Andrews and of van Marum, and of Boyle before them.

Other books

Thief of Lies by Brenda Drake
A Catskill Eagle by Robert B. Parker
Florence and Giles by John Harding
Uncle John’s Unsinkable Bathroom Reader by Bathroom Readers’ Institute
Black Rook by Kelly Meade
Dry Divide by Ralph Moody