100 Million Years of Food (18 page)

Although many changes in diet and lifestyle were beneficial on the surface, rapid changes sometimes carried grave unintended consequences. As we have seen, our ancestors had been undergoing changes in diet and environment for millions of years. Our diets shifted from insects to fruits, meats, and agricultural products like wheat, rice, potatoes, and corn, then added milk and alcohol. By contrast, viewed against this backdrop of gradual transformation, the last thousand years of human history has been a storm of disruption due to technological and scientific breakthroughs. Since biological evolution requires dozens or hundreds of generations to adapt organisms to new environments and foods, new afflictions began to appear. Some of these could be contained and defeated, but other outbreaks took their place, gathered force, and tore through populations with brutal ferocity.

As recounted in an excellent study by Kenneth John Carpenter, beginning in the seventh century, observers in densely populated areas of East and Southeast Asia periodically described an ominous constellation of symptoms including tremors, numbness, difficulty in walking, swelling of the limbs, wasting, and pervasive weakness; when the heart began to race, death was usually not far away. The patterns of the disease befuddled observers. It was prevalent in southern China but not northern China. Unlike cholera, which vaulted into Japan via seaports linked to China, this disease was not contagious; people who moved from afflicted regions did not carry the disease with them. Japanese doctors tried acupuncture and raising blisters along the spine with heated cylinders but to no avail. Western doctors theorized that noxious airs were the root cause of the affliction, but the disease was common on ships manned by Asian crews. In the 1870s, the rage in medicine was bacteria. Louis Pasteur successfully treated cholera and anthrax through manipulating bacteria; Robert Koch found the bacterial agent responsible for tuberculosis. Perhaps the disease afflicting East and Southeast Asia was another bacterial epidemic? However, experiments on chickens failed to reveal any bacterial agent.

People who ate barley did not seem to be affected, which focused attention on the possible role of a diet of rice. This led to another puzzle: Poor-quality rice did not seem to increase the risk of acquiring the disease; if anything, those who were privileged to eat better-tasting rice were more susceptible. Adding to the confusion, the disease, known in the Dutch Indies as beriberi, would also appear in areas of the world where rice was not eaten, such as Brazil and Canada.

Two surgeons, Japanese and Dutch, working with different navies, found that the disease could be greatly alleviated by adding sources of protein to the diet.
2
Although this was a relief for the navy men, rats fed on protein-adequate but otherwise nutritionally deficient diets failed to thrive, which disproved the protein hypothesis. Further experiments revealed that apart from protein, fat, and carbohydrates, rats required two additional substances for survival: a fat-soluble “vitamin A,” which could be obtained from cod liver oil and butter, and a “vitamin B,” which could be obtained from yeast, wheat germ, and nonfat milk powder. Further work isolated a vitamin B
2
complex, and a vitamin B
1
later named thiamine, which was able to prevent beriberi-like symptoms from appearing in rats and chickens. Purified crystals of thiamine were effective at restoring the health of rats and chickens even when administered in microscopic quantities.

Thiamine, it turns out, is present in the bran of rice and is removed when rice is milled, heated to very high temperatures, or boiled and then rinsed. Milling boosted the storage life of rice and increased its palatability but inadvertently removed thiamine in the rice germ from the diet. By introducing steam-powered milling to Asia, the colonial powers greatly increased the misery wrought by beriberi, though the Chinese and Japanese also used rice mills and were beset with the disease. (Diets of cassava in Brazil and bread made from white flour and baking powder in isolated ports of Newfoundland also lacked thiamine and hence led to outbreaks of beriberi in those regions.)

There were two relatively simple ways of resolving the thiamine deficiency. The first method was to parboil rice, a traditional way of cooking rice in parts of South Asia that involves soaking it and then boiling it in its husk. This aids in removal of the husk but also helps the germ to retain nutrients from the husk, including thiamine. During the epidemic of beriberi in Asia, populations that used parboiled rice were not affected. However, parboiled rice had a musty smell and a yellow-brown tinge and was not as fluffy as white rice, which made it unacceptable to East Asian populations.

The second traditional way of preparing rice was to pound or stamp upon rice kernels, then employ sifting or other means to remove the husks. Because this method only incompletely removes the husk, the rice still retains thiamine in the “silver skin” surrounding the kernel. However, people used to eating white rice also rejected hand-milled rice as unpalatable, defeating the efforts of public health officials. A third method was to combine thiamine-rich beans with white rice, a practice still carried on in parts of Asia. Eventually, the disease was resolved definitively by adding thiamine directly to polished rice, but not before beriberi had inflicted much misery upon populations.
3

*   *   *

While beriberi was ravaging urban areas in East Asia, physicians in Europe were coming across a novel disease whose symptoms consisted of blisters, loss of appetite, depression, and a preoccupation with suicide. The disease was dubbed “pellagra,” a term denoting rough skin in the Lombard dialect of Italian. Unlike beriberi, pellagra had a tendency to afflict the poor rather than the rich, the opposite of beriberi's pattern. The locations of the disease were also opposite, with most cases of pellagra being recorded in Europe, while beriberi primarily afflicted East and Southeast Asia.

American doctors probably spotted the first cases of pellagra in the nineteenth century, but since the disease was believed to be nonexistent on their side of the Atlantic, they refrained from announcing their observations. In 1902, a physician in Atlanta identified the disease in an impoverished farmer. The visibility of pellagra rapidly expanded. In 1906, there were eighty-eight cases of pellagra at Mount Vernon Hospital for the Colored Insane in Alabama. Eighty of these patients were female, and more than half died. Mysteriously, none of the nurses at the hospital contracted the disease. Other mental institutions reported outbreaks, and the epidemic spread as far west as Illinois. By 1912, around twenty-five thousand cases had been diagnosed, with four in ten victims dying. As with beriberi, expert opinion initially focused on microbial agents. Some people believed that pellagra arose from eating spoiled, moldy corn, and thus several states enacted laws to inspect corn. Pellagra was also thought to be infectious, and so pellagra victims, who invariably came from the most economically disadvantaged quarters, were shunned like lepers and denied access to hospitals.
4

In February 1914, the U.S. surgeon general invited a talented Jewish Hungarian American epidemiologist, Dr. Joseph Goldberger, to take over the Public Health Service's faltering pellagra investigations. By the time of his appointment, when he was forty years old, he had made a name for himself studying—and surviving—epidemic diseases. Shortly after launching his investigations, Dr. Goldberger surmised that the disease was not communicable, since health workers who were in close association with pellagra victims never acquired the disease. A more likely explanation lay in the classic “Three M's” diet of poor southerners: meat (fatty pork), molasses, and meal (cornmeal). Orphans and mental institution patients who ate monotonous meals along the Three M's pattern got pellagra, but workers at the same institutions who had access to more varied fare avoided the disease.

Dr. Goldberger carried out an experiment with volunteers at a Mississippi prison, who were offered pardon by the governor as a condition for their participation. Over the course of six months, more than half the volunteers who were fed a diet based on cornbread and cornstarch developed skin lesions (starting with the genitalia), while the remaining subjects developed less striking but still noticeable manifestations of the same disease. Although the experiment was carried out with great meticulousness by Dr. Goldberger, both he and the Mississippi governor were roundly criticized for its unorthodox nature. Moreover, not only did the results run contrary to the conventional line that pellagra was infectious, the nutritional hypothesis also drew attention to the poverty of the South, which provoked the ire of proud southern politicians and patriots.
5

Dr. Goldberger continued to try to convince critics that pellagra was noncommunicable, even going to the extraordinary extent of injecting himself, his wife, and colleagues with blood from pellagra victims, and swallowing skin scales, feces, and dried urine from pellagra sufferers, wrapped up in dough. Consuming this concoction yielded nausea and diarrhea, but no pellagra. Unable to sway his critics, but convinced by his own observations and efforts that amino acid deficiency rather than spoiled corn was key, Dr. Goldberger then tried to locate the missing amino acid. He died of renal cancer in 1929 before he could complete his life's mission.

In the end, it turned out that his hypothesis was correct: Corn was found to be deficient in tryptophan, which the human body can metabolize into niacin (also known as vitamin B
3
). By the 1940s, fortification of foods with vitamin B
3
eliminated pellagra as a menace to poor Americans, though not before some 3 million had suffered from the disease, resulting in approximately 100,000 deaths. In Italy, pellagra cases peaked in the late nineteenth century among poor rural peasants in the north confined to monotonous diets of corn, then faded as economic conditions improved through emigration (which raised local wages and brought remittances from emigrant workers), industrialization, improvement of crop yields, and falling prices for wheat (which was substituted for niacin-deficient corn). Pellagra disappeared from Italy by the 1930s.
6

Industrial milling of American corn began in the early 1900s, which stripped the corn germ and thereby boosted the shelf life of processed corn. Unfortunately, the germ is also where niacin resides. During the epidemic, rates of pellagra were highest in areas adjacent to railroads, where people had ready access to stores and industrially milled cornmeal. In rural areas, people relied instead on traditional processing techniques, such as water-driven stone-milling, which preserved more of the corn germ and reduced the risk of pellagra. Indigenous groups in the Americas who domesticated corn over hundreds to thousands of years knew how to prepare their sacred crop for safe consumption. Through trial and error, and copying neighbors, tribes that relied heavily on corn learned to cook it with an alkaline substance such as lime or wood ashes, which helped to increase the availability of tryptophan and niacin in corn and thus helped them avoid pellagra. Another method of preserving niacin, practiced by the Tohono O'odham and the Hopi Indians, was to roast immature corn, which contains higher concentrations of niacin than mature corn.
7

*   *   *

While beriberi was crippling swaths of East Asia and pellagra was picking up steam in southern Europe, a different disease was ravaging northern European cities. In 1634, fourteen deaths in England were attributed to a condition in which children were left with a deformed spine and chest and crooked arms and legs. The disease had shown up in the Balkans in 9000 BC, in early Egypt, and in China around 300 BC, but it finally developed into a full-scale epidemic in industrialized urban European locales in the eighteenth century. Nor was this just a disease of children, as elderly women in northern European and North American cities and towns suffered from high rates of bone fractures.
8

Since prevailing medical theory centered on “humours,” the condition known as rickets was blamed on cold distemper. Herring, a rich potential source of vitamin D, was banned as “cold” food. Peasants found their own cure for the disease by consuming raven livers (livers are key organs in vitamin D metabolism). Fishermen around northern Europe had been taking fish liver as a household remedy for centuries. Swallowing cod liver oil was another matter, for it was prepared by allowing livers to spoil until the oil could be skimmed from the surface. The stench, understandably, was nauseating.
9

As medical practitioners continued to debate the merits of cod liver oil, sunlight, bloodletting, bone breaking and resetting, and racks and slings designed to stretch out children's bodies, rickets accompanied settlers migrating to the New World. Between 1910 and 1961, 13,807 deaths in the United States were officially attributed to rickets, mostly in infants less than a year old.
10
Dark-skinned children were especially vulnerable, particularly in northern cities. Finally, between 1919 and 1922, a series of experiments conducted by researchers in Vienna verified the efficacy of cod liver oil and sunlight in preventing and treating rickets, and thereafter supplementation with cod liver oil, vitamin-D-fortified milk, and the use of sunlight gradually eradicated rickets. However, cases continue to occur even to the present day.

*   *   *

Although the scourges of beriberi, pellagra, and rickets are largely behind us, there are important lessons to learn from the history of these diseases. In each of these cases, a major rethinking—a paradigm shift—was required before progress could be made. In the case of rickets, the old theory of humors made European medical experts skeptical that herring, a “cold” food, could be beneficial, even though we now recognize that herring is a rich source of vitamin D and would have helped to alleviate rickets—a much better cure than the racks that were used to stretch out the deformed bodies of children afflicted with the condition. In the case of beriberi and pellagra, medical opinion clung to the notion that these diseases were caused by infectious germs, thus adding crucial delay to the search for nutritional deficiencies.

Other books

Broken Love by Kelly Elliott
Fractured Darkness by Viola Grace
Kakadu Calling by Jane Christophersen
An Ocean Apart by Robin Pilcher
Becoming Sir by Ella Dominguez
Roast Mortem by Cleo Coyle
What the Heart Takes by Kelli McCracken
Power Play (Center Ice Book 2) by Stark, Katherine
The Templar Throne by Christopher, Paul