Read The Locavore's Dilemma Online

Authors: Pierre Desrochers

The Locavore's Dilemma (20 page)

Overspecialization and Food Security
In the first edition of Thomas Robert Malthus's
Essay on the Principles of Population as it Affects the Future Improvement of Society
(1798),
39
the English clergyman and economist sought to debunk the view that standards of living could be constantly improved through scientific and technological advances. Although he later adopted a more nuanced stance, his general argument is now widely understood to be that human populations have a tendency to grow more rapidly (geometrically or exponentially) than their means of subsistence (arithmetically or linearly). Population growth will therefore inexorably outstrip food and other supplies, resulting in ever increasing misery, famine, disease, war, environmental destruction, and population crashes.
40
Commenting on Malthus's dire predictions, the British economist Kenneth Smith observed in 1952 that the clergyman's previsions on the
fate of the English and Welsh populations (he considered both regions full and on the verge of collapse with a population of approximately 10 million individuals—there are now approximately six times more people living in Wales and England today) had failed to materialize “because the development of overseas territories opened up an enormous source of additional food.” He added that individuals “are not compelled to subsist on the supplies grown in the area where they live; in a trading community it is only the total which must suffice” and that just as “townspeople live on the products of the countryside, so do industrialized nations draw their supplies from more primitive countries which specialize in the production of raw materials and foodstuff.” Englishmen and Welshmen were thus “able to draw on the whole of the world.”
41
Although Smith's core economic insight was eminently sensible, his insistence that “primitive” regions should specialize in the production of a particular foodstuff has long been challenged on the grounds that a sudden drop in demand for a local specialty item—whether because of new competitors, changing consumer tastes, or the development of better substitutes—or an epidemic disease of massive proportion will rapidly make it impossible for laid off workers to purchase the food imports on which they have come to rely. The concern is valid, but the problem is not unique to agriculture. In a time of rapid technological change, all of us invest in personal skills that are likely to become obsolete during our lifetimes. Similarly, as the economist Alfred Marshall observed more than a century ago, “a[n industrial] district which is dependent chiefly on one industry is liable to extreme depression, in case of a falling-off in the demand for its produce, or of a failure in the supply of the raw material which it uses.”
42
Fisheries can collapse. The remains of once prosperous mining communities litter the American landscape. The real issue is therefore not whether a poorly diversified economic base is undesirable, but rather whether specialized agricultural regions should revert back to greater self-sufficiency and subsistence economies to prevent economic downturns. The answer is an unequivocal no, but again, one needs to look at the bigger picture to appreciate why.
First, while virtually all commercial products and economic skills will eventually sink into obsolescence, they are still worth producing as long as there is a market for them. Phonographs, vinyl disks, and tape cassettes are now at best collectors' items, but they created ample employment in earlier times by providing consumers a greater range of musical experiences than would have been the case in their absence. Should investors have refrained from putting their capital into this line of work? Should employees have refrained from acquiring once valuable skills? Obviously not. Similarly, profitable monocultures should be pursued as long as they remain viable in a particular location. If they are suddenly no longer worth pursuing, other alternative crops can often be grown in their place. In the 19th century, coffee production in what is now Sri Lanka was wiped out by a fungus infection, yet the local economy nonetheless forged ahead as tea, rubber (through rubber tree plantations), and coconut production proved profitable.
43
At the turn of the 20th century, Pierce's disease was a significant factor in dooming the wine-making industry of Southern California (the other was the poor quality of the product), but citrus fruits better suited to the local climate more than made up for it.
But, critics readily object, what about
people
being thrown out of work
now
when no viable substitutes have been found? The short answer is that agricultural workers are ultimately no different than the former employees of horse-drawn carriage manufacturers. Old jobs need to be terminated so that resources can be redeployed to better uses, which will in turn create more and better jobs. Besides, skills developed in one context can often be used in another. Well then, activists typically add, what about the fate of local communities? Shouldn't people have a right to live where they want and where they belong? As we see things, humans gave up that “right” the moment they left Africa a very long time ago. True, having to leave one's rural community in search of better opportunities elsewhere might not be everybody's wish, but it sure beats the traditional “starving in fresh air” fate of subsistence farmers. Besides, the fact that humans are not plants and can escape from droughts, floods,
and other natural and economic calamities should be viewed as a blessing, not a curse. It could also be pointed out that the food price spike of 2008 was somewhat softened by record remittances dispatched by migrants to their countries of origin that totaled close to $340 billion, a 40% increase from the $240 billion sent in 2007.
44
In the end, too, abandoned agricultural lands quickly revert to a more “natural” state, a fact that should please environmental activists.
Like financial investors, producers in a monoculture region can reduce the risk of economic collapse through the diversification of their “economic portfolio.” More than a century ago Alfred Marshall observed that the economic meltdown of mono-industrial districts could “in a great measure [be] avoided by those large towns or large industrial districts in which several distinct industries are strongly developed.” Regional diversification, however, doesn't imply giving up on specialization, but rather developing multiple profitable specializations. Unfortunately, mainstream economists have long been in the habit of discussing the benefits of the geographical division of labor using the simple model of two countries, each only able to manufacture two different commodities. Basic economic reasoning then leads to the conclusion that each region or nation should specialize in the production of only one good, but this result is for all intents and purposes built into their unrealistic assumptions. Why they persist in using this example is something we never quite understood given the realities of vast geographical entities made up of diverse landscapes and millions of people with different abilities. What ultimately matters is the fact that individuals with different aptitudes and interests living in specific places specialize and trade with other individuals, in the process profitably concentrating on all kinds of endeavors and making abstract “entities” such as cities, regions, and nations more rather than less diverse over time.
45
Many diverse cities are found throughout the American cornbelt, yet corn producers in this area remain highly specialized. Should things go wrong with corn farming, local producers could find other lines of employment in their region, although this might entail a long commute
or relocation to another city or town. The key point to improve food security in the long run is to ensure that as many resources as possible are invested in the development of the profitable activities of tomorrow rather than squandered in a vain attempt to cling to the industries of yesterday. As long as new lines of work are developed and people are free to move, the fate of agricultural workers in declining monoculture regions and towns will be positive by any historical standard and certainly better than in a world shaped by locavore ideals.
One more way to convey this point is to look at the circumstances of the inhabitants of regions that were once agriculturally diversified and regularly subjected to hunger and famines, but which later became large-scale monocultures and practically famine-proof. Because almost all monoculture regions in advanced economies would qualify, we will limit ourselves to the case of a few square miles in the so-called “American bottom,” the approximately 175-square-mile Mississippi flood plain east of Saint Louis in southern Illinois. This area—once the home of the largest Native American settlement north of what is now Mexico—includes a six-square-mile complex of man-made earthen mounds, the Cahokia Mounds, the largest earthwork built during pre-Columbian times.
At its peak in the 13th century, Cahokia might have had a population of as many as 40,000 people (a high-end estimate that would have made it larger than London at the time), a figure that would only be exceeded in the United States by Philadelphia at the turn of the 19th century. As with all sizeable urban settlements in history, evidence has been found of goods that were brought in from long distances, in this case from as far away as the Gulf Coast (shells), Lake Superior (copper), Oklahoma (chert, a flintlike rock), and the Carolinas (mica). The local inhabitants grew corn, squash, goosefoot, amaranth, canary grass, and other starchy seeds, beans, gourds, pumpkins, and sunflowers which they supplemented by wild fruits and nuts, fish, waterfowl, and a few game animals. Despite corn storage in granaries, though, the Cahokians were subjected to recurring hunger and famine.
46
By contrast, today, the main
mounds are part of the city of Collinsville, Illinois, which is not only the home of the largest ketchup bottle in the world, but is also the self-described “world's horseradish capital.” Even if America's (and the world's) fondness for horseradish were somehow to fade away, the area's agricultural labor force could find work elsewhere or help local horseradish producers switch to other crops. As such, they are much more food secure than the ancient inhabitants of the site.
As the above cases illustrate, monocultures can only be a serious threat to food security
in the absence of broader economic development, scientific and technological advances, trade, and labor mobility.
The Irish potato famine, the standard case used by opponents of monocultures, is also a more telling illustration in this respect than most imagine. Without getting into too many details of a complex and still controversial story,
47
a key feature of the Irish famine of the 1840s is that it was not the result of a uniquely “Irish” disease, but rather of a problem that was then rampant in North America and Continental Europe. A little appreciated feature of the Irish economy at the time is that it was the home of a thriving export-oriented food sector that had long shipped out goods such as dairy products, grains, livestock, fish, and potatoes to the rest of Europe and various American colonies.
48
Not surprisingly, the best lands were devoted to the most lucrative products while a rudimentary form of potato cultivation was concentrated on less fertile soils and had displaced oats as the main staple of poor people because it delivered much higher yields. Potatoes, rich in vitamin C, also proved to be a fitting complement to then-abundant dairy products rich in vitamins A and D. As a result, despite many local and partial potato harvest failures and significant famines in 1799 and 1816, the Irish population grew faster in the 18th century than in any other European country, from around 2 million people in 1750 to about 8.2 million people in 1845.
The downside of this demographic boom, however, was that on the eve of the great famine, about a third of the population depended on potatoes for most of their food intake and that relatively few potato varieties
had been introduced from the Americas. As serious disease problems began to emerge in Western Europe and North America in the late 18th century, new South American varieties were introduced in an attempt to increase resistance. Unfortunately, they probably brought in or heightened vulnerability to the so-called late blight of potato caused by the oomycete (a fungus-like microorganism)
Phytophthora infestans
that attacked both tubers and foliage.
The disease that would forever be associated with Ireland actually first showed up in central Mexico and then reached Pennsylvania in 1843, from which it swept across an area stretching from Illinois and Virginia to Nova Scotia and Ontario over the next three years. It probably entered Belgium in 1845 through a shipment of American seed potatoes and soon ravaged potato fields all the way to Russia. As if things were not dire enough, below average wheat and rye crops also plagued Europe at the time, giving rise to the moniker “the Hungry Forties.” In Ireland, the disease destroyed a third of the potato crop in 1845 and most of the harvest in 1846 and 1848. The resulting loss of foodstuff was of such magnitude (approximately 20 million tons for human consumption alone) that the banning of ongoing Irish grain and livestock exports at the time, a measure requested by many, would have covered at most one-seventh of it. Nearly a million people died in total, the majority from hunger-related diseases such as typhus, dysentery, and typhoid rather than outright starvation. Another notable fact is that the areas that specialized in livestock and cereal production were largely unaffected by the famine.
In Continental Europe, the potato blight resulted in perhaps 100,000 deaths while, to our knowledge, no specific death toll was recorded in North America. While the European number was large, it was literally and proportionately only a small fraction of Ireland's despite the fact that many poor Western Europeans were also heavily dependent on potatoes for their sustenance. Much evidence suggests that the key difference between Ireland and Western Europe at the time was that the latter was by then offering more employment and food options to its inhabitants, such
as artisanal or cottage production, as salesmen working for local markets, or as part-time workers in various industries. Many of the relatively poor Europeans were thus able to purchase other food commodities that were then no longer available to the most impoverished segments of the Irish population. Many individuals whose nutritional intake was also heavily dependent on potatoes moved permanently to industrializing areas, for example from the Scottish highlands to the Scottish lowlands. By and large, however, the potato blight did not result in massive emigration from Western Europe. In New England, farmers gave up on potatoes (often the only crop that grew in their poor soil), culled their cattle and swine for want of feed, and moved away to rich grain (wheat and corn) lands further west or else found manufacturing or other employment in the then-rapidly developing American industrial belt.

Other books

The Map of All Things by Kevin J. Anderson, Kevin J. Anderson
Rebecca Hagan Lee by Gossamer
The Actor by Brooks, Maya
The Husband Hunt by Lynsay Sands
The Burning Man by Phillip Margolin
The Diviner's Tale by Bradford Morrow