Read Sex at Dawn: The Prehistoric Origins of Modern Sexuality Online
Authors: Christopher Ryan,Cacilda Jethá
Tags: #Non-Fiction, #Sociology, #Psychology, #Science, #Social Science; Science; Psychology & Psychiatry, #History
Once our ancestors began cultivating land for food, they were running on a wheel, but never fast enough. More land provides more food. And more food means more children born and fed. More children provide more help on the farm and more soldiers. But this population growth creates demand for more land, which can be won and held only through conquest and war. Put another way, the shift to agriculture was accelerated by the seemingly irrefutable belief that it’s better to take strangers’ land (killing them if necessary) than to allow one’s own children to die of starvation.
Closer to our own time, the BBC reports that as many as 15
percent of the
reported
deaths of female infants in parts of southern India are victims of infanticide. Millions more die in China, where female infanticide is prevalent, and has been for centuries. A late nineteenth-century missionary living in China reported that of 183 sons and 175 daughters born in a typical community, 126 of the sons lived to age ten (69
percent), while only 53 of the daughters made it that far (30
percent).7 China’s one-child policy, combined with the cultural preference for sons, has only worsened the already dismal odds of survival for female infants.8
There are also problematic cultural assumptions lurking within demographers’ calculations, in which life is assumed to begin at birth. This view is far from universal. Societies that practice infanticide don’t consider newborn infants full human beings. Rituals ranging from baptism to naming ceremonies are delayed until it is determined whether or not the child will be permitted to live. If not, from this perspective, the child was never fully
alive
anyway.9
Is 80 the New 30?
Cartoon in
The New Yorker:
Two cavemen are shown
chatting, one of whom is saying, “Something’s just not
right—our air is clean, our water is pure, we all get plenty of
exercise, everything we eat is organic and free-range, and yet
nobody lives past thirty.”
Statistical distortions due to infanticide are not the only source of confusion concerning prehistoric longevity. As you might imagine, it’s not so easy to determine the age at death of a skeleton that’s been in the ground for thousands of years.
For
various
technical
reasons,
archaeologists
often
underestimate the age at death. For example, archaeologists estimated the ages at death of skeletons taken from mission cemeteries in California. After the estimates had been made, written records of the actual ages at death were discovered.
While the archaeologists had estimated that only about 5
percent had lived to age forty-five or beyond, the documents proved that
seven times that many
(37 percent) of the people buried in these cemeteries were over forty-five years of age when they died.10 If estimates can be so far off on skeletons just a few hundred years old, imagine the inaccuracies with remains that are tens of thousands of years old.
One of the most reliable techniques archaeologists use to estimate age at death is dental eruption. They look at how far the molars have grown out of the jawbone, which indicates roughly how old a young adult was at death. But our “wisdom teeth” stop “erupting” in our early to mid-thirties, which means that archaeologists note the age at death of skeletons beyond this point as “35+.” This
doesn’t
mean that thirty-five was the age of death, but that the person was
thirty-five or
older.
He or she may have been anywhere from thirty-five to one hundred years old. Nobody knows.
Somewhere along the line this notation system was mistranslated in the popular press, leaving the impression that our ancient ancestors rarely made it past thirty-five. Big mistake. A wide range of data sources (including, even, the
Old Testament)
point to a typical human life span of anywhere from seventy (“three score and ten”) to over ninety years.
In one study, scientists calibrated brain and body-weight ratios across different primates, arriving at an estimate of sixty-six to seventy-eight years for
Homo sapiens.11
These numbers bear up under observation of modern-day foragers.
Among the !Kung San, Hadza, and Aché (societies in Africa and South America), a female who lived to forty-five could be expected to survive another 20, 21.3, and 22.1 years, respectively.12 Among the !Kung San, most people who reached sixty could reasonably expect to live another ten years or so—active years of mobility and social contribution.
Anthropologist Richard Lee reported that one in ten of the
!Kung he encountered in his time in Botswana were over sixty years of age.13
As mentioned in previous chapters, it’s clear that overall human health (including longevity) took a severe hit from agriculture. The typical human diet went from extreme variety and nutritional richness to just a few types of grain, possibly supplemented by occasional meat and dairy. The Aché diet, for example, includes 78 different species of mammal, 21 species of reptiles and amphibians, more than 150 species of birds, and 14 species of fish, as well as a wide range of plants.14
In addition to the reduced nutritional value of the agricultural diet, the diseases deadliest to our species began their dreadful rampage when human populations turned to agriculture.
Conditions were perfect: high-density population centers stewing in their own filth, domesticated animals in close proximity (adding their excrement, viruses, and parasites to the mix), and extended trade routes facilitating the movement of contagious pathogens from populations with immunity to vulnerable communities.15
When James Larrick and his colleagues studied the still relatively isolated Waorani Indians of Ecuador, they found no evidence of hypertension, heart disease, or cancer. No anemia or common cold. No internal parasites. No sign of previous exposure to polio, pneumonia, smallpox, chicken pox, typhus, typhoid, syphilis, tuberculosis, malaria, or serum hepatitis.16
This is not as surprising as it may seem, given that almost all these diseases either originated in domesticated animals or depend upon high-density population for easy transmission.
The deadliest infectious diseases and parasites that have plagued our species could not have spread until after the transition to agriculture.
Table 3: deadly diseases from domesticated animals17
HUMAN DISEASE
ANIMAL SOURCE
Measles
Cattle (rinderpest)
Tuberculosis
Cattle
Smallpox
Cattle (cowpox)
Influenza
Pigs and birds
Pertussis
Pigs and dogs
Falciparum malaria
Birds
The dramatic increases in world population that paralleled agricultural development don’t indicate increased health, but increased fertility: more people living to reproduce, but lower quality of life for those who do. Even Edgerton, who repeatedly tells the longevity lie (Foragers’ “lives are short—life expectancy at birth ranges between 20 and 40
years …”), has to agree that, somehow, foragers managed to be healthier than agriculturalists: “Agriculturalists throughout the world were always less healthy than hunters and gatherers.” The urban populations of Europe, he writes, “did not match the longevity of hunter-gatherers until the mid-nineteenth or even twentieth century.”18
That’s in Europe. People living in Africa, most of Asia, and Latin America have
still
not regained the longevity typical of their ancestors and, thanks to chronic world poverty, global warming, and AIDS, it’s unlikely they will for the foreseeable future.
Once pathogens mutate into human populations from domesticated animals, they quickly migrate from one community to another. For these agents of disease, the initiation of global trade was a boon. Bubonic plague took the Silk Route to Europe. Smallpox and measles stowed away on ships headed for the New World, while syphilis appears to have hitched a ride back across the Atlantic, probably on Columbus’s first return voyage. Today, the Western world flutters into annual panics over avian flu scares emanating from the Far East. Ebola, SARS, flesh-eating bacteria, the H1N1 virus (swine flu), and innumerable pathogens yet to be named keep us all compulsively washing our hands.
While there were no doubt occasional outbreaks of infectious diseases in prehistory, it’s unlikely they spread far, even with high levels of sexual promiscuity. It would have been nearly impossible for pathogens to take hold in widely dispersed groups of foragers with infrequent contact between groups.
The conditions necessary for devastating epidemics or pandemics just didn’t exist until the agricultural revolution.
The claim that modern medicine and sanitation save us from infectious diseases that ravaged pre-agricultural people (something we hear often) is like arguing that seat belts and air bags protect us from car crashes that were fatal to our prehistoric ancestors.
Stressed to Death
If an infectious virus doesn’t get you, a stressed-out lifestyle and high-fat diet probably will. Cortisol, the hormone your body releases when under stress, is the strongest immunosuppressant known. In other words, nothing weakens our defenses against disease quite like stress.
Even something as seemingly unimportant as not getting enough sleep can have a dramatic effect on immunity.
Sheldon Cohen and his colleagues studied the sleep habits of 153 healthy men and women for two weeks before putting them in quarantine and exposing them to rhinovirus, which causes the common cold. The less an individual slept, the more likely he or she was to come down with a cold. Those who slept less than seven hours per night were
three times
as likely to get sick.19
If you want to live long, sleep more and eat less. To date, the
only
demonstrably
effective
method
for
prolonging
mammalian life is severe caloric reduction. When pathologist Roy Walford fed mice about half of what they wanted to eat, they lived about twice as long—the equivalent of 160 human years. They not only lived longer, but stayed fitter and smarter as well (as judged by—you guessed it—running through mazes). Follow-up studies on insects, dogs, monkeys, and humans have confirmed the benefits of going through life hungry. Intermittent fasting was associated with more than a 40 percent reduction in heart disease risk in a study of 448
people published in the
American Journal of Cardiology
reporting that “most diseases, including cancer, diabetes and even neurodegenerative illnesses, are forestalled” by caloric reduction.20
These studies lead to the slacker-friendly conclusion that in the ancestral environment, where our predecessors lived hand-to-mouth,
a
certain
amount
of
dietary
inconsistency—perhaps exacerbated by sheer laziness interrupted by regular aerobic exercise—would have been adaptive, even healthy. To put it another way, if you hunt or gather just enough low-fat food to forestall serious hunger pangs, and spend the rest of your time in low-stress activities such as telling stories by the fire, taking extended hammock-embraced naps, and playing with children, you’d be engaged in the optimal lifestyle for human longevity.21
Which brings us back to the eternal question asked by foragers offered the chance to join the “civilized” world and adopt farming: Why? Why work so hard when there are so many mongongo nuts in the world? Why stress over weeding the garden when there are “plenty fish, plenty fruits, and plenty birdies”?
We are here on Earth to fart around, and don’t let anybody
tell you any different.
KURT VONNEGUT, JR.
In 1902 the
New York Times
carried a report headlined
“Laziness Germ Discovered.” It seems one Dr. Stiles, a zoologist at the Department of Agriculture, had discovered the germ responsible for “degenerates known as
crackers
or
poor whites”
in the “Southern States.” But in fact, our
laziness seems less in need of explanation than our frenzied industry.
How many beavers die in dam-construction accidents? Are birds subject to sudden spells of vertigo that send them falling from the sky? How many fish drown to death? Such events are all rather infrequent we’d wager, but the toll exacted upon humans by the chronic stress many consider a normal part of human life is massive.
In Japan, there’s a word for it,
: death from overwork.
Japanese police records indicate that as many as 2,200
Japanese workers committed suicide in 2008 due to overwhelming work conditions, and five times that number died from stress-induced strokes and heart attacks, according to Rengo, a labor union federation. But whether our language contains a handy term for it or not, the devastating effects of chronic stress are not limited to Japan. Heart disease, circulatory
problems,
digestive
disorders,
insomnia,
depression, sexual dysfunction, and obesity—behind every one of them lurks chronic stress.