Read Paleofantasy: What Evolution Really Tells Us about Sex, Diet, and How We Live Online
Authors: Marlene Zuk
Natural selection thus produces a bottleneck, through which only the individuals with the genes necessary for survival can pass. The problem is, that bottle also squeezes out a lot of other genetic variation along with the genes for susceptibility to the insecticide or the ailment. Suppose that genes for eye color or heat tolerance or musical ability happen to be located near the susceptibility genes on the chromosome. During the production of sex cells, as the chromosomes line up and the sperm and egg cells each get their share of reshuffled genes, those other genes will end up being disproportionately likely to be swept away when their bearer is struck down early in life by the selective force—the poison or pathogen. The net result is a winnowing out of genetic variants overall, not just those that are detrimental in the face of the current selection regime.
Future evolution, and the downside to immortality
Evolution is constantly at work, altering a gene here or a set of co-occurring attributes there. It’s not always visible, at least not at first, but it’s still happening. And it provides a little-considered flaw to that long-sought goal of humanity: immortality. Imagine that you, like a character in one of those vampire novels that are so popular these days, can live forever. Day in and day out, as the seasons change and the years go by, you remain deathless and unaltered, while those around you wither and die. Except for the inconvenience of having to hide your ageless physique from the mortals around you, and the necessity of catching up with new fashions not once during a single period of youth but over and over again as hemlines rise and fall, it would be perfect, right?
Or maybe not. And not for the usual literary-device reasons, like losing your purpose in life, lacking a need to leave your mark before you expire, or having to watch loved ones succumb to the ravages of time. No. As generations came and went, it would become increasingly apparent that the problem was the inability to evolve. Individuals never can evolve, of course; members of a population just leave more or fewer genetic representations of themselves. But since we are never around to see more than a couple of generations before or after us, we don’t notice the minute changes that are occurring in the rest of the group. After a while, and not all that long a while at that, your fifteenth-century vampire self would start looking, well, maybe not like a Neandertal, but just a bit different. You would be shorter than your peers, for example. And even if you didn’t look different, your insides would lack those latest-model advances, those features that make the new version the one to buy, such as resistance to newfangled diseases like malaria. Natural selection happened while you just kept on being a bloodsucker.
Evolutionary biologist Jerry Coyne, author of
Why Evolution Is True
and an eponymous website, says that the one question he always gets from public audiences is whether the human race is still evolving.
11
On the one hand, modern medical care and birth control have altered the way in which genes are passed on to succeeding generations; most of us recognize that we wouldn’t stand a chance against a rampaging saber-toothed tiger without our running shoes, contact lenses, GPS, and childhood vaccinations. Natural selection seems to have taken a pretty big detour when it comes to humans, even if it hasn’t completely hit the wall. At the same time, new diseases like AIDS impose new selection on our genomes, by favoring those who happen to be born with resistance to the virus and striking down those who are more susceptible.
Steve Jones, University College London geneticist and author of several popular books, has argued for years that human evolution has been “repealed” because our technology allows us to avoid many natural dangers.
12
But many anthropologists believe instead that the documented changes over the last 5,000–10,000 years in some traits, such as the frequency of blue eyes, means that we are still evolving in ways large and small. Blue eyes were virtually unknown as little as 6,000–10,000 years ago, when they apparently arose through one of those random genetic changes that pop up in our chromosomes. Now, of course, they are common—an example of only one such recently evolved characteristic. Gregory Cochran and Henry Harpending even suggest that human evolution as a whole has, on the contrary, accelerated over the last several thousand years, and they also believe that relatively isolated groups of people, such as Africans and North Americans, are subject to differing selection.
13
That leads to the somewhat uncomfortable suggestion that such groups might be evolving in different directions—a controversial notion to say the least.
The “fish out of water” theme is common in TV and movies: city slickers go to the ranch, Crocodile Dundee turns up in Manhattan, witches try to live like suburban housewives. Misunderstandings and hilarity ensue, and eventually the misfits either go back where they belong or learn that they are not so different from everyone else after all. Watching people flounder in unfamiliar surroundings seems to be endlessly entertaining. But in a larger sense we all sometimes feel like fish out of water, out of sync with the environment we were meant to live in. The question is, did that environment ever exist?
I
n 2010 the
New York Times
ran an article titled “The New Age Cavemen and the City,” about modern-day followers of a supposedly evolution-based lifestyle.
1
These people, mainly men, subsist largely on meat (they apparently differ about whether or not it is best, or even acceptable, to cook it), eschew any foods requiring that newfangled practice of cultivation, and exercise in bursts of activity intended to mimic a sprint after escaping prey. The
Sydney Morning Herald
ran a similar article, with one adherent noting, “The theory is that you only eat what our ancestors ate 10,000 years ago. It’s what you could get with a stick in the forest.”
2
Frequent blood donation is also practiced, stemming from the idea that cavemen were often wounded and hence blood loss would have been common. (University of Wisconsin anthropologist John Hawks noted that the resulting photograph of three practitioners of this paleo way of life looks “like the cast of Pleistocene
Twilight
.”
3
) Surprisingly, New York City turns out to be a hospitable place to practice such principles, partly because it is easier to walk to most of one’s destinations. One of those profiled does his walking “dressed in a tweed coat and Italian loafers,”
4
but the lack of adherence to an ancestral wardrobe of (presumably) skins and hides goes unremarked.
The “paleo” (a word many prefer to “caveman”) practitioners are a varied bunch; in addition to the group described in the
New York Times
, numerous diet books and blogs, exercise programs, and advice columns exist to help those trying not just to be healthy, but to do so in a way that hews to how our early human ancestors would have lived. According to a commenter on one such site, Cavemanforum.com, “I see more and more mistakes in moving away from paleo life. All these things we need, to feel happy, to be healthy—it sounds stupid, but I’ve starting to feel like agriculture really was the biggest mistake we ever did. Of course we can’t bring the times back, but in a strange way I wish we could. The solutions to our problems lay there. Not just food. I feel like we messed up, and we are paying for it.”
5
References to masculinity are common (one blog links to “The art of manliness”), but Paleochix.com does offer a more female-oriented approach, with articles on skin care and motherhood. A Google search for “caveman lifestyle” garners over 200,000 hits. In
The Omega Diet
, Artemis P. Simopoulos and Jo Robinson claim, “Human-like creatures have existed on this planet for as long as four million years, and for roughly 99 percent of this time, they were hunters and gatherers . . . This means that when we’re sitting down to lunch, our stone-age bodies ‘expect’ to be fed the same types and ratios of fat that nourished our cave-dwelling ancestors.”
6
And on Diabetescure101.com we see this statement: “When you realize you are stuck with a caveman body and realize you have been doing all sorts of ‘Modern Man’ things to it to screw up the system, and [as] a result it is not working, then you will take the time to stop and figure out what needs to be changed.”
7
The articles were clearly a bit tongue in cheek, and even the most ardent followers of the paleo life are not seriously trying to live exactly like people would have 10,000 years ago or more. For one thing, where are all those caves going to come from? And does it count if the blood loss occurs via needle rather than bear tooth? At least one commenter on the
New York Times
Well
blog sounds a skeptical note: “It is idiotic to model one’s behavior on the practices of pre-modern humans on the belief that it will make you live longer, or result in improved quality of life in later years. You are not a pre-modern human. Get over it.”
8
Anthropologist Greg Downey of Macquarie University in Australia once mused to me that it’s interesting that people never yearn for the houses of yesteryear, or yester-epoch; we seem to have abandoned mud huts quite happily.
9
But the proliferation of essays and conversations does show the appeal of trying to take on at least a few attributes of our ancestors, and at least some people are quite convinced that modern civilization has led us astray. The catch is, if we want to go back to a healthier way of life, what exactly should we emulate? How did those ancestors live anyway, and where do we get our ideas about early humans?
Furthermore, even if we reject the idea of living like a caveman, is it still reasonable to assume that we have spent the last 10,000 years or so being whipsawed through history without sufficient time to adapt? Asking questions about what it would be like if we lived more like our Paleolithic ancestors can lead to much more interesting questions than whether or not we should be eating grains or wearing glasses (several of the paleo proponents are very disparaging about the use of such devices
10
). Really understanding our history means understanding which human traits arose when, and which ones are likely to have changed recently. It also means deciding what we can use as a model for our own evolution, given that our evidence is so limited. Bones fossilize, but few of those have survived, and our ancestral behaviors, including our social arrangements, our love lives, and the way we raised our children, leave no physical traces. Increasingly, however, we can reconstruct our history by examining our genes—an undertaking that reveals a much more complex story than the assumption that our DNA has remained unchanged since the advent of agriculture. It all comes down to the pace of evolution.
On evolution and pinnacles
We’re all familiar with those cartoons that show evolutionary milestones, with a fish morphing into a lizard that crawls onto the land, followed by various types of mammals, and always concluding with a human, often clutching a spear, gazing off into the distance. Some versions start with humans at the outset, and in a graphic depiction of paleofantasy, show a knuckle-walking ape transforming into a beetle-browed guy with a club, followed by a rather well-muscled figure in a loincloth changing into a slouching paunchy fellow bent over a computer. Even taken as caricatures, these images contain a lot to object to: For one thing, they virtually never show women unless they are trying to make a point about sex roles, as if men did all the important evolving for our species. For another, they assume that evolution proceeded in a straight line, with each form leading seamlessly and inevitably into the next. But the real problem is that these evolutionary progressions make it seem as if evolution, well, progresses. And that simply isn’t the case.
It is true that natural selection weeds out the individuals, and genes, less suited to the environment, leaving behind the ones that can manage to survive and reproduce. And what we see now—halibut with eyes that migrate during development so that both end up on the same side of the fish as it lies on the sandy ocean floor, moths with tongues exactly the size of the tubular flowers they feed on—looks awfully perfect, as if the fishy or mothlike progenitors were intending to go there all the time. But in fact, as I mentioned in the Introduction, organisms are full of trade-offs, and the solutions we see are only single possibilities among many. Halibut ancestors weren’t striving to be flat, early forms of moths weren’t all about sipping nectar from only the longest flowers, and Miocene apes weren’t trying to end up driving cars and paying taxes, or even coming down out of the trees. Other paths were possible and might have led to flatfish with even better camouflage, or people with teeth that were not so prone to decay. So, that phrase “we evolved to . . .”—eat meat, have multiple sexual partners, or even just walk upright—is misleading, at least if it’s taken to mean that we were
intending
to get there.
What’s more, evolution can and often does occur without natural selection. Species change just because the precise combinations of genes in populations can shift, as individuals move from place to place or small accidents of fate wipe out whole sets of characteristics that are then no longer available for selection to act on. All of this means that we aren’t at a pinnacle of evolution, and neither are the moths or halibut. And those cartoons, amusing though they can be, are simply wrong.
How we got here: A (very) brief history of human evolution
Before we can look at how fast humans are changing, we need to see where we came from. Although anthropologists still argue about many of the details of human evolution, they largely agree that more-humanlike primates arose from ape ancestors about 6 million years ago, when the Earth was cooling. We did not, of course, arise from apes that are identical, or even all that similar, to the modern chimpanzees, gorillas, and orangutans we see today, but from a different form entirely. Because of the sparseness of the fossil record, it is not clear exactly which kind of ape the earliest humans came from, but the first hominins (a name for the group that includes humans and our ancestors, but not chimpanzees, bonobos, orangutans, or gorillas) were different from their ape ancestors. They walked on two legs, rather than on all fours, and this bipedal mode of perambulating led to a variety of other skeletal changes to which we are arguably still adjusting, as evidenced by problems like lower back pain. And they ate different kinds of foods as they moved into the woods and savannas instead of the forests of more tropical Africa, leading to changes in the teeth and skull as natural selection acted on each generation.
The major milestones in human evolution.
(Courtesy of the University of California Museum of Paleontology—Understanding Evolution, http://evolution.berkeley.edu)
Between 4 and 2 million years ago, several different hominin species coexisted in Africa, including some in the genus
Homo
, which is the same genus to which contemporary humans belong. Modern humans have a number of characteristics that distinguish them from modern apes, including a relatively large brain, that bipedal walking mode, a long period of time spent as a juvenile dependent on others, and symbolic language that helps us communicate about a complex culture that includes material objects such as tools. Other animals share some of these traits with us—cetaceans, for example, also have relatively large brains for their body size—but taken as a whole, the list demarcates humans as humans.
Some anthropologists like to select one or more of the characteristics as key, such as the use of fire for cooking food, which Harvard anthropologist Richard Wrangham believes led to a cascade of other evolutionary events that were essential to our becoming modern humans.
11
Others focus on selection for large brains and longer child development. Bipedalism and the ability to walk upright are also important features of hominins, but while it certainly seems as if having one’s hands free allows all kinds of other human attributes, like carrying tools or weapons around and hugging one’s neighbors, these changes don’t seem to have been the critical trigger for the other traits, like larger brains, since ancestral apes were bipedal long before the skulls of early hominins showed signs of increased brain size.
Regardless of your favorite demarcating trait, exactly when any one of these characteristics arose is still hotly contested among archaeologists, and the goalposts are shifting all the time. Until 2010, tool use, one of those key signs of technological advancement in human evolution, was thought to have arisen less than 3 million years ago. But careful examination of two animal bones from 800,000 years earlier, at the time when the species made famous by the skeleton called Lucy,
Australopithecus afarensis
, was roaming Africa, now suggests that ancestral humans might have been using stone tools to butcher meat as long as 3.4 million years ago. The rib and thigh bones of those animals carry tiny marks that sophisticated imaging techniques indicate were made by stone, not the teeth of predatory animals simply munching on their prey.
12
Lucy and her contemporaries, however, were relatively small-brained compared with the various species of
Homo
, our own genus, that came later, which presumably means that carving up the meat one has hunted doesn’t necessarily require a lot of intelligence.
Other anthropologists are not convinced by the admittedly scanty fossil evidence, and another study, published later in 2010, makes the case that the marks could have been produced by decay, the hooves of animals stepping on the bones, or other innocuous sources.
13
More information is clearly needed before the jury is in on when humans first started to use tools. Nevertheless, estimates for the first use of fire have similarly been pushed back in time, along with the first use of ground grains for flour. In addition to being important and interesting from the standpoint of reconstructing our family history, the shifting benchmarks underscore the riskiness of declaring how long humans have spent doing any one thing, whether that thing is hunting, using tools, or cultivating food. In turn, this uncertainty means that we are on shaky ground declaring that humans have been doing certain things, such as eating grains, for “only” short periods, and hence cannot be well adapted to the new environment. How long is long enough, and how short is too short?