Bunch of Amateurs (28 page)

Read Bunch of Amateurs Online

Authors: Jack Hitt

Our brain is like this too. Its perceptions are just as flawed in understanding the world as our body is in navigating it. Our species has a hard time wrapping its brain around this fact because we’ve devoted the last five thousand or so years of written history to convincing ourselves, through intense repetition, that we are the very best there is, the Xerox copy of an omnipotent deity. We are made in His image because nearly every Scripture and Holy Writ on earth includes that claim.

We can’t shake the idea that the brain is special, the center of self and the storehouse of being. We want it to be constant, Gibraltarian, immortal. But just as we have come to terms with the idea that the human body is not the fixed, instantaneous creation of a celestial architect (or that the universe is not a massively static amphitheater), the same can also be said of our glorious brain.

Evolutionary science has shown us that the brain is a patched-up jalopy—an improvised, makeshift do-over. The brain is reliable enough to get you to the next place, but when you finally look under the hood, it’s a little shocking to find a collection of barely successful workarounds and ad-hoc fixes using what appear to be the cerebral equivalent of baling twine, chewing gum, duct tape, and haywire. The reptilian stub got a big makeover during the mammalian period, with an overlay of the new limbic portion of the brain, and then later
the more humany stuff known as the neocortal got plopped on top of that—each a series of fixes and patches of the previous networks. For instance, two parts of the brain that evolved with
Homo habilis
(literally, “handy man” because he was the first tool-making hominid) were the Broca’s and Wernicke’s areas, both crucial in putting things together to make tools. These parts of the brain evolved about two million years ago. Then, about one hundred thousand years ago, scientists now theorize, these tool-making portions of the brain were hot-wired and hijacked to form the centers of the complex human language you speak every day. That’s how evolution works; parts get re-adapted for new uses (“exapted,” in the jargon), or useless bits lying around (“spandrels,” in that same jargon) suddenly get appropriated to new uses. Ancient chewing bones in the upper jaws of reptiles got taken over a good while back and now serve at the scaffolding of the middle ear—the incus, malleus, and stapes—that allows us to hear.

Evolution did not set us on a trajectory toward the
perfect
brain, the best possible brain, or even, arguably, a decent brain. Rather, we got the amateur version, the unendingly fiddled-with version, a flawed instrument just good enough to get us through to reproductive age. After that achievement, evolution occurs (or not) without a central mission, which might explain the onset of loopy eccentricity in middle-aged aunts and uncles.

Had that evolution failed, we would have gone the way of the Neanderthal or the Australopithecus, visible only in the fossil record. But the brain shambled its way through to right now. And so we live. Scientists studying the early brain have determined that on the savannahs of Africa, we developed thousands of shortcuts in the brain to gain a quick and usually accurate depiction of reality. Our ability to make snap judgments was very handy—evolutionarily. We oversimplify the world we see—and take shortcuts in the viewing of it—in order to make quick sense of it. These shortcuts are called “heuristics,” and nowadays they can make navigating our way through a modern world, dominated by rational choice, quite dicey.

Take a quick one that we all hear our parents say when we’re kids: “Know what you are looking for.” It’s easy to project how helpful that would have been a hundred thousand years ago, when the difficulty of getting food was such that being sidetracked even momentarily could rapidly become total failure or death. But today, when that primitive tic rears its ancient bias, it more likely means we miss all kinds of new opportunities. In fact, many of these ancient heuristics have survived the eons to form a kind of distortion field through which we perceive the universe. And it’s only by looking at Kennewick through such a mirror that one can see anything like a wandering Caucasoid.

Take the most basic notion known as the fundamental attribution error: We are ourselves looking out at the world. That sounds fine, but it has serious and often unavoidable repercussions. When it comes to our own actions, we easily comprehend the state of affairs around us—the situation—because we are trapped in our bodies moving through time. But when we watch other people, we don’t see their situation; rather, we see their bodies as discrete actors in the world. So we are much more likely to ascribe menacing personal motives to another’s actions (“That guy who didn’t do his homework is lazy”), while we are very understanding when judging ourselves (“I didn’t do mine because of a family crisis”), sometimes extremely understanding (“My dog ate my homework”).

I once hung out with an abortion doctor in the Dakotas as he went about his rounds. He told me that pro-life women are no less likely to have abortions than pro-choice women. He said he sometimes found the protesters in one town showing up as patients in the next town (and after the abortion they would go right back to hurling insults at him on the street). But when queried, the pro-life woman would explain away her own choice to abort, saying that her circumstances were unique and one needed to understand the pressures she was under. The other women having abortions? They were baby killers.

Experiments confirm this tendency in every human endeavor. According to Tom Gilovich, the author of
How We Know What Isn’t So
, 25 percent of “college students believe they are in the top 1% in terms of their ability to get along with others.” It’s everybody else who’s the asshole. According to the
Journal of the American Medical Association
, 85 percent of medical students believe politicians are behaving unethically when they accept gifts from lobbyists. But only about half as many medical students—46 percent—think it’s unacceptable for doctors to receive similar goodies from drug companies. We can trust our kind, sort of, but definitely not the other kind. There are hundreds of studies yielding the same type of statistic (another medical study found that young doctors believe that 84 percent of their colleagues might be corrupted by pharmaceutical companies’ freebies, but only 16 percent thought they
personally
were vulnerable).

We can excuse ourselves, literally, because we see so many legitimate excuses in front of us. Other people? Liars, baby killers, thieves. So are the Native Americans politically correct tools of the federal government? Are the scientists opportunistic liars relying on hokum to make an end run around the law? If you’re on the other side, absolutely.

We naturally and easily create a world of order out of events that if examined more closely have other causes or, often, no discernible causes at all. Our ability to craft meaning out of non-meaning is impressive and no doubt has been fairly useful these last two hundred thousand years. But our view of reality, like everything, is not necessarily the best possible view, or even the “real” view—just the one that got us through to right now. The fact is that we see the world from inside this distortion field, and the more researchers study it, the more we learn just how twisted and tenacious it is.

These perceptual flaws now have many names, are being studied continuously, and have generated mountains of papers. The taxonomy of our flawed selves is an explosive and growing field and beginning
to penetrate the world outside the lab. Many people have heard of the confirmation bias—the tendency to sort through evidence to confirm what we already know. That one has practically entered the common culture. Most days, it would appear that the Internet is little more than an exhausting orgy of confirmation bias.

There is a kingdom of graduate students and their notable mentors devising experiments to further understand dozens of fabulously named quirks: the Von Restorff Effect, the Status Quo Bias, Loss Aversion, the Semmelweis Reflex, the Déformation professionnelle, the Clustering Illusion, the Hawthorne Effect, the Ludic Fallacy, the Ostrich Effect, the Reminiscence Bump, Subjective Validation, the Texas Sharpshooter Fallacy, the Barnum Effect, Illusory Superiority, the Just-World Phenomenon, the Lake Wobegon Effect, the Out-group Homogeneity Bias, the Availability Heuristic or the Informational Cascade.

One of the most important biases is called anchoring, the cognitive bias that tends to make most of us always lean toward the first notion we were exposed to. Scientists have discovered that we “anchor and adjust” our beliefs. In other words, we can never really cut off our relationship to that first big impression.

The most famous experiment is simple yet mind-boggling. Say I get people to spin a wheel imprinted with two numbers—15 and 65—and it lands on 15. Then I ask a completely unrelated question—How many African nations are members of the United Nations? Most will cluster their answers around the number in the spin. Crazy, but true. That line you heard from your mom about “always make a good first impression” is not only true but a kind of classic heuristic—i.e., a short nuggetlike axiom that long ago worked well for us but nowadays can lead us into a forest of nonsense. The anchoring tendency is so strong that business schools teach it as a fundamental exercise in negotiation theory. Always be the first to state a number in a salary negotiation. Why? Because the final number will, more often than not, cluster around the first number uttered.

With Kennewick, the anchor was that first racial utterance, a work of periphrastic art: “Caucasoid-like.” We can discuss Kennewick all day long, but every conversation veers back to some aspect of this issue—whether he is or is not Caucasoid-like.

Humans are wired to see things even when they aren’t there. This accounts for so many routine sightings of the Virgin Mary and Jesus and even Michael Jackson on toast, in the bark of trees, or in a photo of spaghetti. These sightings might sound ridiculous, but they are great examples of how the brain fills in the story we want to tell (or picture we want to see). Brain scientists will tell you that the medium for such appearances must always be grainy—like toast, tree bark, or a photo of smeared spaghetti sauce. A blurred medium will activate the portion of the brain that fills out a pattern into whatever the brain wants to see confirmed. In fact, often if you can’t see the image, then squinting helps. Depriving the eye of the true specifics of the image allows the brain to fill in the image with its preconceptions, and there it is (like the blurry images of Bigfoot and the ivory-billed woodpecker). Has anyone ever wondered why, in a world where my local ice-cream parlor can print a high-pixel resolution image of my daughter’s face into the icing of a chocolate cake, the Creator hasn’t updated his tree bark appearances past the daguerreotype phase?

In the Kennewick controversy, this tendency to see Jesus in toast, technically known as pareidolia, is what explains the Solutrean Hypothesis. Only the theory’s most devoted zealots see similarities between the Solutrean laurel leaf arrowhead and the Clovis point. Only those who most desire it can see in these few bits of stone an entire land-based culture that could have turned into sailors without any evidence; maritime
Homo sapiens
who left the countryside of Europe and managed to adapt overnight to Inuit-style living, camping on ice and fishing along the kelp highway. Even though the Solutreans disappeared some four thousand years
before
the appearance of Clovis, and even though they left no redundant evidence
behind, if you look at these dissimilar stone tools, you can see their entire voyage right there in the flutes of the Clovis points.

But only if you squint.

The language used to describe Kennewick is thoroughly infected with many of these biases. One of the most powerful is called the self-esteem bias. That is, we more eagerly see things that flatter us than those that don’t. Putting together a skull and nudging it a few millimeters here and there to make it more possible to see a “European” shape is a perfect example of the self-esteem bias on the part of white researchers.

Since there were a number of different ways to assemble the skull and one of them trended closer toward confirming what these researchers deeply wanted to see in the skull, the skilled scientist would certainly set up an experiment to work around this obvious tendency. If Chatters had sent precise molds of the skull to five different anthropologists and asked them to “assemble” it without telling them the age or the location of the finding and then asked them to explain what one might surmise from the skull—
then
you would have had an experiment and possibly a clear-eyed view of the skull. Instead putting it together yourself and then declaring that it just so happens to confirm what it is you so deeply long to see would make any cognitive scientist throw up her hands in despair.

The Kennewick court case itself is a classic example of another bias known as the Endowment Effect. Our ability to unconsciously create value for an object we are holding (or wish to hold) is impressive. A famous experiment demonstrating this effect involved giving free coffee mugs to people and selling them to other people. Later, when asked to sell them, people who had paid money for the mugs insisted on higher prices. People who were given the mugs didn’t care so much.

Because everyone was struggling to retain control of the skull and bones, they not only had to be valuable, but that also tended to make people believe they had to be valuable in other ways. Of course
the skeleton had to be unique proof of a European presence prior to paleo-Indians. Why else were the Indians fighting so hard to take possession of it?

Priming is the other cognitive bias that overwhelmed the popular media in this story from the beginning. For instance, if I asked you to think about your grandfather’s death and then asked you to categorize words as “negative” or “positive” as I read off “happy,” “singing,” and “crying,” you would more quickly categorize the word “crying” as negative because I had already primed your mind to be on the alert for negative things. This happens in all kinds of ways. But few of them are as textbook perfect as handing a reconstruction artist a skull with the explicit observation that you think the skull bears an uncanny resemblance to Patrick Stewart of
Star Trek
.

Other books

Buried Alive! by Gloria Skurzynski
Zombie X by S.G. Harkness
The Devil of DiRisio by DuBois, Leslie
Beautiful Music by DeVore, Lisa
A Knight In Cowboy Boots by Quint, Suzie
FIFTY SHADES OF FAT by Goldspring, Summer
Murder by Candlelight by Michael Knox Beran
Maxwell’s Ride by M. J. Trow