The Rational Animal: How Evolution Made Us Smarter Than We Think (13 page)

Read The Rational Animal: How Evolution Made Us Smarter Than We Think Online

Authors: Douglas T. Kenrick,Vladas Griskevicius

Tags: #Business & Economics, #Consumer Behavior, #Economics, #General, #Education, #Decision-Making & Problem Solving, #Psychology, #Cognitive Psychology, #Cognitive Psychology & Cognition, #Social Psychology, #Science, #Life Sciences, #Evolution, #Cognitive Science

Let’s think about the Walt Disney Company in light of what we know about how people’s different subselves do business.
Disney not only started out as an actual family company but continued to operate on familial principles even after Walt and his brother left the scene.
But then Michael Eisner arrived.
Eisner brought to the company a hard-nosed Wall Street emphasis on the bottom line, and the result was a harsher, more cutthroat atmosphere.
Whereas Walt and Roy O.
kept
the Disney family together despite occasional downturns in the bottom line, when Eisner started losing money, he was treated like a foreign invader from a strange land, and Roy E.
arranged to have him thrown out on the street.

E
ACH OF OUR SUBSELVES
negotiates according to a different rulebook.
Rather than treating everyone according to the same set of cold, objective, self-serving rules in every situation, we have a different set of biases depending on whom we’re playing with.
Those biases often lead us to put shallow self-interest aside.
But putting self-interest aside, and doing so in different ways for friends, relatives, and business associates, was good business practice for our ancestors.
Those biases led to more beneficial exchanges between our progenitors and their family members, friends, and trading partners.

But some of the biases that served our ancestors so well can sometimes produce costly errors and mistakes.
We next take a closer look at our biases and mistakes in an attempt to understand seemingly senseless phenomena from everyday overconfidence to egregious miscalculation.
Next stop—Africa, where one particularly potent ancestral bias led the people of one country to choose to starve rather than accept help.

4
Smoke Detectors in the Mind

O
N
M
AY
30, 2002, the African nation of Zambia declared a food crisis.
Even before this catastrophe, the country had been in bad shape.
The nation was mired in chronic poverty, with a typical family earning $395 per year, and one in ten infants dying soon after birth.
In 2002, though, things went from bad to worse.
The rainy season that usually begins in November and runs into April came to a sudden halt in mid-January.
Thousands of acres of crops planted the previous fall withered and died.
By mid-spring, with the country’s food reserves running out, Zambians had to resort to boiling poisonous wild roots for eight hours to make them edible and killing protected elephants for meat.
Despite these desperate measures, 3 million Zambians were on the verge of starvation, when President Levy Mwanawasa declared a national food emergency.

With the emergency declared, things began to look more hopeful, as the world swiftly came to Zambia’s aid.
Within weeks the United States had sent thirty-five thousand tons of food to the distressed nation, enough to sustain the population until the next harvest.
Much of the food consisted of donations from American farmers, some of whom had surpluses from a bountiful harvest.

But few could have anticipated what happened next.
To the shock of the world, President Mwanawasa rejected the aid!

Some observers speculated that this startling episode might be a
ploy by an evil dictator hoping to bring his people to their knees.
As a wealthy leader, after all, he wasn’t the one who was going to starve.
But in reality President Mwanawasa had support for the food boycott from the government and from much of the population.
Other observers conjectured that perhaps the food aid was so low in quality as to be barely edible.
But the supplies sent to Zambia were identical to the food many Americans consumed on a daily basis.
Instead, at the heart of the matter were two words that appeared in small print on the food crates: “genetically modified.”
Many outraged Zambians refused to even touch the crates, leading President Mwanawasa to categorically assert, “We would rather starve than get genetically modified foods.”
To the Zambians the thirty-five thousand tons of crops from the United States were not food but poison.

Although observers decried Zambia’s decision as an error of astounding proportions, we suspect that many people in President Mwanawasa’s shoes would have reacted just as he did.
The president’s decision may have been an error, but it stemmed from a deeply rational bias designed to avoid a much costlier mistake.
Here we take a closer look at the ancestral biases guiding people’s decisions.
While biases are often viewed as deficiencies and equated with poor decision making, an evolutionary perspective stresses that many biases stem from adaptive tendencies that helped our ancestors solve evolutionary problems.
We humans are born to be biased—and for good reason.
Although these innate biases can sometimes lead to errors, the very nature of these errors often reveals a deeply intelligent brain.

DEFECTIVE BRAINS

We have already looked at a few of the human biases discovered by behavioral economists, the scientists wearing Rolling Stones T-shirts under their white lab coats, who enjoy having a good chuckle at human behavior.
And why not?
It’s sometimes hard not to chuckle at our long list of gross errors in judgment.
Our perceptions of reality are grossly warped by phenomena such as the
false consensus bias
, our tendency to overestimate the degree to which other folks agree with us, which results in half of us being shocked after every election.
And reality is further
warped by the
overconfidence bias
, the tendency for most of us to believe we are better than average on most traits, even though this is mathematically impossible (half of the population is, by definition, below average on any given trait, but of course we don’t mean you or me).
Overconfidence sometimes reaches absurd levels, as in the case of people hospitalized after auto accidents, who persist in believing they are better-than-average drivers.

Scrolling down the long list of documented errors and biases, it’s easy to see humans as being like Keanu Reeves’s character Neo in
The Matrix
.
Our perception of the world is so skewed by our brains that we seem to be out of touch with the true nature of reality.

But from the evolutionary psychologist’s perspective, it would be surprising if the human mind were really so woefully muddled.
If our ancestors were so out of touch with reality, how in the world would they have managed to survive, let alone reproduce and then raise offspring who themselves survived, thrived, and reproduced?
Brains are expensive mechanisms, requiring more energy than most other bodily organs (our brains make up only 2 percent of our bodily mass yet gobble up as much as 20 to 30 percent of the calories we consume).
These calories are not wasted, though.
Human brains, like those of other animals, are designed with incredible efficiency, allowing us to thrive in an incredible range of environments.

We are not saying that the brain is free of biases or that people don’t sometimes make moronic choices.
But we are saying that it’s time to reconsider what it means for judgments and decisions to be considered smart.

DOES ADAPTIVE = ACCURATE?

A critical distinction between a traditional and an evolutionary perspective involves the question of whether it’s always smart to be accurate.
According to most social scientists, people should strive to uncover and know the pure and undistorted truth, which would therefore enable us to make more accurate and correct judgments.
But will natural selection necessarily build organisms designed to seek truth?
Maybe not.
Indeed, in some instances evolution might even disfavor
truth and accuracy.
What matters instead is that people’s judgments enhance fitness.
And if it so happens that being biased and inaccurate helps achieve that end in some cases, then in those cases we should expect to see a mind that consistently produces biased and inaccurate judgments.
“The principal function of nervous systems,” according to cognitive scientist Patricia Churchland, is “to get the body parts where they should be in order that the organism may survive.
.
.
.
Truth, whatever that is, definitely takes the hindmost.”

We are not saying that someone who is hopelessly delusional and incapable of ever seeing the truth is going to be evolutionarily successful.
For most problems, it usually pays a higher dividend to be accurate rather than inaccurate.
But the mind is not designed to strive for accuracy and truth 100 percent of the time.
As we’ll see, there’s a good reason why it sometimes makes evolutionary sense to warp reality.

Consider the following example: if an object is moving toward you at 20 feet per second, and is currently 120 feet away, how long will it take for the object to hit you?
The accurate answer is 6 seconds (6.001 seconds if you’re a physicist concerned about air friction).
A guess of four seconds would certainly be inaccurate; indeed, it would be a clear error in judgment.
Yet the mind is wired to intentionally make this exact error.
When our eyes see an approaching object, our brains tell us that this object will hit us sooner than it actually will.
In fact, merely hearing the sound of an approaching object (the swoosh of a bird diving through the air, the rustling of someone in the bushes) will result in the same error.
The bias to sense that approaching sounds are going to arrive sooner than they really will is known as
auditory looming
.
One study found that this “error” was made by 100 percent of people.

Like many errors and biases that seem irrational on the surface, auditory looming turns out, on closer examination, to be pretty smart.
Other animals, like rhesus monkeys, have evolved the same bias.
This intentional error functions as an advance warning system, manned by the self-protection subself, providing individuals with a margin of safety when confronted with potentially dangerous approaching objects.
If you spot a rhinoceros or hear an avalanche speeding toward you, auditory looming will motivate you to jump out of the way now
rather than wait until the last second.
The evolutionary benefits of immediately getting out of the way of approaching dangers were so strong that natural selection endowed us—and other mammals—with brains that intentionally see and hear the world inaccurately.
Although this kind of bias might inhibit economically rational judgment in laboratory tasks, it leads us to behave in a deeply rational manner in the real world.
Being accurate is not always smart.

NOT ALL ERRORS ARE CREATED EQUAL

Although many of our decision errors and biases might look like random design flaws at the surface level, a deeper look often reveals that they aren’t random flaws at all.
The mind evolved not simply to be biased but to make specific kinds of errors and not others.
Consider one of our friends, who recently purchased a house with his family.
During the first night in their new home, the family endured a frightening ordeal.
As everyone was sleeping, a ceiling-mounted smoke detector sounded a piercing alarm.
Our friend woke up to an acrid smell and quickly remembered in horror that he had left several boxes of packing materials next to the stove.
His wife hustled frantically to get the kids out of the house, while he managed to call 911.

Thankfully, it was a false alarm.
There was no fire after all—the neighbors were just burning wood in their grungy, resin-coated fireplace.
A few weeks later the smoke alarm shrieked again, waking everyone in the wee hours of morning.
Once again, there was no fire.
A few weeks later, there was another false alarm, and then another.
Annoyed that the smoke detector was making so many errors, our friend decided to adjust the device to reduce its sensitivity.
As he was about to make the smoke detector less responsive, though, his young daughter became visibly distraught.
“But daddy,” she cried, “what if there is a fire and the alarm doesn’t sound?”

This is the smoke detector dilemma.
Do you want to set your smoke detector to be less sensitive or more sensitive?
This depends on how much you value not being annoyed versus not being trapped in a burning home.
Most people would rather have an oversensitive smoke detector, because by tolerating the occasional irritating error they ensure
that their families remain alive.
We intentionally bias the judgments of our smoke detectors because this helps ensure our survival.

The smoke detector dilemma is the same conundrum natural selection had to resolve in designing many of our own built-in decision-making systems.
For many decisions, our brains are wired like ancestral smoke detectors.
A smoke detector is designed to make judgment calls despite having incomplete information.
When it senses trace amounts of airborne particles that might resemble smoke, the device needs either to start screaming, “Fire!
Fire!
Fire!”
or to remain silent.
Our brains have similarly evolved to make judgment calls without having all the pertinent information.

Imagine you’re about to run an errand and need to decide whether to take an umbrella.
You must choose whether to lug this clunky bumbershoot around town all day or take a chance and leave it at home.
Your decision will depend on whether you think it’s going to rain.
The forecast says there is a 50 percent chance of rain, and you see some clouds in the sky.
But you also know there can be clouds without rain, and weather forecasts are notoriously inaccurate.
Like a smoke detector, you need to make a decision based on imperfect information.

The decision-making process will inevitably produce some errors.
We simply can’t be right all the time in a world with imperfect and incomplete information.
But not all errors are created equally.
There are two fundamentally different types of errors, one of which is much costlier than the other.
In the umbrella case, one type of error is that you bring the umbrella, and it doesn’t rain.
Known as a
false alarm
, this error is like that of a smoke detector sounding without a fire.
It’s annoying but not a huge deal.
Alternatively, a second type of error is that it rains, and you fail to bring your umbrella; this is known as a
miss
.
Misses are often much costlier than false alarms in detecting threats.
In the umbrella case, a miss means you may be drenched to the bone and ruin your new dress jacket.
In the smoke detector case, a miss has even more dire consequences: you could die.

Other books

Beneath a Southern Sky by Deborah Raney
Tyran's Thirst (Blood Lust) by Lindsen, Erika
Tears Of The Giraffe by Smith, Alexander Mccall
Radiance by Catherynne M. Valente
Clinch by Martin Holmén
Whatever After #4: Dream On by Mlynowski, Sarah
Motor City Fae by Cindy Spencer Pape