Brain Buys (20 page)

Read Brain Buys Online

Authors: Dean Buonomano

Another study by Kahneman and Tversky showed that doctors’ decisions to recommend one of two medical treatments were influenced by whether they were told that one of the procedures had a 10 percent survival or a 90 percent mortality rate.
8

Marketers understood the importance of framing well before Kahneman and Tversky set about studying it. Companies have always known that their product should be advertised as costing 10 percent less than their competitors’, not 90 percent as much. Likewise, a company may announce “diet Chocolate Frosted Sugar Bombs, now with 50% less calories,” but they would never lead their marketing campaign with “now with 50% as many calories.” In some countries stores charge customers who pay with credit cards more than those who pay with cash (because credit card companies generally take a 1 to 3 percent cut of your payment). But the difference between credit and cash is always framed as a discount to the cash customers, never as a surcharge to those paying with their credit card.
9

In another classic experiment Kahneman and Tversky described a different cognitive bias: anchoring. In this study they asked people if they thought the percent of African countries in the United Nations was above or below a given value: 10 percent for one group and 65 percent for the other (the subjects were led to believe these numbers were chosen at random). Next, the subjects were asked to estimate the actual percentage of African countries in the United Nations. The values 10 percent and 65 perent served as “anchors,” and it turns out they contaminated people’s estimates. People in the low anchor group (10 percent) on average estimated 25 percent, whereas those in the high anchor group (65 percent) came up with an estimate of 45 percent.
10
This anchoring bias captures the fact that our numerical estimates can be unduly influenced by the presence of irrelevant numbers.

Frankly, I have always been a bit skeptical about exactly how robust cognitive biases such as the anchoring effect are, so I performed an informal experiment myself. I asked everyone I bumped into the following two questions: (1) how old do you think vice president Joseph Biden is? and (2) how old to you think the actor Brad Pitt is? Every time I asked someone these questions I switched the order. So there were two groups: Joe then Brad (Joe/Brad) and Brad then Joe (Brad/Joe) groups. My first and most surprising finding was that I knew 50 people. The second finding was that when I averaged the age estimates of Brad and Joe in the Brad/Joe group, they were 42.9 and 61.1; whereas in the Joe/Brad group the average estimates of Brad’s and Joe’s ages were 44.2 and 64.7 (at the time Brad Pitt was 45 and Joe Biden was 66). The estimates of Joe Biden’s age were significantly lower when they were “anchored” by Brad Pitt’s age.
11
The estimates of Brad’s age were higher when they were anchored by Joe’s; however, this difference was not statistically significant. The anchoring effect occurs when people are making guesstimates—no matter what the anchor, no effect will be observed if Americans are asked how many states there are in the United States. So it is possible that Brad influenced estimates of Joe’s age more than the other way around, because people had more realistic estimates of Brad Pitt’s age (I live in Los Angeles).

Being misled by Brad Pitt’s age when estimating Joe Biden’s is not likely to have any palpable consequences in real life. In other contexts, however, the anchoring effect is one more vulnerability to be exploited. We have all heard the stratospheric amounts some plaintiffs sue big companies for—in one recent case a jury ordered that a cigarette company pay a single individual $300 million.
12
These astronomical values are not merely driven by a fuzzy understanding of zeros by the jury, but represent a rational strategy by the prosecution to exploit the anchoring effect by planting vast sums in the minds of the jury during the trial. Similarly, in salary negotiations it is likely that the anchoring effect plays an important role, particularly in situations when both parties are unfamiliar with the worth of the services in question. As soon as one party mentions an initial salary, it will serve as an anchor for all subsequent offers and counteroffers.
13

Both the framing and anchoring biases are characterized by the influence of prior events—the wording of a question or the exposure to a given number—on subsequent decisions. Evolutionarily speaking, these particular biases are clearly recent arrivals since language and numbers are themselves newcomers. But framing and anchoring are simply examples of a more general phenomenon: context influences outcome. Human beings are nothing if not “context-dependent” creatures, and language is one of the many sources from which we derive context. The “meaning” of a syllable is determined in part by what precedes it (today/yesterday; belay/delay). The meaning of words is often determined by the words that precede it (bed bug/computer bug; a big dog/a hot dog). And the meaning of sentences is influenced by who said them and where (“He fell off the wagon” means very different things depending upon whether you hear it at a playground or in a bar). If you give yourself a paper cut, you react differently depending on whether you are home alone or in a business meeting. If someone calls you a jerk, your reaction depends on whether that person is your best friend, your boss, or a stranger. Context is key.

It is irrational that we are swayed by whether an option is framed as “1/3 of the people will live” or “2/3 will die.” But such interchangeable scenarios represent the exception. Most of the time the choice of words is not arbitrary but purposely used to convey context and provide an additional communication channel. If two options are posed as “1/3 of the people will live” and one as “2/3 will die” perhaps the questioner is giving us a hint that the first option is the best one. Indeed, we all use the framing effect automatically and unconsciously. Who among us, when faced with having to relay the expected outcome of an emergency surgical procedure to a shaken sibling, would say, “There is a 50 percent chance Dad will die” rather than “There is a 50 percent chance Dad will survive”? Although most of us don’t consciously think about how we frame questions and statements, we intuitively grasp the importance of the framing. Even children seem to grasp that when Dad asks whether they ate their vegetables they are better off framing their answer as “I ate almost all of my veggies,” than “I left a few of my veggies.”

The framing and anchoring biases are simply examples of situations in which we would be better off not being sensitive to context.

Loss Aversion

If you are like me, losing a $100 bill is more upsetting than finding a $100 bill is rewarding. Similarly, if your stock investment starts at $1000, proceeds to rise to $1200 in a week, and a week later it falls back to $1000, the ride down is more agonizing than the ride up is enjoyable. That a loss carries more emotional baggage than an equivalent gain is referred to as loss aversion.

In one prototypical experiment half the students in a classroom were given a coffee mug with the school logo on it. Next, the students with the mugs were asked to set a sale price for their mugs, and the mugless students were asked how much they would be willing to pay for the mugs. The median asking price for the mugs was $5.25, and the median buying offer was around $2.50.
14
The mug owners overvalued their newly acquired mugs, at least in comparison to what the other members of the class believed they were worth. Loss aversion (related to another cognitive bias referred to as the endowment effect) is attributable to placing a higher value on the things we already own—the fact that it is my mug makes it more valuable and more difficult to part with.
15

Loss aversion is the source of irrational decisions in the real world. Among investors a typical response to an initial investment going down in value is “I’ll sell it as soon as it goes back up”; this is referred to as “chasing a loss.” In some cases this behavior can lead to even more dramatic losses that could have been avoided if the investor had been able to come to terms with a relatively small loss and sold the stock at the first sign of danger.
16
One also suspects that loss aversion contributes to our distaste for paying taxes. Parting with our hard-earned money can be traumatic even though we know that a nation cannot be held together without a taxation system and that taxes in the United States are well below the average of developed countries.

Most people will not accept an offer in which there is a 50 percent chance they will lose $100, and a 50 percent chance they will win $150, even though it is a more than fair proposition.
17
Standard economic theory argues that taking the bet is the rational choice, which it is, if our goal is to maximize our potential net worth. The very notion of investing and accumulating wealth is, however, a modern contrivance; one that is still primarily limited to those citizens of the world who do not need to spend any time worrying about where their next meal will come from.

Money is a recent cultural invention, one that provides an easily quantifiable and linear measure of value. Our neural operating system did not evolve to make decisions expressed numerically or that involve exchanges of pieces of paper whose worth lies in a shared belief that they are valuable.

An ecologically realistic scenario would be to consider propositions relating to a more tangible resource, such as food. When food is at stake, loss aversion begins to make a bit more sense. If our ancestor, Ug, who lived in the African savannah, had a stash of food that would last a couple of days, and a Martian anthropologist popped up and offered him 2: 1 odds on his food supply, Ug’s disproportionate attachment to his current food supply would seem quite rational. First, if Ug is hungry and food scarce, a loss could result in death. Also, in contrast to money, food is not a “linear” resource. Having twice as much food is not necessarily twice as valuable; food is perishable and one can only eat so much of it.
18
Finally, a proposition such as a bet assumes that there is a very high degree of trust between the parties involved—something we take for granted when we play the Lotto or go to a casino but was unlikely to exist early in human evolution. The brain’s built-in loss aversion bias is probably a carryover from the days our primate ancestors made decisions that involved resources that didn’t obey the tidy linear relationships of monetary wealth or the simple maxim, the more, the better.

Probability Blindness

Imagine that I have a conventional six-sided die with two faces painted red (R) and the other four painted green (G), and that I rolled this die 20 times. Below I’ve written down three potential partial sequences, one of which actually occurred. Your job is to pick the sequence that you think is most likely to be the real one.

1.
R-G-R-R-R

2.
G-R-G-R-R-R

3.
G-R-R-R-R-R

Which would you choose? When Kahneman and Tversky performed this experiment, 65 percent of the college students in the study picked sequence 2, and only 33 percent correctly picked sequence 1.
19
We know that G is the most likely outcome of any single roll (2/3 probability versus 1/3 probability), so we expect fewer Rs. Most people seem to favor sequence 2 because at least it has two Gs; however, the significance of the fact that there are fewer elements in sequence 1 is often missed: any specific sequence of five rolls is more likely than any given sequence of six rolls. And sequence 2 is sequence 1 with a G in front of it. So if we calculated the probability of sequence 1,
P
(1) = 1/3 × 2/3 × 1/3 × 1/3 × 1/3, the likelihood of sequence 2 has to be less: 2/3 ×
P
(1).

Which brings us to the conjunction fallacy. The probability of any event A and any other event B occurring together has to be less likely than (or equal to) the probability of event A by itself. So, as remote as it is, the probability of winning the Lotto in tomorrow’s draw is still higher than the probability of winning the Lotto and of the sun rising. We are suckers for conjunction errors. What do you think is more likely to be true about my friend Robert: (A) he is a professional NBA player, or (B) he is a professional NBA player and is over six feet tall? For some reason option B seems more plausible, even though the probability of B must be less than (or equal to) the probability of A.
20
And speaking of basketball, there is a particularly severe epidemic of conjunction fallacies in the arena of sports. Sportscasters offer a continuous stream of factoids such as “He’s the first teenager in the last 33 years with three triples and two intentional walks in one season”
21
or “He’s the first to throw for three touchdowns on a Monday night game with a full moon on the same day the Dow Jones fell more than 100 points.” Okay, that last one I made up. But my point is that as conditions are added to a statement (more conjunctions), the less likely any set of events is. The conjunction fallacy allows sportscasters to feed the illusion that we are witnessing a one-of-a-kind event by adding largely irrelevant conjunctions—and in doing so increase the likelihood we will stay tuned in.

The Monty Hall problem provides perhaps the best-known example of the inherent counterintuitiveness of probability theory. In the early nineties the Monty Hall problem created a national controversy, or at least as much of a controversy as a brain teaser has ever generated. In 1990 a reader posed a question to the
Parade
magazine columnist Marilyn vos Savant. The question was based on the game show
Let’s Make a Deal
, hosted by Monty Hall. Contestants on the show were asked to pick one of three doors: one had a major prize behind it, and the other two doors had goats behind them (I’m unclear on whether the contestants could keep the goat). In this example of the game, before revealing what was behind the door the contestant had chosen, Monty Hall would always open another door (always one with a goat behind it) and ask, “Would you like to change your choice?” Would you?

Other books

The McKettrick Legend by Linda Lael Miller
Night in Eden by Candice Proctor
The Seduction by Laura Lee Guhrke
Dicey's Song by Cynthia Voigt
Before Adam by Jack London
Deadly Inheritance by Simon Beaufort
Desolation Crossing by James Axler
The Burning City (Spirit Binders) by Alaya Dawn Johnson