Against the Gods: The Remarkable Story of Risk (54 page)

The moral of these results is disturbing. Invariance is normatively
essential [what we should do], intuitively compelling, and psychologically unfeasible.12

The failure of invariance is far more prevalent than most of us realize. The manner in which questions are framed in advertising may persuade people to buy something despite negative consequences that, in a
different frame, might persuade them to refrain from buying. Public
opinion polls often produce contradictory results when the same question
is given different twists.

Kahneman and Tversky describe a situation in which doctors were
concerned that they might be influencing patients who had to choose
between the life-or-death risks in different forms of treatment.13 The
choice was between radiation and surgery in the treatment of lung cancer. Medical data at this hospital showed that no patients die during
radiation but have a shorter life expectancy than patients who survive
the risk of surgery; the overall difference in life expectancy was not
great enough to provide a clear choice between the two forms of treatment. When the question was put in terms of risk of death during
treatment, more than 40% of the choices favored radiation. When the
question was put in terms of life expectancy, only about 20% favored
radiation.

One of the most familiar manifestations of the failure of invariance
is in the old Wall Street saw, "You never get poor by taking a profit."
It would follow that cutting your losses is also a good idea, but investors
hate to take losses, because, tax considerations aside, a loss taken is an
acknowledgment of error. Loss-aversion combined with ego leads investors to gamble by clinging to their mistakes in the fond hope that
some day the market will vindicate their judgment and make them
whole. Von Neumann would not approve.

The failure of invariance frequently takes the form of what is known
as "mental accounting," a process in which we separate the components
of the total picture. In so doing we fail to recognize that a decision affecting each component will have an effect on the shape of the whole.
Mental accounting is like focusing on the hole instead of the doughnut.
It leads to conflicting answers to the same question.

Kahneman and Tversky ask you to imagine that you are on your
way to see a Broadway play for which you have bought a ticket that
cost $40.14 When you arrive at the theater, you discover you have lost
your ticket. Would you lay out $40 for another one?

Now suppose instead that you plan to buy the ticket when you
arrive at the theater. As you step up to the box office, you find that you
have $40 less in your pocket than you thought you had when you left
home. Would you still buy the ticket?

In both cases, whether you lost the ticket or lost the $40, you would
be out a total of $80 if you decided to see the show. You would be out
only $40 if you abandoned the show and went home. Kahneman and
Tversky found that most people would be reluctant to spend $40 to
replace the lost ticket, while about the same number would be perfectly
willing to lay out a second $40 to buy the ticket even though they had
lost the original $40.

This is a clear case of the failure of invariance. If $80 is more than
you want to spend on the theater, you should neither replace the
ticket in the first instance nor buy the ticket in the second. If, on the
other hand, you are willing to spend $80 on going to the theater, you
should be just as willing to replace the lost ticket as you are to spend
$40 on the ticket despite the disappearance of the original $40. There
is no difference other than in accounting conventions between a cost and a loss.

Prospect Theory suggests that the inconsistent responses to these
choices result from two separate mental accounts, one for going to the
theater, and one for putting the $40 to other uses-next month's lunch
money, for example. The theater account was charged $40 when the
ticket was purchased, depleting that account. The lost $40 was charged
to next month's lunch money, which has nothing to do with the theater account and is off in the future anyway. Consequently, the theater
account is still awaiting its $40 charge.

Thaler recounts an amusing real-life example of mental accounting.15 A professor of finance he knows has a clever strategy to help him deal with minor misfortunes. At the beginning of the year, the professor plans for a generous donation to his favorite charity. Anything untoward that happens in the course of the year-a speeding ticket, replacing
a lost possession, an unwanted touch by an impecunious relative-is
then charged to the charity account. The system makes the losses painless, because the charity does the paying. The charity receives whatever
is left over in the account. Thaler has nominated his friend as the world's
first Certified Mental Accountant.

In an interview with a magazine reporter, Kahneman himself confessed that he had succumbed to mental accounting. In his research
with Tversky he had found that a loss is less painful when it is just an
addition to a larger loss than when it is a free-standing loss: losing a second $100 after having already lost $100 is less painful than losing $100
on totally separate occasions. Keeping this concept in mind when moving into a new home, Kahneman and his wife bought all their furniture
within a week after buying the house. If they had looked at the furniture as a separate account, they might have balked at the cost and ended
up buying fewer pieces than they needed.16

We tend to believe that information is a necessary ingredient to
rational decision-making and that the more information we have, the
better we can manage the risks we face. Yet psychologists report circumstances in which additional information gets in the way and distorts
decisions, leading to failures of invariance and offering opportunities for
people in authority to manipulate the kinds of risk that people are willing to take.

Two medical researchers, David Redelmeier and Eldar Shafir,
reported in the Journal of the American Medical Association on a study
designed to reveal how doctors respond as the number of possible
options for treatment is increased." Any medical decision is risky-no
one can know for certain what the consequences will be. In each of
Redelmeier and Shafir's experiments, the introduction of additional
options raised the probability that the physicians would choose either
the original option or decide to do nothing.

In one experiment, several hundred physicians were asked to prescribe treatment for a 67-year-old man with chronic pain in his right hip. The doctors were given two choices: to prescribe a named medication or to "refer to orthopedics and do not start any new medication"; just about half voted against any medication. When the number
of choices was raised from two to three by adding a second medication
option, along with "refer to orthopedics," three-quarters of the doctors
voted against medication and for "refer to orthopedics."

Tversky believes that "probability judgments are attached not to
events but to descriptions of events ... the judged probability of an
event depends upon the explicitness of its description."18 As a case in
point, he describes an experiment in which 120 Stanford graduates
were asked to assess the likelihood of various possible causes of death.
Each student evaluated one of two different lists of causes; the first
listed specific causes of death and the second grouped the causes under
a generic heading like "natural causes."

The following table shows some of the estimated probabilities of
death developed in this experiment:

These students vastly overestimated the probabilities of violent deaths
and underestimated deaths from natural causes. But the striking revelation in the table is that the estimated probability of dying under either
set of circumstances was higher when the circumstances were explicit
as compared with the cases where the students were asked to estimate
only the total from natural or unnatural causes.

In another medical study described by Redelmeier and Tversky,
two groups of physicians at Stanford University were surveyed for their
diagnosis of a woman experiencing severe abdominal pain.19 After
receiving a detailed description of the symptoms, the first group was
asked to decide on the probability that this woman was suffering from
ectopic pregnancy, a gastroenteritis problem, or "none of the above."
The second group was offered three additional possible diagnoses along
with the choices of pregnancy, gastroenteritis, and "none of the above"
that had been offered to the first group.

The interesting feature of this experiment was the handling of the
"none of the above" option by the second group of doctors. Assuming
that the average competence of the doctors in each group was essentially equal, one would expect that that option as presented to the first
group would have included the three additional diagnoses with which
the second group was presented. In that case, the second group would
be expected to assign a probability to the three additional diagnoses
plus "none of the above" that was approximately equal to the 50%
probability assigned to "none of the above" by the first group.

That is not what happened. The second group of doctors assigned a
69% probability to "none of the above" plus the three additional diagnoses and only 31% to the possibility of pregnancy or gastroenteritisto which the first group had assigned a 50% probability. Apparently, the
greater the number of possibilities, the higher the probabilities assigned
to them.

Daniel Ellsberg (the same Ellsberg as the Ellsberg of the Pentagon
Papers) published a paper back in 1961 in which he defined a phenomenon he called "ambiguity aversion."20 Ambiguity aversion means
that people prefer to take risks on the basis of known rather than
unknown probabilities. Information matters, in other words. For example, Ellsberg offered several groups of people a chance to bet on drawing either a red ball or a black ball from two different urns, each
holding 100 balls. Urn 1 held 50 balls of each color; the breakdown in
Urn 2 was unknown. Probability theory would suggest that Urn 2 was
also split 50-50, for there was no basis for any other distribution. Yet the overwhelming preponderance of the respondents chose to bet on
the draw from Urn 1.

Tversky and another colleague, Craig Fox, explored ambiguity
aversion more deeply and discovered that matters are more complicated
than Ellsberg suggested.21 They designed a series of experiments to discover whether people's preference for clear over vague probabilities
appears in all instances or only in games of chance.

The answer came back loud and clear: people will bet on vague
beliefs in situations where they feel especially competent or knowledgeable, but they prefer to bet on chance when they do not. Tversky
and Fox concluded that ambiguity aversion "is driven by the feeling of
incompetence ... [and] will be present when subjects evaluate clear
and vague prospects jointly, but it will greatly diminish or disappear
when they evaluate each prospect in isolation."22

Other books

Primal Possession by Katie Reus
The Book of the Dead by Carriger, Gail, Cornell, Paul, Hill, Will, Headley, Maria Dahvana, Bullington, Jesse, Tanzer, Molly
IntoEternity by Christina James
The End of the Affair by Graham Greene
Bad Boy's Last Race by Dallas Cole
Poison Frog Mystery by Gertrude Chandler Warner
Raising Rain by Debbie Fuller Thomas
Death at Whitechapel by Robin Paige
Wylde by Jan Irving