Authors: Paul Bloom
Individuals’ behavior in the Ultimatum Game, then, provides no support for the Robin Hood theory. But now consider
the Dictator Game. First thought up by the psychologist Daniel Kahneman and his colleagues, this is just like the Ultimatum Game except that it removes the stage where the recipients get to make a choice. The participants get sums of money and can give however much they want to anonymous strangers. And that’s it—they keep what they choose to keep.
Plainly, a self-interested agent would give nothing. But this is not what people do. There have been more than a hundred published studies on dictator games, and it turns out that most people do give, and the average gift is between
20 and 30 percent. Some studies find even greater generosity, reporting that many people give half or just a bit less than half.
Unlike in the Ultimatum Game, this generosity cannot be explained as due to fear of retaliation. So one interpretation of these findings is the Robin Hood analysis—the dictator hands over the money out of a sense of fairness. That is, the person who gets to make the choice has managed to put aside her particular position in the world and is considering the optimal solution from the position of an uninvolved bystander. Since there is no reason for the dictator to get more than the other person, she is driven to split the windfall evenly (though, because of human frailty, she might keep a bit extra for herself).
Now, I’m not the first to point that there’s something odd about this interpretation. While it’s true that some people believe in the egalitarian principle that the best world is one where resources are evenly divided, we don’t, as a rule, feel compelled to give away half of our money to the person standing next to us.
We are often generous, but not in this sort of indiscriminate way. This is true even when the money comes as a windfall. Suppose you find twenty dollars on the sidewalk. Do you immediately hand over ten dollars to the next person who walks by, on the logic that it was just luck that it was you who found it and not he? Likely not.
So why are people so nice in these laboratory experiments? A different sort of explanation is social pressure. The participants know that they are in a study that is looking
at kindness and fairness. The situation is typically framed so that one can act on a continuum of generosity, and the worst thing is to give nothing. The finding that most people give something might largely be explained by the fact that nobody wants to look like an ass.
To see the effects of an audience, just imagine, as one researcher suggested, playing the Dictator Game on national television, with all your family and friends watching. Wouldn’t this make you more generous? It is not surprising that laboratory studies find that
the more observable one’s choice is, the more one gives.
Even pictures of eyes on the wall or on the computer screen make people kinder, presumably because they trigger thoughts of being watched. The idea here was nicely expressed by
Tom Lehrer, in his song about the Boy Scouts: “Be careful not to do / Your good deeds when there’s no one watching you.”
While the standard Dictator scenario is supposed to be anonymous, still, participants might not believe experimenters’ assurances that this is so. And they’re
right
to be suspicious; sometimes they are being lied to. Furthermore, the motivation to make a good impression on others might be operative even when one consciously believes that there is no audience.
This might all seem picky. If people give generously, what difference does it make if this generosity is motivated by worries about how others will see them? It turns out, though, that a pure egalitarian impulse is one thing and the desire to look good is quite another. Two clever sets of experiments make this point.
In the first,
the psychologist Jason Dana and his colleagues tweaked the standard Dictator Game. They set up the basic game with $10, but now some participants could choose between playing the regular game or taking $9 and leaving the game. They were told that if they chose this second option, the recipient would never learn that he or she was ever in a Dictator Game.
A selfish individual, just in it for the money, would agree to play the game and keep the whole $10 for the maximum gain. A generous individual, on the other hand, would agree to play and give away some chunk of the $10. Neither would opt out for $9, because this option would give the player less than $10 (and so doesn’t make sense from a selfish perspective) and would give the other person nothing (and so doesn’t make sense from a generous perspective).
Still, over a third of participants chose to take the $9. This is likely because they wanted the money but didn’t want to be put in a position where they would feel pressured to give a substantial amount of it away. The analogy here is walking down a street where a beggar is waiting. If you were cold-blooded, you would walk by and do nothing; if you were generous, you would walk by and hand over some money. But if you didn’t want to be put in a position of feeling obliged to give, you might take a third option: you might cross the street to avoid the beggar altogether.
The second set of experiments was done by the economist John List. He started with a game where the dictator was given $10 and the recipient was given $5. As usual, the dictator could give as much of his money to the other
person as he wanted. In this simple condition, the average gift was $1.33, a sensibly generous amount.
A second group of participants were told that they could give as much as they wanted—but they could also take $1 from the other person. Now, the average gift dropped to 33 cents. And a third group were told that they could give as much as they wanted but could also take as much as they wanted, up to the whole $5. Now, they
took
on average $2.48, and very few gave anything.
We should stop and marvel at how weird this is. If the standard explanation of giving in the Dictator Game is correct—that it reflects an impulse to share the wealth—it shouldn’t matter if someone adds the option of taking. But suppose now that the giving is motivated, at least in part, by a desire to look good. Now the option of taking makes a difference, because the worst possible option is no longer giving nothing, it’s taking away all of the other person’s money. The participant might think: A real jerk would leave this person with nothing. I don’t want to look like a jerk—I’ll just take a little. Taken together, these studies suggest that the behavior in the Dictator Game is influenced by factors that have little to do with altruistic and egalitarian motives and much to do with
looking
altruistic and egalitarian.
T
HE
economist Ernst Fehr and his colleagues were among the first to explore how children behave when faced with economic games. They tested Swiss children, from three to eight years old, and instead of money, they used candies. In the experiments I’ll discuss here, the children were told
that their decisions would affect children that they didn’t know who came from the same playschool, kindergarten, or school.
One of the games was a variant of the Dictator Game: each child was given two candies and had the option of either keeping one and giving the other away or of keeping both. In this condition, the seven- and eight-year-olds were generous—about half of them gave away a candy. But the younger children were greedy—only about 20 percent of the five- and six-year-olds gave away a candy, and only about 10 percent of the three- and four-year-olds did so. This selfishness on the part of young children fits with
more recent research on the Dictator Game in different countries—including America, Europe, China, Peru, Brazil, and Fiji—that finds that young children are much less likely than older children or adults to give away what they have to a stranger.
One might conclude that young children don’t care at all about equality when they themselves are involved. But perhaps this is unfair. Maybe the youngest children have the same equity/kindness/fairness impulse as the older ones, but they have less self-control and so, unlike the older children, they cannot overcome their self-interest. Their appetite overwhelms their altruism.
To test this theory, Fehr and colleagues developed another game—the Prosocial Game—that avoids this conflict between altruism and self-control. Here the child gets a candy no matter what; the choice is whether to give the other individual a candy as well. This allows children to be altruistic (and fair, and egalitarian) at no cost to themselves.
The seven- and eight-year-olds did as one would expect: about 80 percent gave away a candy. For the younger children, however, only about half did. That is, about half of the children in the younger age groups chose not to give away a candy to a stranger—even if it cost them
nothing.
Other studies explore children’s emotional reactions to fair and unfair distributions that they themselves are affected by.
The psychologist Vanessa LoBue and her colleagues tested three-, four-, and five-year-olds in preschools and did so up close and personal—unlike the studies so far, this one didn’t have children deal with anonymous strangers. Instead, the experimenters would pair up two children from the same class. The children played together with blocks in a quiet room for five minutes and then they put the blocks away. An adult came in to tell them that since they had helped to clean up, they were going to get stickers. In full view of both the children (named, for example, Mary and Sally), the experimenter handed over stickers one at a time, giving a running tally: “One sticker for Mary, one sticker for Sally. Two stickers for Mary, two stickers for Sally. Three stickers for Mary. Four stickers for Mary.” So Sally would end up with two and Mary would end up with four. Then the experimenter paused for seven seconds, doing nothing and avoiding eye contact, as the children’s spontaneous responses were captured on video. The children were then asked whether the distribution was fair.
Children in Sally’s position usually said that it was not fair, they looked unhappy, and they often asked for more. If asked, children in Mary’s position were likely to agree that
this was unfair, but they didn’t respond to this unfairness in the same way—they weren’t bothered by it. The nicest thing that an advantaged child was seen to do was hand over a sticker after a disadvantaged child complained—but fewer than one in ten did this. And remember that these children weren’t dealing with anonymous strangers; they were sitting next to their classmates, often their friends.
Children are sensitive to inequality, then, but it seems to upset them only when they themselves are the ones getting less.
In this regard, they are similar to monkeys, chimpanzees, and dogs, all of whom show signs of being bothered by getting a smaller reward than someone else. For instance, researchers have done studies with pairs of dogs, in which each dog does a trick. One dog is then rewarded with a nice treat, while the other gets a lesser treat. The researchers find that the dog offered a lesser treat will sometimes act, well, pissed, and refuse to eat it.
Children can also be spiteful in their preferences. The psychologists Peter Blake and Katherine McAuliffe paired up four- to eight-year-olds who had never met, placing them in front of a special apparatus that was set up to distribute two trays of candy. One of the children had access to a lever that gave her the choice either to tilt the trays toward herself and the other child (so that each child got whatever amount of candy was on the nearest tray) or to dump both trays (so that nobody got any candy).
When there was an equal amount of candy in each tray, the children almost never dumped. They also almost never dumped when the distribution favored themselves—say,
four candies on their tray, and one candy on the other child’s tray—though some of the eight-year-olds did reject this choice. But when this distribution was reversed to favor the other child, children at every age group frequently chose to dump both trays. They would rather get nothing than have another child, a stranger, get more than they did.
Further evidence of children’s spiteful natures comes from a series of experiments I’ve just completed in collaboration with Karen Wynn and Yale graduate student Mark Sheskin. We offered children between five and ten years of age a series of choices about how to divide tokens (which could later be exchanged for toys) with another child whom they would never meet. For instance, they would choose between a distribution where each child got one token and a distribution where each child got two tokens. Reasonably enough, when we offered them this choice, they tended to choose the latter option—they got more, and so did the other person.
But we also found that social comparison matters. Consider an option where the chooser and the other child each get one token versus an option where the chooser gets two tokens and the other child gets three. You might think that the latter is the better choice because both children get more; it’s greedier
and
it’s more generous. But choosing a 2/3 split over a 1/1 split means that the chooser will get relatively less than the other child. This was unpleasant to the children we tested, and they often chose 1/1, giving up an extra token so that they wouldn’t end up with a relative disadvantage.
Or take an option where each gets two tokens versus an option where the chooser gets one and the other child gets nothing. The 2/2 option is better for all involved in absolute terms, but the advantage of the 1/0 option is that the chooser gets relatively more than the other child. The older children preferred 2/2, but the five- and six-year-olds preferred the 1/0 option; they would rather have a relative advantage, even at a cost to themselves. Such responses are reminiscent of
a medieval Jewish folktale about an envious man who was approached by an angel, and told that he could have anything he wanted—but his neighbor would get double. He thought for a moment, then asked to have one of his eyes plucked out.
F
AIRNESS
is more than deciding the best way to distribute the positive. We also have to determine how to allocate the negative. This brings us to punishment and revenge, the darker side of morality.