Wired for Culture: Origins of the Human Social Mind (27 page)

Read Wired for Culture: Origins of the Human Social Mind Online

Authors: Mark Pagel

Tags: #Non-Fiction, #Evolution, #Sociology, #Science, #21st Century, #v.5, #Amazon.com, #Retail

Axelrod points out that marriage exploits the shadow of the future in the wedding vows


til death do us part.” In the absence of a belief in the afterlife, this is about as long a shadow of the future as any relationship can be expected to produce. Whether this is an argument for making divorce difficult to obtain, we do know that durable exchanges can even help enemies to get along. The most striking illustration of this was the live-and-let-live system that spontaneously arose during the trench warfare of World War I. Enemy combat units, facing each other from their trenches and engaging in daily bouts of deadly warfare across no-man’s-land, evolved sophisticated measures to
avoid
killing each other. Artillery would be fired at the same time every day, and always a bit short. Snipers would aim high. Famously, some of these enemies even shared Christmas gifts and played soccer one Christmas Day. Commanders had to use threats of courts-martial to break up these spontaneous reciprocal relationships.

Even with a long shadow of the future, the prospect of a defection looms large in any cooperative relationship. Someone might “forget” to return your favor, or simply make a mistake and fail to return your kindness. What should you do? If you do nothing, they might get the idea they can cheat you every now and then. Experiments with volunteers and studies using computers to simulate cooperation have shown that a simple strategy of repaying kindness with kindness and betrayal with revenge is surprisingly effective. If your partner betrays you, punish them. Axelrod called it “tit for tat.” It is not very costly to you, and defectors quickly learn that they will not be tolerated.

On the other hand, Mahatma Gandhi famously pointed out that this simple “eye for an eye” strategy “makes the whole world blind” as formerly happy cycles of cooperation can disintegrate into endless cycles of punishment and revenge. Indeed, anthropologists’ accounts of tribal conflict cite
tit-for-tat
cycles of revenge and counterrevenge in response to homicides and thievery as the most common cause of skirmishes and warfare between groups. In
War Before Civilization
, Lawrence Keeley recounts the history of violence between two groups in New Guinea that he discreetly labels A and B:

Village A owed village B a pig as reward for B’s help in a previous war in which the latter had killed one of A’s enemies. Meanwhile, a man from village A heard some (untrue) gossip that a man from village B had seduced his young wife; so, with the aid of a relative, he assaulted the alleged seducer. Village B then “overreacted” to this beating by making two separate raids on village A, wounding a man and a woman… . These two raids by village B led to a general battle in which several warriors on both sides were wounded, but no one was killed… later… a warrior from village B, to avenge a wound suffered by one of his kinsmen during the battle, ambushed and wounded a village A resident. The following day battle was resumed and a B villager was killed. After this death, the war became general: all the warriors of both villages, plus various allies, began a series of battles and ambushes that continued intermittently for the next two years.

One way to end
tit-for-tat
cycles of revenge and counterrevenge might be to acquire the dispositions that encourage you to exterminate your enemy in a great rush of violence. It might be just such dispositions that fuelled the brutality we saw earlier between the Tutsi and Hutus. On the other hand, if cooperation has been valuable in our past, then we might expect it to have given us strategies of forgiveness as a way of avoiding these cycles. And indeed a strategy of ignoring the first act of betrayal, then waiting before resuming cooperation, can be shown to work better than
tit for tat
. Assume the betrayal was a mistaken judgment, a moment of weakness, or maybe just a slipup. This allows groups of generous and forgiving cooperators to overcome the occasional bout of moral weakness or mere mistake from someone within their ranks.

An even better strategy is more wily and self-serving. It is sometimes confusingly called
win-stay, lose-shift
,
even though it is subtly different from the straightforward version of that strategy we saw in Chapter 3. Colloquially, we might think of it as a mild form of sulking, but with an added twist. In this setting, a person responds with cooperation so long as the other person is cooperative—if you are winning, you stay. Confronted by a betrayal, you don’t respond with punishment; rather you simply withhold cooperation—this is the sulking, and it corresponds to “if you lose, shift tactics.” On the next exchange, though, you switch back to cooperating, and continue to cooperate so long as the other person cooperates.

This form of
win-stay, lose-shift
is the policy we follow when we have a brief argument and then make up. It gives people a second chance, and if they take that chance, cooperation is maintained. If they don’t and continue to betray you,
win-stay, lose-shift
again switches back to withholding cooperation. By merely withholding cooperation rather than overtly punishing someone who has betrayed you, the strategy avoids having a series of exchanges dissolve into cycles of betrayal and revenge. At the same time it makes it clear to cheaters and others who might “free-ride” on your goodwill that their behavior won’t work, and it offers incentives in the forms of glimpses of what cooperation can look like.

But what is the twist? This strategy also has a self-serving trick up its sleeve. Every now and then
it
tries defecting. Why would it do this? Remember, natural selection is not about goodness and light; it is about strategies that promote replicators. For all the
win-stay, lose-shift
strategist knows, it might be playing against a Good Samaritan who always behaves cooperatively, or simply a gullible person who always does the nice thing. The
win-stay, lose-shift
strategy cunningly exploits them by defecting. A Good Samaritan will nevertheless continue to cooperate, so
win-stay, lose-shift
, being on a winning streak, stays and exploits them again. The success of this strategy against other forgiving but less wily strategies tells us to expect that natural selection might have built into us emotions for taking advantage of the weak or gullible—an emotion that we sadly cannot easily deny is part of our species.

AN EXPECTATION FOR FAIRNESS

OF ALL
the emotions associated with getting acts of reciprocity to work, our expectation for fairness is perhaps the most intriguing and explosive. If forgiveness and generosity are like investments in keeping a cooperative relationship going, our sense of fairness is more like a police force. It is the emotion behind our belief that it is wrong for others to take advantage of us, and it might take the form in our own minds of our conscience, telling us that it is wrong to take advantage of others. It can be schizophrenic in its effects, capable of producing violence on the one hand, and startling altruism on the other. Its violent side disposes us to punish people whose actions reveal them as selfish, and for this reason it is sometimes called
moralistic aggression
. We like to think it is something only others do, but honking horns at people who cut into traffic, or heckling people who jump lines are commonplace instances of moralistic aggression deriving from a sense that someone’s actions are not fair.

Once, travelling in Vienna, I was waiting for a tram, and even as it was arriving and the doors were still opening an older woman on board, wrapped in a head scarf, wagged her finger disapprovingly and hissed at me indignantly. Evidently I wasn’t leaping forward quickly enough to help a younger woman with a baby in a pushchair who was preparing to clamber down the stairs to the sidewalk. In 2009, a man in the city of Guangzhou in China threatened to commit suicide by jumping from a bridge. His presence on the bridge caused traffic jams, and eventually he was approached by a passer-by who shoved him over the edge, telling a newspaper later on that he was fed up with the desperate man’s “selfish activity.”

On the other hand, the altruistic side of our sense of fairness can produce surprising acts of human kindness that, if we observed them in any other animal, we would think we were watching an animated Disney film. In the late 1960s, the social scientist Henry Hornstein left wallets in public places throughout New York City. The wallets contained money and identification so that if someone found a wallet, he or she could contact the owner. To everyone’s surprise, around half of the wallets—along with the money—were returned. True, wallets with more money in them were less likely to be returned, but the majority of people went out of their way to return a wallet to someone unknown to them, at a personal cost of time and effort and with no promise of any recompense.

Doing the right thing is something we take for granted, even in this anonymous situation, but in no other animal on Earth would the thought that returning the wallet was the “right” thing to do even come to mind. Why do we behave this way? The economist Kaushik Basu points out that this expectation for fairness runs deep in our minds. Consider, he says, that when you ride in a taxi and you get to your destination, if you are like most people you reflexively pay the taxi driver rather than running off. It is an action you probably give very little thought to—just what we do in such situations. Even so, Basu reminds us that this is a revealing action because most of the time no one else observes us, and the thought of running off must have occurred to everyone who has ever ridden in a taxi, even though almost none of us do it. Equally, when we pay taxi drivers, they do not turn around and demand further payment. Both of you might be behaving this way out of fear of getting the police involved or of violent reprisal by the other, but still we somehow feel such behavior would be wrong or unfair.

Are we programmed somehow to do the right thing—as when we return lost wallets or pay taxi drivers—even at a cost to ourselves, just because it is the “right” thing to do, and to expect the same from others? Social scientists and economists who get people to participate in an economic exchange called “the ultimatum game” think so. In this game, volunteers are given a sum of money, say, $100. They are told they have to give some of it to an anonymous other person, but the amount they offer is up to them. The other person can either accept the offer, in which case both people keep their portions of the money, or reject the offer, in which case neither person gets anything. Volunteers are told they will not ever see each other and that the experiment involves just this one exchange.

Now, recipients should accept any offer, as something free is better than nothing. Knowing this, the person with the money should offer the smallest amount. But neither party behaves this way. The game has been played with university students, and in cultures around the world, including hunter-gatherer societies. Time after time, recipients reject as “unfair” offers below 20–40 percent of the sum given to the first person, and both parties walk away empty-handed. Those making offers seem to expect this, and the typical offer is often around 40 percent of the total. In a related exchange known as “the Dictator game,” the people are told they can offer whatever amount they wish and that the recipient does not have any choice in the matter. Offers are lower, but people still give away some of their money.

What is going on? Would you reject a low offer? If so, why? Would you give more than the minimum? If so, why? Remember you are not going to see this person ever again. Some researchers interpret our behavior in these games as evidence that humans are hard-wired for altruism—that we both offer it and expect it from others. They say our actions are governed by a principle of
strong reciprocity,
a deep moral sense to behave in ways that benefit others, even when this means suffering a personal cost. According to these researchers, our strong reciprocity is evidence that human social behavior evolved by the process of
group selection
. This is the idea we saw that natural selection can choose among competing groups of people. The most successful groups in our past were those in which individuals put aside their own interests to pull together, even when that meant sacrificing our own well-being. This altruistic behavior is supposedly what we are seeing in the ultimatum game: donors give more than they need to, and recipients expect this. When donors give a small amount, recipients punish them, even though this means that the recipient gives up some money. Strong reciprocity is, according to some social scientists and economists, why we pay taxi drivers and why we return wallets, but also why we might even go to war for our country.

Group selection can work, but its effects are weak and it will always be opposed by selection promoting individual interest over the good of the group: when everyone else is pulling together, it might pay
you
to hold back. Could it be, then, that people reject low offers in the ultimatum games, not out of any sense of duty to the group but simply because a low offer is not a
fair
way for two people to divide up money that neither one really has much claim to? My hunch is that if you are like most people reading this account of the ultimatum game, you are feeling that it would indeed be unfair for someone to make you a low offer. This is especially true in the ultimatum game experiments, because the donor and recipient know their roles are arbitrary and could just as easily have been reversed. They also know—because they have been told—that the money has to be divided. In such circumstances, it is indeed only fair to divide the money equally, or at least nearly so, if we wish to acknowledge that someone is entitled to more just by the luck of the draw.

Fairness sounds good, but still, why do we think things must be fair? And why do we turn down the offer of free money? Why not just take whatever is on offer, and walk out of the experiment better off for it? What use is it turning down what you consider to be a miserly offer to “punish” someone (and thus yourself as well) you are never going to see again? Well, when something is unfair, most of us feel an emotion of indignation or anger, and it makes us want to lash out and punish the other person. We do so by rejecting their offer. But why do we do this? If we think about it, this is a spiteful act on our part. Yes, we punish that other person, maybe we feel better for it, and perhaps the punishment makes it less likely the person will behave that way in the future. But our spite also benefits anyone else who might do business with him or her. Given that it has cost us to behave this way, this is an act of altruism on our part. We don’t expect this kind of altruism to evolve because your actions help others at a cost to you.

Other books

DUALITY: The World of Lies by Paul Barufaldi
Like Honey by Liz Everly
The Matisse Stories by A.S. Byatt
The Scarlet Pepper by Dorothy St. James
Two Wolves by Tristan Bancks
Lottery Boy by Michael Byrne
Plain Jane by Fern Michaels
Finding Home by Kelley, Aine