The Unpersuadables: Adventures with the Enemies of Science (11 page)

*

‘I’m not the man my actions would suggest.’
The lyric is effective, because the plea it contains is not true. We are what we do, no matter how desperately we might try to insist otherwise. When I was a boy, despite my trying so hard, I would regularly steal – crisps from cupboards, pens from classmates, money from purses. My parents would catch me, again and again. I would have done anything not to be a thief any more. But something kept happening; some force would take me over. It was as if I would temporarily become another person. And then, the moment that the sin was committed, I would beg of myself, ‘
Why did you do it? Why can’t you stop?’
It was the same when I was older and buried within the ropes of paranoid jealousy, and again when I developed an alcohol problem. During my years of madness, all of my actions suggested that I was a bad man; that I was, in some elementary way,
wrong
.

I felt these feelings return when I failed to assist the crying woman. It was as if I was not in control of my own actions. Throughout my battles in love, theft and drinking I came to know all too well that feeling of reason, of will, of better information, failing to influence my actions. And in the midst of it all, I always knew that I was being mad; inhuman.
That is not how humans behave.
We are in control of ourselves. We
are not victim to convenient ‘invisible forces’. We are one single person, with one set of values and one gallery of beliefs about the world. We are rational. We take in information about the world, we judge its worth and adjust our behaviour accordingly. We are agents of reason.

Everything we know about people tells us that this is so. It is this quality that elevates us above animals. It is the predicate upon which we evaluate the moral worth of other people. It is the basis of our legal system of judgement and punishment. But, as I kneeled in that pagoda, all of that seemed to break down. I wanted to go over there. But I didn’t. I
couldn’t
. Some devil overcame me. And I began to wonder, is it like this for other people?

Then I read about
the events that spiralled from a single phone call to a Kentucky branch of McDonald’s
on 9 April 2004, and I discovered that it is.

*

It rang sometime after 5 p.m. and Donna Summers, the assistant manager, picked up the receiver. Straight away, she knew it was important. ‘I’m a police officer. My name is Officer Scott,’ said the caller. ‘I’ve got McDonald’s corporate on the line here, and the store manager. We have reason to believe that one of your employees – you know: young, small, dark hair – has stolen the purse of a customer. Do you know who I’m talking about?’

Summers knew who it sounded like – Louise Ogborn, the pretty eighteen-year-old who was working to support her family after her mother had fallen ill and lost her job. Officer Scott confirmed that it was indeed Louise and instructed Summers to fetch her, empty her pockets and confiscate her purse and car keys. She would then be required to perform a thorough search of the suspect. When Louise – a former Girl Scout and regular church attendee – was informed of what was about to happen, she began to cry. ‘I didn’t do anything wrong,’ she said. ‘I’ve been out there, working. You can ask anyone. I couldn’t steal!’ Summers instructed Louise to remove one item of clothing at a time and examined each as it was passed to her. When Louise was naked, Summers took the bagged garments outside, ready for collection by Officer Scott’s colleagues, who would be arriving soon.

Louise had been detained, wearing nothing but a dirty apron, for more than an hour when Summers told the policeman that she had to get back to work. ‘The problem is we’re currently having Louise’s home searched for drugs,’ said Officer Scott. ‘Do you have, say, a husband who can watch her for the time being?’ Summers did not. But she did have a fiancé.

Soon afterwards, her partner Walter Nix Jr. – a churchgoer and youth basketball coach – dutifully arrived to guard Louise. Nix took the phone and followed Officer Scott’s instructions precisely. He made Louise dance with her hands in the air to see if any stolen goods would ‘shake out’. He made her open out her vaginal cavity with her fingers, in case anything was hidden in there. He made her turn around and touch her toes, stand on a desk. He made her kiss him, so he could check for alcohol on her breath. When she refused to call Nix ‘sir’, Officer Scott demanded she be reprimanded with a spanking. Nix did just that, for more than ten minutes. Two and a half hours after the initial phone call, Louise was on her knees, tearfully performing fellatio. It only occurred to any of them that Officer Scott might be a hoaxer when the branch’s fifty-eight-year-old odd-job man became suspicious. He refused to take over, despite being reassured by Summers that the whole thing had been ‘approved by corporate’.

Walter Nix Jr. has been described by a friend as ‘a great community guy … a great role model for kids’ who ‘had never even had a ticket’. When he drove home, later that night, he telephoned his best friend. He told him, ‘I have done something terribly bad.’

This was not an isolated incident. Similar incidents had been occurring for years. Ultimately, the scam was pulled in more than seventy restaurants across the United States.

‘The point is that this did not happen occasionally,’ Philip Zimbardo, Professor Emeritus of Psychology at Stanford University, tells me. ‘If it happened just once or twice, you’d say, “Gee, how dumb are these people? How gullible?” But it worked most of the time. The scenario that was created was so compelling that people got trapped in it.’

Zimbardo served as an expert witness in one of the trials that related to the so-called ‘strip-search scams’. He was called because, back in the 1970s, he had become an authority on the invisible
processes which compel good people to do bad things when he carried out a study that remains darkly notorious among students of the psychology of evil. In an attempt to examine the effects of prison life on ordinary individuals, Zimbardo created a mock gaol on the grounds of his university and recruited the twelve most ‘normal and healthy’ young men from a cohort of seventy-five applicants. ‘We randomly assigned half to be guards and half to be prisoners,’ he recalls. ‘I had to end it in six days because it was out of control. Normal healthy college students were having emotional breakdowns. Five of them had to be released early because of the cruelty and sadism of the guards towards them. It demonstrated in a powerful way how situations can overwhelm the best and the brightest.’

The Stanford Prison Experiment is a legendary study in the realm of what became known as ‘situational psychology’. It helped to reveal a terrible flaw in the way humans typically view themselves. We tend to assume that we are in control of ourselves; that inner forces such as character and conscience captain our actions and define our behaviour. But the work of Zimbardo revealed the hitherto unimaginable power of outside forces to affect us. ‘My research and the research of many social psychologists has demonstrated very powerfully that people can be corrupted into behaving in evil ways, often without the awareness of the power of the situation that they find themselves in.’

According to Zimbardo, there is a kind of recipe for creating evil. ‘How did evil come about during the prison experiment?’ he asks. ‘It was people playing a role. You’re assigned a role as a guard, a prisoner, a teacher or a military trainer – any of the roles we play in life. Although you start off thinking those roles are arbitrary and not the real you, as you live them, they become you. The second thing is the power of the group. You’re a guard but you’re in a cadre of other guards, so you put pressure on each other to be tough. Groups can have powerful influences on individual behaviour. Our guards were in uniform and they wore sunglasses to conceal their eyes. We call that de-individuation. You take away somebody’s individuality. You make them anonymous. The next process is called dehumanisation, where you begin to think of other people as different from you and then as different from your kind and kin, and then as less than human. You take away their
humanity. Once you do that, you can do anything to them – harm, hurt, torture, rape, kill. These were the basic processes operating in the abuses at Abu Ghraib prison in Iraq, which I studied at length because I was an expert witness for one of the guards.’

Of course, generating evil was not the intent of the Buddhists at the Vipassana centre in Blackheath. All the people who attend their courses do so voluntarily and a number of those present were returnees who evidently received huge benefit from their practice. When I look back upon my days there, I realise now that, psychologically speaking, I was unprepared, under-researched and weak. Really, I should never have gone. But listening to Zimbardo, I cannot help but wonder if situational forces had an accidental impact on my inability, despite myself, to stand up and attend to the screaming woman. Perhaps it was the role I had taken on, as serious, studious Buddhist; the pressure of the group to conform; the anonymity of the darkness and the prohibition against communicating with anyone. If so, I believe that there was also another powerful engine in play. When Louise Ogborn was asked in court why she did not simply leave the room in which she was being abused, she replied, ‘I was scared, because they were a higher authority to me.’ It was the same reason why her assistant manager and her fiancé behaved as they did. They believed that they were being instructed by someone senior to them.

‘Excessive obedience’ to authority is a flaw in humans that has been known to social psychologists for a long time. This is, in part, due to a set of extremely famous experiments carried out by Professor Stanley Milgram at Yale University in 1961, during which it was discovered that two-thirds of participants were prepared to deliver potentially fatal electric shocks to strangers, simply because they had been told to do so by a man in a white coat.

We are invisibly influenced not only by those in authority, but by those who populate our work and social lives.
In a 2012 paper, neuroscientist Professor Chris Frith
reviews a trove of well-replicated studies that demonstrate how, when we are in the company of others, we can automatically switch from ‘I mode’ to ‘we mode’. ‘We can’t help taking into account the views of others,’ he writes. ‘The brain creates the illusion that we are all independent entities who make our own decisions.
In reality there are powerful unconscious processes that embed us in the social world. We tend to imitate others and share their goals, knowledge and beliefs, but we are hardly aware of this. This is why strange narratives work best when they are shared by a group.’

In 1951, Professor Stanley Milgram’s boss, Dr Solomon Asch
, conducted a simple but devastating test that explored the ease by which we can let the opinions of others affect our own. He showed a hundred and twenty-three participants a series of two simple straight lines and asked them to say whether the first was longer, shorter or the same length as the second. Each person was in a group of eight and, initially, everything was easy. As you would expect, most of the time everyone gave the same answer:
the same
,
longer
,
shorter
,
shorter
,
longer
,
the same
, and so on. But gradually, for one person in the group, everything turned weird. Because all the others began to give answers that were wrong. What Asch wanted to know was this: when it came to their turn, what would that one person do? Go their own way and give the right answer? Or copy all the others?

This was a test to see if pressure from the group (who were actually actors) could compel individuals to defy the evidence of their own eyes. Asch found that around 70 per cent of people did just that. But as amazing and troubling as that finding was, it failed to answer a critical question: did the opinions of others simply intimidate the participant into calling it wrong? Or was his finding evidence of another, infinitely stranger hypothesis? That the group changed how the person actually perceived the line? It is a radical idea. Can it really be true? Can the view of the many actually
change the world
of the one?

It took the development of some advanced technologies before the tantalising beginnings of an answer could be sensed.
In 2005, Dr Gregory Berns, a psychiatrist and neuroscientist
at Emory University in Atlanta, conducted a test based on Asch’s lines, which involved judging the ‘sameness’ of various objects while under social pressure to give the wrong answer. This time, however, the participants were in an fMRI scanner, having their brain activity recorded.

In Berns’s study, people bowed to peer pressure 41 per cent of the time. But did they make a conscious decision to lie? Or were they somehow pressured into actually
seeing
the wrong answer? Were the
situational forces so great that they altered their perception of reality? Checking the fMRI data, Berns’s team found that in the moments prior to a participant giving their answer, there was no corresponding activity in areas of the brain that are associated with conscious decision- making. And yet there was corresponding activity in the area which is involved in the judging of spatial awareness. To put it simply, when these people were considering their response, it seemed as if they were not
analysing
their opinion, but
seeing
it.

Before we leap too high for our conclusion, it must be pointed out that there has recently been a significant backlash in scientific circles against inappropriate levels of confidence in the kinds of things that fMRI scans can tell us. But if further research reinforces these findings, the implications will be weird and dazzling.
In an interview with the
New York Times
after his paper was published, Berns said, ‘We like to think that seeing is believing, but the study’s findings show that seeing is believing what the group tells you to believe.’ In his book
The Lucifer Effect
, Zimbardo writes that this test ‘calls into question the nature of truth itself.’

Other books

To Probe A Beating Heart by Wren, John B
Come Back by Claire Fontaine
Frozen: Heart of Dread, Book One by de la Cruz, Melissa, Johnston, Michael
Dragon's Egg by Robert L. Forward
Some Die Eloquent by Catherine Aird
One Golden Ring by Cheryl Bolen
Nanny 911 by Julie Miller