Read Welcome to Your Brain Online

Authors: Sam Wang,Sandra Aamodt

Tags: #Neurophysiology-Popular works., #Brain-Popular works

Welcome to Your Brain (3 page)

When people look at complicated pictures like the ones shown here, they can identify differences

if the images remain still. But if the image flickers during the transition from one to another, then they

have a lot more trouble. This happens because our visual memory isn’t very good.

Experiments of this sort led psychologists to push their luck and try more outrageous ways of

getting people to fail to notice things. In one of our favorites, a researcher approaches someone on the

street and asks for directions. While the person is replying, workmen carry a large door between the

two people, blocking their view of each other. Behind the cover of the door, the person who asked for

directions is replaced by another researcher, who carries on the conversation as if nothing has

happened. Even when the second person looks very different from the first, the person giving the

directions has only about a 50 percent chance of noticing the change.

In another experiment, subjects watch a video in which three students in white shirts pass a

basketball around, while another three students in black shirts pass a second basketball. The viewers

are asked to count the number of passes made by the white-shirted team. As the two groups mingle, a

person in a gorilla suit walks into the game from one side and walks out the other side, after pausing

to face the camera and beat his chest. About half of the viewers fail to notice this event. These

experiments illustrate that you perceive only a fraction of what’s going on in the world.

Myth: We use only 10 percent of our brains

Ask a group of randomly chosen people what they know about the brain, and the most

common response is likely to be that we only use 10 percent of its capacity. This belief

causes neuroscientists around the world to cringe. The ten percent myth was established in

the U.S. more than a century ago, and it is now believed by half the population in countries

as far away as Brazil.

To scientists who study the brain, though, the idea doesn’t make any sense at all; the

brain is a very efficient device, and pretty much all of it appears to be necessary. To stick

around so long, the myth must be saying something that we really want to hear. Its

impressive persistence may depend on its optimistic message.
If we only use 10 percent of

our brains normally, think what we could do if we could use even a tiny bit of that other

90 percent!
That’s certainly an attractive idea, and it’s also sort of democratic. After all, if

everyone has so much spare brain capacity, there aren’t any dumb people, only a bunch of

potential Einsteins who haven’t learned to use enough of their brains.

This brand of optimism has been exploited by self-help gurus to sell an unending series

of programs to improve brainpower. Dale Carnegie used the idea to win book sales and

influence readers in the 1940s. He gave the myth a big boost by attributing the idea to a

founder of modern psychology, William James. But no one has found the 10 percent number

in James’s writings or speeches. James did tell his popular audiences that people have

more mental resources than they use. Perhaps some enterprising listener made the idea

sound more scientific by specifying a percentage.

This idea is particularly popular among people who are interested in extrasensory

perception (ESP) and other psychic phenomena. Believers often use the ten percent myth to

explain the existence of these abilities. Grounding a belief that is outside the realm of

science in a scientific fact is nothing new, but it’s particularly egregious when even the

“scientific fact” is known to be false.

In reality, you use your whole brain every day. If big chunks of brain were never used,

damaging them would not cause noticeable problems. This is emphatically not the case!

Functional imaging methods that allow the measurement of brain activity also show that

simple tasks are sufficient to produce activity throughout the entire brain.

One possible explanation for how the ten percent myth got started is that the functions of

certain brain regions are complicated enough that the effects of damage are subtle. For

instance, people with damage to the cerebral cortex’s frontal lobes can often still perform

most of the normal actions of everyday life, but they don’t select correct behaviors in

context. For instance, such a patient might stand up in the middle of an important business

meeting and walk out in search of lunch. Needless to say, patients like this have a hard time

getting around in the world.

Early neuroscientists may have had some trouble figuring out the functions of frontal

brain areas partly because they were working with laboratory mice. In the laboratory, mice

have a pretty simple life. They have to be able to see their food and water, walk over to it,

and consume it. Beyond that, they don’t have to do much of anything to survive. None of that

requires the frontal areas of the brain, and some early neuroscientists developed the idea

that maybe these areas didn’t do anything much. Later, more sophisticated tests disproved

that view, but the myth had already taken hold.

We’ve established that your memory of the past is unreliable and your perception of the present is

highly selective. At this point, you probably won’t be surprised to hear that your ability to imagine the

future also is worthy of suspicion. As Daniel Gilbert explains in
Stumbling on Happiness
, when we

try to project ourselves into the future, our brains tend to fill in many details, which may be

unrealistic, and leave out many others, which may be important. Depending on our imagined reality as

though it were a movie of the future, we are prone to overlook pitfalls and opportunities alike as we

plan our lives.

By now, you may be wondering if you can trust anything your brain tells you, but millions of years

of evolution lie behind its seemingly peculiar choices. Your brain selectively processes details in the

world that have historically been most relevant to survival—paying particular attention to events that

are unexpected. As we’ve seen, your brain rarely tells you the truth, but most of the time it tells you

what you need to know anyway.

Chapter 2

Gray Matter and the Silver Screen: Popular

Metaphors of How the Brain Works

If you want to see what happens when the brain goes out of whack, please don’t go to the movies.

Movie characters are continually getting themselves into neurological scrapes, losing their memories,

changing personalities, and getting schizophrenia or Parkinson’s disease (not to mention sociopathy

and other psychiatric disorders). The brain goes haywire in Hollywood far more often than in real

life, and sometimes it can be hard to tell science from science fiction. Movie depictions of mental

disorders span the spectrum from mostly accurate to totally wrong. At their worst, movie depictions

of neurological illness can reinforce common, but wrong, ideas about how the brain works.

By far the most common mental disorder in the movies is amnesia. Memory loss in the movies

constitutes its own genre, as predictable as boy meets girl, boy loses girl, and boy gets girl back. But

instead of losing a love interest, the thing lost might instead be, to pick an example, the awareness that

one is a trained assassin, as in
The Bourne Identity
(2002) or
Total Recall
(1990).

Neuropsychologist Sallie Baxendale conducted an extensive survey of memory loss in the movies,

going all the way back to the silent era. She sorted incidents into categories, most of which are filled

with wrong science but all of which are entertaining. A common dramatic theme is a trauma that

triggers memory loss, typically followed by a new start of some kind. Our hero or heroine then has a

series of adventures and misadventures, but is able to live normally and form new memories. Another

common cause of amnesia in the movies is a psychologically traumatic event. These events, which

satisfy the dramatic need to drive the plot, include anything from killing someone to getting married.

As a final twist, a character might regain his or her memory by getting whacked in the head a second

time, or through a brilliant act of neurosurgery, hypnosis, or the sight of a significant and well-loved

object from the past. Roll credits.

Did you know? Depictions of brain disorders in the movies

ACCURATE

INACCURATE

Memento

Total Recall

Sé Quién Eres

50 First Dates

Finding Nemo

Men in Black

A Beautiful Mind

The Long Kiss Goodnight

Awakenings

Nit-Witty Kitty (Tom and Jerry)

_____ •
Murder by Night

_____ •
Les Dimanches de Ville d’Avray

There also seems to be an inverse correlation between the incidence of amnesia and the artistic

merit of a television program. Soap operas and situation comedies are rife with such cases. The

1960s television series
Gilligan’s Island
, which is loved for its entertainment value rather than its

accuracy, over three seasons featured no fewer than three cases of amnesia. Another offender is the

movie
50 First Dates
(2004), which portrays a pattern of memory loss that never occurs in any

known neurological condition. Drew Barrymore plays a character who collects new memories each

day and then discards them all overnight, clearing the way for a brand-new beginning the next day. In

this way she is able to tolerate more than one date with Adam Sandler. This pattern—the ability to

store memories but subsequently lose them on a selective, timed basis—exists only in the

imaginations of scriptwriters who get their knowledge of the brain from other scriptwriters.

The head-bonk model of memory loss can even be traced to precinema literature. Edgar Rice

Burroughs, creator of the Tarzan novels, was particularly fond of the concept and applied it to quite a

few of his potboiler plots. In one of Burroughs’s finer literary moments,
Tarzan and the Jewels of

Opar
(1918), he manages to separate memory loss neatly from any other neurological damage:

His eyes opened upon the utter darkness of the room. He raised his hand to his head and

brought it away sticky with clotted blood. He sniffed at his fingers, as a wild beast might sniff

at the life-blood upon a wounded paw … No sound reached to the buried depths of his

sepulcher. He staggered to his feet, and groped his way about among the tiers of ingots. What

was he? Where was he? His head ached; but otherwise he felt no ill effects from the blow that

had felled him. The accident he did not recall, nor did he recall aught of what had led up to

it.

Burroughs may have drawn upon an existing belief that head injury could lead to amnesia. In the

1901 book
The Right of Way
, by Gilbert Parker, a snobbish, drink-sodden attorney named Charley

Steele, with a nagging wife and a lazy thief of a brother-in-law, suffers amnesia in a barroom assault.

This memory loss allows him to escape his many problems and start life over. He finds a new love

and is happy until his memory—and old obligations—return. Hollywood loved this plot, making

Charley Steele movies in 1915, 1920, and 1931.

Did you know? Head injury and personality

Head injury can sometimes lead to personality change. In real life, this can occur with

blows to the front of the head, which can affect the prefrontal cortex. Typical outcomes

include the loss of inhibition and judgment. What is not typical is wholesale transposition

of personality. In one episode of
Gilligan’s Island
, the girlish Mary Ann develops the

delusion that she is the sultry starlet Ginger after a blow to the head. Such delusional

Other books

Aiden's Betrayal by Nicholson, CT
Love Online (Truly Yours Digital Editions) by Nancy Toback, Kristin Billerbeck
Destinata (Valguard) by Nicole Daffurn
Ticket to Faerie by F. I. Goldhaber
Comfort Food by Kate Jacobs
Boys in Gilded Cages by Jarod Powell