Brain Rules: 12 Principles for Surviving and Thriving at Work, Home, and School (9 page)

Emotionally charged events can be divided into two categories: those that no two people experience identically, and those that everybody experiences identically.

When my mother got angry (which was rare), she went to the kitchen, washing LOUDLY any dishes she discovered in the sink. And if there were pots and pans, she deliberately would crash them together as she put them away. This noise served to announce to the entire household (if not the city block) her displeasure at something. To this day, whenever I hear loudly clanging pots and pans, I experience an emotionally competent stimulus—a fleeting sense of “You’re in trouble now!” My wife, whose mother never displayed anger in this fashion, does not associate anything emotional with the noise of pots and pans. It’s a uniquely stimulated, John-specific ECS.

Universally experienced stimuli come directly from our evolutionary heritage, so they hold the greatest potential for use in teaching and business. Not surprisingly, they follow strict Darwinian lines of threats and energy resources. Regardless of who you are, the brain pays a great deal of attention to these questions:

“Can I eat it? Will it eat me?”
“Can I mate with it? Will it mate with me?”
“Have I seen it before?”

Any of our ancestors who didn’t remember threatening experiences thoroughly or acquire food adequately would not live long enough to pass on his genes. The human brain has many dedicated systems exquisitely tuned to reproductive opportunity and to the perception of threat. (That’s why the robbery story grabbed your attention—and why I put it at the beginning of this chapter.) We also are terrific pattern matchers, constantly assessing our environment for similarities, and we tend to remember things if we think we have seen them before.

One of the best TV spots ever made used all three principles in an ever-increasing spiral. Stephen Hayden produced the commercial, introducing the Apple computer in 1984. It won every major advertising award that year and set a standard for Super Bowl ads. The commercial opens onto a bluish auditorium filled with robot-like men all dressed alike. In a reference to the 1956 movie
1984
, the men are staring at a screen where a giant male face is spouting off platitude fragments such as “information purification!” and “unification of thought!” The men in the audience are absorbing these messages like zombies. Then the camera shifts to a young woman in gym clothes, sledgehammer in hand, running full tilt toward the auditorium. She is wearing red shorts, the only primary color in the entire commercial. Sprinting down the center aisle, she throws her sledgehammer at the screen containing Big Brother. The screen explodes in a hail of sparks and blinding light. Plain letters flash on the screen: “On January 24
th
, Apple Computer will introduce Macintosh. And you’ll see why 1984 won’t be like
1984
.”

All of the elements are at work here. Nothing could be more threatening to a country marinated in free speech than George Orwell’s
1984
totalitarian society. There is sex appeal, with the revealing gym shorts, but there is a twist. Mac is a female, so-o-o … IBM must be a male. In the female-empowering 1980s, a whopping statement on the battle of the sexes suddenly takes center stage. Pattern matching abounds as well. Many people have read
1984
or seen the movie. Moreover, people who were
really
into computers at the time made the connection to IBM, a company often called Big Blue for its suit-clad sales force.

2) Meaning before details

What most people remember about that commercial is its emotional appeal rather than every detail. There is a reason for that. The brain remembers the emotional components of an experience better than any other aspect. We might forget minute details of an interstate fender bender, for example, yet vividly recall the fear of trying to get to the shoulder without further mishap.

Studies show that emotional arousal focuses attention on the “gist” of an experience
at the expense
of peripheral details. Many researchers think that’s how memory normally works—by recording the gist of what we encounter, not by retaining a literal record of the experience. With the passage of time, our retrieval of gist always trumps our recall of details. This means our heads tend to be filled with generalized pictures of concepts or events, not with slowly fading minutiae. I am convinced that America’s love of retrieval game shows such as
Jeopardy!
exists because we are dazzled by the unusual people who can invert this tendency.

Of course, at work and at school, detailed knowledge often is critical for success. Interestingly, our reliance on gist may actually be fundamental to finding a strategy for remembering details. We know this from a fortuitous series of meetings that occurred in the 1980s between a brain scientist and waiter.

Watching J.C. take an order is like watching Ken Jennings play
Jeopardy!
J.C. never writes anything down, yet he never gets the order wrong. As the menu offers more than 500 possible combinations of food (entrees, side dishes, salad dressing, etc.)
per customer
, this is an extraordinary achievement. J.C. has been recorded taking the orders of 20 people consecutively with a zero percent error rate. J.C. worked in a restaurant frequented by University of Colorado brain scientist K. Anders Ericsson. Noticing how unusual J.C.’s skills were, he asked J.C. if he would submit to being studied. The secret of J.C.’s success lay in the deployment of a powerful organization strategy. He always divided the customer’s order into discrete categories, such as entree, temperature, side dish, and so on. He then coded the details of a particular order using a lettering system. For salad dressing, Blue Cheese was always “B,” Thousand Island always “T” and so on. Using this code with the other parts of the menu, he assigned the letters to an individual face and remembered the assignment. By creating a hierarchy of gist, he easily could apprehend the details.

J.C.’s strategy employs a principle well-known in the brain-science community: Memory is enhanced by creating associations between concepts. This experiment has been done hundreds of times, always achieving the same result: Words presented in a logically organized, hierarchical structure are much better remembered than words placed randomly—typically 40 percent better. This result baffles scientists to this day. Embedding associations between data points necessarily increases the number of items to be memorized. More pieces of intellectual baggage to inventory should make learning more difficult. But that is exactly not what was found. If we can derive the
meaning
of the words to one another, we can much more easily recall the details. Meaning
before
details.

John Bransford, a gifted education researcher who edited the well-received
How People Learn
, one day asked a simple question: In a given academic discipline, what separates novices from experts? Bransford eventually discovered six characteristics, one of which is relevant to our discussion: “[Experts’] knowledge is not simply a list of facts and formulas that are relevant to their domain; instead, their knowledge is organized around core concepts or ‘big ideas’ that guide their thinking about their domains.”

Whether you are a waiter or a brain scientist, if you want to get the particulars correct, don’t start with details. Start with the key ideas and, in a hierarchical fashion, form the details around these larger notions.

3) The brain cannot multitask

Multitasking, when it comes to paying attention, is a myth. The brain naturally focuses on concepts sequentially, one at a time. At first that might sound confusing; at one level the brain does multitask. You can walk and talk at the same time. Your brain controls your heartbeat while you read a book. Pianists can play a piece with left hand and right hand simultaneously. Surely this is multitasking. But I am talking about the brain’s ability to pay attention. It is the resource you forcibly deploy while trying to listen to a boring lecture at school. It is the activity that collapses as your brain wanders during a tedious presentation at work. This attentional ability is not capable of multitasking.

Recently, I agreed to help the high-school son of a friend of mine with some homework, and I don’t think I will ever forget the experience. Eric had been working for about a half-hour on his laptop when I was ushered to his room. An iPod was dangling from his neck, the earbuds cranking out Tom Petty, Bob Dylan, and Green Day as his left hand reflexively tapped the backbeat. The laptop had at least 11 windows open, including two IM screens carrying simultaneous conversations with MySpace friends. Another window was busy downloading an image from Google. The window behind it had the results of some graphic he was altering for MySpace friend No. 2, and the one behind that held an old Pong game paused mid-ping.

Buried in the middle of this activity was a word-processing program holding the contents of the paper for which I was to provide assistance. “The music helps me concentrate,” Eric declared, taking a call on his cell phone. “I normally do everything at school, but I’m stuck. Thanks for coming.” Stuck indeed. Eric would make progress on a sentence or two, then tap out a MySpace message, then see if the download was finished, then return to his paper. Clearly, Eric wasn’t concentrating on his paper. Sound like someone you know?

To put it bluntly, research shows that
we can’t multitask.
We are biologically incapable of processing attention-rich inputs simultaneously. Eric and the rest of us must jump from one thing to the next. To understand this remarkable conclusion, we must delve a little deeper into the third of Posner’s trinity: the Executive Network. Let’s look at what Eric’s Executive Network is doing as he works on his paper and then gets interrupted by a “You’ve got mail!” prompt from his girlfriend, Emily.

step 1: shift alert

To write the paper from a cold start, blood quickly rushes to the anterior prefrontal cortex in Eric’s head. This area of the brain, part of the Executive Network, works just like a switchboard, alerting the brain that it’s about to shift attention.

step 2: rule activation for task #1

Embedded in the alert is a two-part message, electricity sent crackling throughout Eric’s brain. The first part is a search query to find the neurons capable of executing the paper-writing task. The second part encodes a command that will rouse the neurons, once discovered. This process is called “rule activation,” and it takes several tenths of a second to accomplish. Eric begins to write his paper.

step 3: disengagement

While he’s typing, Eric’s sensory systems picks up the email alert from his girlfriend. Because the rules for writing a paper are different from the rules for writing to Emily, Eric’s brain must disengage from the paper-writing rules before he can respond. This occurs. The switchboard is consulted, alerting the brain that another shift in attention is about to happen.

step 4: rule activation for task #2

Another two-part message seeking the rule-activation protocols for emailing Emily is now deployed. As before, the first is a command to find the writing-Emily rules, and the second is the activation command. Now Eric can pour his heart out to his sweetheart. As before, it takes several tenths of a second simply to perform the switch.

Incredibly, these four steps must occur in sequence
every time
Eric switches from one task to another. It is time-consuming.
And it is sequential.
That’s why we can’t multitask. That’s why people find themselves losing track of previous progress and needing to “start over,” perhaps muttering things like “Now where was I?” each time they switch tasks. The best you can say is that people who appear to be good at multitasking actually have good working memories, capable of paying attention to several inputs
one at a time.

Here’s why this matters: Studies show that a person who is interrupted takes 50 percent longer to accomplish a task. Not only that, he or she makes up to 50 percent more errors.

Some people, particularly younger people, are more adept at task-switching. If a person is familiar with the tasks, the completion time and errors are much less than if the tasks are unfamiliar. Still, taking your sequential brain into a multitasking environment can be like trying to put your right foot into your left shoe.

A good example is driving while talking on a cell phone. Until researchers started measuring the effects of cell-phone distractions under controlled conditions, nobody had any idea how profoundly they can impair a driver. It’s like driving drunk. Recall that large fractions of a second are consumed every time the brain switches tasks. Cell-phone talkers are a half-second slower to hit the brakes in emergencies, slower to return to normal speed after an emergency, and more wild in their “following distance” behind the vehicle in front of them. In a half-second, a driver going 70 mph travels 51 feet. Given that 80 percent of crashes happen within three seconds of some kind of driver distraction, increasing your amount of task-switching increases your risk of an accident. More than 50 percent of the visual cues spotted by attentive drivers are missed by cell-phone talkers. Not surprisingly, they get in more wrecks than anyone except very drunk drivers.

Other books

Heat of the Moment by Lauren Barnholdt
Summoning Sebastian by Katriena Knights
Reckless Territory by Kate Watterson
Beg Me by Lisa Lawrence
And Both Were Young by Madeleine L'engle
The Tempest by James Lilliefors
Future Perfect by Suzanne Brockmann