Brain Rules: 12 Principles for Surviving and Thriving at Work, Home, and School (13 page)

Actually, I saw eight people die. Son of a career Air Force official, I was very used to seeing military airplanes in the sky. But I looked up one afternoon to see a cargo plane do something I had never seen before or since. It was falling from the sky, locked in a dead man’s spiral. It hit the ground maybe 500 feet from where I stood, and I felt both the shock wave and the heat of the explosion.

There are two things I could have done with this information. I could have kept it entirely to myself, or I could have told the world. I chose the latter. After immediately rushing home to tell my parents, I called some of my friends. We met for sodas and began talking about what had just happened. The sounds of the engine cutting out. Our surprise. Our fear. As horrible as the accident was, we talked about it so much in the next week that the subject got tiresome. One of my teachers actually forbade us from bringing it up during class time, threatening to make T-shirts saying, “You’ve done enough talking.”

Why do I still remember the details of this story? T-shirt threat notwithstanding, my eagerness to yap about the experience provided the key ingredient. The gabfest after the accident forced a consistent re-exposure to the basic facts, followed by a detailed elaboration of our impressions. The phenomenon is called elaborative rehearsal, and it’s the type of repetition shown to be most effective for the most robust retrieval. A great deal of research shows that thinking or talking about an event
immediately after it has occurred
enhances memory for that event, even when accounting for differences in type of memory. This tendency is of enormous importance to law-enforcement professionals. It is one of the reasons why it is so critical to have a witness recall information as soon as is humanely possible after a crime.

Ebbinghaus showed the power of repetition in exhaustive detail almost 100 years ago. He even created “forgetting curves,” which showed that a great deal of memory loss occurs in the first hour or two after initial exposure. He demonstrated that this loss could be lessened by deliberate repetitions. This notion of timing in the midst of re-exposure is so critical, I am going to explore it in three ways.

space out the input

Much like concrete, memory takes an almost ridiculous amount of time to settle into its permanent form. And while it is busy hardening, human memory is maddeningly subject to amendment. This probably occurs because newly encoded information can reshape and wear away previously existing traces. Such interference is especially true when learning is supplied in consecutive, uninterrupted glops, much like what happens in most boardrooms and schoolrooms. The probability of confusion is increased when content is delivered in unstoppable, unrepeated waves, poured into students as if they were wooden forms.

But there is happy news. Such interference does not occur if the information is delivered in deliberately spaced repetition cycles. Indeed, repeated exposure to information in specifically timed intervals provides the most powerful way to fix memory into the brain. Why does this occur? When the electrical representations of information to be learned are built up slowly over many repetitions, the neural networks recruited for storage gradually remodel the overall representation
and do not interfere
with networks previously recruited to store similarly learned information. This idea suggests that continuous repetition cycles create experiences capable of adding to the knowledge base, rather than interfering with the resident tenants.

There is an area of the brain that always becomes active when a vivid memory is being retrieved. The area is within the left inferior prefrontal cortex. The activity of this area, captured by an fMRI (that’s “functional magnetic resonance imaging”) machine during learning, predicts whether something that was stored is being recalled in crystal-clear detail. This activity is so reliable that if scientists want to know if you are retrieving something in a robust manner, they don’t have to ask you. They can simply look in their machine and see what your left inferior prefrontal cortex is doing.

With this fact in mind, scientist Robert Wagner designed an experiment in which two groups of students were required to memorize a list of words. The first group was shown the words via mass repetition, reminiscent of students cramming for an exam. The second group was shown the words in spaced intervals over a longer period of time, no cramming allowed. In terms of accurate retrieval, the first group fared much worse than the second; activity in the left inferior prefrontal cortex was greatly reduced. These results led Harvard psychology professor Dan Schacter to say: “If you have only one week to study for a final, and only 10 times when you can hit the subject, it is better to space out the 10 repetitions during the week than to squeeze them all together.”

Taken together, the relationship between repetition and memory is clear. Deliberately re-expose yourself to the information if you want to retrieve it later. Deliberately re-expose yourself to the information
more elaborately
if you want the retrieval to be of higher quality. Deliberately re-expose yourself to the information more elaborately, and in fixed, spaced intervals, if you want the retrieval to be the most vivid it can be. Learning occurs best when new information is incorporated gradually into the memory store rather than when it is jammed in all at once. So why don’t we use such models in our classrooms and boardrooms? Partly it’s because educators and business people don’t regularly read the
Journal of Neuroscience.
And partly it’s because the people who do aren’t yet sure which time intervals supply all the magic. Not that timing issues aren’t a powerful research focus. In fact, we can divide consolidation into two categories based on timing: fast and slow. To explain how timing issues figure into memory formation, I want to stop for a moment and tell you about how I met my wife.

sparking interest

I was dating somebody else when I first met Kari—and so was she. But I did not forget Kari. She is a physically beautiful, talented, Emmy-nominated composer, and one of the nicest people I have ever met. When we both became “available” six months later, I immediately asked her out. We had a great time, and I began thinking about her more and more. Turns out she was feeling the same. I asked her out again, and soon we were seeing each other regularly. After two months, it got so that every time we met, my heart would pound, my stomach would flip-flop, and I’d get sweaty palms. Eventually I didn’t even have to see her to raise my pulse. Just a picture would do, or a whiff of her perfume, or … just
music!
Even a fleeting thought was enough to send me into hours of rapture. I knew I was falling in love.

What was happening to effect such change? With increased exposure to this wonderful woman, I became increasingly sensitive to her presence, needing increasingly smaller “input” cues (perfume, for heavens sake?) to elicit increasingly stronger “output” responses. The effect has been long-lasting, with a tenure of almost three decades.

Leaving the whys of the heart to poets and psychiatrists, the idea that increasingly limited exposures can result in increasingly stronger responses lies at the heart of how neurons learn things. Only it’s not called romance; it’s called long-term potentiation.

To describe LTP, we need to leave the high-altitude world of behavioral research and drop down to the more intimate world of cellular and molecular research. Let’s suppose you and I are looking at a laboratory Petri dish where two hippocampal neurons happily reside in close synaptic association. I will call the presynaptic neuron the “teacher” and the post-synaptic neuron the “student.” The goal of the teacher neuron is to pass on information, electrical in nature, to the student cell. Let’s give the teacher neuron some stimulus that inspires the cell to crack off an electrical signal to its student. For a short period of time, the student becomes stimulated and fires excitedly in response. The synaptic interaction between the two is said to be temporarily “strengthened.” This phenomenon is termed early LTP.

Unfortunately, the excitement lasts only for an hour or two. If the student neuron does not get the same information from the teacher within about 90 minutes, the student neuron’s level of excitement will vanish. The cell will literally reset itself to zero and act as if nothing happened, ready for any other signal that might come its way.

Early LTP is at obvious cross-purposes with the goals of the teacher neuron and, of course, with real teachers everywhere. How does one get that initial excitement to become permanent? Is there a way to transform a student’s short-lived response into a long-lived one?

You bet there is: The information must be
repeated
after a period of time has elapsed. If the signal is given only once by the cellular teacher, the excitement will be experienced by the cellular student only transiently. But if the information is repeatedly pulsed in discretely timed intervals (the timing for cells in a dish is about 10 minutes between pulses, done a total of three times), the relationship between the teacher neuron and the student neuron begins to change. Much like my relationship with Kari after a few dates, increasingly smaller and smaller inputs from the teacher are required to elicit increasingly stronger and stronger outputs from the student. This response is termed “late LTP.” Even in this tiny, isolated world of two neurons, timed repetition is deeply involved in whether or not learning will occur.

The interval required for synaptic consolidation is measured in minutes and hours, which is why it is called fast consolidation. But don’t let this small passage of time disabuse you of its importance. Any manipulation—behavioral, pharmacological, or genetic—that interferes with any part of this developing relationship will block memory formation in total.

Such data provide rock solid evidence that repetition is critical in learning—at least, if you are talking about two neurons in a dish. How about between two people in a classroom? The comparatively simple world of the cell is very different from the complex world of the brain. It is not unusual for a single neuron to have hundreds of synaptic connections with other neurons.

This leads us to a type of consolidation measured in decidedly longer terms, and to stronger end-use implications. It is sometimes called “system consolidation,” sometimes “slow consolidation.” As we shall see, slow is probably the better term.

a chatty marriage

Nuclear annihilation is a good way to illustrate the differences between synaptic and system consolidation. On August 22, 1968, the Cold War got hot. I was studying history in junior high at the time, living with my Air Force-tethered family at an air base in central Germany, unhappily near ground zero if the atomics were ever to fly in the European theater.

If you could have visited my history class, you wouldn’t have liked it. For all the wonderfully intense subject matter—Napoleonic Wars!—the class was taught in a monotonic fashion by a French national, a teacher who really didn’t want to be there. And it didn’t help my concentration to be preoccupied with the events of the previous day. August 21, 1968, was the morning when a combined contingent of Soviet and Warsaw Pact armies invaded what used to be Czechoslovakia. Our air base went on high alert, and my dad, a member of the U.S. Air Force, had left the evening before. Ominously, he had not yet come home.

The instructor pointed to a large and beautiful painting of the Battle of Austerlitz on the wall, tediously discussing the early wars of Napoleon. I suddenly heard her angry voice say, “Do I need to ask zees twice?” Jolted out of my worried distraction, I turned around to find her looming over my desk. She cleared her throat. “I said, ‘Who were Napoleon’s enemies in zees battle?’ I suddenly realized she had been talking to me, and I blurted out the first words that came to my addled mind. “The Warsaw Pact armies! No? Wait! I mean the Soviet Union!” Fortunately, the teacher had a sense of humor and some understanding about the day. As the class erupted with laughter, she quickly thawed, tapped my shoulder, and walked back to her desk, shaking her head. “Zee enemies were a coalition of Russian and Austrian armies.” She paused. “And Napoleon cleaned their clocks.” Many memory systems are involved in helping me to retrieve this humiliating memory, now almost four decades old. I want to use some of its semantic details to describe the timing properties of system consolidation.

Like Austerlitz, our neurological tale involves several armies of nerves. The first army is the cortex, that wafer-thin layer of nerves that blankets a brain the way an atmosphere blankets a battlefield. The second is a bit of a tongue twister, the medial temporal lobe. It houses another familiar old soldier, the oft-mentioned hippocampus. Crown jewel of the limbic system, the hippocampus helps shape the long-term character of many types of memory. That other teacher-student relationship we were discussing, the one made of neurons, takes place in the hippocampus.

How the cortex and the medial temporal lobe are cabled together tells the story of long-term memory formation. Neurons spring from the cortex and snake their way over to the lobe, allowing the hippocampus to listen in on what the cortex is receiving. Wires also erupt from the lobe and wriggle their way back to the cortex, returning the eavesdropping favor. This loop allows the hippocampus to issue orders to previously stimulated cortical regions while simultaneously gleaning information from them. It also allows us to form memories, and it played a large role in my ability to recount this story to you.

Other books

Why Can't I Be You by Allie Larkin
No Escape by Heather Lowell
Love in the Afternoon by Yvette Hines
Knot Gneiss by Piers Anthony
ThreeReasonsWhy by Mari Carr
The Boy Detective by Roger Rosenblatt
Vanished Without A Trace by Nava Dijkstra