Hooking Up (13 page)

Read Hooking Up Online

Authors: Tom Wolfe

Tags: #Literary Criticism, #General

In fact—
nobody wanted their damnable IQ Cap!
It wasn’t simply that no one
believed
you could derive IQ scores from brain waves—it was that nobody
wanted
to believe it could be done. Nobody
wanted
to believe that human brainpower is …
that hardwired
. Nobody wanted to learn in a flash that …
the genetic fix is
in. Nobody wanted to learn that he was …
a hardwired genetic mediocrity
… and that the best he could hope for in this Trough of Mortal Error was to live out his mediocre life as a stress-free dim bulb. Barry Sterman of UCLA, chief scientist for a firm called Cognitive Neurometrics, who has devised his own brain-wave technology for market research and focus groups, regards brain-wave IQ testing as possible—but in the current atmosphere you “wouldn’t have a Chinaman’s chance of getting a grant” to develop it.
Here we begin to sense the chill that emanates from the hottest field in the academic world. The unspoken and largely unconscious premise of the wrangling over neuroscience’s strategic high ground is: We now live in an age in which science is a court from which there is no appeal. And the issue this time around, at the beginning of the twenty-first century, is not the evolution of the species, which can seem a remote business, but the nature of our own precious inner selves.
The elders of the field, such as Wilson, are well aware of all this and are cautious, or cautious compared to the new generation. Wilson still holds out the possibility—I think he doubts it, but he still holds out the possibility—that at some point in evolutionary history culture began to influence the development of the human brain in ways that cannot be explained by strict Darwinian theory. But the new generation of neuroscientists are not cautious for a second. In private conversations, the bull sessions, as it were, that create the mental atmosphere of any hot new science—and I love talking to these people—they express an uncompromising determinism.
They start with the second most famous statement in all of modern philosophy, Descartes’s “
Cogito ergo sum
,” “I think, therefore I am,” which they regard as the essence of “dualism,” the old-fashioned notion that the mind is something distinct from its mechanism, the brain and the body. (I will get to the most famous statement in a moment.) This is also known as the “ghost in the machine” fallacy, the quaint belief that there is a ghostly “self” somewhere inside the brain that interprets and directs its operations. Neuroscientists involved in three-dimensional electroencephalography will tell you that there is not even any one place in the brain where consciousness or self-consciousness (
Cogito ergo sum
) is located. This is merely an illusion created by a medley of neurological systems acting in concert. The young generation takes this yet one step further. Since consciousness and thought are entirely physical products of your brain and nervous system-and since your brain arrived fully imprinted at birth—what makes you think you have free will? Where is it going to come from? What “ghost,” what “mind,” what “self,” what “soul,” what anything that will not be immediately grabbed by those scornful quotation marks is going to bubble up your brain stem to give it to you? I have heard neuroscientists theorize that, given computers of sufficient power and sophistication, it would be possible to predict the course of any human being’s life moment by moment, including the fact that the poor devil was about to shake his head over the very idea. I doubt that any Calvinist of the sixteenth century
ever believed so completely in predestination as these, the hottest and most intensely rational young scientists in the United States in the twenty-first.
Since the late 1970s, in the Age of Wilson, college students have been heading into neuroscience in job lots. The Society for Neuroscience was founded in 1970 with 1,100 members. Today, one generation later, its membership exceeds 26,000. The society’s latest convention, in Miami, drew more than 20,000 souls, making it one of the biggest professional conventions in the country. In the venerable field of academic philosophy, young faculty members are jumping ship in embarrassing numbers and shifting into neuroscience. They are heading for the laboratories. Why wrestle with Kant’s God, Freedom, and Immortality when it is only a matter of time before neuroscience, probably through brain imaging, reveals the actual physical mechanism that fabricates these mental constructs, these illusions?
Which brings us to the most famous statement in all of modern philosophy: Nietzsche’s “God is dead.” The year was 1882. The book was
Die Fröhliche Wissenschaft
(
The Gay Science
). Nietzsche said this was not a declaration of atheism, although he was in fact an atheist, but simply the news of an event. He called the death of God a “tremendous event,” the greatest event of modern history. The news was that educated people no longer believed in God, as a result of the rise of rationalism and scientific thought, including Darwinism, over the preceding 250 years. But before you atheists run up your flags of triumph, he said, think of the implications. “The story I have to tell,” wrote Nietzsche, “is the history of the next two centuries.” He predicted (in
Ecce Homo
) that the twentieth century would be a century of “wars such as have never happened on earth,” wars catastrophic beyond all imagining. And why? Because human beings would no longer have a god to turn to, to absolve them of their guilt; but they would still be racked by guilt, since guilt is an impulse instilled in children when they are very young, before the age of reason. As a result, people would loathe not only one another but themselves. The blind and reassuring faith they formerly poured into their belief in God, said Nietzsche, they would now pour
into a belief in barbaric nationalistic brotherhoods: “If the doctrines … of the lack of any cardinal distinction between man and animal, doctrines I consider true but deadly”—he says in an allusion to Darwinism in
Untimely Meditations
—“are hurled into the people for another generation … then nobody should be surprised when … brotherhoods with the aim of the robbery and exploitation of the non-brothers … will appear in the arena of the future.”
Nietzsche’s view of guilt, incidentally, is also that of neuroscientists a century later. They regard guilt as one of those tendencies imprinted in the brain at birth. In some people the genetic work is not complete, and they engage in criminal behavior without a twinge of remorse—thereby intriguing criminologists, who then want to create Violence Initiatives and hold conferences on the subject.
Nietzsche said that mankind would limp on through the twentieth century “on the mere pittance” of the old decaying God-based moral codes. But then, in the twenty-first, would come a period more dreadful than the great wars, a time of “the total eclipse of all values” (in
The Will to Power
). This would also be a frantic period of “revaluation,” in which people would try to find new systems of values to replace the osteoporotic skeletons of the old. But you will fail, he warned, because you cannot believe in moral codes without simultaneously believing in a god who points at you with his fearsome forefinger and says “Thou shalt” or “Thou shalt not.”
Why should we bother ourselves with a dire prediction that seems so far-fetched as “the total eclipse of all values”? Because of man’s track record, I should think. After all, in Europe, in the peaceful decade of the 1880s, it must have seemed even more far-fetched to predict the world wars of the twentieth century and the barbaric brotherhoods of Nazism and Communism. Ecce vates!
Ecce vates!
Behold the prophet! How much more proof can one demand of a man’s powers of prediction?
A hundred years ago those who worried about the death of God could console one another with the fact that they still had their own bright selves and their own inviolable souls for moral ballast and the
marvels of modern science to chart the way. But what if, as seems likely, the greatest marvel of modern science turns out to be brain imaging? And what if, ten years from now, brain imaging has proved, beyond any doubt, that not only Edward O. Wilson but also the young generation are, in fact, correct?
The elders, such as Wilson himself and Daniel C. Dennett, the author of
Darwin’s Dangerous Idea: Evolution and the Meanings of Life
, and Richard Dawkins, author of
The Selfish Gene
and
The Blind Watchmaker
, insist that there is nothing to fear from the truth, from the ultimate extension of Darwin’s dangerous idea. They present elegant arguments as to why neuroscience should in no way diminish the richness of life, the magic of art, or the righteousness of political causes, including, if one need edit, political correctness at Harvard or Tufts, where Dennett is Director of the Center for Cognitive Studies, or Oxford, where Dawkins is something called Professor of Public Understanding of Science. (Dennett and Dawkins, every bit as much as Wilson, are earnestly, feverishly, politically correct.) Despite their best efforts, however, neuroscience is not rippling out into the public on waves of scholarly reassurance. But rippling out it is, rapidly. The conclusion people out beyond the laboratory walls are drawing is:
The fix is in! We’re all hardwired! That, and: Don’t blame me! I’m wired wrong!
This sudden switch from a belief in Nurture, in the form of social conditioning, to Nature, in the form of genetics and brain physiology, is the great intellectual event, to borrow Nietzsche’s term, of the late twentieth century. Up to now the two most influential ideas of the century have been Marxism and Freudianism (see page 82). Both were founded upon the premise that human beings and their “ideals”—Marx and Freud knew about quotation marks, too—are completely molded by their environment. To Marx, the crucial environment was one’s social class; “ideals” and “faiths” were notions foisted by the upper orders upon the lower as instruments of social control. To Freud, the crucial environment was the Oedipal drama, the unconscious sexual plot that
was played out in the family early in a child’s existence. The “ideals” and “faiths” you prize so much are merely the parlor furniture you feature for receiving your guests, said Freud; I will show you the cellar, the furnace, the pipes, the sexual steam that actually runs the house. By the mid-1950s even anti-Marxists and anti-Freudians had come to assume the centrality of class domination and Oedipally conditioned sexual drives. On top of this came Pavlov, with his “stimulus-response bonds,” and B. F. Skinner, with his “operant conditioning,” turning the supremacy of conditioning into something approaching a precise form of engineering.
So how did this brilliant intellectual fashion come to so screeching and ignominious an end?
The demise of Freudianism can be summed up in a single word: lithium. In 1949 an Australian psychiatrist, John Cade, gave five days of lithium therapy—for entirely the wrong reasons—to a fifty-one-year-old mental patient who was so manic-depressive, so hyperactive, unintelligible, and uncontrollable, he had been kept locked up in asylums for twenty years. By the sixth day, thanks to the lithium buildup in his blood, he was a normal human being. Three months later he was released and lived happily ever after in his own home. This was a man who had been locked up and subjected to two decades of Freudian logorrhea to no avail whatsoever. Over the next twenty years antidepressant and tranquillizing drugs completely replaced Freudian talk-talk as treatment for severe mental disturbances. By the mid-1980s, neuroscientists looked upon Freudian psychiatry as a quaint relic based largely upon superstition (such as dream analysis—
dream
analysis!), like phrenology or mesmerism. In fact, among neuroscientists, phrenology now has a higher reputation than Freudian psychiatry, since phrenology was in a certain crude way a precursor of electroencephalography. Freudian psychiatrists are now regarded as quacks with sham medical degrees, as ears that people with more money than sense can hire to talk into.
Marxism was finished off even more suddenly—in a single year, 1973—with the smuggling out of the Soviet Union and the publication in France of the first of the three volumes of Aleksandr Solzhenitsyn’s
The Gulag Archipelago
. Other writers, notably the British historian Robert Conquest, had already exposed the Soviet Union’s vast network of concentration camps, but their work was based largely on the testimony of refugees, and refugees were routinely discounted as biased and bitter observers. Solzhenitsyn, on the other hand, was a Soviet citizen, still living on Soviet soil, a
zek
himself for eleven years,
zek
being Russian slang for concentration-camp prisoner. His credibility had been vouched for by none other than Nikita Khrushchev, who in 1962 had permitted the publication of Solzhenitsyn’s novella of the gulag,
One Day in the Life of Ivan Denisovich
, as a means of cutting down to size the daunting shadow of his predecessor Stalin. “Yes,” Khrushchev had said in effect, “what this man Solzhenitsyn has to say is true. Such were Stalin’s crimes.” Solzhenitsyn’s brief fictional description of the Soviet slave labor system was damaging enough. But
The Gulag Archipelago,
a two-thousand-page, densely detailed, nonfiction account of the Soviet Communist Party’s systematic extermination of its enemies, real and imagined, of its own countrymen,
by the tens of millions
, through an enormous, methodical, bureaucratically controlled “human sewage disposal system,” as Solzhenitsyn called it—
The Gulag Archipelago
was devastating. After all, this was a century in which there was no longer any possible ideological detour around the concentration camp. Among European intellectuals, even French intellectuals, Marxism collapsed as a spiritual force immediately. Ironically, it survived longer in the United States before suffering a final, merciful
coup de grace
on November 9, 1989, with the breaching of the Berlin Wall, which signaled in an unmistakable fashion what a debacle the Soviets’ seventy-two-year field experiment in socialism had been. (Marxism still hangs on, barely, acrobatically, in American universities in a Mannerist form known as Deconstruction, a literary doctrine that depicts language itself as an insidious tool used by the powers that be to deceive the proles and peasants.)

Other books

Sanctuary by Ken Bruen
The Third Man by Graham Greene
Saving Katya by Edwards, Sandra
Uncle John’s Briefs by Bathroom Readers’ Institute
Lone Star Legacy by Roxanne Rustand
Season of the Rainbirds by Nadeem Aslam
Codley and the Sea Cave Adventure by Lisl Fair, Ismedy Prasetya