Read Believing Bullshit: How Not to Get Sucked into an Intellectual Black Hole Online
Authors: Stephen Law
The Christian Science movement has no interest in conducting such careful checks. They just stick with their accumulation of anecdotes, which they
dress up as
“science.”
Christian Science has cost lives. People can and have died as a result of their rejecting conventional medical services and plumping for the power of Christian Science instead. Hundreds,
perhaps thousands, of children have received Christian Science treatment rather than conventional medicine. One of the most notorious cases is the 1979 incident involving twelve-year-old Michael Schram. As Michael started showing increasingly serious symptoms of gastric distress, his mother, a devout Christian Scientist, decided to rely on the services not of a doctor but of a Christian Science practitioner. As a result, Michael died unnecessarily from a ruptured appendix. We don't know how many children have died in this way, because Christian Science keeps no record of its failures.
Christian Science is undoubtedly an Intellectual Black Hole and a potentially dangerous one at that. While other mechanisms described in this book also play a role in providing Christian Science with a veneer of reasonableness, it is
Piling Up the Anecdotes
that does the bulk of the work.
O
But what if these methods aren't available? Suppose I have little or no evidence to support the belief I nevertheless want people to accept. Suppose I can't just show them that it's true. How else might I get them to believe?
I might try to dupe them, of course. I could produce fraudulent evidence and bogus arguments. But what if I suspect this won't be enough? What if I think my deceit is likely to be detected?
Another option is to dispense with even the pretense of rational persuasion and rely on the strategy I call
Pressing Your Buttons
instead. As I'll explain,
Pressing Your Buttons
, applied in a systematic and dedicated way, amounts to
brainwashing.
I'll set
out exactly what I mean by
Pressing Your Buttons
toward the end of this chapter. In the meantime, in order to prepare the ground for that discussion, let's survey some of the many mechanisms that can influence what we believe.
BELIEF-SHAPING MECHANISMS
One very obvious way in which our beliefs are shaped is by social and psychological mechanisms such as peer pressure and a desire to conform. Finding ourselves believing something of which our community disapproves can be a deeply uncomfortable experience, an experience that may lead us unconsciously to tailor what we believe so that we remain in step with others. We're far more susceptible to such social pressures than we like to believe (as several famous psychological studies have shown).
1
Belief can also be shaped through the use of reward and punishment. A grandmother may influence the beliefs of her grandson by giving him a sweet whenever he expresses the kind of beliefs of which she approves, and by ignoring or smacking him when he expresses the “wrong” sort of belief. Over time, this may change not just the kind of beliefs her grandson expresses, but also the kinds of beliefs he holds.
Perhaps certain beliefs might also be directly implanted in us. Some suppose God has implanted certain beliefs in at least some of us. Our evolutionary history may also produce certain beliefs, or at least certain predispositions to belief. For example, there's growing evidence that a disposition toward religious belief is part of our evolutionary heritage, bestowed on us by natural selection. But even if neither God nor evolution have implanted beliefs in us, perhaps one day we'll be able to implant beliefs using technology. Perhaps we'll be able to strap a brain-state-altering helmet onto an unwitting victim while they sleep, dial in the required belief, press the red button and Bing! our victim will wake up with the belief we've programmed them to hold. That
would be a rather cruel trick. Some hypnotists claim a similar ability to, as it were, directly “inject” beliefs into people's minds.
Obviously, such causal mechanisms can operate on us
without our realizing what's going on.
I might
think
I condemn racism because I have good grounds for supposing racism is morally wrong, but the truth is I have merely caved into peer pressure and my desire not to be ostracized by my liberal family and friends. If a belief has been implanted in me by, say, natural selection, or by some brain-state-altering device then, again, I may not be aware that this is the reason why I believe. Suppose, for example, that some prankster programs me to believe I have been abducted by aliens using the belief-inducing helmet described above. I wake up one morning and find, as a result, that I now very strongly believe I was taken aboard a flying saucer during the night. I have no awareness of the real reason why I now hold that belief—of the mechanism that actually produced the belief in me. If asked how I know I was abducted, I'll probably say, “
I Just Know!
”
ISOLATION, CONTROL, UNCERTAINTY, REPETITION, EMOTION
I'm going to focus here on five important belief-shaping mechanisms that can play a major role in producing and sustaining an Intellectual Black Hole:
isolation, control, uncertainty, repetition
, and
emotion.
Isolation
is a useful belief-shaping tool. An isolated individual is more vulnerable to various forms of psychological manipulation. If you want someone to believe something that runs contrary to what their friends and family believe, it's a good idea to have them spend some time at a retreat or remote training camp where their attachment to other ideas can more easily be undermined. Cults often isolate their members in this way. Cult leader Jim Jones physically moved both himself and all his followers to the Guyanan jungle (where they all eventually committed suicide).
Isolation is also recommended by some within more mainstream religions. In many countries, hermetically sealed-off religious schools are commonplace.
2
A related mechanism is
control.
If you want people to accept your belief system, it's unwise to expose them to alternative systems of belief. Gain control over the kind of ideas to which they have access and to which they are exposed. Censor beliefs and ideas that threaten to undermine your own. This kind of control is often justified on the grounds that people will otherwise be corrupted or confused. Totalitarian regimes will often remove “unhealthy” books from their libraries if the books contradict the regime. All sorts of media are restricted on the grounds that they will only “mislead” people. Schools under strict religious regimes will sometimes justify preventing children from discovering or exploring other points of view on the grounds that they will only succeed in “muddling” or “corrupting” the children. Take a leaf out of the manuals of such regimes and restrict your followers' field of vision so that everything is interpreted through a single ideological lens—your own.
If you want people to abandon their beliefs and embrace your own, or if you want to be sure they won't reject your beliefs in favor of others, it also helps to raise as much doubt and
uncertainty
as possible about those rival beliefs. Uncertainty is a potent source of stress, so the more you associate alternative beliefs with uncertainty, the better. Ideally, offer a simple set of geometric, easily formulated and remembered certainties designed to give meaning to and cover every aspect of life. By constantly harping on the vagaries, uncertainties, and meaninglessness of life outside your belief system, the simple, concrete certainties you offer may begin to seem increasingly attractive to your audience.
Encourage
repetition.
Get people to recite what you want them to believe over and over again in a mantra-like way. Make the beliefs trip unthinkingly off their tongues. It doesn't matter whether your subjects accept what they are saying, or if they even fully understand it to begin with. There's still a fair chance that
belief will eventually take hold. Mindless repetition works especially well when applied in situations in which your subjects feel powerful pressure to confirm. Lining pupils up in playgrounds for a daily, mantra-like recitation of your key tenets, for example, combines repetition with a situation in which any deviation by an individual will immediately result in a hundred pairs of eyes turned in their direction.
Emotion
can also be harnessed to shape belief. Fear is particularly useful. In George Orwell's novel
Nineteen Eighty-Four
, the regime seeks control not just over people's behavior but, even more important, what they think and feel. When the hapless rebel Winston is finally captured, his “educators” make it clear that what ultimately concerns them are his
thoughts
:
“And why do you imagine that we bring people to this place?”
“To make them confess.”
“No, that is not the reason. Try again.”
“To punish them.”
“No!” exclaimed O'Brien. His voice had changed extraordinarily, and his face had suddenly become both stern and animated. “No! Not merely to extract your confession, not to punish you. Shall I tell you why we have brought you here? To cure you! To make you sane! Will you understand, Winston, that no one whom we bring to this place ever leaves our hands uncured? We are not interested in those stupid crimes that you have committed. The Party is not interested in the overt act: the thought is all we care about.”
3
It is of course fear—the terrifying contents of Room 101—that ultimately causes Winston to succumb. He ends up genuinely believing that if Big Brother says that two plus two equals five, then two plus two
does
equal five. Many real regimes have been prepared to employ similarly brutal methods to control what goes on in people's minds. But emotional manipulation can take much milder forms yet still be effective. For example, you might harness the emotional power of iconic music and imagery. Ensure people
are regularly confronted by portraits of Our Leader accompanied by smiling children and sunbeams emanating from his head (those Baghdad murals of Saddam Hussein spring to mind). Ensure your opponents and critics are always portrayed accompanied by images of catastrophe and suffering, and perhaps even Hieronymus Bosch-like visions of hell. Make people emotionally dependent on your belief system. Ensure that what self-esteem and sense of meaning, purpose, and belonging they have is derived as far as possible from their belonging to your system of belief. Make sure they recognize that abandoning those beliefs will involve the loss of things about which they care deeply.
It goes without saying that these five mechanisms of thought control are popular with various totalitarian regimes. They are also a staple of many extreme religious cults.
Applied determinedly and systematically, these mechanisms can be highly effective in shaping belief and suppressing “unacceptable” lines of thought. They are particularly potent when applied to children and young adults, whose critical defenses are weak, and who have a sponge-like tendency to accept whatever they are told.
Note that traditional, mainstream religious education has sometimes also involved heavy reliance on many, sometimes all, of these five mechanisms. I was struck by a story a colleague once told me that, as a teenage pupil of a rather strict Catholic school in the 1960s, she once put up her hand in class to ask why contraception was wrong. She was immediately sent to the headmaster, who asked her why she was obsessed with sex. Interestingly, my colleague added that, even before she asked the question, she knew she shouldn't. While never explicitly saying so, her school and wider Catholic community had managed to convey to her that asking such a question was unacceptable. Her role was not to think and question but passively to accept. My colleague added that, even today, nearly half a century later, despite the fact that she no longer has any religious conviction, she still finds herself feeling guilty if she dares to question a Catholic belief. So
effective was her religious upbringing in straitjacketing her thought that she still feels
instinctively
that to do so is to commit a thought crime.
Of course, religious education doesn't have to be like this. Often it isn't. An open, questioning attitude can be encouraged rather than suppressed. Still, it's clear that some mainstream religions have historically been very reliant upon such techniques so far as the transmission of the faith from one generation to the next is concerned. In some places, they still are.
BRAINWASHING
Applied in a consistent and systematic fashion, these five techniques add up to what many would call “brainwashing.” Kathleen Taylor, a research scientist in physiology at the University of Oxford, upon whose work I am partly drawing here, has published a book on brainwashing. In an associated newspaper article, Taylor writes:
One striking fact about brainwashing is its consistency. Whether the context is a prisoner of war camp, a cult's headquarters or a radical mosque, five core techniques keep cropping up: isolation, control, uncertainty, repetition and emotional manipulation.
4
Taylor adds that within the discipline of psychology,
brainwashing
is an increasingly superfluous word. It can be a misleading term, associated as it is with
Manchurian Candidate–type
stories of seemingly ordinary members of the public transformed into presidential assassins on hearing a trigger phrase. As Taylor says, that kind of brainwashing is a myth. Case studies suggest there is
no “magic” process called “brainwashing,” though many (including the U.S. government) have spent time and money
looking for such a process. Rather the studies suggest that brainwashing … is best regarded as a collective noun for various, increasingly well-understood techniques of non-consensual mind-change.
5