Read A Supposedly Fun Thing I'll Never Do Again Online
Authors: David Foster Wallace
What explains the pointlessness of most published TV criticism is that television has become immune to charges that it lacks
any meaningful connection to the world outside it. It’s not that charges of nonconnection have become untrue but that they’ve
become deeply irrelevant. It’s that any such connection has become otiose. Television used to point beyond itself. Those of
us born in, say, the ’60s were trained by television to look where it pointed, usually at versions of “real life” made prettier,
sweeter, livelier by succumbing to a product or temptation. Today’s mega-Audience is way better trained, and TV has discarded
what’s not needed. A dog, if you point at something, will look only at your finger.
metawatching
It’s not like self-reference is new to U.S. entertainment. How many old radio shows—Jack Benny, Burns and Allen, Abbott and
Costello—were mostly about themselves as shows? “So, Lou, and you said I couldn’t get a big star like Miss Lucille Ball to
be a guest on our show, you little twerp.” Etc. But once television introduces the element of watching, and once it informs
an economy and culture like radio never could have, the referential stakes go way up. Six hours a day is more time than most
people (consciously) do any other one thing. How human beings who absorb such high doses understand themselves will naturally
change, become vastly more spectatorial, self-conscious. Because the practice of “watching” is expansive. Exponential. We
spend enough time watching, pretty soon we start watching ourselves watching. Pretty soon we start to “feel” ourselves feeling,
yearn to experience “experiences.” And that American subspecies into fiction writing starts writing more and more about…
The emergence of something called Metafiction in the American ’60s was hailed by academic critics as a radical aesthetic,
a whole new literary form, literature unshackled from the cultural cinctures of mimetic narrative and free to plunge into
reflexivity and self-conscious meditations on aboutness. Radical it may have been, but thinking that postmodern Metafiction
evolved unconscious of prior changes in readerly taste is about as innocent as thinking that all those college students we
saw on television protesting the Vietnam war were protesting only because they hated the Vietnam war. (They may have hated
the war, but they also wanted to be seen protesting on television. TV was where they’d
seen
this war, after all. Why wouldn’t they go about hating it on the very medium that made their hate possible?) Metafictionists
may have had aesthetic theories out the bazoo, but they were also sentient citizens of a community that was exchanging an
old idea of itself as a nation of doers and be-ers for a new vision of the U.S.A. as an atomized mass of self-conscious watchers
and appearers. For Metafiction, in its ascendant and most important phases, was really nothing more than a single-order expansion
of its own great theoretical nemesis, Realism: if Realism called it like it saw it, Metafiction simply called it as it saw
itself seeing itself see it. This high-cultural postmodern genre, in other words, was deeply informed by the emergence of
television and the metastasis of self-conscious watching. And (I claim) American fiction remains deeply informed by television…
especially those strains of fiction with roots in postmodernism, which even at its rebellious Metafictional zenith was less
a “response to” televisual culture than a kind of abiding- in-TV. Even back then, the borders were starting to come down.
It’s strange that it took television itself so long to wake up to watching’s potent reflexivity. Television shows about the
business of television shows were rare for a long time.
The Dick van Dyke Show
was prescient, and Mary Moore carried its insight into her own decade-long exploration of local-market angst. Now, of course,
there’s been everything from
Murphy Brown
to
Max Headroom
to
Entertainment Tonight
And with Letterman, Miller, Shandling, and Leno’s battery of hip, sardonic, this-is-just-TV schticks, the circle back to
the days of “We’ve just got to get Miss Ball on our show, Bud” has closed and come spiral, television’s power to jettison
connection and castrate protest fueled by the very ironic postmodern self-consciousness it had first helped fashion.
It will take a while, but I’m going to prove to you that the nexus where television and fiction converse and consort is self-conscious
irony. Irony is, of course, a turf fictionists have long worked with zeal. And irony is important for understanding TV because
“TV,” now that it’s gotten powerful enough to move from acronym to way of life, revolves off just the sorts of absurd contradictions
irony’s all about exposing. It is ironic that television is a syncretic, homogenizing force that derives much of its power
from diversity and various affirmations thereof. It is ironic that an extremely canny and unattractive self-consciousness
is necessary to create TV performers’ illusion of unconscious appeal. That products presented as helping you express individuality
can afford to be advertised on television only because they sell to enormous numbers of people. And so on.
Television regards irony sort of the way educated lonely people regard television. Television both fears irony’s capacity
to expose, and needs it. It needs irony because television was practically
made
for irony. For TV is a bisensuous medium. Its displacement of radio wasn’t picture displacing sound; it was picture added.
Since the tension between what’s said and what’s seen is irony’s whole sales territory, classic televisual irony works via
the conflicting juxtaposition of pictures and sounds. What’s seen undercuts what’s said. A scholarly article on network news
describes a famous interview with a corporate guy from United Fruit on a CBS special about Guatemala: “I sure don’t know of
anybody being so-called ‘oppressed,’ “ this guy, in a ’70s leisure suit and bad comb-over, tells Ed Rabel. “I think this is
just something that some reporters have thought up.”
7
The whole interview is intercut with commentless footage of big-bellied kids in Guatemalan slums and union organizers lying
in the mud with cut throats.
Television’s classic irony function came into its own in the summer of 1974, as remorseless lenses opened to view the fertile
“credibility gap” between the image of official disclaimer and the reality of high-level shenanigans. A nation was changed,
as Audience. If even the president lies to you, whom are you supposed to trust to deliver the real? Television, that summer,
got to present itself as the earnest, worried eye on the reality behind all images. The irony that television is itself a
river of image, however, was apparent even to a twelve-year-old, sitting there, rapt. After ’74 there seemed to be no way
out. Images and ironies all over the place. It’s not a coincidence that
Saturday Night Live
, that Athens of irreverent cynicism, specializing in parodies of (1) politics and (2) television, premiered the next fall
(on television).
I’m worried when I say things like “television fears…” and “television presents itself…” because, even though it’s kind of
a necessary abstraction, talking about television as if it were an entity can easily slip into the worst sort of anti-TV paranoia,
treating of TV as some autonomous diabolical corrupter of personal agency and community gumption. I am concerned to avoid
anti-TV paranoia here. Though I’m convinced that television today lies, with a potency somewhere between symptom and synecdoche,
behind a genuine crisis for U.S. culture and literature, I do not agree with reactionaries who regard TV as some malignancy
visited on an innocent populace, sapping IQs and compromising SAT scores while we all sit there on ever fatter bottoms with
little mesmerized spirals revolving in our eyes. Critics like Samuel Huntington and Barbara Tuchman who try to claim that
TV’s lowering of our aesthetic standards is responsible for a “contemporary culture taken over by commercialism directed to
the mass market and necessarily to mass taste”
8
can be refuted by observing that their Propter Hoc isn’t even Post Hoc: by 1830, de Tocqueville had already diagnosed American
culture as peculiarly devoted to easy sensation and mass-marketed entertainment, “spectacles vehement and untutored and rude”
that aimed “to stir the passions more than to gratify the taste.”
9
Treating television as evil is just as reductive and silly as treating it like a toaster w/pictures.
It is of course undeniable that television is an example of Low Art, the sort of art that has to please people in order to
get their money. Because of the economics of nationally broadcast, advertiser-subsidized entertainment, television s one goal—never
denied by anybody in or around TV since RCA first authorized field tests in 1936—is to ensure as much watching as possible.
TV is the epitome of Low Art in its desire to appeal to and enjoy the attention of unprecedented numbers of people. But it
is not Low because it is vulgar or prurient or dumb. Television is often all these things, but this is a logical function
of its need to attract and please Audience. And I’m not saying that television is vulgar and dumb because the people who compose
Audience are vulgar and dumb. Television is the way it is simply because people tend to be extremely similar in their vulgar
and prurient and dumb interests and wildly different in their refined and aesthetic and noble interests. It’s all about syncretic
diversity: neither medium nor Audience is faultable for quality.
Still, for the fact that individual American human beings are consuming vulgar, prurient, dumb stuff at the astounding average
per-household dose of six hours a day—for this both TV and we need to answer. We are responsible basically because nobody
is holding any weapons on us forcing us to spend amounts of time second only to sleep doing something that is, when you come
right down to it, not good for us. Sorry to be a killjoy, but there it is: six hours a day is not good.
Television’s greatest minute-by-minute appeal is that it engages without demanding. One can rest while undergoing stimulation.
Receive without giving. In this respect, television resembles certain other things one might call Special Treats (e.g. candy,
liquor), i.e. treats that are basically fine and fun in small amounts but bad for us in large amounts and
really
bad for us if consumed in the massive regular amounts reserved for nutritive staples. One can only guess at what volume of
gin or poundage of Toblerone six hours of Special Treat a day would convert to.
On the surface of the problem, television is responsible for our rate of its consumption only in that it’s become so terribly
successful at its acknowledged job of ensuring prodigious amounts of watching. Its social accountability seems sort of like
that of designers of military weapons: unculpable right up until they get a little too good at their job.
But the analogy between television and liquor is best, I think. Because (bear with me a second) I’m afraid good old average
Joe Briefcase might be a teleholic. I.e., watching TV can become malignantly addictive. It may become malignantly addictive
only once a certain threshold of quantity is habitually passed, but then the same is true of Wild Turkey. And by “malignant”
and “addictive” I again do not mean evil or hypnotizing. An activity is addictive if one’s relationship to it lies on that
downward-sloping continuum between liking it a little too much and really needing it. Many addictions, from exercise to letter-writing,
are pretty benign. But something is
malignantly
addictive if (1) it causes real problems for the addict, and (2) it offers itself as a relief from the very problems it causes.
10
A malignant addiction is also distinguished for spreading the problems of the addiction out and in in interference patterns,
creating difficulties for relationships, communities, and the addict’s very sense of self and spirit. In the abstract, some
of this hyperbole might strain the analogy for you, but concrete illustrations of malignantly addictive TV-watching cycles
aren’t hard to come by. If it’s true that many Americans are lonely, and if it’s true that many lonely people are prodigious
TV-watchers, and it’s true that lonely people find in television’s 2-D images relief from their stressful reluctance to be
around real human beings, then it’s also obvious that the more time spent at home alone watching TV, the less time spent in
the world of real human beings, and that the less time spent in the real human world, the harder it becomes not to feel inadequate
to the tasks involved in being a part of the world, thus fundamentally apart from it, alienated from it, solipsistic, lonely.
It’s also true that to the extent one begins to view pseudo-relationships with Bud Bundy or Jane Pauley as acceptable alternatives
to relationships with real people, one will have commensurately less conscious incentive even to try to connect with real
3-D persons, connections that seem pretty important to basic mental health. For Joe Briefcase, as for many addicts, the Special
Treat begins to substitute for something nourishing and needed, and the original genuine hunger—less satisfied than bludgeoned—subsides
to a strange objectless unease.
TV-watching as a malignant cycle doesn’t even require special preconditions like writerly self-consciousness or neuroallergic
loneliness. Let’s for a second imagine Joe Briefcase as now just an average U.S. male, relatively unlonely, adjusted, married,
blessed with 2.3 apple-cheeked issue, utterly normal, home from hard work at 5:30, starting his average six-hour stint in
front of the television. Since Joe B. is average, he’ll shrug at pollsters’ questions and answer averagely that he most often
watches television to “unwind” from those elements of his day and life he finds unpleasant. It’s tempting to suppose that
TV enables this unwinding simply because it offers an Auschlanderian “distraction,” something to divert the mind from quotidian
troubles. But would mere distraction ensure continual massive watching? Television offers way more than distraction. In lots
of ways, television purveys and enables
dreams,
and most of these dreams involve some sort of transcendence of average daily life. The modes of presentation that work best
for TV—stuff like “action,” with shoot-outs and car wrecks, or the rapid-fire “collage” of commercials, news, and music videos,
or the “hysteria” of prime-time soap and sitcom with broad gestures, high voices, too much laughter—are unsubtle in their
whispers that, somewhere, life is quicker, denser, more interesting, more… well,
lively
than contemporary life as Joe Briefcase knows it. This might seem benign until we consider that what good old average Joe
Briefcase does more than almost anything else in contemporary life is watch television, an activity which anyone with an average
brain can see does not make for a very dense and lively life. Since television must seek to attract viewers by offering a
dreamy promise of escape from daily life, and since stats confirm that so grossly much of ordinary U.S. life is watching TV,
TV’s whispered promises must somehow undercut television-watching in theory (“Joe, Joe, there’s a world where life is lively,
where nobody spends six hours a day unwinding before a piece of furniture”) while reinforcing television-watching in practice
(“Joe, Joe, your best and only access to this world is TV”).