Read A Supposedly Fun Thing I'll Never Do Again Online
Authors: David Foster Wallace
This would have been just another clever low-concept ’80s TV story, where the final cap-tossing coyly undercuts Dr. Auschlander’s putdown of television, were it not for the countless layers of ironic, involuted TV imagery and data that whirled around this incredibly high-concept installment. Because another of this episode’s cameo stars, drifting through a different subplot, is one Betty White, Sue-Ann Nivens of the old
Mary Tyler Moore Show
, here playing a tortured NASA surgeon (don’t ask). It is with almost tragic inevitability, then, that Ms. White, at 32 minutes into the episode, meets up with the TV-deluded pseudo-Mary in their respective tortured wanderings through the hospital’s corridors, and that she greets the mental patient’s inevitable joyful cries of “Sue-Ann!” with a too-straight face as she says that he must have her confused with someone else. Of the convolved levels of fantasy and reality and identity here—e.g. the patient simultaneously does, does not, and does have Betty White “confused” with Sue-Ann Nivens—we needn’t speak in detail; doubtless a Yale Contemporary Culture dissertation is under way on Deleuze & Guattari and just this episode. But the most interesting levels of meaning here lie, and point, behind the lens. For NBC’s
St Elsewhere
, like
The Mary Tyler Moore Show
and
The Bob Newhart Show
before it, was created, produced, and guided into syndication by MTM Studios, owned by Mary Tyler Moore and overseen by her erstwhile husband, eventual NBC CEO Grant Tinker; and
St. Elsewhere
’s scripts and subplots are story-edited by Mark Tinker, Mary’s stepson, Grant’s heir. The deluded mental patient, an exiled, drifting veteran of one MTM program, reaches piteously out to the exiled, drifting (literally—
NASA
, for God’s sake!) veteran of another MTM production, and her deadpan rebuff is scripted by MTM personnel, who accomplish the parodic undercut of MTM’s Dr. Auschlander with the copyrighted MTM hat-gesture of one MTM veteran who’s “deluded” he’s another. Dr. A.’s Fowleresque dismissal of TV as just a “distraction” is less naïve than insane: there is nothing
but
television on this episode. Every character and conflict and joke and dramatic surge depends on involution, self-reference, metatelevision. It is in-joke within in-joke.
So then why do I get the in-joke? Because I, the viewer, outside the glass with the rest of the Audience, am
in
on the in-joke. I’ve seen Mary Tyler Moore’s “real” toss of that fuzzy beret so often it’s moved past cliché into warm nostalgia. I know the mental patient from
Bob Newhart
, Betty White from everywhere,
and
I know all sorts of intriguing irrelevant stuff about MTM Studios and syndication from
Entertainment Tonight
I, the pseudo-voyeur, am indeed “behind the scenes,” primed to get the in-joke. But it is not I the spy who have crept inside television’s boundaries. It is vice versa. Television, even the mundane little businesses of its production, has become my—our—own interior. And we seem a jaded, weary, but willing and above all
knowledgeable
Audience. And this knowledgeability utterly transforms the possibilities and hazards of “creativity” in television.
St. Elsewhere
’s episode was nominated for a 1988 Emmy. For best original teleplay.
The best TV of the last five years has been about ironic self-reference like no previous species of postmodern art could ever have dreamed of. The colors of MTV videos, blue-black and lambently flickered, are the colors of television.
Moonlighting
’s David and
Bueller
’s Ferris throw asides to the viewer every bit as bald as an old melodrama villain’s mono-logued gloat. Segments of the new late-night glitz-news
After Hours
end with a tease that features harried earphoned guys in the production booth ordering the tease. MTV’s television-trivia game show, the dry-titled
Remote Control
, got so popular it burst out of its MTV-membrane and is now syndicated band-wide. The hippest commercials, with stark computerized settings and blank-faced models in mirrored shades and plastic slacks genuflecting before various forms of velocity, excitement, and prestige, seem like little more than TV’s vision of how TV offers rescue to those lonely Joe Briefcases passively trapped into watching too much TV.
What explains the pointlessness of most published TV criticism is that television has become immune to charges that it lacks any meaningful connection to the world outside it. It’s not that charges of nonconnection have become untrue but that they’ve become deeply irrelevant. It’s that any such connection has become otiose. Television used to point beyond itself. Those of us born in, say, the ’60s were trained by television to look where it pointed, usually at versions of “real life” made prettier, sweeter, livelier by succumbing to a product or temptation. Today’s mega-Audience is way better trained, and TV has discarded what’s not needed. A dog, if you point at something, will look only at your finger.
metawatching
It’s not like self-reference is new to U.S. entertainment. How many old radio shows—Jack Benny, Burns and Allen, Abbott and Costello—were mostly about themselves as shows? “So, Lou, and you said I couldn’t get a big star like Miss Lucille Ball to be a guest on our show, you little twerp.” Etc. But once television introduces the element of watching, and once it informs an economy and culture like radio never could have, the referential stakes go way up. Six hours a day is more time than most people (consciously) do any other one thing. How human beings who absorb such high doses understand themselves will naturally change, become vastly more spectatorial, self-conscious. Because the practice of “watching” is expansive. Exponential. We spend enough time watching, pretty soon we start watching ourselves watching. Pretty soon we start to “feel” ourselves feeling, yearn to experience “experiences.” And that American subspecies into fiction writing starts writing more and more about…
The emergence of something called Metafiction in the American ’60s was hailed by academic critics as a radical aesthetic, a whole new literary form, literature unshackled from the cultural cinctures of mimetic narrative and free to plunge into reflexivity and self-conscious meditations on aboutness. Radical it may have been, but thinking that postmodern Metafiction evolved unconscious of prior changes in readerly taste is about as innocent as thinking that all those college students we saw on television protesting the Vietnam war were protesting only because they hated the Vietnam war. (They may have hated the war, but they also wanted to be seen protesting on television. TV was where they’d
seen
this war, after all. Why wouldn’t they go about hating it on the very medium that made their hate possible?) Metafictionists may have had aesthetic theories out the bazoo, but they were also sentient citizens of a community that was exchanging an old idea of itself as a nation of doers and be-ers for a new vision of the U.S.A. as an atomized mass of self-conscious watchers and appearers. For Metafiction, in its ascendant and most important phases, was really nothing more than a single-order expansion of its own great theoretical nemesis, Realism: if Realism called it like it saw it, Metafiction simply called it as it saw itself seeing itself see it. This high-cultural postmodern genre, in other words, was deeply informed by the emergence of television and the metastasis of self-conscious watching. And (I claim) American fiction remains deeply informed by television… especially those strains of fiction with roots in postmodernism, which even at its rebellious Metafictional zenith was less a “response to” televisual culture than a kind of abiding- in-TV. Even back then, the borders were starting to come down.
It’s strange that it took television itself so long to wake up to watching’s potent reflexivity. Television shows about the business of television shows were rare for a long time.
The Dick van Dyke Show
was prescient, and Mary Moore carried its insight into her own decade-long exploration of local-market angst. Now, of course, there’s been everything from
Murphy Brown
to
Max Headroom
to
Entertainment Tonight
And with Letterman, Miller, Shandling, and Leno’s battery of hip, sardonic, this-is-just-TV schticks, the circle back to the days of “We’ve just got to get Miss Ball on our show, Bud” has closed and come spiral, television’s power to jettison connection and castrate protest fueled by the very ironic postmodern self-consciousness it had first helped fashion.
It will take a while, but I’m going to prove to you that the nexus where television and fiction converse and consort is self-conscious irony. Irony is, of course, a turf fictionists have long worked with zeal. And irony is important for understanding TV because “TV,” now that it’s gotten powerful enough to move from acronym to way of life, revolves off just the sorts of absurd contradictions irony’s all about exposing. It is ironic that television is a syncretic, homogenizing force that derives much of its power from diversity and various affirmations thereof. It is ironic that an extremely canny and unattractive self-consciousness is necessary to create TV performers’ illusion of unconscious appeal. That products presented as helping you express individuality can afford to be advertised on television only because they sell to enormous numbers of people. And so on.
Television regards irony sort of the way educated lonely people regard television. Television both fears irony’s capacity to expose, and needs it. It needs irony because television was practically
made
for irony. For TV is a bisensuous medium. Its displacement of radio wasn’t picture displacing sound; it was picture added. Since the tension between what’s said and what’s seen is irony’s whole sales territory, classic televisual irony works via the conflicting juxtaposition of pictures and sounds. What’s seen undercuts what’s said. A scholarly article on network news describes a famous interview with a corporate guy from United Fruit on a CBS special about Guatemala: “I sure don’t know of anybody being so-called ‘oppressed,’ “ this guy, in a ’70s leisure suit and bad comb-over, tells Ed Rabel. “I think this is just something that some reporters have thought up.”
7
The whole interview is intercut with commentless footage of big-bellied kids in Guatemalan slums and union organizers lying in the mud with cut throats.
Television’s classic irony function came into its own in the summer of 1974, as remorseless lenses opened to view the fertile “credibility gap” between the image of official disclaimer and the reality of high-level shenanigans. A nation was changed, as Audience. If even the president lies to you, whom are you supposed to trust to deliver the real? Television, that summer, got to present itself as the earnest, worried eye on the reality behind all images. The irony that television is itself a river of image, however, was apparent even to a twelve-year-old, sitting there, rapt. After ’74 there seemed to be no way out. Images and ironies all over the place. It’s not a coincidence that
Saturday Night Live
, that Athens of irreverent cynicism, specializing in parodies of (1) politics and (2) television, premiered the next fall (on television).
I’m worried when I say things like “television fears…” and “television presents itself…” because, even though it’s kind of a necessary abstraction, talking about television as if it were an entity can easily slip into the worst sort of anti-TV paranoia, treating of TV as some autonomous diabolical corrupter of personal agency and community gumption. I am concerned to avoid anti-TV paranoia here. Though I’m convinced that television today lies, with a potency somewhere between symptom and synecdoche, behind a genuine crisis for U.S. culture and literature, I do not agree with reactionaries who regard TV as some malignancy visited on an innocent populace, sapping IQs and compromising SAT scores while we all sit there on ever fatter bottoms with little mesmerized spirals revolving in our eyes. Critics like Samuel Huntington and Barbara Tuchman who try to claim that TV’s lowering of our aesthetic standards is responsible for a “contemporary culture taken over by commercialism directed to the mass market and necessarily to mass taste”
8
can be refuted by observing that their Propter Hoc isn’t even Post Hoc: by 1830, de Tocqueville had already diagnosed American culture as peculiarly devoted to easy sensation and mass-marketed entertainment, “spectacles vehement and untutored and rude” that aimed “to stir the passions more than to gratify the taste.”
9
Treating television as evil is just as reductive and silly as treating it like a toaster w/pictures.
It is of course undeniable that television is an example of Low Art, the sort of art that has to please people in order to get their money. Because of the economics of nationally broadcast, advertiser-subsidized entertainment, television s one goal—never denied by anybody in or around TV since RCA first authorized field tests in 1936—is to ensure as much watching as possible. TV is the epitome of Low Art in its desire to appeal to and enjoy the attention of unprecedented numbers of people. But it is not Low because it is vulgar or prurient or dumb. Television is often all these things, but this is a logical function of its need to attract and please Audience. And I’m not saying that television is vulgar and dumb because the people who compose Audience are vulgar and dumb. Television is the way it is simply because people tend to be extremely similar in their vulgar and prurient and dumb interests and wildly different in their refined and aesthetic and noble interests. It’s all about syncretic diversity: neither medium nor Audience is faultable for quality.
Still, for the fact that individual American human beings are consuming vulgar, prurient, dumb stuff at the astounding average per-household dose of six hours a day—for this both TV and we need to answer. We are responsible basically because nobody is holding any weapons on us forcing us to spend amounts of time second only to sleep doing something that is, when you come right down to it, not good for us. Sorry to be a killjoy, but there it is: six hours a day is not good.