Read Coming of Age on Zoloft Online

Authors: Katherine Sharpe

Coming of Age on Zoloft (34 page)

11
| Coming of Age

I
n the mid-2000s, the Pfizer corporation ran a major print and television advertising campaign for Zoloft. The concept was a bit avant-garde for a pharmaceutical product. Instead of human actors, the ads featured cartoons. The main character was a crudely drawn oval, white with a rough black outline, that most viewers referred to as a “ball” or an “egg.” At the beginning of each TV commercial, the ball suffered pathetically—cowering in a corner with social anxiety disorder, or languishing under a dark cloud of depression. By the end, successfully treated with antidepressants, the ball smiled and bounced, joining a room full of fellow balls celebrating in party hats, or horsing around playfully with its friends, a red ladybug and a blue butterfly.

In fact, the series was brilliant. Using a cartoon character represented a stroke of genius; somehow the abstract blob swept away the resistance it’s so easy to feel when watching live actors. “I just want the sad, badly drawn circle to regain its happiness,” read a typical comment in an online forum. “I feel more empathy and goodwill toward it than to the legion of live actors in various commercials. Kind of strange, really.” The simplicity of the ball character—just a circular line with two eyes and a mouth—belied an incredible expressiveness. It
was
kind of strange how easy it was to get emotionally involved with the little guy. It was also hard to watch the commercials without wondering, at least for a moment, whether you could use some Zoloft yourself; the distance from emotional involvement to identification, and from identification to imagining how the product might make you feel, was just two brief hops.

When I search for a way to describe the change antidepressants have brought about during my generation, I think of what’s communicated in these commercials. To live in America today is to be invited, again and again, to ask ourselves whether our problems are symptoms, to consider whether we need or would simply benefit from a psychiatric medication. For some people, antidepressants are more legitimately considered a need than an option. But for millions of others who occupy the large middle ground between definitely requiring medication and definitely not, antidepressants exist as a possibility: available and acceptable, they are always on the table, a potential we remain well aware of, whether we use them or not. Access to medication is controlled by physicians, but the urging to “ask your doctor” is commercial and omnipresent. In a very real sense, medication has become a consumer choice, just one more in the ocean of such choices that define our modern lives.

Certainly there’s much to celebrate about the antidepressant option. Placebo or not, the medications work, alleviating serious depression, “mere” sadness and borderline states alike. People who are grateful for their medication number in the millions. Among the antidepressant users and former users I talked to, even those who felt ambivalent about antidepressants often expressed appreciation that they exist. There’s a small fortitude that comes from simply knowing that there’s help available, something we could try if we wanted to.

But while having the choice to use antidepressants represents freedom, it also brings a type of anxiety all its own. Social psychologists have repeatedly shown that despite the advantages they confer, proliferating options sap decision-making energy and multiply the possibilities for regret.
1
Medication is no exception. My peers and I have gained the power of being able to change how we feel, but we have also had to assume the necessity of wondering, at any point, whether we are choosing correctly. To medicate, or not to medicate? This question, with its accompanying undertow of slight worry, has become part of the atmosphere. To live in the age of SSRIs is to know that there are possibilities, and to have to choose one mode of living over another. It’s hard not to wonder whether there’s something about the road not taken that would have been better, or even to feel that either path leaves something to be desired. We exist in the bind that Carl Elliott’s work on authenticity pointed to: shall we be happy but unnatural, or natural but less than perfectly happy?

As the use of psychiatric medication becomes more prevalent in children and younger teens, the anxiety of choice around medication is increasingly transferred onto parents. In the course of my research for this book, I spoke to several parents who approached the decision to place their kids on medication sure-footedly, without much doubt about which course of action their situation demanded. But others described a deep and lasting uncertainty. “The sense of parental guilt is enormous,” said a father, fifty, who had considered antidepressants for his teenage son and ADHD medication for his younger daughter, after a school friend’s parents and a teacher suggested that the children might benefit. He and his wife eventually decided against medication for both of their children, but they still felt haunted by the choice, and they revisited it occasionally. A mother of three in the Midwest told me that her debate with her husband about whether to start their precocious five-year-old on Ritalin was the fiercest conflict they had faced in their marriage to date. She too reported feeling condemned to a sense of guilt no matter what she chose: she could keep her son off medication and worry about denying him something that he might need or gain from, or she could place him on medication and worry about interfering with his natural development. Both possibilities seemed equally frightening; when we spoke, she had recently agreed to a trial period on Ritalin for her son, but she reported feeling far from peaceful about the choice.

Despite the existence of experts to help parents face these decisions, the call often feels frustratingly subjective. To some extent, it is: there are still no objective or physical tests for mental disorder. Indeed, as the bioethicists Erik Parens and Josephine Johnston point out, the question of whether or not to medicate children and adolescents who are not severely ill can legitimately be considered a question of values. Reasonable people, including doctors, can and do disagree about where to draw the line between normal and abnormal feelings in kids and teenagers, and they disagree about how to weigh the advantages and disadvantages of medicating.
2
In the end, the burden of this uncertainty falls on parents, who find that they must face the choice of whether to use medication, and hold the possibility of making a mistake, on their own.

It is ironic that medication has become a choice that we’re perennially hovering over. Despite the DSM task force’s “increased commitment to reliance on data as the basis for understanding mental disorders,” the last twenty years have seen an increase both in our collective confusion about what mental illness is and about where the boundaries of normal should be drawn. Diagnostic brackets creep, as Peter Kramer said once. As our vocabulary for sadness, conflict, alienation, and exhaustion merges with the language of biomedical mental disorder, we lose the language of ordinary distress. The nonmedical words come to seem imprecise or old-fashioned. As it spreads, the new language breeds uncertainty, until almost any uncomfortable feeling comes to seem potentially abnormal. In our age, it has become increasingly hard to feel sad, angry, or overwhelmed—or have someone close to you feel that way—without wondering if you, or they, are sick. While no doubt there remain places in this country where mental health care is still badly underdelivered, in the pockets where awareness reigns, we crossed over some time ago into what a psychologist would call a state of hypervigilance.

Maybe some part of this puzzlement about normalcy is old, just the latest vestige of a long-standing anxiety about how we ought to feel. The question of how much happiness we should feel and express has been an active one in America for centuries, with different answers prevailing in different times and contexts. Perhaps the question seems written into our Declaration of Independence itself: maybe there has always been slippage in our minds between the idea of a right to pursue happiness, and a duty to
be
happy, a sense that if we aren’t sucking the marrow out of life, aren’t using our extraordinary freedom to its greatest advantage at all times, we aren’t, somehow, fulfilling our job as Americans.

Whatever its causes, the extraordinary proliferation of psych-iatric drug use that began twenty-five years ago with the arrival of Prozac shows few signs of slowing. Spending on prescription drugs in the United States more than doubled between 1999 and 2008, thanks in part to sales of psychopharmaceu-ticals.
3
As of 2009, 9 percent of five-to-seventeen-year-olds in America had been diagnosed with ADHD at some point in their lives so far, representing a steady rise since the 1990s.
4
Over a third of foster children in the United States use a psychotropic medication, and over 40 percent of that group use three or more such medications at the same time.
5

Advertising and marketing help keep the consumption of medication high and rising. Researchers exploring the effects of direct-to-consumer advertising on patient and physician behavior found that in 1995, 3 percent of physician office visits by youth aged fourteen to eighteen resulted in a prescription for a psychiatric drug; in 2001, 8 percent of office visits did. Their data points frame the year 1997, when direct-to-consumer advertising of prescription drugs was first allowed.
6

While data reflecting population-level use of pharmaceuticals roll out slowly, the latest analyses suggest that SSRI use among children under eighteen underwent a modest decline, around 15 percent, in the few years after the FDA mandated that a black-box warning label about the risk of suicidal behaviors in children be placed on SSRIs’ packaging in 2004. (There was a concomitant small rise in the number of children treated with talk therapy.)
7
It remains to be seen whether these changes last, and whether they are part of a larger move away from psychopharmaceutical use in children, or merely a sign that the medication frontier has moved elsewhere.

If sales figures are a guide, that frontier may now consist of a family of drugs called atypical antipsychotics, which became the top-selling drug class by revenue in the United States in 2009. (Though they are used by many fewer people than use SSRIs, they are much more expensive.)
8
Over half a million children and adolescents in the United States now take atypicals,
9
whose brand names include Abilify, Zyprexa, Risperdal, and Seroquel. In children with severe behavioral problems, atypicals—which are also known as “major tranquilizers”—are often prescribed to augment the stimulants used to treat ADHD.
10
The use of atypicals in children and teens is controversial because the drugs cause pronounced weight gain, increase the risk of diabetes, and can cause muscle spasms, twitches, and tics that may or may not go away even after the patient stops the drugs.
11
In spite of these dangers, antipsychotic medications have found a use in children because the drugs “can settle almost any extreme behavior, often in minutes, and doctors have few other answers for desperate families.”
12

The companies that make atypical antipsychotics have promoted their use in young patients. A
New York Times
investigation into public records in Minnesota found that over a third of the state’s psychiatrists accepted payments from drug makers, that an increase in payments over recent years was associated with a ninefold increase in atypical antipsychotic prescriptions for children on the state’s Medicare rolls, and that the doctors who accepted the most money from the manufacturers of atypicals appeared to prescribe those drugs most often.
13

Consumers in the United States use far more psychiatric medication than people in many other countries. Rates of antidepressant use by youth in the United States are between three and fifteen times higher than in continental Europe,
14
and the United Kingdom has been slower to embrace antidepressants for young people as well.
15
Antidepressant use by youth less than doubled through the 1990s in some places in Europe, while in the United States it increased sixfold from the late 1980s to the mid-1990s, then doubled again, and again.
16
Polypharmacy, the practice of prescribing a second or third psychotropic medication to augment the benefits or combat the side effects of a first, is rare outside of the United States, but it is prevalent and rising in popularity here.
17
The reasons for these discrepancies are not definitively understood, but they likely include variation in cultural beliefs about the point at which a behavior becomes a pathology, and also the influence of direct-to-consumer pharmaceutical advertising, which is not currently permitted in any European country.

Whatever the total constellation of reasons, psychiatric medications have become a part of us: so much so that scientists collecting samples downstream from a wastewater treatment plant in North Texas in 2003 discovered metabolites of Prozac and other antidepressants in every single fish they tested.
18

In 2013, the American Psychiatric Association is expected to release the next edition of the diagnostic manual, DSM-5. In keeping with tradition, the book will have expanded to include new categories of disorders. For the first time, a number of diagnoses will include severity scales, making it possible to have certain disorders in mild, moderate, or severe degrees. While I appreciate the move away from the binary, the skeptic in me expects the possibility of having a “mild” case of something to result in further diagnostic-bracket creep, more diagnoses, more prescriptions, and ever-greater shrinkage of that beleaguered old category of normal.

People who defend the biomedical turn in psychiatry often claim that this move has been vital in reducing the stigma that’s associated with mental illness. In the past, the story goes, people with depression and other serious mental disorders were viewed not as ill but as weak of character and were shamed or told unhelpfully to “snap out of it.” Today, thanks to the push to see mental illnesses as legitimate, physical diseases, those who suffer are accorded the respect and care that is due them as people with a true affliction. There is much that is valid in this account. Mental health awareness has risen in recent decades. Several of the people I interviewed felt that they’d benefited personally by the chance to think of their problems as a kind of disease, reducing the attached stigma. The popularization of the idea that depression is an organic disorder may have given many people, for the first time, a way to talk about mental and emotional problems at all.

Other books

Weep In The Night by Valerie Massey Goree
I Forgot to Tell You by Charis Marsh
The Girl from Summer Hill by Jude Deveraux
Captive but Forbidden by Lynn Raye Harris
Seduction Becomes Her by Busbee, Shirlee
Virtual Justice by MA Comley