Read Restless Giant: The United States From Watergate to Bush v. Gore Online

Authors: James T. Patterson

Tags: #20th Century, #Oxford History of the United States, #American History, #History, #Retail

Restless Giant: The United States From Watergate to Bush v. Gore (56 page)

In the summer of 1994, however, Congress renewed the statute, whereupon a three-person panel of federal judges charged by the law with the authority to appoint such counsel intervened. The panel, named by Chief Justice Rehnquist, determined that an independent prosecutor (that is, Fiske) ought not to be named by an executive official such as Reno, a presidential appointee. The judges replaced Fiske with Kenneth Starr, who had been solicitor general of the United States during the Bush administration.

This proved to be a major step toward the widening of probes into the doings of the Clintons. Fiske had investigated impartially and had not discovered evidence incriminating the Clintons. Starr, however, proved to be a zealously partisan prosecutor. Given wider authority over time, his investigations into the Clintons’ involvement in the complicated Whitewater matter, which dated to the 1970s, soon broadened to include the president’s handling of “Travelgate,” the flap in early 1993 over his firing of White House travel office employees, and of “Filegate,” a controversy that had surfaced in December 1993 over the mysterious disappearance from the White House of files relating to Whitewater and other matters.

By the time of Starr’s appointment, the president had already been confronting accusations of sexual harassment brought against him by Paula Corbin Jones, a former Arkansas state employee who claimed that Clinton, while governor (and therefore her boss), had exposed himself to her in a Little Rock hotel room in 1991. Because Jones named a state trooper whom she accused of bringing her to the hotel room for a sexual liaison with Clinton, the media labeled this story, too, as a “-gate”—this time, “Troopergate.” In May 1994, she filed suit in an Arkansas federal court seeking $700,000 in damages.
61

While in private practice between May and August 1994, Starr served as an adviser to Jones’s legal team. When his name surfaced as a possible successor to Fiske, critics cried angrily but to no avail that he had a conflict of interest and should not accept such an appointment. Later, when Starr’s investigations widened to focus on the president’s sexual activities, furious defenders of the president, notably Mrs. Clinton, insisted that his appointment had been a partisan move that had launched a vast right-wing conspiracy.

These legal battles received extraordinarily wide exposure in the media. In one week during mid-March, when the Jones suit was front-page news, the three TV networks carried 126 stories about Clinton’s alleged involvement in Whitewater and other matters, compared to 107 during the first three months of 1994 concerning bloodshed in Bosnia, 56 concerning tensions in the Middle East, and 42 concerning the ongoing struggle for health care reform.
62
Amid sensationalism of this sort, it was hardly surprising that many Americans wondered if “Slick Willie,” already known as a philanderer, might be guilty as charged. Jay Leno, host of
The Tonight Show
, quipped that Clinton had complained about “powerful forces threatening to bring down his administration. I think that they are called hormones.”
63

Gingrich then acted boldly to promote GOP success in the 1994 elections. In an unprecedented move, he drew up a so-called Contract with America and in late September succeeded in getting 367 Republican House candidates to endorse it. Its preamble proclaimed that the election “offers the chance, after four decades of one-party control, to bring the House a new majority that will transform the way Congress works. That historic change would be the end of government that is too big, too intrusive, and too easy with the public’s money. It can be the beginning of a Congress that respects the values and shares the faith of the American family.”
64

The Contract wisely skirted divisive cultural issues such as abortion or school prayer. Otherwise, however, it was a concise summation of long-cherished conservative positions concerning economic, foreign, and military policies. It opened by calling for a series of measures that promised to reform procedures in the House, including the establishment of limits on the terms of committee chairs. It then highlighted ten broader goals. These included approval of a constitutional “balanced budget/tax limitation amendment” and of a “legislative line-item veto”; tougher action against welfare programs and crime, including “effective death penalty provisions”; “tax incentives for private long-term care insurance to let Older Americans keep more of what they have earned over the years”; a “Common Sense Legal Reform Act” that would set “reasonable limits on punitive damages and reform product liability laws to stem the endless tide of litigation”; cuts in capital gains taxes; a $500-per-child tax credit; a prohibition against American troops being placed under U.N. command; stronger defense efforts that would “maintain our credibility around the world”; and a “first-ever vote on term limits to replace career politicians with citizen legislators.”

By moving toward the center in 1994, Clinton had already tried to narrow the distance between his own policies and conservative goals such as these. A budget balancer, free trader, welfare reform advocate, and self-described crime fighter, he was by no means the ardent liberal that Gingrich portrayed him to be. It is therefore hard to say whether the Contract greatly influenced voters, most of whom had only a dim idea of what it said. But as the off-year elections approached it was obvious that Republicans, aided by religious voters who had mobilized behind the Christian Coalition, had managed to turn the upcoming elections into a referendum on Clinton himself. Well-organized foes of gun control, directed by the National Rifle Association, were especially active in denouncing the administration. All these groups did their best to paint the president as a knee-jerk liberal. Moreover, polls indicated that voters had a low opinion of him. Many Democratic congressional candidates avoided being closely associated with him.

The results in November were devastating to Democrats. Republicans scored the most impressive off-year comeback in modern times, adding nine members in the Senate, where they recaptured a majority (of 52 to 48) for the first time since 1986. They gained fifty-four seats in the House, taking command there, 230 to 204, for the first time since 1954. In 1995, the House was to include seventy-three freshman Republicans, many of them southerners who were ideologically to the right of Gingrich. Harvey Mansfield, a conservative professor of government at Harvard, said that the election meant the “end of the New Deal” and the “completion of what Ronald Reagan began.”
65

To a degree, that was wishful thinking on his part. The major programs of the New Deal and the Great Society survived. Still, it was obvious after 1994 that politicians on the right were high in the saddle and that the familiar phenomenon of divided government—Congress vs. the White House—had returned with a vengeance. The GOP, having wrested control of the House, maintained it for the next decade and more. During the remainder of Clinton’s time in the White House, it did its best to dash his high hopes of being remembered as a great American president.

11
Prosperity, Partisanship, Terrorism

Of the many developments highlighting the last years of the twentieth century in the United States, two stood out. The first was a surge in the economy. As rising prosperity promoted good feelings, it stimulated still higher expectations, which in turn continued to produce many of the anxieties that had troubled Americans since the 1960s. The second was a heating up of the partisan warfare that had already beset the first two years of Clinton’s presidency. Intensifying to unprecedented levels, this warfare polarized the politics of his second term, relegating even concerns about terrorism to the background.

I
N 1993–94, WHEN THE
U
NITED
S
TATES
was still recovering from recession, some analysts of economic trends continued to emphasize the theme of decline. One historian noted a widespread belief that “the American economy is weak and failing, destined to be a second echelon participant in a new twenty-first-century world economic order.” Another writer, Edward Luttwak, lamented the “spent power of Calvinism” in the United States and complained that Americans, refusing to save or invest in infrastructure, were piling up huge amounts of personal debt. The major question to be asked about the economy, he concluded, was not
if
the United States would become a “third world country,” but
when
. He speculated, “The date might be as close as the year 2020.” He concluded, “If present trends continue, all but a small minority of Americans will be impoverished soon enough, left to yearn hopelessly for the last golden age of American prosperity.”
1

Declinists such as these ticked off a number of familiar trends to bolster their case. Economic growth, they said, remained sluggish; productivity, though showing signs of resurgence, was still smaller than it had been in the 1960s; inequality of wealth and income, ascendant since the 1970s, was sharpening; poverty (in 1994) still afflicted 14.5 percent of the population, or 38 million people, including 21 percent of children under 18; public schools, especially in the inner cities, continued to falter; the downtown areas of many large cities, though glitzier here and there, were still languishing; jobs were still disappearing in the Rust Belt and other centers of American manufacturing; and the United States, facing strong competition from abroad, was running up large trade and payments deficits. Knowledgeable observers wrote that America’s economy was becoming dangerously dependent on overseas investors, notably central banks that bought Treasury securities.

The laments went on: The real wages of production and non-supervisory workers in manufacturing, having slipped slowly since the 1970s, were showing no signs of improvement; “outsourcing” of jobs to cheap-labor nations was throwing Americans, including white-collar employees, out of work; apparently insatiable consumerism was ratcheting up credit card debt and diverting money from productive investment; ever greater corporate centralization was fattening the salaries and perks of CEOs and swallowing small businesses; the rise of huge, anti-union retail chains such as Wal-Mart was accelerating the growth of low-wage service sector work; and “downsizing” was threatening middle managers as well as blue-collar workers, thereby fostering what some pessimists were calling the “democratization of insecurity.”
2

Contemporary critics especially deplored the persistence of social and environmental problems, caused by what they perceived as the scandalously excessive wastefulness and materialism of life in the United States, the world’s leading “throw-away society.” The old adage “Waste not, want not,” they complained, had gone the way of the horse and buggy. Echoing earlier pessimists, they railed at the political influence of growth-at-all-costs developers and at the rapid expansion of “exurban sprawl,” where office parks, malls, fast-food outlets, and ticky-tacky subdivisions were said to be blighting the countryside.
3

In their familiarly grim descriptions of suburban and exurban life as “cultural wastelands,” many of these critics continued to be elitist and patronizing: Contrary to the message of movies such as
American Beauty
(1999), it was surely not the case that the majority of suburbanites were bored, tasteless, or neurotic. Most city-dwelling Americans who moved to suburbs and exurbs—which varied considerably in size and levels of personal income—hoped to find better schools and safer neighborhoods. They yearned for more space. Those who settled in “ticky-tacky” housing tracts were not tasteless; they were relatively poor. Struggling to get ahead, they moved to places that they could afford. They shopped at stores such as Wal-Mart because that was where goods were cheapest. Still, critics continued to bewail the mores and tastes of suburban and exurban folk: Like far too many other Americans, they asserted, many exurbanites were mindless, acquisitive consumers.

Critics of American society in the 1990s further maintained that the nation’s obsession with automobiles had roared out of control, creating mammoth traffic jams and leaving the nation ever more reliant on foreign production of oil. Increases in the number of gas-guzzling SUVs, pickups, and other high-powered motor vehicles, they added, were endangering people and befouling air that was dirtied already by emissions from carelessly regulated power plants, refineries, and manufacturing industries.
4
Many of these facilities loomed over low-income and predominantly minority neighborhoods, subjecting children in such areas to asthma attacks and raising the risk of chronic bronchitis among adults. Greenhouse gases, environmentalists insisted, were seriously exacerbating global warming. Al Gore declared in 1992 that environmental degradation was threatening the “very survival of the civilized world.”
5

Other commentators focused grimly on the stressful culture of work in the United States. Americans, they pointed out, toiled for far more hours per week than did people in most other industrialized nations. Workers, stressed out, had little time to spend with their families or to volunteer for community activities. Wages and salaries, though rising for most workers in the late 1990s, never seemed sufficient. As one unhappy Chicagoan, managing director of a company, despaired in 1997, “I earn more in a month than my dad did in a year, but I feel my life is more difficult.” He added, “I don’t gamble, I don’t have season tickets to the Bulls. How can I earn this much but not have anything left over?”
6

S
OME OF THESE MANY COMPLAINTS
about American economic, environmental, and social conditions in the mid- and late 1990s were on target. Until very late in the decade, poverty remained stubborn, reflecting not only the large number of low-income female-headed families—and racial inequalities—but also the persistence of holes in the nation’s safety net, which was still more porous than those of most industrialized countries. Thanks to poverty, drug abuse, and lack of adequate prenatal services in many low-income areas, infant mortality in the United States, though roughly half of what it had been in the 1970s, continued to be higher than it was in twenty-five other industrialized nations.
7
The “underclasses” in America’s urban ghettos, Native Americans on reservations, and migrant workers and other low-income people in depressed rural areas still struggled to stay afloat. As in the 1970s and 1980s, long-range structural trends advancing the spread of relatively low-wage service work, as well as competition from abroad, threatened American manufacturing jobs.
8
The wages of production and non-supervisory workers continued to stagnate. Though Congress raised the minimum wage in 1996 (from $4.25 to $5.15 an hour), its real buying power, having fallen since the 1970s, continued to decline.
9

Americans with full-time employment (still reckoned, as it long had been, at forty hours per week) did in fact work considerably longer hours on the average—perhaps 350 to 400 more a year—than did Western Europeans, who enjoyed shorter workdays and more holidays.
10
Many Europeans (living in the cradle of Calvinism) were stunned by the strength of the work ethic in the United States and deplored the stress that they said it created. Critics were also correct to point out that American energy use remained enormous: With approximately 6 percent of the world’s population, the United States in the late 1990s was annually responsible for a quarter of the planet’s total energy consumption and emitted a quarter of the world’s greenhouse gases. By 2002, the United States had to import 54 percent of its crude oil, compared to less than 40 percent during the frightening energy crises of the late 1970s.
11

It was also true that America’s eager consumers and investors were continuing to amass levels of personal debt that were far higher than those in other nations. People were also gambling more than ever, and speculating eagerly in the stock market, sometimes as day traders and as members of proliferating investment clubs. Given the listed value of stocks, this activity was hardly surprising: Between January 1991, when the Dow Jones industrial average hit a low of 2,588, and January 2000, by which time it had soared to a high of 11,722, stock prices more than quadrupled.
12
In the same month of 2000, starry-eyed (though, as it later turned out, badly blurred) visions of the future led AOL to acquire Time Warner for $180 billion in stock and debt. This, the largest corporate merger in United States history, was but the most spectacular instance of a merger mania that dwarfed that of the Reagan years. Successful investors such as Warren Buffett (the “oracle of Omaha”) of Berkshire Hathaway and Peter Lynch, who managed the Magellan Fund of Fidelity Investments, received adulatory attention in the media, in a culture that seemed more mesmerized than ever by dreams of moneymaking. By 2001, 51 percent of American families had some investments in stock, compared to 32 percent in 1989 and 13 percent in 1980.
13

Trouble lay ahead, especially for tech-obsessed buyers who were plunging their money into increasingly overpriced “dot-com” stocks. Federal Reserve chairman Alan Greenspan, who most of the time intoned that the United States was entering a “new age” economy of huge potential, paused in December 1996 to warn against America’s “irrational exuberance.”
14
Greenspan did not want to stick a pin in the bubble, however, and stock prices continued to increase greatly until early 2000. By that time, enthusiastic onlookers were declaring that the United States had become a “shareholder nation.” The boom in stocks, part of the larger advance of prosperity in the late 1990s, did much to give Americans—already feeling good about the end of the Cold War—a triumphant but illusory sense of empowerment.
15

Most economists agreed that inequality of income, as measured by shares of national earnings held by various levels of the income pyramid, was not only continuing to rise in the United States but also that it was sharper than in other industrial nations. The share of aggregate income held by the poorest one-fifth of American households declined from 4.4 percent of total income in 1975 to 3.7 percent in 1995, or by almost one-sixth. The share held by the richest fifth increased in the same years from 43.2 percent to 48.7 percent, a rise of more than 12 percent. The IRS reported in 1999 that 205,000 American households had incomes of more than $1 million.
16
The very wealthy, including many CEOs, were enjoying salaries, perks, and comforts on an unprecedented scale. By 1998, the average income of the 13,000 wealthiest families in the United States was 300 times that of average families. These families earned as much income as the poorest 20 million families.
17

Why this inequality continued to mount remained disputed. Some writers emphasized that top corporate leaders had become greedier and less paternalistic and that tax cuts favoring the very wealthy were to blame.
18
Others stressed that racial discrimination still played a key role, and that female-headed families, which were disproportionately African American, and steadily larger numbers of relatively poor immigrants weighted the bottom of the income pyramid. The increase in immigration was surely an important source of ascending inequality. Virtually all analysts agreed that another major cause of inequality was lack of growth in relatively well paid manufacturing employment and the ongoing rise in the number of low-wage service-sector jobs. Many of these openings were taken out of necessity by women, recent immigrants, and other people with low levels of education and skill.

All these factors helped account for the worsening of economic inequality. So did the actions of some large corporations. The inflated sense of monetary entitlement expressed by a large number of corporate executives—“we made big profits for the company, and we deserve big rewards,” they insisted—exposed in an especially crass and magnified fashion the entitlement mentality of much of the culture at large.
19
Some major corporations, anxious to lessen huge obligations, began cutting back or discontinuing long-promised defined-benefit pension and medical plans. Many employers continued to take a hard line with trade unions, whose losses of members badly sapped the bargaining power of organized labor.
20
Lobbying effectively in Washington and in state capitals, representatives of big business, including titans of agribusiness, demanded—and often received—generous subsidies, protections, and tax breaks from legislators. Not without cause, liberals (and others) concluded that the harshly dog-eat-dog approach of many American business leaders in the 1990s was creating a new, often nastier “corporate culture.”

Other books

The Infernal City by Greg Keyes
Facts of Life by Gary Soto
Safe From the Fire by Lily Rede
The Ghost Pattern by Leslie Wolfe
KeyParty by Jayne Kingston
There You'll Find Me by Jenny B. Jones
Franklin and the Thunderstorm by Brenda Clark, Brenda Clark
Forever Too Far by Abbi Glines
Billionaire Erotic Romance Boxed Set: 7 Steamy Full-Length Novels by West, Priscilla, Davis, Alana, Gray, Sherilyn, Stephens, Angela, Lovelace, Harriet