Authors: David Graeber
Allowing them, through yet another piece of arcane magic that none of us could possibly understand, to end up, after an initial near-$400-billion dip, with far larger reserves than they had ever had before.
At this point, some U.S. creditors clearly feel they are finally in a position to demand that their own political agendas be taken into account.
CHINA WARNS U.S. ABOUT DEBT MONETIZATION
Seemingly everywhere he went on a recent tour of China, Dallas Fed President Richard Fisher was asked to deliver a message to Federal Reserve Chairman Ben Bernanke: “stop creating credit out of thin air to purchase U.S. Treasuries.”
18
Again, it’s never clear whether the money siphoned from Asia to support the U.S. war machine is better seen as “loans” or as “tribute.” Still, the sudden advent of China as a major holder of U.S. treasury bonds has clearly altered the dynamic. Some might question why, if these really are tribute payments, the United States’ major rival would be buying treasury bonds to begin with—let alone agreeing to various tacit monetary arrangements to maintain the value of the dollar, and hence, the buying power of American consumers.
19
But I think this is a perfect case in point of why taking a very long-term historical perspective can be so helpful.
From a longer-term perspective, China’s behavior isn’t puzzling at all. In fact it’s quite true to form. The unique thing about the Chinese empire is that it has, since the Han dynasty at least, adopted a peculiar sort of tribute system whereby, in exchange for recognition of the Chinese emperor as world-sovereign, they have been willing to shower their client states with gifts far greater than they receive in return. The technique seems to have been developed almost as a kind of trick when dealing with the “northern barbarians” of the steppes, who always threatened Chinese frontiers: a way to overwhelm them with such luxuries that they would become complacent, effeminate, and unwarlike. It was systematized in the “tribute trade” practiced with client states like Japan, Taiwan, Korea, and various states of Southeast Asia, and for a brief period from 1405 to 1433, it even extended to a world scale, under the famous eunuch admiral Zheng He. He led a series of seven expeditions across the Indian Ocean, his great “treasure fleet”—in dramatic contrast to the Spanish treasure fleets of a century later—carrying not only thousands of armed marines, but endless quantities of silks, porcelain, and other Chinese luxuries to present to those local rulers willing to recognize the authority of the emperor.
20
All this was ostensibly rooted in an ideology of extraordinary chauvinism (“What could these barbarians possibly have that we really need, anyway?”), but, applied to China’s neighbors, it proved extremely wise policy for a wealthy empire surrounded by much smaller but potentially troublesome kingdoms. In fact, it was such wise policy that the U.S. government, during the Cold War, more or less had to adopt it,
creating remarkably favorable terms of trade for those very states—Korea, Japan, Taiwan, certain favored allies in Southeast Asia—that had been the traditional Chinese tributaries; in this case, in order to contain China.
21
Bearing all this in mind, the current picture begins to fall easily back into place. When the United States was far and away the predominant world economic power, it could afford to maintain Chinese-style tributaries. Thus these very states, alone amongst U.S. military protectorates, were allowed to catapult themselves out of poverty and into first-world status.
22
After 1971, as U.S. economic strength relative to the rest of the world began to decline, they were gradually transformed back into a more old-fashioned sort of tributary. Yet China’s getting in on the game introduced an entirely new element. There is every reason to believe that, from China’s point of view, this is the first stage of a very long process of reducing the United States to something like a traditional Chinese client state. And of course, Chinese rulers are not, any more than the rulers of any other empire, motivated primarily by benevolence. There is always a political cost, and what that headline marked was the first glimmerings of what that cost might ultimately be.
All that I have said so far merely serves to underline a reality that has come up constantly over the course of this book: that money has no essence. It’s not “really” anything; therefore, its nature has always been and presumably always will be a matter of political contention. This was certainly true throughout earlier stages of U.S. history, incidentally—as the endless nineteenth-century battles between goldbugs, greenbackers, free bankers, bi-metallists and silverites so vividly attest—or, for that matter, the fact that American voters were so suspicious of the very idea of central banks that the Federal Reserve system was only created on the eve of World War I, three centuries after the Bank of England. Even the monetization of the national debt is, as I’ve already noted, double-edged. It can be seen—as Jefferson saw it—as the ultimate pernicious alliance of warriors and financiers; but it also opened the way to seeing government itself as a moral debtor, of freedom as something literally owed to the nation. Perhaps no one put it so eloquently as Martin Luther King Jr., in his “I Have a Dream” speech, delivered on the steps of the Lincoln Memorial in 1963:
In a sense we’ve come to our nation’s capital to cash a check. When the architects of our republic wrote the magnificent words
of the Constitution and the Declaration of Independence, they were signing a promissory note to which every American was to fall heir. This note was a promise that all men, yes, black men as well as white men, would be guaranteed the “unalienable Rights” of “Life, Liberty and the pursuit of Happiness.” It is obvious today that America has defaulted on this promissory note, insofar as her citizens of color are concerned. Instead of honoring this sacred obligation, America has given the Negro people a bad check, a check which has come back marked “insufficient funds.”
One can see the great crash of 2008 in the same light—as the outcome of years of political tussles between creditors and debtors, rich and poor. True, on a certain level, it was exactly what it seemed to be: a scam, an incredibly sophisticated Ponzi scheme designed to collapse in the full knowledge that the perpetrators would be able to force the victims to bail them out. On another level it could be seen as the culmination of a battle over the very definition of money and credit.
By the end of World War II, the specter of an imminent working-class uprising that had so haunted the ruling classes of Europe and North America for the previous century had largely disappeared. This was because class war was suspended by a tacit settlement. To put it crudely: the white working class of the North Atlantic countries, from the United States to West Germany, were offered a deal. If they agreed to set aside any fantasies of fundamentally changing the nature of the system, then they would be allowed to keep their unions, enjoy a wide variety a social benefits (pensions, vacations, health care …), and, perhaps most important, through generously funded and ever-expanding public educational institutions, know that their children had a reasonable chance of leaving the working class entirely. One key element in all this was a tacit guarantee that increases in workers’ productivity would be met by increases in wages: a guarantee that held good until the late 1970s. Largely as a result, the period saw both rapidly rising productivity and rapidly rising incomes, laying the basis for the consumer economy of today.
Economists call this the “Keynesian era” since it was a time in which John Maynard Keynes’ economic theories, which already formed the basis of Roosevelt’s New Deal in the United States, were adopted by industrial democracies pretty much everywhere. With them came Keynes’ rather casual attitude toward money. The reader will recall that Keynes fully accepted that banks do, indeed, create money “out of thin air,” and that for this reason, there was no intrinsic reason that
government policy should not encourage this during economic downturns as a way of stimulating demand—a position that had long been dear to the heart of debtors and anathema to creditors.
Keynes himself had in his day been known to make some fairly radical noises, for instance calling for the complete elimination of that class of people who lived off other people’s debts—the “the euthanasia of the rentier,” as he put it—though all he really meant by this was their elimination through a gradual reduction of interest rates. As in so much of Keynesianism, this was much less radical than it first appeared. Actually it was thoroughly in the great tradition of political economy, hearkening back to Adam Smith’s ideal of a debtless utopia but especially David Ricardo’s condemnation of landlords as parasites, their very existence inimical to economic growth. Keynes was simply proceeding along the same lines, seeing rentiers as a feudal holdover inconsistent with the true spirit of capital accumulation. Far from a revolution, he saw it as the best way of avoiding one:
I see, therefore, the rentier aspect of capitalism as a transitional phase which will disappear when it has done its work. And with the disappearance of its rentier aspect much else in it besides will suffer a sea-change. It will be, moreover, a great advantage of the order of events which I am advocating, that the euthanasia of the rentier, of the functionless investor, will be nothing sudden … and will need no revolution.
23
When the Keynesian settlement was finally put into effect, after World War II, it was offered only to a relatively small slice of the world’s population. As time went on, more and more people wanted in on the deal. Almost all of the popular movements of the period from 1945 to 1975, even perhaps revolutionary movements, could be seen as demands for inclusion: demands for political equality that assumed equality was meaningless without some level of economic security. This was true not only of movements by minority groups in North Atlantic countries who had first been left out of the deal—such as those for whom Dr. King spoke—but what were then called “national liberation” movements from Algeria to Chile, or, finally, and perhaps most dramatically, in the late 1960s and 1970s, feminism. At some point in the ’70s, things reached a breaking point. It would appear that capitalism, as a system, simply cannot extend such a deal to everyone. Quite possibly it wouldn’t even remain viable if all its workers were free wage laborers; certainly it will never be able to provide everyone in the world the sort of life lived by, say, a 1960s auto worker in Michigan or Turin
with his own house, garage, and children in college—and this was true even before so many of those children began demanding less stultifying lives. The result might be termed a crisis of inclusion. By the late 1970s, the existing order was clearly in a state of collapse, plagued simultaneously by financial chaos, food riots, oil shock, widespread doomsday prophecies of the end of growth and ecological crisis—all of which, it turned out, proved to be ways of putting the populace on notice that all deals were off.
The moment that we start framing the story this way, it’s easy to see that the next thirty years, the period from roughly 1978 to 2009, follows nearly the same pattern. Except that the deal, the settlement, had changed. Certainly, when both Ronald Reagan in the United States and Margaret Thatcher in the UK launched a systematic attack on the power of labor unions, as well as on the legacy of Keynes, it was a way of explicitly declaring that all previous deals were off. Everyone could now have political rights—even, by the 1990s, most everyone in Latin America and Africa—but political rights were to become economically meaningless. The link between productivity and wages was chopped to bits: productivity rates have continued to rise, but wages have stagnated or even atrophied:
24