Read Modern Times: The World From the Twenties to the Nineties Online
Authors: Paul Johnson
Tags: #History, #World, #20th Century
In the 1930s, however, the Chicago scientist Warder Alee published
Animal Aggregations
(1931) and
The Social Life of Animals
(1938), which gave illuminating examples of the effect of evolution on social behaviour. The real breakthrough came at roughly the same time as the Watson-Crick discovery, when the British ecologist V.C. Wynne-Edwards published
Animal Dispersion in Relation to Social Behaviour
(1962). He showed that virtually all social behaviour, such as hierarchies and pecking-orders, securing territory, bird-flocking, herding and dances, were means to regulate numbers and prevent species exceeding available food supplies. Socially
subordinate members were prevented from breeding; each animal sought to maximize its own reproduction; the fittest succeeded. In 1964 another British geneticist, W.D. Hamilton, showed in
The Genetic Evolution of Social Behaviour
how important devotion to one’s own genes was in ordering social behaviour: parental ‘protection’ was a case of concern for others in proportion as they shared parents’ genes. Unselfishness or altruism found in natural selection, therefore, was not moral in origin, nor implied a conscience or personal motivation: there were altruistic chickens, even viruses. Genetic kin theory stated that occurrence of altruistic behaviour increased in proportion to the number of genes shared by common ancestry. It had a cost-benefit element, being more likely to occur when the cost to the donor was small, the recipient’s benefit large. Kin theory was refined by the Harvard biologist Robert Trivers, who developed the notions of ‘reciprocal altruism’ (a form of enlightened self-interest) and ‘parental investment’, which enhanced the offspring’s chances of survival at the cost of parents’ ability to invest in subsequent offspring. Females invested more than males, since eggs ‘cost’ more than sperm. Female choice was largely responsible for the evolution of mating systems, being attuned to the maximizing of evolutionary fitness. As this new methodology developed, it became possible to show that social patterns in almost any species had their origins in evolutionary natural selection.
In 1975 the Harvard scientist Edward Wilson brought together two decades of specialist research in his book,
Sociobiology: the New Synthesis.
154
His own work lay with insects but he drew on a vast array of detailed empirical studies to mount his case that the time was ripe for a general theory analogous to the laws of Newton or Einstein. This book, and other studies, drew attention to the biological process of self-improvement which is going on all the time and is a vital element in human progress. They suggested the process should be studied by empirical science, not metaphysics, and by the methodology so brilliantly categorized by Karl Popper, in which theory is made narrow, specific and falsifiable by empirical data, as opposed to the all-purpose, untestable and self-modifying explanations offered by Marx, Freud, Lévi-Strauss, Lacan, Barthes and other prophets.
What was clear, by the last decade of the century, was that Alexander Pope had been right in suggesting, ‘The proper study of mankind is man.’
155
For man, as a social being, was plainly in need of radical improvement. He was, indeed, capable of producing scientific and technical ‘miracles’ on an ever-increasing scale. The ability to create new substances further accelerated the communications and electronics revolution that had started in the 1970s and
gathered pace throughout the 1980s and into the 1990s. As the number of circuits which could be imprinted on a given area multiplied, calculators and computers grew in capacity and fell in price. The first true pocket calculator, on which mankind had been working since the time of Pascal in the mid-seventeenth century, was produced by Clive Sinclair in 1972 and cost £100; by 1982 a far more powerful model cost £7. The emergence of the silicon chip led directly to the development of micro-processors. Whereas complex electronic controls had previously to be specifically constructed for each job, the micro-processor was a general-purpose device which could be cheaply made in vast numbers. Its emergence was followed, in December 1986, by high-temperature super-conductors, materials which lose all resistance to electric currents at very low temperatures. These, and other new materials and processes, not only advanced the frontiers of high technology, thus making possible the kind of long-distance space probes common in the 1980s and early 1990s, advanced laser surgery and the devastating military technology employed in the Gulf War, but introduced mass-manufactured, low-cost devices which affected the life and work of hundreds of millions of ordinary people. Video machines and micro-discs transformed popular entertainment. Portable phones and car phones gave a new, and often unattractive, dimension to work. Conventional telephone cables were replaced by fibre-optic ones, whose signals coded as light-pulses enabled thousands of phone conversations and scores of TV channels to be carried simultaneously along a single circuit. While the capacity of specialist computers enabled governments and businesses to perform prodigies of computations in micro-seconds, word-processors transformed office work throughout the advanced nations and were employed by ever-widening ranges of people, even including humble world-historians. Machines, often of astonishing complexity, now entered and often dominated the lives of the masses.
Yet, in the early 1990s, as many people died of starvation as ever before in world history. Moreover, many innovations designed to increase human happiness ended by diminishing it. In the West, the spread of contraception, in a variety of forms, and the growing availability of abortion on demand, made fortunes for pharmaceutical firms and clinics, but, in a hedonistic and heedless society, did not appreciably diminish the number of unwanted children. One striking and unwelcome phenomenon of the 1970s and still more of the 1980s was the growth of what were euphemistically termed ‘one-parent families’, in most cases mothers, usually dependent on welfare payments, looking after children on their own. These deprived children were the products of promiscuity and divorce-by-consent.
The numbers of illegitimate children, in societies which called themselves advanced, grew at an astonishing rate in the 1980s. By the spring of 1991, one in four live births in Britain was illegitimate; in parts of Washington
DC
, capital of the richest nation on earth, the proportion was as high as 90 per cent. There was no point in trying to pretend that one-parent families and illegitimacy were anything other than grave social evils, devastating for the individuals concerned and harmful for society, leading, as they inevitably did in many cases, to extreme poverty and crime. Crime-rates rose everywhere, fuelled by growing abuse of alcohol and drugs. The spread of unlawful drug habits was just as likely to be prompted by affluence as by poverty. By the end of the 1980s it was calculated that the illegal use of drugs in the United States now netted its controllers over $110 billion a year. On 6 September 1989, President Bush announced plans to reduce drug abuse in the United States by half by the year 2000, and to spend $7.86 billion in federal funds on the effort. Few expressed much confidence in the project.
Another self-inflicted wound in the advanced nations was the spread of
AIDS
(Acquired Immune Deficiency Syndrome). The origins of this fatal and seemingly incurable disease, which destroys the body’s self-defence system against infection, remained obscure even in the early 1990s, despite much research. It appeared to be spreading most rapidly in black Africa, where heterosexuals acted as transmitters. In the West, however, it was largely confined to male homosexuals and (to a much lesser extent) to drug-users. It was the product of drug abuse and, far more seriously, of the homosexual promiscuity which, often in extreme form, had followed the decriminalization of homosexuality in the 1960s and 1970s. Some male homosexuals were shown to have had 300 or more sexual partners in a single year, and against this background the disease spread rapidly. First reports of its seriousness came on 31 December 1981, when 152 cases had emerged, chiefly in San Francisco, Los Angeles and New York; one was an intravenous drug-abuser; the rest were male homosexuals. By 13 October 1985 the World Health Organization declared that the disease had reached epidemic proportions. By February 1989 it was widely reported that those tested for
AIDS
with positive results were being denied life insurance; others were losing their jobs. Drugs like azidothymidine (
AZT
) were used to delay (not cure) the progress of the disease, but often with horrific side effects. On 9 February 1989 it was announced that a new antibody called
CD4
had been developed in San Francisco; this promised to delay the fatal consequences of
AIDS
for possibly years, and with minimal side-effects. But no actual cure appeared in sight
despite vast expenditure and effort. Uncertainties about the disease produced bitter political arguments. Governments were particularly anxious to prevent its spread among the community as a whole, and spent many millions on advertising campaigns designed to reduce heterosexual promiscuity, and encourage the use of condoms. Again, the pharmaceutical industry benefited, but whether government expenditure had any other effect, no one knew. By the early 1990s it was generally believed that the likelihood of an epidemic among heterosexuals, once confidently forecast by the homosexual lobby, was negligible.
Hugely expensive and probably ineffectual government campaigns against drug-abuse and
AIDS
saw the modern state in a characteristic twentieth-century posture – trying to do collectively what the sensible and morally educated person did individually. The disillusion with socialism and other forms of collectivism, which became the dominant spirit of the 1980s, was only one aspect of a much wider loss of faith in the state as an agency of benevolence. The state was, up to the 1980s, the great gainer of the twentieth century; and the central failure. Before 1914 it was rare for the public sector to embrace more than 10 per cent of the economy; by the end of the 1970s, and even beyond, the state took up to 45 per cent or more of the
GNP
in liberal countries, let alone totalitarian ones. But whereas, at the time of the Versailles Treaty in 1919, most intelligent people believed that an enlarged state could increase the sum total of human happiness, by the 1990s this view was held by no one outside a small, diminishing and dispirited band of zealots, most of them academics. The experiment had been tried in innumerable ways; and it had failed in nearly all of them. The state had proved itself an insatiable spender, an unrivalled waster. It had also proved itself the greatest killer of all time. By the 1990s, state action had been responsible for the violent or unnatural deaths of some 125 million people during the century, more perhaps than it had succeeded in destroying during the whole of human history up to 1900. Its inhuman malevolence had more than kept pace with its growing size and expanding means.
The fall from grace of the state likewise, by the early 1990s, had begun to discredit its agents, the activist politicians, whose phenomenal rise in numbers and authority was one of the most important and baleful human developments of modern times. It was Jean-Jacques Rousseau who had first announced that human beings could be transformed for the better by the political process, and that the agency of change, the creator of what he termed the ‘new man’, would be the state, and the self-appointed benefactors who controlled it for the good of all. In the twentieth century his theory
was finally put to the test, on a colossal scale, and tested to destruction. As we have noted, by the year 1900 politics was already replacing religion as the chief form of zealotry. To archetypes of the new class, such as Lenin, Hitler and Mao Tse-tung, politics – by which they meant the engineering of society for lofty purposes – was the one legitimate form of moral activity, the only sure means of improving humanity. This view, which would have Struck an earlier age as fantastic, even insane, became to some extent the orthodoxy everywhere: diluted in the West, in virulent form in the Communist countries and much of the Third World. At the democratic end of the spectrum, the political zealot offered New Deals, Great Societies and welfare states; at the totalitarian end, cultural revolutions; always and everywhere, Plans. These zealots marched across the decades and hemispheres: mountebanks, charismatics,
exaltés
, secular saints, mass murderers, all united by their belief that politics was the cure for human ills: Sun Yat-sen and Ataturk, Stalin and Mussolini, Khrushchev, Ho Chi Minh, Pol Pot, Castro, Nehru, U Nu and Sukarno, Perón and Allende, Nkrumah and Nyerere, Nasser, Shah Pahlevi, Gadafy and Saddam Hussein, Honecker and Ceausescu. By the 1990s, this new ruling class had lost its confidence and was rapidly losing ground, and power, in many parts of the world. Most of them, whether alive or dead, were now execrated in their own homelands, their grotesque statues toppled or defaced, like the sneering head of Shelley’s Ozymandias. Was it possible to hope that ‘the age of politics’, like the ‘age of religion’ before it, was now drawing to a close?
Certainly, by the last decade of the century, some lessons had plainly been learned. But it was not yet clear whether the underlying evils which had made possible its catastrophic failures and tragedies – the rise of moral relativism, the decline of personal responsibility, the repudiation of Judeo-Christian values, not least the arrogant belief that men and women could solve all the mysteries of the universe by their own unaided intellects – were in the process of being eradicated. On that would depend the chances of the twenty-first century becoming, by contrast, an age of hope for mankind.