Modern Mind: An Intellectual History of the 20th Century (152 page)

Read Modern Mind: An Intellectual History of the 20th Century Online

Authors: Peter Watson

Tags: #World History, #20th Century, #Retail, #Intellectual History, #History

The issues just discussed are partly psychological, partly sociological. In
The Decomposition of Sociology
(1994),
Irving Louis Horowitz,
Hannah Arendt Distinguished Professor of Sociology at Rutgers University, and president of Transaction/Society, a sociological publishing house, laments both the condition and direction of the discipline to which he has given his life.
61
His starting point, and the reason why his book appeared when it did, was the news in February 1992 that the sociology departments in three American universities had been closed down and the one at Yale cut back by more than half. At the same time, the number of students graduating in sociology was 14,393, well down on the 35,996 in 1973. Horowitz is in no doubt about the cause of this decline, a decline that, he notes, is not confined to the United States: ‘I firmly believe that a great discipline has turned sour if not rancid.’
62
Strong words, but that all-important change, he said, has been brought about by the injection of ideology into the discipline – to wit, a belief that a single variable can explain human behaviour: ‘Thus, sociology has largely become a repository of discontent, a gathering of individuals who have special agendas, from gay and lesbian rights to liberation theology’;
63
‘Any notion of a common democratic culture or a universal scientific base has become suspect. Ideologists masked as sociologists attack it as a dangerous form of bourgeois objectivism or, worse, as imperialist pretension…. That which sociology once did best of all, support the humanistic disciplines in accurately studying conditions of the present to make the future a trifle better, is lost. Only the revolutionary past and the beatific future are seen as fit for study, now that the aim of sociology has become to retool human nature and effect a systematic overhaul of society.’
64
The result, he said, has been the departure from sociology of all those scholars for whom social science is linked to public policy – social planners, penologists, demographers, criminologists, hospital administrators, and international development specialists.
65
Sociology, rather than being the study of ideology, has become ideology itself-in particular Marxist ideology. ‘Every disparity between ghetto and suburb is proof that capitalism is sick. Every statistic concerning increases in homicide and suicide demonstrates the decadence of America or, better, resistance to America. Every child born out of wedlock is proof that “the system” has spun out of control.’
66

For Horowitz, the way to rehabilitate and reinvent sociology is for it to tackle some big sympathetic issues, to describe those issues in detail and without bias, and to offer
explanation.
The Holocaust is the biggest issue, he wrote, still – amazingly – without a proper sociological description or a proper sociological explanation. Other areas where sociology should seek to offer help – to government and public alike – are in drug abuse, AIDS, and an attempt to define ‘the national interest,’ which would help foreign policy formulation. He also outlines a sociological ‘canon,’ a list of authors with whom, he said, any literate sociologist should be familiar. Finally, he makes a point very germane to the thesis of this chapter, that the positive hour, or ‘positive bubble’ as he put it, might not always last, or produce a vision of society that we can live with.
67
It is, above all, he said, the sociologist’s job to help us see past this bubble, to explore how we might live together. Horowitz’s book finishes up
far more positive in tone than it starts out, but it cannot be said that sociology has changed much as a result; its decomposition is still its dominant feature.

Horowitz’s thoughts bring us back to the Introduction, and to the fact that, in this book, I have sought to shift the focus away from political and military events. Of course, as was said at the beginning, this is an artificial division, a convenience merely for the sake of exploring significant and interesting issues often sidelined in more conventional histories. Yet one of the more challenging aspects of politics lies in the attempt to adapt such findings as those reported here to the governance of peoples. Whole books could be written about both the theory and practicalities of such adaptation, and while there is certainly no space to attempt such an exercise in the present work, it is necessary to acknowledge such a limit, and to make (as I see it) one all-important point.

This is that neither side of the conventional political divide (left
versus
right) holds all the virtues when it comes to dealing with intellectual and social problems. From the left, the attempted marriage of Marx and Freud has failed, as it was bound to do, being based on two rigid and erroneous theories about human nature (Freud even more so than Marx). The postmodern tradition is more successful as a diagnosis and description than as a prognosis for a way forward, except in one respect – that it cautions us to be wary of ‘big’ ideas that work for all people, in all places, at all times.

Looking back over the century, and despite the undoubted successes of the free-market system, one wonders whether the theorists of the right have any more reason to feel satisfied. Too often a substantial part of what they have offered is a directive to do nothing, to allow matters to take their ‘natural’ course, as if doing nothing is somehow more natural than doing something. The theories of Milton Friedman or Charles Murray, for example, seem very plausible, until one thinks of the writings of George Orwell. Had Friedman and Murray been writing in the 1930s, they would probably have still been arguing for the status quo, for economics to take its ‘natural’ course, for no intervention. Yet who can doubt that Orwell helped bring about a shift in sensibility that, combined with the experience of war, wrought a major change in the way the poor were regarded? However unsatisfactory the welfare state is
now,
it certainly improved living conditions for millions of people across the world. This would not have happened if left to
laisser-faire
economists.

Perhaps Karl Popper had it about right when he said that politics is like science, in that it is – or ought to be – endlessly modifiable. Under such a system, a welfare state might be a suitable response to a certain set of circumstances. But, once it has helped to create a healthier, wealthier population in which far greater numbers survive into old age, with all the implications that has for disease and the economic profile of an entire people, surely a different set of circumstances is called for? We should know by now – it is one of the implicit messages of this book – that in a crowded world, the world of mass society (a twentieth-century phenomenon), every advance is matched by a corresponding drawback or problem. In this regard, we should never forget that science teaches us two lessons, one just as important as the other. While it has revealed to us
some of the fundamentals of nature, science has also taught us that the
pragmatic,
piecemeal approach to life is by far the most successful way of adapting. We should beware grand theories.

As the century drew to its close, the shortcomings and failures first recognised by Gunther Stent and John Horgan began to grow in importance – in particular the idea that there are limits to what science can tell us and what, in principle, we can know. John Barrow, professor of astronomy at the University of Sussex, put these ideas together in his 1998 book
Impossibility: The Limits of Science and the Science of Limits.
68
‘Science,’ said Barrow in his concluding chapter, ‘exists only because there are limits to what Nature permits. The laws of Nature and the unchanging “constants” of Nature define the borders that distinguish our Universe from a host of other conceivable worlds where all things are possible…. On a variety of fronts we have found that growing complexity ultimately leads to a situation that is not only limited, but self-limiting. Time and again, the development of our most powerful theories has followed this path: they are so successful that it is believed that they can explain everything…. The concept of a “theory of everything” occasionally rears its head. But then something unexpected happens. The theory predicts that it cannot predict: it tells us that there are things it cannot tell us.’
69
In particular, Barrow says, taking as his starting point Kurt Gödel’s 1931 theory, there are things mathematics cannot tell us; there are limits that arise from our humanity and the evolutionary heritage we all share, which determine our biological nature and, for instance, our size. There are limits to the amount of information we can process; the great questions about the nature of the universe turn out to be unanswerable, because for one thing the speed of light is limited. Chaoplexity and randomness may well be beyond us in principle. ‘Whether it be an election, a bank of linked computers, or the “voting” neurones inside our head, it is impossible to translate individual rational choices into collective rationality.’
70

Not everyone agrees with Barrow, but if he is right, then the end of the century has brought with it yet another change in sensibility, perhaps the most important since Galileo and Copernicus: we are living near the end of the positive hour, and a ‘post scientific age’ awaits us. For many, this can’t come soon enough, but it is important not to overstate the case – as John Maddox has shown, there is still plenty of science to be done. Nevertheless, science has always promised, however far down the road, an ultimate explanation of the universe. If, as Barrow and others tell us, that now looks like a
theoretical
impossibility, who can tell what the consequences will be? Where will the evolution of knowledge forms next lead?

One thing seems clear: as Eliot said, there’s no going back. The arch-critics of science, with their own brand of secular zealotry, while they often skilfully describe why science can never be a complete answer to our philosophical condition, usually have little to add to it or replace it with. They tend either to look back to an age of religion or to recommend some sort of Heideggerean ‘submission’ to nature, to just ‘be.’ They lament the ‘disenchantment’ that has
disappeared as we have turned away from God, but are unclear as to whether ‘reenchantment’ could ever be meaningful.

The British philosopher Roger Scruton is one of the most articulate of such thinkers. His
An Intelligent Person’s Guide to Modern Culture
(1998) brilliantly punctures the pretensions, postures, and vacuities of modernist and popular culture, its failure to provide the ‘experience of membership’ that was true in an age of shared religious high culture, and laments how we can ever learn to judge ‘in a world that will not be judged.’ His view of science is sceptical: ‘The human world is a world of significances, and no human significance can be fully grasped by science.’ For Scruton, fiction, the imagination, the world of enchantment, is the highest calling, for it evokes sympathy for our condition, toleration, shared feelings, a longing that cannot be fulfilled, and ‘processes’ that, like Wagner’s operas, he deeper than words.
71

Scruton is nostalgic for religion but does not make the most of its possibilities. Perhaps the most sophisticated religious postscientific argument has come from
John Polkinghorne.
A physicist by training, Polkinghorne studied with Paul Dirac, Murray Gell-Mann, and Richard Feynman, became professor of mathematical physics at Cambridge and therefore a close colleague of Stephen Hawking, and in 1982 was ordained as a priest in the Anglican Church. His thesis in
Beyond Science
(1996) has two elements: one, that ‘our scientific, aesthetic, moral and spiritual powers greatly exceed what can convincingly be claimed to be needed in the struggle for survival, and to regard them as merely a fortunate but fortuitous by-product of that struggle is not to treat the mystery of their existence with adequate seriousness’;
72
and two, that ‘the evolution of conscious life seems the most significant thing that has happened in cosmic history and we are right to be intrigued by the fact that so special a universe is required for its possibility.’
73
In fact, Polkinghorne’s main argument for his belief in a creator is the anthropic principle – that our universe is so finely tuned, providing laws of physics that allow for our existence, that a creator must be behind it all. This is an updated argument as compared with those of the bishop of Birmingham and Dean Inge in the 1930s, but Polkinghorne’s case for God still lies in the details that we don’t – and maybe can’t – grasp. In that sense it is no different from any of the arguments about religion and science that have gone before.
74

In his intellectual autobiography,
Confessions of a Philosopher
(1997), Bryan Magee writes as follows: ‘Not being religious myself, yet believing that most of reality is likely to be permanently unknowable to human beings, I see a compelling need for the demystification of the unknowable. It seems to me that most people tend either to believe that all reality is in principle knowable or to believe that there is a religious dimension to things. A third alternative – that we can know very little but have equally little ground for religious belief – receives scant consideration, and yet seems to me to be where the truth lies.’
75
I largely share Magee’s views as expressed here, and I also concur with the way he describes ‘the main split in western philosophy.’ There is, he says, the analytic approach, mainly identified with the logical positivists and British and American philosophers, who are fascinated by science and its implications and whose
main aim is ‘explanation, understanding, insight.’
76
In contrast to them are what are known in Britain and America as the ‘continental’ school of philosophers, led by such figures as Husserl and Heidegger but including Jacques Lacan, Louis Althusser, Hans-Georg Gadamer, and Jürgen Habermas, and looking back to German philosophy – Kant, Hegel, Marx, and Nietzsche. These philosophers are not so interested in science as the analytic ones are, but they are interested in Freudian (and post-Freudian) psychology, in literature, and in politics. Their approach is rhetorical and partisan, more interested in comment than in understanding.
77
This is an important distinction, I think, because it divides some of our deepest thinkers between science, on the one hand, and Freud, literature, and politics on the other. Whatever we do, it seems we cannot get away from this divide, these ‘two cultures,’ and yet if I am right the main problems facing us require us to do so. In the twentieth century, what we may characterise as scientific/analytic reason has been a great success, by and large; political, partisan, and rhetorical reason, on the other hand, has been a catastrophe. The very strengths of analytic, positive/passive reason have lent political rhetorical reason an authority it does not deserve. George Orwell, above and before everyone, saw this and sought to bring home the point. Oswald Spengler and Werner Sombart’s distinction between heroes and traders is recast as one between heroes and scientists.

Other books

Over the Moon by Jean Ure
That's What Friends Are For by Patrick Lewis, Christopher Denise
Alex Van Helsing by Jason Henderson
Miss Jane's Undoing by Jiwani, Sophia
Our Lady of Darkness by Peter Tremayne
The Kingmaker by Haig, Brian
Chosen Thief by Scarlett Dawn
Rocky Mountain Mayhem by Joan Rylen