Read Stranger Than We Can Imagine Online
Authors: John Higgs
As the century progressed scientists and mathematicians, such as Heisenberg, Freud, Gödel and Lorenz, reinforced the non-existence of the omphalos and instead stressed uncertainty, incompleteness and our lack of a self-contained, non-paradoxical system. At the same time artists and philosophers such as Picasso, Korzybski, Joyce and Leary continued their exploration of the human psyche, and came to much the same conclusion. The multiple-perspective models they pioneered made possible the individualism preached by Crowley, Rand, Thatcher and The Rolling Stones. We were all separate, and different, and considered our own perspectives to be personally valid. Our worldview could only be absolute if we could force everyone else to share it, and not even Hitler or Stalin could achieve that.
Our personal realities, then, were relative. We simply did not have anything absolute to orientate ourselves to. The closest thing the people of the northern hemisphere had to a fixed point in their lives was Polaris, the North Star, the only point in the heavens that remains fixed over the span of a human life. And even Polaris wobbles a little.
We might not like this. We might curse relativity, and crave an absolute position. But that doesn’t change the fact that we do not have one.
Postmodernism was not some regrettable intellectual folly, but an accurate summation of what we had learnt. Even if it was intellectual quicksand from which escape did not appear possible. And which nobody seemed to like very much.
By the early twenty-first century the entire edifice of postmodernism had become routinely rejected. That, unfortunately, tended to include all the understanding that led up to it. Our current ideology
stresses that
of course there is an absolute
. Of course there is truth. Richard Dawkins makes this argument when he says that ‘We face an equal but much more sinister challenge from the left, in the shape of cultural relativism – the view that scientific truth is only one kind of truth and it is not to be especially privileged.’ Or as Pope Benedict XVI said in his pre-conclave speech in 2005, ‘Today, a particularly insidious obstacle to the task of education is the massive presence in our society and culture of that relativism which, recognising nothing as definitive, leaves as the ultimate criterion only the self with its desires. And under the semblance of freedom it becomes a prison for each one, for it separates people from one another, locking each person into his or her own ego.’ As Martin Luther King put it, ‘I’m here to say to you this morning that some things are right and some things are wrong. Eternally so, absolutely so. It’s wrong to hate. It always has been wrong and it always will be wrong.’ Or to quote the British philosopher Roger Scruton, ‘In argument about moral problems, relativism is the first refuge of the scoundrel.’ The existence of absolute truth has also been declared by neoliberalists and socialists, by terrorists and vigilantes, and by scientists and hippies. The belief in certainty is a broad church indeed.
All these people disagree on what form this absolutism takes, unfortunately. But they’re pretty sure that it exists.
This faith in absolute certainty is not based on any evidence for the existence of certainty. It can sometimes appear to stem from a psychological need for certainty which afflicts many people, particularly older men. Cultural debate in the early twenty-first century has, as a result, descended into a War of the Certain. Different factions, all of whom agree about the existence of absolute truth, are shouting down anyone who has a different definition of that absolute truth.
Fortunately, true absolutism is rare. Most people, scientists and non-scientists alike, unconsciously adopt a position of multiple-model agnosticism. This recognises that we make sense of the world by using a number of different and sometimes contradictory models. A multiple-model agnostic would not say that all models
are of equal value, because some models are more useful than others, and the usefulness of a model varies according to context. They would not concern themselves with infinite numbers of interpretations as that would be impractical, but they understand that there is never only one interpretation. Nor would they agree that something is not ‘real’ because our understanding of it is a cultural or linguistic construct. Things can still be real, even when our understanding of them is flawed. Multiple-model agnostics are, ultimately, pretty loose. They rarely take impractical, extreme positions, which may be why they do not do well on the editorial boards of academic postmodern journals.
Multiple-model agnosticism is an approach familiar to any scientist. Scientists do not possess a grand theory of everything, but they do have a number of competing and contradictory models, which are valid at certain scales and in certain circumstances. A good illustration of this point is the satellite navigation device in a car. The silicon chip inside it utilises our understanding of the quantum world; the GPS satellite it relies on to find its position was placed in orbit by Newtonian physics; and that satellite relies on Einstein’s theory of relativity in order to be accurate. Even though the quantum, Newtonian and relativity models all contradict each other, the satnav still works.
Scientists generally don’t lose too much sleep over this. A model is, by definition, a simplified incomplete version of what it describes. It may not be defendable in absolutist terms, but at least we can find our route home.
There is still, however, a tendency to frame these contradictory models as part of a hidden absolute, perhaps in order to avoid the whiff of postmodernism.
The absolutist approach to the contradictory nature of scientific models is to say that while all those models are indeed flawed, they will be superseded by a grand theory of everything, a wonder theory that does not contain any paradoxes and which makes sense of everything on every scale. The 2005 book about the quest for a Theory of Everything by the Canadian science journalist Dan Falk
was called
Universe on a T-Shirt
, due to the belief that such a grand theory would be concise enough, like the equation E=mc
2
, to be printed on a T-shirt.
To a multiple-model agnostic, this idea is a leap of faith. It is reminiscent of Einstein’s mistaken belief that quantum uncertainty must be wrong, because he didn’t like the thought of it. Of course if such a theory were found, multiple-model agnostics would be out celebrating with everyone else. But until that day comes it is not justifiable to assume that such a theory is out there, waiting. It is a call to an external ideal that is currently not supported by the data. A scientist who says that such a theory must exist is displaying the same ideological faith as someone who says God must exist. Both could be brilliant, but presently we should remain relativist enough to recognise them as unproven maybes.
In 1981 the American pop artist Andy Warhol began a series of paintings of the dollar sign. Warhol was a commercial artist from Pittsburgh who found fame in the 1960s with his gaudily coloured, mass-produced screen-prints of cultural icons. His most famous work, a series of prints of a Campbell’s soup can, probably captured the essence of the postwar Golden Age better than any other work of visual art.
The dollar sign paintings Warhol began in the 1980s were not the last work he did. He died in 1987, and mass-produced typically Warholian canvases until the end. It is tempting, however, to see the dollar sign as the moment he finally ran out of ideas. With the exception of an increasing preoccupation with death, there is little in his 1980s work that was new. The dollar sign paintings seem in many ways to be the conclusion of his life’s work.
They were big paintings, over two metres tall and just short of two metres wide. Each single dollar canvas could take up an entire wall in a gallery. The experience of walking into a white space, empty except for huge, bright dollar signs, is an uncomfortable one. Initially it is tempting to dismiss them as superficial, but there remains a lingering doubt that maybe Warhol had a point. Perhaps there
really was nothing else he could do but paint the dollar sign as big and bright as he could. Perhaps the neoliberalists were correct and their dollar god was the only genuine power in the world. Maybe we had had a valid omphalos all along.
Money, it seemed in the 1980s, was the only thing solid enough for people to orientate themselves by. Individualism and
Do What Thou Wilt
had become the fundamental principle of living, so the power to achieve your desires became all-important. That power was most potently distilled in the form of money. It was money that allowed you to do what you wanted, and a lack of money that stopped you. It did not matter that a world where the dollar sign was the only true subject of worship was fundamentally grim. In a postmodern culture, all such judgement calls were subjective. Our artists, thinkers and scientists were free to offer up alternatives to the court of public opinion. The fact that they failed to do so seemed telling.
In 1992, the American political scientist Francis Fukuyama published his most influential book,
The End of History and the Last Man
. Fukuyama argued that, with the collapse of the Soviet Union, the neoliberal argument had won. Capitalism was the only option on the table and liberal democracies were the only justifiable form of state. Fukuyama claimed that we had reached the predetermined, final form of society, an argument that was essentially teleological and therefore religious in nature. He wrote in a deliberately prophetic, evangelical tone, proclaiming the ‘Good News’ of the eternal triumph of the capitalist paradise.
In this context, Warhol’s paintings of dollar signs made complete sense. Was this, then, the endpoint of the twentieth century? Was this how our story was going to end?
Fukuyama was, fortunately, entirely wrong, as he would now be the first to admit. He split from the neoconservative movement over the US invasion of Iraq, a conflict he initially was in favour of, and voted for Barack Obama in 2008.
The individualism that had fuelled the neoliberal triumph was not the end point to humanity that people like Fukuyama or Margaret
Thatcher once believed it to be. It was a liminal period. The twentieth century had been the era after one major system had ended but before the next had begun. Like all liminal periods it was a wild time of violence, freedom and confusion, because in the period after the end of the old rules and before the start of the new, anything goes.
The coming era would keep the individual freedom that had been so celebrated in the twentieth century, but it would wed it to the last thing that the likes of The Rolling Stones wanted. Individual freedom was about to connect to the one thing it had always avoided. Freedom was about to meet consequence, and a new period of history was about to begin. In the research centres of Silicon Valley, a feedback loop was being built.
A Color Run participant in Barcelona taking a photo with a selfie stick, 2014
(Artur Debat/Getty)
T
he twenty-first century began over a twenty-four-hour period, which started at 11 a.m. on 31 December 1999. Or at least, that’s how it appeared to those in Britain watching the BBC’s coverage of the global New Year celebrations.
At 11 a.m. GMT it was midnight in New Zealand, which marked the occasion with a celebratory firework display in Auckland. This was beamed around the world by orbiting communication satellites, technology that was then about thirty-five years old and already taken for granted. Two hours later, a spectacular display over Sydney Harbour marked eastern Australia’s entrance into the twenty-first century. The television coverage continued as the hours passed and a procession of major cities from East to West left the twentieth century behind.
In the Hebrew calendar the date was 22 Teveth 5760. Under the Islamic calendar it was 23 Ramadan 1420. To the UNIX computer operating system, it was 946598400. The significance of the moment was only a product of the perspective used to define it.
As we noted at the start, New Year’s Day 2000 was not technically the beginning of the new millennium. That honour should have come a year later, on 1 January 2001. A few people were bothered by this, and wrote to newspapers, but were largely ignored. New Year’s Eve 1999 was going to mark the end of the twentieth century, because that’s how the planet of individuals wanted it. They had spent the last eighteen years listening to Prince singing that he wanted to ‘party like it’s 1999’. They were impatient to party that way, too. This was a world where the population drove cars and understood the strange appeal of watching all the digits in the milometer change at the same time. Who wanted to wait another year? Seeing 1999 turn
into 2000 was clearly more fun than 2000 turn into 2001. People no longer recognised the claims to authority that institutions such as the Royal Observatory at Greenwich had once had. The twenty-first century began at midnight on 31 December 1999 by mutual consent. That’s what people wanted.
In London, to the crowd enjoying a drunken singalong on the banks of the River Thames, the twenty-first century was a blank slate and full of potential. It still felt clean at that point, unsullied by what was to come. They had no inkling of 9/11, the War on Terror, or the coming global financial crash. Among the many songs that those revellers put their heart and soul into was a rendition of the classic Frank Sinatra standard ‘My Way’. That song, perhaps more than any other, could symbolise the twentieth century. The lyrics are based around words such as ‘I’ or ‘me’, never ‘we’ or ‘us’.
I’ll
make it clear, he says,
I’ll
state my case, of which
I’m
certain. It is far from the only song to base its lyrics on ‘me, me, me’, of course, but the proud manner of Sinatra’s delivery marks it out as something special. When it was released it sounded heroic, a statement of pride in one man’s ability to face life on his own terms. And yet, as we entered the twenty-first century, it was becoming possible to see that lyric in a different light. It is not the song of a man who understands himself to be part of something larger, or connected to a wider context in any meaningful way. It’s the song of an isolated man who has spent his entire life attempting to force his own perspective on the world.