Present Shock: When Everything Happens Now (12 page)

That is, if the smart phone only sat there and waited for us to read it. More often than not, it’s the phone or laptop demanding
our
attention, alerting us to the upcoming event in our schedule, or unpacking one of a seemingly infinite number of its processes into our attention. Indeed, if the Axial Age was coordinated by the calendar, and the clockwork universe by the schedule, the digital era subjects us to the authority of code. Our children may have their afternoons scheduled, but we adults live in a world that is increasingly understood as a program.

Where a schedule simply lists an appointment we are supposed to attend or a task we are to complete, a program carries out the actual process with us or for us. The clock dictated only time; the computer (and smart phone, and biometric device on the wrist) tells us what we should be doing when. We once read books at our own pace; computer animations, YouTube movies, and video games play out at a speed predetermined by the programmer. The GPS on the dashboard knows our entire route to the dentist’s office and offers us the turns we need to take on a need-to-know basis. The computer graphic on the exercise machine tells us how fast to pedal and turns the simulation uphill when it knows we need it. We surrender this authority over our pacing because we know that, for the most part, the programs will be better at helping us meet our stated goals than we are. The events carried out by computers are less like schedules than they are like simulations; less like sheet music, more like a piano roll. In a scheduled world, you are told you have half an hour to peruse an exhibit at the museum; in a programmed world, you are strapped into the ride at Disneyland and conveyed through the experience at a predetermined pace. However richly conceived, the ride’s complexity is limited because its features are built-in, appearing and unfolding in sync with the motions of your cart. The skeleton dangles, the pirate waves, and the holographic ship emerges from the cloud of smoke. In the programmed life, the lights go on and off at a specified time, the coffee pot activates, the daylight-balanced bulb in the programmable clock radio fades up. Active participation is optional, for these events will go on without us and at their own pace.

As we grow increasingly dependent on code for what to do and when to do it, we become all the more likely to accept a model of human life itself as predetermined by the codons in our genomes. All we can do is watch the program of our DNA unfold or, at best, change our fates by altering the sequence of genes dictating what’s supposed to happen. Free will and autonomy are eventually revealed to us to be a simulation of some sort, while the reality we think we’re participating in is really just the predetermined dance of pure information. The timekeeper is no longer the controller of the clock but the programmer of the computer.

And instead of taking our cues from the central clock tower or the manager with the stopwatch, we carry our personal digital devices with us. Our daily schedule, dividing work time from time off, is discarded. Rather, we are
always
-on. Our boss isn’t the guy in the corner office, but a PDA in our pocket. Our taskmaster is depersonalized and internalized—and even more rigorous than the union busters of yesterday. There is no safe time. If we are truly to take time away from the program, we feel we must disconnect altogether and live “off the grid,” as if we were members of a different, predigital era.

Time in the digital era is no longer linear but disembodied and associative. The past is not something behind us on the timeline but dispersed through the sea of information. Like a digital unconscious, the raw data sits forgotten unless accessed by a program in the future. Everything is recorded, yet almost none of it feels truly accessible. A change in file format renders decades of stored files unusable, while a silly, forgotten Facebook comment we wrote when drunk can resurface at a job interview.

In the digital universe, our personal history and its sense of narrative is succeeded by our social networking profile—a snapshot of the current moment. The information itself—our social graph of friends and likes—is a product being sold to market researchers in order to better predict and guide our futures. Using past data to steer the future, however, ends up negating the present. The futile quest for omniscience we looked at earlier in this chapter encourages us, particularly businesses, to seek ever more fresh and up-to-the-minute samples, as if this will render the present coherent to us. But we are really just chasing after what has already happened and ignoring whatever is going on now. Similarly, as individuals, our efforts to keep up with the latest Tweet or update do not connect us to the present moment but ensure that we are remaining focused on what just happened somewhere else. We guide ourselves and our businesses as if steering a car by watching a slide show in the rearview mirror. This is the disjointed, misapplied effort of digiphrenia.

Yet instead of literally coming to our
senses
, we change our value system to support the premises under which we are operating, abstracting our experience one step further from terra firma. The physical production of the factory worker gives way to the mental production of the computer user. Instead of measuring progress in acres of territory or the height of skyscrapers, we do it in terabytes of data, whose value is dependent on increasingly smaller units of time-stamped freshness.

Time itself becomes just another form of information—another commodity—to be processed. Instead of measuring change from one state of affairs to another, we measure the rate of change, and the rate at which that rate is changing, and so on. Instead of proceeding from the past to the future, time now proceeds along derivatives, from location to speed to acceleration and beyond. We may like to think that the only constant is change, except for the fact that it isn’t really true—change is changing, too. As Mark McDonald, of IT research and advisory company Gartner, put it, “The nature of change is changing because the flow and control of information has become turbulent, no longer flowing top down, but flowing in every direction at all times. This means that the ability to manage and lead change is no longer based on messaging, communication and traditional sponsorship. Rather it is based on processes of informing, enrolling and adapting that are significantly more disruptive and difficult to manage for executives and leaders.”
10

Or as Dave Gray, of the social media consultancy Dachis Group, explains it, “Change is not a once-in-a-while thing so much as something that is going to be happening all the time. Change is accelerating, to the point where it will soon be nearly continuous. Periods of sustained competitive advantage are getting shorter, and there are a host of studies that confirm that. It’s not just something that is happening in technology, either. It’s happening in every industry.”
11

These analysts are describing the new turbulence of a present-shock universe where change is no longer an event that happens, but a steady state of existence. Instead of managing change, we simply hope to be iterated into the next version of reality that the system generates. The only enduring truth in such a scheme is evolution, which is why the leading spokespeople for this world-after-calendars-and-clocks tend to be evolutionary scientists: we are not moving through linear time; we are enacting the discrete, punctuated steps of a program. What used to pass for the mysteriousness of consciousness is shrugged off as an emergent phenomenon rising from the complexity of information. As far as we know, they may be right.

It’s not all bad, of course. There are ways to inhabit and direct the programs in our lives instead of simply responding to their commands. There are ways to be in charge. Unlike the workers of the Industrial Age who stood little chance of becoming one of the managing elite, we are not excluded from computing power except through lack of education or the will to learn. As we’ll see, becoming one of the programmers instead of the programmed is probably a better position from which to contend with digitality.

CHRONOBIOLOGY

Thanks to our digital tools, we are living in a new temporal order, one no longer defined by the movement of the heavens, the division and succession of the years, or the acceleration of progress. We are free of the confines of nature, capable of creating simulated worlds in which we can defy gravity, program life, or even resurrect ourselves. And as anyone who has gotten lost in World of Warcraft only to look up at the clock and realize four hours have gone by can tell you, we are also free of time—and as cognitive studies show, more so than when reading a book or watching a movie, and with temporal distortions lingering still hours later.
12

At least our virtual selves enjoy this freedom. We flesh-and-blood humans living back in the real world have still aged four hours, missed lunch, denied ourselves bathroom breaks, and allowed our eyes to dry up and turn red. Like an astronaut traveling at light speed for just a few seconds who returns to an Earth on which ninety years have passed, our digital selves exist in a time unhinged from that of our bodies. Eventually the two realities conflict, leading to present shock. If tribal humans lived in the “total” time of the rotating Earth, digital humans attempt to live in the “no” time of the computer. We simply can’t succeed at it if we bring our bodies along for the ride. Yet when we try to leave them behind, both nature and time come back to reassert their authority over us.

Digital technology is not solely to blame. Indeed, the microchip may be less the cause of this effort to defeat time than its result. Since the prehistoric age, humankind has been using technology to overcome the dictates of nature’s rhythms. Fire allowed us to travel to colder climates and defeat the tyranny of the seasons. It also gave us the ability to sit up past sundown and cook food or tell stories. In the early 1800s, the proliferation of gaslight utterly changed the landscape and culture of London, making the night streets safer and illuminating the urban environment at all hours. New cultures emerged, with new relations to time of day and other activities. Nighttime cafés and bars led to new musics and entertainments. New flexibility of work scheduling allowed for around-the-clock shifts and factories whose stacks emitted smoke day and night. The invention of jet planes gave us even more authority over time, allowing us to traverse multiple time zones in a single day.

But no matter how well technology overcame the limits of natural time, our bodies had difficulty keeping up. Stress and fatigue among night workers and those whose shifts are changed frequently is only now being recognized as more than a ploy by labor unions, and if it weren’t for the rather obvious symptoms of jet lag, we may still not have acknowledged the existence of biological clocks underlying our bodies’ rhythms. For jet lag is more than just a woozy feeling, and it took many years until scientists realized it had a real influence over our effectiveness. Back in the 1950s, for example, when jet passenger service was still quite novel, Secretary of State John Foster Dulles flew to Egypt to negotiate the Aswan Dam treaty. His minders assumed he would sleep on the plane, and they scheduled his first meeting for shortly after he arrived. He was incapable of thinking straight, his compromised perceptual and negotiating skills were overtaxed, and he failed utterly. The USSR won the contract instead, and many still blame this one episode of jet lag for provoking the Cold War.

A decade later, in 1965, the FAA finally began to study the effects of air travel on what were now admitted to be human biological rhythms. For some unknown reason, subjects traveling east to west experienced much greater decline in “psychological performance” than those traveling in other directions.
13
The next year, the
New York Times
Sports section acknowledged the little-understood effects of jet lag on Major League Baseball players: “The Jet Lag can make a recently debarked team too logy for at least the first game of any given series.”
14
Coaches became aware of the various behavior patterns associated with travel in each direction, but no one could tell them the mechanisms at play or how to counteract them.

By the 1980s, NASA got on the case. Its Ames Fatigue/Jet Lag Program was established to “collect systematic, scientific information on fatigue, sleep, circadian rhythms, and performance in flight operations.”
15
Leaving people in rooms with no external time cues, researchers found that the average person’s biological clock would actually lengthen to a twenty-five-hour cycle. This, they concluded, is why traveling east, which shortens the day, is so much more disorienting than traveling west, which lengthens it. Most important, however, the studies showed that there were clocks inside us, somehow governed by the body, its metabolism, and its chemical processes. Or perhaps we were syncing to something unseen, like the moon, or shifting magnetic fields. Or both. Circadian rhythms, as they came to be called, were real.

The phenomenon had been discovered and measured in plants centuries earlier. In the 1700s, Swedish botanist Carolus Linnaeus designed a garden that told time by planting adjacent sections of flowers that opened and closed their blossoms an hour apart, around the clock. But if human activities were really governed by the same mysterious cycles as tulips and cicada, we would have to consider whether our ability to transcend nature through technology was limited by forces beyond our control.

The relatively new field of chronobiology hopes to unravel some of these mysteries, but each new discovery seems to lead to even bigger questions. Some biological cues are clearly governed by simple changes in sunlight. The eye’s photoreceptors sense the darkening sky, sending a signal to release melatonin, which makes us sleepy. Watching TV or staring at a bright computer screen in the evening delays or prevents this reaction, leading to sleeplessness. But if sunlight were the only cue through which the body regulated, then why and how does a person who can set his own hours find the twenty-five-hour day?

Other books

Censored 2012 by Mickey Huff
Gut Instinct by Brad Taylor
The story of Lady Hamilton by Meynell, Esther
Henry and Clara by Thomas Mallon
Behind the Badge by J.D. Cunegan
The Broken Frame by Claudio Ruggeri