Alone Together (56 page)

Read Alone Together Online

Authors: Sherry Turkle

14
For an argument about the pleasures of limited worlds in another technological realm, see Natasha Schüll’s work on gambling,
Addiction by Design: Machine Gambling in Las Vegas
(Princeton, NJ: Princeton University Press, forthcoming).
15
See, for example, Bill Gates, “A Robot in Every Home,”
Scientific American
, January 2007,
www.scientificamerican.com/article.cfm?id=a-robot-in-every-home
(accessed September 2, 2009).
16
See Sherry Turkle,
Life on the Screen: Identity in the Age of the Internet
(New York: Simon and Schuster, 1995). On life as performance, the classic work is Erving Goffman,
The Presentation of Self in Everyday Life
(Garden City, NY: Doubleday Anchor, 1959).
17
The apt phrase “identity workshop” was coined by my then student Amy Bruckman. See “Identity Workshop: Emergent Social and Psychological Phenomena in Text-Based Virtual Reality” (unpublished essay, Media Lab, Massachusetts Institute of Technology, 1992),
www.cc.gatech.edu/~asb/papers
(accessed September 2, 2009).
18
Sociologists distinguish between strong ties, those of family and close friendship, and weak ties, the bonds of acquaintanceship that make us comfortable at work and in our communities. Facebook and Twitter, friending rather than friendship—these are worlds of weak ties. Today’s technology encourages a celebration of these weak ties as the kind we need in the networked life. The classic work on weak ties is Mark S. Granovetter, “The Strength of Weak Ties,”
American Journal of Sociology
78, no. 6 (May 1973): 1360-1380.
19
See, for example, Matt Richtel, “In Study, Texting Lifts Crash Risk by Large Margin,”
New York Times
, July 27, 2009,
www.nytimes.com/2009/07/28/technology/28texting.html
(accessed September 1, 2009). On the pressure that friends and family members put on drivers who text, see “Driver Texting Now an Issue in Back Seat,”
New York Times
, September 9, 2009,
www.nytimes.com/2009/09/09/technology/09distracted.html
(accessed September 9, 2009). As I complete this book, Oprah Winfrey has made texting while driving a personal crusade, encouraging people across America to sign an online pledge to not text and drive. See “Oprah’s No Phone Zone,”
Oprah.com
,
www.oprah.com/packages/no-phone-zone.html
(accessed May 30, 2010).
20
The teenage national average as of January 2010 is closer to thirty-five hundred; my affluent, urban neighborhood has a far higher number. Roger Entner, “Under-aged Texting: Usage and Actual Cost,”
Nielson.com
, January 27, 2010,
http://blog.nielsen.com/nielsenwire/online_mobile/under-aged-texting-usage-and-actual-cost
(accessed May 30, 2010). On texting’s impact on teenage life, see Katie Hafner, “Texting May Be Taking Its Toll,”
New York Times
, May 25, 2009,
www.nytimes.com/2009/05/26/health/26teen.html?_r=2&8dpc
(accessed July 21, 2009).
21
To find friends in the neighborhood, Loopt for the iPhone is a popular “app.”
22
A witty experiment suggests that Facebook “friends” won’t even show up when you invite them to a party. Hal Niedzviecki, “Facebook in a Crowd,”
New York Times
, October 24, 2008,
www.nytimes.com/2008/10/26/magazine/26lives-t.html
(accessed July 27, 2010).
23
From Winston Churchill’s remarks to the English Architectural Association in 1924, available at the International Centre for Facilities website at
www.icf-cebe.com/quotes/quotes.html
(accessed August 10, 2010). Churchill’s comment is, of course, very similar to the spirit of Marshall McLuhan. See, for example,
Understanding Media
:
The Extensions of Man
(1964; Cambridge, MA: MIT Press, 1994).
CHAPTER 1: NEAREST NEIGHBORS
 
1
Weizenbaum had written the program a decade earlier. See Joseph Weizenbaum, “ELIZA—a Computer Program for the Study of Natural Language Communication Between Man and Machine,”
Communications of the ACM
, vol. 9, no. 1 (January 1966): 36-45.
2
See Joseph Weizenbaum,
Computer Power and Human Reason: From Judgment to Calculation
(San Francisco: W. H. Freeman, 1976).
3
For whatever kind of companionship, a classical first step is to make robots that are physically identical to people. In America, David Hanson has an Albert Einstein robot that chats about relativity. At the TED conference in February 2009, Hansen discussed his project to create robots with empathy as the “seeds of hope for our future.” See
http://www.ted.com/talks/david_hanson_robots_that_relate_to_you.html
(accessed August 11, 2010) On Hanson, also see Jerome Groopman, “Robots That Care: Advances in Technological Therapy,”
The New Yorker
, November 2, 2009,
www.newyorker.com/reporting/2009/11/02/091102fa_fact_groopman
(accessed November 11, 2009).
These days, you can order a robot clone in your own image (or that of anyone else) from a Japanese department store. The robot clone costs $225,000 and became available in January 2010. See “Dear Santa: I Want a Robot That Looks Just Like Me,” Ethics Soup, December 17, 2009,
www.ethicsoup.com/2009/12/dear-santa-i-want-a-robot-that-looks-like-me
. html (accessed January 12, 2010).
4
Bryan Griggs, “Inventor Unveils $7,000 Talking Sex Robot,” CNN, February 1, 2010,
www.cnn.com/2010/TECH/02/01/sex.robot/index.html
(accessed June 9, 2010).
5
Raymond Kurzweil,
The Singularity Is Near: When Humans Transcend Biology
(New York: Viking, 2005). On radical images of our future, see Joel Garreau,
Radical Evolution: The Promise and Peril of Enhancing Our Minds, Our Bodies—and What It Means to Be Human
(New York: Doubleday, 2005).
6
For my further reflections on computer psychotherapy, see “Taking Things at Interface Value,” in Sherry Turkle,
Life on the Screen: Identity in the Age of the Internet
(New York: Simon and Schuster, 1995), 102-124.
7
There is, too, a greater willingness to enter into a relationship with a machine if people think it will make them feel better. On how easy it is to anthropomorphize a computer, see Byron Reeves and Clifford Nass,
The Media Equation: How People Treat Computers, Television and New Media Like Real People and Places
(New York: Cambridge University Press, 1996). See also, on computer psychotherapy, Harold P. Erdman, Marjorie H. Klein, and John H. Greist, “Direct Patient Computer Interviewing,”
Journal of Consulting and Clinical Psychology
53 (1985): 760-773; Kenneth Mark Colby, James B. Watt, and John P. Gilbert, “A Computer Method for Psychotherapy: Preliminary Communication,”
Journal of Nervous and Mental Diseases
142, no. 2 (1966): 148-152; Moshe H. Spero, “Thoughts on Computerized Psychotherapy,”
Psychiatry
41 (1978): 281-282.
8
For my work on early computational objects and the question of aliveness, see Sherry Turkle,
The Second Self: Computers and the Human Spirit
(1984; Cambridge, MA: MIT Press, 2005)
.
That work on aliveness continued with a second generation of computational objects in Turkle,
Life on the Screen.
My inquiry, with an emphasis on children’s reasoning rather than their answers, is inspired by Jean Piaget,
The Child’s Conception of the World
, trans. Joan Tomlinson and Andrew Tomlinson (Totowa, NJ: Littlefield, Adams, 1960).
9
On the power of the liminal, see, for example, Victor Turner,
The Ritual Process: Structure and Anti-Structure
(Chicago: Aldine, 1969), and
The Forest of Symbols: Aspects of Ndembu Ritual
(1967; Ithaca, NY: Cornell University Press, 1970). See also Mary Douglas,
Purity and Danger: An Analysis of Concepts of Pollution and Taboo
(London: Routledge and Kegan Paul, 1966).
10
Piaget,
The Child’s Conception.
11
Turkle,
The Second Self
, 33-64.
12
Children, in fact, settled on three new formulations. First, when it came to thinking through the aliveness of computational objects, autonomous motion was no longer at the heart of the matter. The question was whether computers had autonomous cognition
.
Second, they acknowledged that computer toys might have some kind of awareness (particularly of them) without being alive. Consciousness and life were split. Third, computers seemed alive because they could think on their own, but were only “sort of alive” because even though they could think on their own, their histories undermined their autonomy. So an eight-year-old said that Speak & Spell was “sort of alive” but not “really alive” because it had a programmer. “The programmer,” he said, “gives it its ideas. So the ideas don’t come from the game.” These days, sociable robots, with their autonomous behavior, moods, and faces, seem to take the programmer increasingly out of the picture. And with the formulation “alive enough,” children put the robots on a new terrain. As for cognition, it has given way in children’s minds to the capacity to show attention, to be part of a relationship of mutual affection.
13
Turkle
, Life on the Screen,
169.
14
Turkle
, Life on the Screen
, 173-174.
15
The quotation is from a journal entry by Emerson in January 1832. The passage reads in full, “Dreams and beasts are two keys by which we are to find out the secrets of our nature. All mystics use them. They are like comparative anatomy. They are our test objects.” See Joel Porte, ed.
Emerson in His Journals
(Cambridge, MA: Belknap Press, 1982), 81.
16
According to psychoanalyst D. W. Winnicott, objects such as teddy bears, baby blankets, or a bit of silk from a first pillow mediate between the infant’s earliest bonds with the mother, who is experienced as inseparable from the self, and other people, who will come to be experienced as separate beings. These objects are known as “transitional,” and the infant comes to know them both as almost inseparable parts of the self and as the first “not me” possessions. As the child grows, these transitional objects are left behind, but the effects of early encounters with them remain. We see them in the highly charged relationships that people have with later objects and experiences that call forth the feeling of being “at one” with things outside the self. The power of the transitional object is associated with religion, spirituality, the perception of beauty, sexual intimacy, and the sense of connection with nature. And now, the power of the transitional object is associated with computers and, even more dramatically, with sociable robots. On transitional objects, see D. W. Winnicott,
Playing and Reality
(New York: Basic Books, 1971).
17
In the early 1980s, children’s notion of people as “emotional machines” seemed to me an unstable category. I anticipated that later generations of children would find other formulations as they learned more about computers. They might, for example, see through the apparent “intelligence” of the machines by developing a greater understanding of how they were created and operated. As a result, children might be less inclined to see computers as kin. However, in only a few years, things moved in a very different direction. Children did not endeavor to make computation more transparent. Like the rest of the culture, they accepted it as opaque, a behaving system. Children taking sociable robots “at interface value” are part of a larger trend. The 1984 introduction of the Macintosh encouraged its users to stay on the surface of things. The Macintosh version of “transparency” stood the traditional meaning of that word on its head. Transparency used to refer to the ability to “open up the hood” and look inside. On a Macintosh it meant double-clicking an icon. In other words, transparency had come to mean being able to make a simulation work without knowing how it worked. The new transparency is what used to be called opacity. For more on this question, see Turkle,
Life on the Screen
, especially 29-43, and Sherry Turkle,
Simulation and Its Discontents
(Cambridge, MA: MIT Press, 2009).
18
Our connections with the virtual intensifies when avatars look, gesture, and move like us; these connections become stronger when we move from the virtual to the embodied robotic. Computer scientist Cory Kidd studied attachment to a computer program. In one condition the program issued written commands that told study participants what to do. In a second condition, an on-screen avatar issued the same instructions. In a third condition an on-screen robot was used to give the same instructions. The robot engendered the greatest attachment. Cory Kidd, “Human-Robot Interaction” (master’s thesis, Massachusetts Institute of Technology, 2003).
19
The Tamagotchi website cautions about unfavorable outcomes: “If you neglect your little cyber creature, your Tamagotchi may grow up to be mean or ugly. How old will your Tamagotchi be when it returns to its home planet? What kind of virtual caretaker will you be?” The packaging on a Tamagotchi makes the agenda clear: “There are a total of 4 hearts on the ‘Happy’ and ‘Hunger’ screens and they start out empty. The more hearts that are filled, the better satisfied Tamagotchi is. You must feed or play with Tamagotchi in order to fill the empty hearts. If you keep Tamagotchi full and happy, it will grow into a cute, happy cyberpet. If you neglect Tamagotchi, it will grow into an unattractive alien.” The manufacturer of the first Tamagotchi is Bandai. Its website provides clear moral instruction that links nurturance and responsibility. See the Bandai website at
www.bandai.com
(accessed October 5, 2009).

Other books

Amsterdam 2012 by Ruth Francisco
SPY IN THE SADDLE by DANA MARTON,
Holiday With Mr. Right by Carlotte Ashwood
Metal Emissary by Chris Paton
The Infection by Craig Dilouie