Alone Together (59 page)

Read Alone Together Online

Authors: Sherry Turkle

7
A sympathetic reading of the possibilities of deep human-robot connections is represented in Peter H. Kahn Jr. et al., “What Is Human? Toward Psychological Benchmarks in the Field of Human-Robot Interaction,”
Interaction Studies
8, no. 3 (2007): 363-390, and Peter H. Kahn Jr. et al., “Social and Moral Relationships with Robotic Others?” in
Proceedings of the 13th International Workshop on Robot and Human Interactive Communication (RO-MAN ’04)
(Piscataway, NJ: Institute of Electrical and Electronics Engineers, 2004), 545-550.
Yet in their 2006 paper “Robotic Pets in the Lives of Preschool Children” (
Interaction Studies: Social Behavior and Communication in Biological and Artificial Systems
7, no. 3, 405-436), Kahn and his colleagues cite John Searle’s 1992 critique of AI in formulating their own. See John Searle,
The Rediscovery of the Mind
(Cambridge, MA: MIT Press, 1992). Kahn concludes, “Although it is an open question our sense is that because computerized robots are formal systems, with syntax but not semantics, they will never be capable of engaging in full social relationships or of engendering full moral development in human beings.”
8
The philosopher Emmanuel Lévinas writes that the presence of a face initiates the human ethical compact, which binds us before we know what lies behind a face. The face itself communicates, “You shall not commit murder.” We seem to be summoned by the face even if we are looking at the face of a machine, something that cannot be killed. The robot’s face certainly announces, as Lévinas puts it, “Thou shalt not abandon me.” See Emmanuel Lévinas, “Ethics and the Face,” in
Totality and Infinity: An Essay on Exteriority
, trans. Alphonso Lingis (Pittsburgh, PA: Duquesne University Press, 1969), 199. Lévinas notes that the capacity to put ourselves in the place of another,
alterity
, is one of the defining characteristics of the human. I speak of complicity because for a human being to feel that current robots are “others,” the human must construct them as capable of alterity. See Emmanuel Lévinas,
Alterity and Transcendence,
trans. Michael Smith (New York: Columbia, 1999).
9
See Sherry Turkle et al., “First Encounters with Kismet and Cog: Children Respond to Relational Artifacts,” in
Digital Media: Transformations in Human Communication
, ed. Paul Messaris and Lee Humphreys (New York: Peter Lang Publishing, 2006). I owe a special debt to Jennifer Audley for her contribution to the design and implementation of this study and to Olivia Dasté and Robert Briscoe for work on the analysis of transcripts.
10
Plato,
The Republic
, Book Two:
The Individual, the State, and Education
.
11
J. K. Rowling,
Henry Potter and the Chamber of Secrets
(New York: Scholastic, 1999), 329.
12
One twelve-year-old is troubled to learn about Scassellati’s imminent departure. She pleads with him, “But Cog has seen you so much. Won’t it miss you? I think Cog sees you as its dad. . . . It is easier for you to teach it than for any of us.” She imagines the worst for the robot’s future. “What if someone was trying to do something bad to the robot and you won’t be there to protect it anymore?”
13
Brian Aldiss,
Supertoys Last All Summer Long and Other Stories of Future Time
(New York: St. Martin, 2001).
14
See Takayuki Kanda et al., “Interactive Humanoid Robots and Androids in Children’s Lives,”
Children, Youth and Environments
19, no. 1, (2009): 12-33,
www.coloradoedu/journals/cye
. (accessed July 4, 2009).
CHAPTER 6: LOVE’S LABOR LOST
 
1
For Paro’s Guinness citation, see “Seal-Type Robot ‘PARO’ to Be Marketed with Best Healing Effect in the World,” Paro Robots, January 4, 2005,
www.parorobots.com/pdf/pressreleases/PARO%20to%20be%20marketed%202004-9.pdf
(accessed July 27, 2010).
2
Publicity films for Paro show older men and women who live with Paro having breakfast with it, watching television with it, taking it to the supermarket and out to dinner. Sometimes Paro is adopted by a couple and sometimes by a younger person who simply does not enjoy living alone. In interviews, people say they are happy to have company that is easier to take care of than a real pet, company that will not die. See the Paro website at
www.parorobots.com
(accessed August 10, 2010).
3
Over years, I become convinced that in nursing homes, seniors become fascinated by relationships with robots because, among other reasons, they bring to the surface tensions about seniors’ autonomy in their institutions. A robot that needs you promotes a fantasy of autonomy: seniors feel competent because something depends on them. Yet, robots can also disrupt fictions of autonomy. They can send the message, “Now you know that you are wholly dependent. Play with this toy. You are like a child.” This push and pull makes the robots compelling even as they disturb. I owe this insight to my research assistant William Taggart. See Sherry Turkle et al., “Relational Artifacts with Children and Elders: The Complexities of Cybercompanionship,”
Connection Science
18, no. 4 (December 2006): 347-361, and Cory D. Kidd, William Taggart, and Sherry Turkle, “A Sociable Robot to Encourage Social Interaction Among the Elderly,”
Proceedings of the 2006 IEEE International Conference on Robotics and Automation
, Orlando, Florida, May 15-19, 2006.
4
See, for example, Toshiyo Tamura et al., “Is an Entertainment Robot Useful in the Care of Elderly People with Severe Dementia?”
The Journals of Gerontology Series A: Biological Sciences and Medical Sciences
59 (January 2004): M83-M85,
http://biomed.gerontologyjournals.org/cgi/content/full/59/1/M83
(accessed August 15, 2009).
5
Suvendrini Kakuchi, “Robot Lovin’,”
Asia Week Magazine Online
, November 9, 2001,
www.asiaweek.com/asiaweek/magazine/life/0,8782,182326,00.html
(accessed September 15, 2006).
6
It is standard for presentations about the “need” for sociable robots to begin with a slide demonstrating the inability to staff service jobs with people because of demographic trends. This slide is often drawn from the 2002 United Nations Report “UNPD World Population Ageing: 1950-2050,” United Nations,
www.un.org/esa/population/publications/worldageing19502050
(accessed July 8, 2009). The slides in this report dramatize (particularly for the developed world) that there are more and more older people and fewer and fewer younger people to take care of them. Nothing in my argument disputes the slide. I do question the leap from this slide to the inevitability of robots to care for people.
7
The meeting was sponsored by the Association for the Advancement of Artificial Intelligence. There are were also presentations on relational agents that dwell within the machine—for example, an “affective” health and weight-loss coach, developed by the chair of the symposium, Timothy W. Bickmore. See Timothy W. Bickmore, “Relational Agents: Effecting Change Through Human-Computer Relationships” (PhD diss., Massachusetts Institute of Technology, 2003), and Timothy W. Bickmore and Rosalind W. Picard, “Towards Caring Machines,” in
CHI ’04 Extended Abstracts on Human Factors in Computing Systems
(New York: ACM Press, 2004). Another presentation discussed a robotic cat, Max, named one of
Time
magazine’s “Inventions of the Year” for 2003. If you stroke Max or call it by name, the robot responds. See “Best Inventions 2003: Lap Cat,”
Time.com
,
www.time.com/time/2003/inventions/invcat.html
(accessed September 23, 2009). Max is brought to the conference by Elena and Alexander Lubin, who are proponents of “robotherapy.” For an overview of their work, see their entry in the
Encyclopedia of Applied Psychology
(Oxford: Elsevier, 2004), 289-293.
8
An estimated 26.2 percent of Americans ages 18 and older (and one in five children) suffer from a diagnosable mental disorder in a given year. National Institutes of Mental Health Statistics,
http://www.nimh.nih.gov/health/topics/statistics/index.shtml
(accessed August 10, 2010) and “The numbers count: Mental Disorders in America,”
http://www.nimh.nih.gov/health/publications/the-numbers-count-mental-disorders-in-america/index.shtml
(Accessed August 10, 2010).
9
The movement to use robots in therapeutic situations is encouraged by the observed therapeutic potential of pets and the power of a gaze. See, for example, K. Allen et al., “Presence of Human Friends and Pet Dogs As Moderators of Autonomic Responses to Stress in Women,”
Journal of Personality and Social Psychology
61, no. 4 (1991): 582-589, and Michael Argyle and Mark Cook,
Gaze and Mutual Gaze
(Cambridge: Cambridge University Press, 1976).
10
We now have years of experience of people using games and the Internet as places to, in their words, “say what they couldn’t say in their real lives.” We see people, especially adolescents, using the anonymity and new opportunities of online life to experiment with identity. They try out new things, in safety—never a bad idea. When we see what we do in our lives on the screen, we can learn what we feel we are missing and use this information to enhance our lives in the “real.” Over years of study I have learned that the people who do best with their lives on the screen are those who use them as material for self-reflection. This can be the focus of work between a therapist and a patient. In Sherry Turkle, ed.,
The Inner History of Devices
(Cambridge, MA: MIT Press, 2008), see especially the contributions of therapists John Hamilton, Kimberlyn Leary, and Marcia Levy-Warren.
11
Cory D. Kidd, “Designing for Long-Term Human-Robot Interaction and Application to Weight Loss” (PhD diss., Massachusetts Institute of Technology, 2008). Rose (and Gordon) are pseudonyms that Kidd provided for his subjects.
12
Cory D. Kidd, “Sociable Robots: The Role of Presence and Task in Human-Robot Interaction” (master’s thesis, Massachusetts Institute of Technology, 2003).
13
For an early experiment using a small, plush robot as a trigger for memories, see Marina Bers and Justine Cassell, “Interactive Storytelling Systems for Children: Using Technology to Explore Language and Identity,”
Journal of Interactive Learning Research
9, no. 2 (1999): 603-609.
14
See, for example, Erving Goffman,
The Presentation of Self in Everyday Life
(Garden City, NY: Doubleday Anchor, 1959).
15
The Intel Corporation joins with the Universities of Michigan and Pittsburgh, Carnegie Mellon, and Stanford on the Nursebot project. Nursebot tests a range of ideas for assisting elderly people, such as reminding elderly patients to visit the bathroom, take medicine, drink, or see the doctor; connecting patients with caregivers through the Internet; collecting data and monitoring the well-being of patients; manipulating objects around the home, such as the refrigerator, washing machine, or microwave; taking over certain social functions such as game playing and simple conversation. For a 2002 film produced by the National Science Foundation on Nursebot, see “Nursebot,” YouTube, May 10, 2008,
www.youtube.com/watch?v=6T8yhouPolo
(accessed August 13, 2009).
16
Chung Chan Lee, “Robot Nurse Escorts and Schmoozes the Elderly,” Robots—Because Humans Deserve Better, May 17, 2006,
http://i-heart-robots.blogspot.com/2006/03/robot-nurse-escorts-and-schmooze.html
(accessed August 13, 2009). This blog’s main title is telling: “Because Humans Deserve Better.”
17
See Lee, “Robot Nurse Escorts.”
18
See comments about “In the Hands of Robots—Japan,” YouTube, June 16, 2008,
www.youtube.com/watch?v=697FJZnFvJs&NR=1
(accessed August 13, 2009).
19
Amy Harmon, “Discovering a Soft Spot for Circuitry,”
New York Times
, July 5, 2010,
www.nytimes.com/2010/07/05/science/05robot.html?pagewanted=all
(accessed July 5, 2010). The story cites Timothy Hornyak, a student of robots in Japan, on our new challenge to process synthetic emotions.
CHAPTER 7: COMMUNION
 
1
See “Kismet and Rich,” MIT Computer Science and Artificial Intelligence Laboratory,
www.ai.mit.edu/projects/sociable/movies/kismet-and-rich.mov
(accessed November 14, 2009).
2
I have been fortunate to have colleagues who have both inspired and challenged my readings of complicity and communion. I owe a special debt to Margaret Boden, Linnda R. Caporael, and Lucy Suchman.
For a discussion of the construction of meaning behind what I am terming
complicity
in human-robot interactions, see Margaret Boden,
Mind As Machine: A History of Cognitive Science
, vol. 1 (London: Oxford University Press, 2006). Prior to these constructions of meaning is the general question of why humans anthropomorphize. See, for example, Linnda R. Caporael, “Anthropomorphism and Mechanomorphism: Two Faces of the Human Machine.”
Computers in Human Behavior
2 (1986): 215-34 and Linnda R. Caporael and Ceclia M. Hayes, “Why Anthropomorphize? Folk Psychology and Other Stories,” in
Anthropomorphism, Anecdotes, and Animals
, ed. Robert W. Mitchell, Nicholas S. Thompson, and Lyn Miles (Albany: State University of New York, 1997), 59-75. The literature on anthropomorphism is large. I signal two particularly useful volumes: Mitchell, Thompson, and Miles, eds.,
Anthropomorphism, Anecdotes, and Animals
, and John Stodart Kennedy,
The New Anthropomorphism
(Cambridge: Cambridge University Press, 1992).
For a critical study of the constructions of meaning in human-robot interactions, see Lucy Suchman,
Human-Machine Reconfigurations: Plans and Situated Actions
(1987; Cambridge: Cambridge University Press, 2007), especially ch. 13. See also Lucy Suchman, “Affiliative Objects,”
Organization
12, no. 2 (2005): 379-399. Suchman and I both participated in panels on computers and society at the Society for the Social Studies of Science (August 2007) and at the Harvard Graduate School of Design (March 2009). At both panels, Suchman eloquently examined human-robot interactions as social constructs. Most recently, Suchman has persuasively argued for a return to “innocence” in how we approach sociable robots, a tonic dialing down of what we are willing to project onto them. See Lucy Suchman, “Subject Objects,” accepted for a special issue of
Feminist Theory
devoted to “nonhuman feminisms,” edited by Myra Hird and Celia Roberts.

Other books

Wake Up Dead by Roger Smith
The Exile by Steven Savile
Hollywood Nights by Sara Celi
Fantasy Maker by Sabrina Kyle
Love's Deception by Adrianne Byrd
Night After Night by Phil Rickman
Darwin's Dangerous Idea by Daniel C. Dennett