Notes for Sherry Turkle Alone Together: Why We Expect More from Technology and Less from Each Other
Key concepts: anxiety of always, community, complex wonder, evocative objects, flow state, interface value, liminal places, panopticon, postfamilial families, protean self, psychosocial moratorium, quandary thinking, realtechnik, robotic moment, sacred space, the zone.
Related theorists: Kwame Anthony Appiah, Gordon Bell, Niels Bohr, Rodney Brooks, Martin Buber, Mihaly Csikszentmihali, Aaron Edsinger, Erik Erikson, Michel Foucault, Kevin Kelly, Ray Kurzweil, John Lester, Levinas, Robert Jay Lifton, Pia Lindman, Stanley Milgram, Marvin Minsky, Anthony Storr, Thoreau, Victor Turner, Joseph Weizenbaum.
Innocence of working with early computers builds justification for learning programming by first studying ancient technologies.
Truth is that trajectory of dumbest generation defaults hobbyist relationship to technology, opening a place for philosophical thought whether geared toward engineering or ethics topics for both when done well involve the same systems.
(ix) The first home computers were being bought by people called hobbyists. The people who bought or built them experimented with programming, often making their own simple games. No one knew to what further uses home computers might be put.
Raises questions of comportment defined from anthropological research.
(ix-x) Now I was among them and, like any anthropologist, something
of a stranger in a strange land. . . . Computational metaphors, such
as “debugging” and “programming,” were starting to be used to
think about politics, social life, and – most central to the
analogy with psychoanalysis – about the self. . . . We are shaped
by our tools. And now, the computer, a machine on the border of
becoming a mind, was changing and shaping us.
(x) I asked questions of scientists, home computer owners, and children, but mostly I listened to how they talked and watched how they behaved among their new “thinking” machines.
Importance of computer as evocative object philosophical work space cybersage workstation is a point missed by Maner but clear to Hayles: Turkle clearly reflects more on her ambivalence than proposing software projects, leaving the task to students of texts and technology, digital media studies, and digital humanities; she is concerned with human implications, so far offering little steering direction for rhetorics of machine intelligence.
Computers brought philosophy into everyday life; in particular, they
turned children into philosophers.
(x-xi) The computer was an evocative object that provoked self-reflection. . . . In 1984, thinking about Deborah (and in homage as well to Simone de Beauvoir), I called my first book on computers and people The Second Self.
Hope perhaps tied to generic learning programming that did not flourish.
(xi) I find it ironic that my own 1984 book, about the technology that in many a science fiction novel makes possible such a dystopian world, was by contrast full of hope and optimism.
Compare to shift from focus on author writing to operation of writing within networks.
(xi) In the decade following the publication of The Second Self, people's relationships with computers changed. . . . By then, the computer had become a portal that enabled people to lead parallel lives in virtual worlds. . . . My focus shifted from the one-on-one with a computer to the relationships people formed with each other using the computer as an intermediary.
Consider SCA as real world virtual environment where people can take on different identities and engage in behavior incongruent with mundane social norms (it is often referred to as the Society of Consenting Adults).
(xi) I reported on this work in the 1995 Life on the Screen, which offered, on balance, a positive view of new opportunities for exploring identity online.
Two futures: fully networked life and evolution in robotics; compare to Kitchin and Dodge distinctions.
Two avenues forward became apparent by the mid-1990s. The first was
the development of a fully networked life. . . . The network was with
us, on us, all the time. So, we could be with each other all the
time. Second, there was an evolution in robotics. . . . by the late
1990s, children were presented with digital “creature” that made
demands for attention and seemed to pay attention to them.
(xii) Alone Together picks up these two strands in the story of digital culture over the past fifteen years, with a focus on the young, those from five through their early twenties - “digital natives” growing up with cell phones and toys that ask for love. If, by the end of researching Life on the Screen, I was troubled about the costs of life with simulation, in the course of researching this book, my concerns have grown. These days, insecure in our relationships and anxious about intimacy, we look to technology for ways to be in relationships and protect ourselves from them at the same time. . . . I feel witness for a third time to a turning point in our expectations of technology and ourselves. We bend to the inanimate with new solicitude. We fear the risks and disappointments of relationship with our fellow humans. We expect more from technology and less from each other.
She focuses on young people, I on adults who grew up in the age of innocent computing; moreover, I consider machine intelligences as well.
(xii-xiii) To tell the story of artifacts that encourage
relationship, I begin with the ELIZA program in the 1970s and take
the story through to the “sociable” humaniod robots, such as Domo
and Mertz, built at MIT in the 2000s. . . . Children and their
families were asked to keep “robot diaries,” accounts of home
life with an AIBO, My Real Baby, or Furby.
(xiii) In the story of computer-mediated communication, I began my investigations in the 1980s and early 1990s with e-mail, bulletin boards, Internet Relay Chat, and America Online and went on from there to the first virtual communities and multiuser online role-playing games.
Dual focus on social robots and computer-mediated communication.
(xiii) The focus of my research on networking was the young, and so I did most of my observations in high schools and on college campuses.
Not inner history of devices as theorized by texts and technology studies or Latour.
(xiii) The work reported on here, as all of my work, includes field research and clinical studies. . . . I teach courses about the computer culture and the psychology of computation, and some of my material comes from the give-and-take of the classroom. . . . I call these studies clinical, but of course my role in them is as a researcher, not a therapist. My interest in the “inner history” of technology means that I try to bring together the sensibility of ethnographer and clinician in all my work.
Imagine a participatory, ethnographic approach like hers in programming cultures, as recommended by Kitchin and Dodge.
(xiii) In my studies of robots, I provided the artifacts. . . . In the research on the networked life, I did not distribute any technology.
Different philosophical trajectory for my study of adults and computers.
(xiv) It seems right that Zhu Zhu pets and Chatroulette are the final
“objects” I report on in this book: the Zhu Zhus are designed to
be loved; in Chatroulette, people are objectified and quickly
discarded. I leave my story at a point of disturbing symmetry: we
seem determined to give human qualities to objects and content to
treat each other as things.
(xv) I did much of the work reported here under the auspices of the MIT Initiative on Technology and Self.
She mentions Rodney Brooks, also important to Hayles, and Anita Say Chan who wrote on Slashdot users in Inner History of Devices.
(xvi) My MIT colleague Hal Abelson sent me an e-mail in 1997,
suggesting that I “study those dolls,” and I always take his
advice. In the late 1970s, he was the first to introduce me to the
special hopes of personal computer owners who were not content until
they understood the “innards” of their machines. In the late
1980s, he introduced me to the first generation of virtual
communities, known at the time as “MUDs.”
(xvi) My work on robotics has been funded by the Intel Corporation, the Mitchell Kapor Foundation, the Kurzweil Foundation, and the National Science Foundation (NSF Grant # SES-0115668, “Relational Artifacts”).
(xvii) I have worked on the themes of this book for decades.
(xvii) Rebecca calls our basement storage room “the robot cemetery” and doesn't much like to go down there. . . . The story of digital culture has been the story of Rebecca's life. The book is written as a letter to her about how her mother sees the conversations in her future.
(1) Technology proposes itself as the
architect of our intimacies. These days, it suggests substitutions
that put the real on the run. . . . On Second Life, a lot of people,
as represented by their avatars, are richer than they are in first
life and a lot younger, thinner, and better dressed. And we are
smitten with the idea of sociable robots, which most people first
meet in the guise of artificial pets.
(1) Technology is seductive when what it offers meets our human vulnerabilities. And as it turns out, we are very vulnerable indeed. We are lonely but fearful of intimacy. Digital connections and the sociable robot may offer the illusion of companionship without the demands of friendship.
(2) Computers no longer wait for humans to project meaning onto them. Now, sociable robots meet our gaze, speak to us, and learn to recognize us. They ask us to take care of them; in response, we imagine that they might care for us in return.
(3) As sociable robots propose themselves as substitutes for people, new networked devices offer us machine-mediated relationships with each other, another kind of substitution. . . . The network is seductive. But if we are always on, we may deny ourselves the rewards of solitude.
THE ROBOTIC MOMENT
Authenticity in culture of simulation what was sex for Victorians: threat, obsession, taboo, fascination.
believe that in our culture of simulation, the notion of authenticity
is for us what sex was for the Victorians – threat and obsession,
taboo and fascination. I have lived with this idea for many years;
yet, at the museum [exhibit featuring live Galapagos tortoise], I
found the children's position strangely unsettling. For them, in this
context, aliveness seemed to have no intrinsic value. Rather, it is
useful only if needed for a specific purpose.
(5) Always impressed with [David] Levy's inventiveness, I found myself underwhelmed by the message of this latest book, Love and Sex with Robots. . . . Levy argues that robots will teach us to be better friends and lovers because we will be able to practice on them. Beyond this, they will substitute where people fail.
(6) Authenticity, for me, follows from the ability to put oneself in the place of another, to relate to the other because of a shared store of human experiences: we are born, have families, and know loss and the reality of death. A robot, however sophisticated, is patently out of this loop.
Inauthentic as new aesthetic: why not make the same argument for intellectual dumbing down of human machine relationships?
and Sex seems
to celebrate an emotional dumbing down, a willful turning away from
the complexities of human partnerships – the inauthentic as a new
(7) I did not see marriage to a machine as a welcome evolution in human relationships. And so I was taken aback when the [Scientific American] reporter suggested that I was no better than bigots who deny gays and lesbians the right to marry.
(7) More than this, the reporter was insisting that machines would bring their own special qualities to an intimate partnership that needed to be honored in its own right. . . . The machine could be preferable – for any number of reasons – to what we currently experience in the sometimes messy, often frustrating, and always complex world of people.
(9) The comparison with pets sharpens the question of what it means to have a relationship with a robot. I do not know whether a pet could sense Miriam's unhappiness, her feelings of loss. I do know that in the moment of apparent connection between Miriam and her Paro, a moment that comforted her, the robot understood nothing.
Robotic moment as state of emotional and philosophical readiness to consider robots as pets, friends, confidants, romantic partners.
(9) Experiences such as these . . . have caused me to think of our time as the “robotic moment.” This does not mean that companionate robots are common among us; it refers to our state of emotional – and I would say philosophical – readiness. I find people willing to seriously consider robots not only as pets but as potential friends, confidants, and even romantic partners.
Could she be wrong about this trend as she was about the programming moment in which people seriously consider learning programming not only as a possible occupation but as an everyday practice like writing and operating motor vehicles?
(10) The idea of sociable robots suggests that we might navigate intimacy by skirting it. People seem comforted by the belief that if we alienate or fail each other, robots will be there, programmed to provide simulations of love.
Compare to work by Monica Florence on vampire stories from the PCA conference.
(10) (endnote 11) I have said that
rampant fantasies of vampire lovers (closeness with
constraints on sexuality) bear a family resemblance to ideas about
robot lovers (sex without intimacy, perfect). And closeness without
the possibility of physical intimacy and eroticized encounters that
can be switched off in an instant – these are the affordances of
online encounters. Online romance expresses the aesthetic of the
(11) Sociable robots serve as both symptom and dream: as a symptom, they promise a way to sidestep conflicts about intimacy; as a dream, they express a wish for relationships with limits, a way to be both together and alone.
(11) If the problem is that too much technology has made us busy and anxious, the solution will be another technology that will organize, amuse, and relax us. . . . Putting hope in robots expresses an enduring technological optimism, a belief that as other things go wrong, science will go right.
What if part of larger attitude related to Zizek chocolate laxative, so that technological comportment is not cause but symptom?
(11-12) But this is not a book about
robots. Rather, it is about how we are changed as technology offers
us substitutes for connecting with each other face-to-face. . . .
Does virtual intimacy degrade our experience of the other kind and,
indeed, of all encounters, of any kind?
(12) The blurring of intimacy and solitude may reach its starkest expression when a robot is proposed as a romantic partner. But for most people it begins when one creates a profile on a social-networking site or builds a persona or avatar for a game or virtual world. Over time, such performances of identity may feel like identity itself. And this is where robotics and the networked life first intersect. For the performance of caring is all that robots, no matter how sociable, know how to do.
(12) Our new media are well suited for accomplishing the rudimentary. And because this is what technology serves up, we reduce our expectations of each other.
(13) Our first embrace of sociable robotics (both the idea of it and its first exemplars) is a window onto what we want from technology and what we are willing to do to accommodate it. . . . We celebrate its “weak ties,” the bonds of acquaintance with people we may never meet. But that does not mean we prosper in them.
CONNECTIVITY AND ITS
(14) I have often observed this distinctive confusion: these days, whether you are online or not, it is easy for people to end up unsure if they are closer together or further apart. I remember my own sense of disorientation the first time I realized that I was “alone together.”
(14) But at this conference, it was clear that what people mostly want from public space is to be alone with their personal networks.
(15) The world is now full of modern Goldilockses, people who take comfort in being in touch with a lot of people whom they also keep at bay.
(16) Whether or not our devices are in use, without them we feel disconnected, adrift.
Balance Turkle critical evaluation of posthuman cyborg identity against Hayles (not in index but cited).
(16) I once described the computer as a second self, a mirror of mind. Now the metaphor no longer goes far enough. Our new devices provide space for the emergence of a new state of the self, itself, split between the screen and the physical real, wired into existence through technology.
Necessarily enter ethics.
(17) Technology reshapes the landscape of our emotional lives, but is it offering us the lives we want to lead? Many roboticists are enthusiastic about having robots tend to our children and our aging parents, for instance. Are these psychologically, socially, and ethically acceptable propositions? What are our responsibilities here? And are we comfortable with virtual environments that propose themselves not as places for recreation but as new worlds to live in?
ROMANCING THE MACHINE: TWO
(17) I tell two stories in Alone Together: today's story of the network, with its promise to give us more control over human relationships, and tomorrow's story of sociable robots, which promise relationships where we will be in control, even if that means not being in relationships at all.
(18) Part One, “The Robotic Moment,” moves from the sociable robots in children's playrooms to the more advanced ones in the laboratory and those being developed and deployed for elder care. . . . If something asks for your care, you don't want to analyze it but take it “at interface value.” It becomes “alive enough” for relationship.
(18) And with this, the heightened expectations begin.
(18-19) Part Two, “Networked,” turns to the online life as it reshapes the self. . . . We are increasingly connected to each other but oddly more alone: in intimacy, new solitudes.
In intimacy, new solitudes not a Luddite outcome, nor optimistic like Hayles.
(19) But when we ask what we “miss,” we may discover what we care
about, what we believe to be worth protecting. We prepare ourselves
not necessarily to reject technology but to shape it in ways that
honor what we hold dear.
(19) We are on the verge of seeking the company and counsel of sociable robots as a natural part of life. Before we cross this threshold, we should ask why we are doing so.
Roboticists have learned triggers that help us fool ourselves, perhaps making us stupider, more gullible, or more striated.
Can TAPOC help innoculate from shallow human-computer symbiosis by critically foregrounding awareness of how they work and through practice programming them instead of being programmed by them, or is my position likewise damned?
(20) The attachments I describe do not follow from whether computational objects really have emotion or intelligence, because they do not. The attachments follow from what they evoke in their users. Our new objects don't so much “fool us” into thinking they are communicating with us; roboticists have learned those few triggers that help us fool ourselves. We don't need much. We are ready to enter the romance.
The Robotic Moment
In Solitude, New Intimacies
(24) Soon after, Weizenbaum and I were coteaching a course on computers and society at MIT.
(24) I came to think of this human complicity in a digital fantasy as the “ELIZA effect.” . . . At the robotic moment, more than ever, our willingness to engage with the inanimate does not depend on being deceived but on wanting to fill in the blanks.
Link coteaching with Weizenbaum to Hayles narrative of how we became posthuman and Latours narrative of why we have never been modern.
(25) The arc of this story does not reflect new abilities of machines
to understand people, but people's changing ideas about psychotherapy
and the workings of their own minds, both seen in more mechanistic
(26) A premium on performance is the cornerstone of the robotic moment. We live the robotic moment not because we have companionate robots in our lives but because the way we contemplate them on the horizon says much about who we are and who we are willing to become.
How to combat the apparent default trajectory of who we are willing to become: try rejuvenating learning programming by adults who shall become philosophers of computing.
(26) How did we get to this place? The answer to that question is hidden in plain sight, in the rough-and-tumble of the playroom, in children's reactions to robot toys. As adults, we can develop and change our opinions. In childhood, we establish the truth in our hearts.
(27) In the 1980s, faced with computational objects, children began to think through the question of aliveness in a new way, shifting from physics to psychology.
(28) As criteria for life, everything pales in comparison to a robot's capacity to care.
In speaking about sociable robots, children use the phrase “alive
enough” as a measure not of biological readiness but of relational
readiness. . . . Wilson's way of keeping in mind the dual aspects of
the Furby's nature seems to me a philosophical version of
multitasking, so central to our twentieth-century attentional
ecology. His attitude is pragmatic. If something that seems to have a
self is before him, he deals with the aspect of self he finds most
relevant to the context.
(30) These robots are evocative: understanding how people think about them provides a view onto how we think about ourselves. . . . It all began when children met the seductive Tamagotchis and Furbies, the first computers that asked for love.
THE TAMAGOTCHI PRIMER
Crucial connection to transition from deep understanding to taking things at interface value that can be basis for rejuvenating adult interest in learning programming.
Stakes are higher now that technology is more advanced and robots are designed to help humans fool themselves into artificial relationships.
(30) (endnote 17) In the early 1980s, children's notion of people as “emotional machines” seemed to me an unstable category. I anticipated that later generations of children would find other formulations as they learned more about computers. They might, for example, see through the apparent “intelligence” of the machines by developing a greater understanding of how they were created and operated. As a result, children might be less inclined to see computers as kin. However, in only a few years, things moved in a very different direction. Children did not endeavor to make computation more transparent. Like the rest of the culture, they accepted it as opaque, a behaving system. Children taking sociable robots “at interface value” are part of a larger trend. . . . The Macintosh version of “transparency” stood the traditional meaning of the word on its head. . . . In other words, transparency had come to mean being able to make a simulation work without knowing how it worked.
Build comparison to how children and adults alike are acculturated to taking care of their electronic objects, such as charging batteries, updating applications, running virus scans, and making backups: as she says, nurturance is the killer app.
(31) The earliest
electronic toys and games of thirty years ago – such as Merlin,
Simon, Speak & Spell – encouraged children to consider the
proposition that something smart might be “sort of alive.” With
Tamagotchis, needy objects asked for care, and children took further
(32) Nurturance is the “killer app.”
(33-34) The new
creature, a kind of imposter, is a classic case of Sigmund Freud's
uncanny – it's familiar, yet somehow not.
(34) With the Tamagotchi, we see the beginning of mourning for artificial life. It is not mourned as one would mourn a doll. The Tamagotchi has crossed a threshold. Children breathe life into their dolls. With the Tamagotchi, we are in a realm of objects that children see as having their own agendas, needs, and desires. Children mourn the life the Tamagotchi has led.
(34) A child's mourning for a Tamagotchi is not always a solitary matter. When a Tamagotchi dies, it can be buried in an online Tamagotchi graveyard. . . . These online places of mourning do more than give children a way to express their feelings. They sanction the idea that it is appropriate to mourn the digital – indeed, that there is something “there” to mourn.
(36) From the very first, the children make it clear that the Furby is a machine but alive enough to need care.
Cites Haraway and Hayles.
(37) Ideas about the human as machine or as joined to a machine are played out in classroom games. In their own way, toy robots prepare a bionic sensibility.
WHAT DOES A FURBY WANT?
(38) But as children interact with sociable robots like Furbies, they move beyond a psychology of projection to a new psychology of engagement.
Furby as primitive exemplar of sociable robots.
This mutuality is at the heart of what makes the Furby, a primitive
exemplar of sociable robotics, different from traditional dolls. . .
. For decades computers have asked us to think
them; these days, computers and robots, deemed sociable, affective,
and relational, ask us to feel
and with them.
(41) Designed to give users a sense of progress in teaching it, when the Furby evolves over time, it becomes the irreplaceable repository and proof of its owner's care.
(45) A half century later, [Freedom] Baird asks under what conditions a creature is deemed alive enough for people to experience an ethical dilemma if it is distressed. She designs a Turing test not for the head but for the heart and calls it the “upside-down test.” A person is asked to invert three creatures: a Barbie doll, a Furby, and a biological gerbil. Baird's question is simple: “How long can you hold the object upside down before your emotions make you turn it back?”
Damasio physical response to painful situation versus associated emotion.
(45) The work or neuroscientist Antonio Damasio offers insight into
the origins of this guilt.
(46) They are experiencing something new: you can feel bad about yourself for how you behave with a computer program.
(47) No matter what position one takes, sociable robots have taught us that we do not shirk from harming realistic simulations of life.
(48) All of this – the Furbies that complain of pain, the My Real Babies that do not – creates a new ethical landscape.
Programmed assertions of boundaries disturbing due to accompanying somatic reaction.
(49) I feel even more uncomfortable when I learn about a beautiful “female” robot, Aiko, now on sale, that says, “Please let go . . . you are hurting me,” when its artificial skin is pressed too hard. The robot also protests when its breast is touched: “I do not like it when you touch my breasts.” I find these programmed assertions of boundaries and modesty disturbing because it is almost impossible to hear them without imagining an erotic body braced for assault.
THE ROMANTIC REACTION TO THE ROBOTIC MOMENT
(50) Of course, elements of this romantic reaction are still around us. But a new sensibility emphasizes what we share with our technologies. With psychopharmacology, we approach the mind as a bioengineerable machine.
(52) Life? Romantic matters? Problems of friendship? These were the sacred spaces of the romantic reaction. Only people were allowed there. Howard thinks that all of these can be boiled down to information so that a robot can be both expert resource and companion. We are at the robotic moment.
Updating OS, apps versus wide spectrum care of motorcycle maintenance.
(55) These days, what is new is that an off-the-shelf technology as simple as an AIBO provides an experience of shaping one's own companion. But the robots are shaping us as well, teaching us how to behave so that they can flourish. Again, there is psychological risk in the robotic moment. Logan's comment about talking with the AIBO to “get thoughts out” suggests using technology to know oneself better. But it also suggests a fantasy in which we cheapen the notion of companionship to a baseline of “interacting with something.” We reduce relationship and come to see this reduction as the norm.
Lose alterity with robot companion as it is a selfobject.
The first thing missing if you take a robot as a companion is
the ability to see the world through the eyes of another.
(56) Selfobjects are “part” objects. When we fall back on them, we are not taking in a whole person. Those who can only deal with others as part objects are highly vulnerable to the seductions of a robot companion.
(56) With a price tag of $1,300 to $2,000, AIBO is meant for grown-ups. But the robot dog is a harbinger of the digital pets of the future, and so I present it to children from age four to thirteen as well as to adults. . . . What is it like living with an AIBO?
(57) In the early days of digital culture, when they met their first electronic toys and games, children of this age would remain preoccupied with such questions of categories. But now, faced with this sociable machine, children address them and let them drop, taken up with the business of a new relationship.
(58) AIBO also “wants” attention in order to learn. And here children become invested. Children don't just grow up with AIBO around; they grow AIBO up.
(58) The fact that AIBO can develop new skills is very important to children; it means that their time and teaching make a difference.
Physical autonomy of robots removes question of historical determination: why we need to dive back into the depths and cultivate an informed, practical stance toward computer technology.
(endnote 7) In the 1980s, the presence of a “programmer” figured
in children's conversations about computer toys and games. The
physical autonomy of robots seems to make the question of their
historical determination fall out of the conversation.
is crucial in people's relating to robots as alive on their own
(59) A particular AIBO is irreplaceable because it calls back memories not only of one's younger self but of the robot's younger self as well, something we already saw as children connected to their Tamagotchis and Furbies.
BETTER THAN NOTHING TO BETTER THAN ANYTHING
(60) Here, AIBO is not practice for the real. It offers an alternative, one that sidesteps the necessity of death. For Paige, simulation is not necessarily second best.
(60 endnote 9) (318) One hears echoes of a “transhuman perspective” (the idea that well will move into a new realm by merging with our machines) in children's asides as they play with AIBO.
Robot and Frank ideal movie for considering aggression toward sociable robots, their relation to memory, and new social bonds they may foster.
(61) Aggression toward sociable robots is more complex because children are trying to manage more significant attachments. . . . Whether we have permission to hurt or kill an object influences how we think about its life.
VISIONS AND COLD COMFORTS
(63) Artificial intelligence is often described as the art and science of “getting machines to do things that would be considered intelligent if done by people.” We are coming to a parallel definition of artificial emotion as the art of “getting machines to express things that would be considered feelings if expressed by people.”
(64) [John] Lester has a cubist view of AIBO; he is aware of it as machine, bodily creature, and mind.
(64) But this understanding does not interfere with his attachment, just as knowing that infants draw him in with their big, wide eyes does not threaten his connection with babies.(64) It is a big step from accepting AIBO as a companion, and even a solace, to the proposals of David Levy, the computer scientist who imagines robots as intimate partners. . . . From watching children play with objects designed as “amusements,” we come to a new place, a place of cold comforts.
(65) In the company of their robots, Jane and Harry are alone in a way that encourages them to give voice to their feelings. Is there harm here?
Compare Kurzweil robotic incarnation of dead father to Descartes mythical female automaton mentioned by Sterne.
(66) Like AI scientist and inventory Raymond Kurzweil, who dreams of a robotic incarnation of his father who died tragically young, [Douglas] Hines committed himself to the project of building an artificial personality. At first, he considered building a home health aid for the elderly but decided to begin with sex robots, a decision that he calls “only marketing.” His long-term goal is to take artificial personalities into the mainstream. He still wants to recreate his lost friend.
Robotic companionship like living in books, lost in music?
(66) Robotic companionship may seem a sweet deal, but it consigns us to a closed world – the loveable as safe and made to measure.
(68) To put it too simply, conversations about My Real Baby easily lead to musing about a future in which My Real Baby becomes My Real Babysitter. In this, My Real Baby and AIBO are evocative objects—they give people a way to talk about their disappointments with the people around them—parents and babysitters and nursing home attendants—and imagine being served more effectively by robots.
Portrayals of inanimate coming to life range from horrifying to gratifying; compare to Heim seeing computer as component versus opponent.
(68) Traditional science fiction, from Frankenstein to the Chucky movies, portrays the inanimate coming to life as terrifying. Recently, however, it has also been portrayed as gratifying, nearly redemptive.
Levinas alterity mixed with Buber I and thou.
(85) Over the years, some of my students have even spoken of time with Cog and Kismet by referring to a robotic “I and thou.” . . . A robotic face is an enabler; it encourages us to imagine that robots can put themselves in our place and that we can put ourselves in theirs.
Robotic experiments of questionable ethical character hint at Milgram experiments.
(101) In one published experiment, two young children are asked to spend time with a man and a robot designed to be his clone. The experiment has a significant backstory. Japanese roboticist Hiroshi Ishiguro build androids that duplicate himself, his wife, and his five-year-old daughter.
love's labor lost
MIT AgeLab creating technologies for helping the elderly like the robot Paro.
(103) The [MIT] AgeLab's mission is to create technologies for helping the elderly with their physical and emotional needs, and already Paro had carved out a major role on this terrain.
Opacity of robot programming forces behavior as with an likewise opaque human, at interface level.
(111) The programming of My Real Baby lies beyond his reach. The robot is an opaque behaving system that he is left to deal with as he would that other opaque behaving system, a person.
DO ROBOTS CURE CONSCIENCE?
Democratization of sense of connection originally noticed with programmers now at everyone taking them at interface value; programs now designed to convince us they are adequate companions.
(124) When I first began studying people and computers, I saw programmers relating one-to-one with their machines, and it was clear that they felt intimately connected. The computer's reactivity and interactivity—it seemed an almost-mind—made them feel they had “company,” even as they wrote code. Over time, that sense of connection became “democratized.” Programs became opaque: when we are at our computers, most of us only deal with surfaces. We summon screen icons to act as agents. We are pleased to lose track of the mechanisms behind them and take them “at interface value.” But as we summon them to life, our programs come to seem almost companions. Now, “almost” has almost left the equation. Online agents and sociable robots are explicitly designed to convince us that they are adequate companions.
Concern that promise of robotic solutions are defaulting, and our practice interacting with robots accustoming us to reduced emotional range.
(124) Do plans to
provide companion robots to the elderly make us less likely to look
for other solutions for their care?
(125) If you practice sharing “feelings” with robot “creatures,” you become accustomed to the reduced “emotional” range that machines can offer. As we learn to get the “most” out of robots, we may lower our expectations of all relationships, including those with people. In the process, we betray ourselves.
Edsinger complex wonder not dissolved by knowing how the robot works.
Thrilled by moments when the “creature” seems to escape,
unbidden, from the machine, [Aaron] Edsinger beings
to think of Domo's preferences not as things he has programmed but as
the robot's own likes and dislikes.
(132) In these moments, there is no deception. Edsinger knows now Domo “works.” Edsinger experiences a connection where knowledge does not interfere with wonder. This is the intimacy presaged by the children for whom Cog was demystified but who wanted it to love them all the same.
Lindman trying to experience machine cognition by embodying its facial expressions after failing to go into alien temporalities of the program level with her brain.
(138) [Pia] Lindman soon
discovers that a person cannot make her brain into the output device
for a robot intelligence. So, she modifies her plan. Her new goal is
to “wear” Mertz's facial expressions by hooking up her face
rather than her brain to the Mertz computer, to “become the tool
for the expression of the artificial intelligence.” After working
with Domo, Lindman anticipates that she will experience a gap between
who she is and what she will feel as she tries to be the robot.
(139) If some day she does hook herself up to a robot's program, she believes she will have knowledge of herself that no human has ever had. She will have the experience of what it feels to be “taken over” by an alien intelligence.
Turkle limits artificial comprehension for lack of human life cycle, as did Lyotard.
(139) In my own work, I argue the limits of artificial comprehension because neither computer agents nor robots have a human life cycle.
People project affect onto computers.
(139) Computer scientists who work in the field known as “affective computing” feel supported by the work of social scientists who underscore that people always project affect onto computers, which helps them to work more constructively with them.
Affective computing attempting to steer technological evolution by adding winning personality to ease of use, threatening reduction of affect like intelligence.
(140) For me, these secondary consequences are the heart of the
matter. Making a machine easy to use is one thing. Giving it a
winning personality is another, Yet, this is one of the directions
taken by affective computing (and sociable robotics).
(140-141) Giving machines a bit of “affect” to make them easier to use sounds like common sense, more a user interface strategy than a philosophical position. . . . The word “intelligence” underwent a similar reduction in meaning when we began to apply it to machines.
THROUGH THE EYES OF THE ROBOT
Computer scientist John Lester makes optimistic predictions that humans will fill robots with same personal history as their phones; like the robots of Tony Stark in the Ironman movies, the animated butler, prosthetic suit, and code space of the room itself will create true cyborgs.
on AIBO, [John] Lester imagines that robots will change the course of
(141) Like Lindman and Edsinger, Lester sees a world of creature-objects burnished by our emotional attachments. . . . Lester sees a future in which something like an AIBO will develop into a prosthetic device, extending human reach and vision. It will allow people to interact with real. Physical space in new ways.
(142) We will animate our robots with what we have poured into our phones: the story of our lives. When the brain in your phone marries the body of your robot, document preparation meets therapeutic massage. Here is a happy fantasy of security, intellectual companionship, and nurturing connection. How can one not feel tempted?
(143) Our rooms will be our friends and companions.
In Intimacy, New Solitudes
THE NEW STATE OF THE SELF: MULTITASKING AND THE ALCHEMY OF TIME
Social judgment of multitasking has shifted from blight to virtue, in spite of psychological research, due to neurochemical high it produces.
(162-163) Subtly, over time, multitasking, once seen as something of a blight, was recast as a virtue. . . . When psychologists study multitasking, they do not find a story of new efficiencies. Rather, multitaskers don't perform as well on any of the tasks they are attempting. But multitasking feels good because the body rewards it with neurochemicals that induce a multitasking “high.”
Sociable robots imagined as people, and people online imagined as objects; bring in Latin origins of the word computer in computarat.
(168) With sociable robots, we imagine objects as people. Online, we invent ways of being with people that turn them into something close to objects.
growing up tethered
no need to call
reduction and betrayal
Kurzweil Ramona experiment foreshadowed everyday practice of avatar identity in computer games like The Beatles Rock Band.
(211) In the real, Kurzweil wore high-tech gear that captured his every gesture and turned them into Ramona's movements. His own voice was transformed into Ramona's female voice. Watching Kurzweil perform as Ramona was mesmerizing. And Kurzweil himself was mesmerized. It was an occasion, he said, for him to reflect on the difficulties of inhabiting another body and on how he had to retrain his movements—the way he held his head, the shape of his gestures—to became an avatar of another gender. These days, certain aspects of that experience, once so revolutionary, have become banal. We have turned them into games.
Turner identity explored most freely in liminal places; boundary of things well describes early experience with personal computers and now everyday experience with virtual realities.
(213) Anthropologist Victor Turner writes that we are most free to explore identity in places outside of our normal life routines, places that are in some way “betwixt and between.” Turner calls them liminal, from the Latin word for “threshold.” They are literally on the boundaries of things.
Example of slipping away in games than online accomplishment improving character or providing practice for accomplishing mundane tasks.
(223) In Adam's case, there is no evidence that online accomplishment is making him feel better about himself in the real. He says he is letting other things “slip away”—Erin, the girl he liked on the word game; his job; his hopes of singing and writing songs and screenplays. None of these can compete with his favorite simulations, which he describes as familiar and comforting, as places where he feels “special,” both masterful and good.
Feeling of creation in simulation games, not creation or its pressures, is the sweet spot of simulation.
(223) But this is creation where
someone has already been. Like playing the guitar in The Beatles:
Rock Band, it is not creation but the feeling
(224) This is the sweet spot of simulation: the exhilaration of creativity without its pressures, the excitement of exploration without its risks.
In the zone, flow state fully immersed in focused activity, there are clear expectations and attainable goals, allowing action without self-consciousness, compelling through constraints creates pure space: source of Weizenbaum computer bum imagery, flow space seems comparable to draft of thinking Heidegger praised, so remains ambiguous like pharmakon.
As I have said, with robots, we are alone and imagine ourselves
together. On networks, including game worlds, we are together but so
lessen our expectations of other people that we can feel utterly
alone. In both cases, our devices keep us distracted. They provide a
sense of safety in a place of excluding concentration. Some call it
(226) Psychologist Mihaly Csikszentmihali examines the idea of “zone” through the prism of what he calls “flow,” the mental state in which a person is fully immersed in an activity with focus and involvement. In the flow state, you have clear expectations and attainable goals.
(226) In the flow state, you are able to act without self-consciousness. Overstimulated, we seek constrained worlds. You can have this experience at a Las Vegas gambling machine or on a ski slope. And now, you can have it during a game of Civilization or World of Warcraft. . . . All of these are worlds that compel through their constraints, creating a pure space where there is nothing but its demands.
Neurochemical response stimulated by connectivity to answer the seeking drive, which resembles addiction; substitution for archive fever of print era?
(227) Our neurochemical response to
every ping and ring tone seems to be the one elicited by the
“seeking” drive, a deep motivation of the human psyche.
Connectivity becomes a craving; when we receive a text or an email,
our nervous system responds by giving us a shot of dopamine. We are
stimulated by connectivity itself.
(228) For many people, the metaphor of addiction feels like the only possible way to describe what is going on.
(228) Simulation offers the warmth of a technological cocoon.
Online confessional sites as symptoms visited to relieve anxieties; Turkle does not blame technology for creating myths people believe that it does not matter they are disappointing each other.
(237) It is useful to think of a
symptom as something you “love to hate” because it offers relief
even as it takes you away from addressing an underlying problem. To
me, online confessional sites can be like symptoms—a shot of
feeling good that can divert attention from what a person really
(237) But if we use these sites to relieve our anxieties by getting them “out there,” we are not necessarily closer to understanding what stands behind them. And we have not used our emotional resources to build sustaining relationships that might help. We cannot blame technology for this state of affairs. It is people who are disappointing each other. Technology merely enables us to create a mythology in which this does not matter.
Counter broadening definition of community to include virtual places by foregrounding physical proximity, shared concerns, real consequences and common responsibilities.
(238) Those who
run online confessional sites suggest that it is time to “broaden
our definition of community” to include these virtual places. But
this strips language of its meaning. If we start to call online
spaces where we are with other people “communities,” it is easy
to forget what that word used to mean. From its derivation, it
literally means “to give among each other.”
(239) Communities are constituted by physical proximity, shared concerns, real consequences, and common responsibilities. Its members help each other in the most practical ways.
AFTER CONFESSION, WHAT?
Internet gives us new ways not to think by keeping us busy externalizing problems, recalling Weizenbaum absent mind.
(240) We did not need the invention of online confessional sites to keep us busy with ways to externalize our problems instead of looking at them. But among all of its bounties, here the Internet have given us a new way not to think.
Arguments that disparage books as disconnected appeal to idealized online reading practices, ignoring daydreaming and introspection that used to attend reading books, and along with multitasking often do not inspire heroic narratives, but instead new anxieties.
(242) Anxiety is part of the new
connectivity. Yet, it is often the missing term when we talk about
the revolution in mobile communications. Our habitual narratives
about technology begin with respectful disparagement of what came
before and move on to idealize the new. So, for example, online
reading, with its links and hypertext possibilities, often receives a
heroic, triumphalist narrative, while the book is disparaged as
“disconnected.” That narrative goes something like this: the old
reading was linear and exclusionary; the new reading is democratic as
every text opens out to linked pages—chains of new ideas. But this,
of course, is only one story, the one technology wants to tell. There
is another story. The book is connected to daydreams and personal
associations as readers look within themselves. Online reading—at
least for the high school and college students I have studied—always
invites you elsewhere. And it is only sometimes interrupted by
linking to reference works and associated commentaries. More often,
it is broken up by messaging, shopping, Facebook, MySpace, and
YouTube. This “other story” is complex and human. But it is not
part of the triumphalist narrative in which every new technological
affordance meets an opportunity, never a vulnerability, never an
(242) There were similar idealizations when it became clear that networked computers facilitated human multitasking. . . . But online multitasking, like online reading, can be a useful choice without inspiring a heroic narrative.
Realtechnik perspective examines problems and dislocations in addition to possibilities and fulfillment.
(243) The realtechnik of connectivity culture is about possibilities and fulfillment, but it also is about the problems and dislocations of the tethered self.
Young people believe digital memory will create a more tolerant society, and their favorite websites are run by good people of their generation and ignore their actual corporate governance; her insight connects well it Lanier critiquing these siren servers.
(255) Some teenagers say that their privacy concerns are not as bad
as they might seem because, in the future, everyone running for
office, everyone about to get a judicial appointment or an important
corporate job, will have an accessible Internet pas with significant
indiscretions. In this narrative, implacable digital memory will not
be punishing but will create a more tolerant society. Others come up
with a generational argument: “Facebook is owned by young people.”
This idea confuses investors, owners, managers, inventors,
spokespeople, and shareholders. It is innocent of any understanding
of how corporations work or are governed. But it is not a surprising
response. If your life is on Facebook or MySpace or Google, you want
to feel that these companies are controlled by good people. Good
people are defined as those who share what you feel is your most
salient characteristic. For the young, that characteristic is
(255) In fact, from the very beginning, Facebook has been in something of a tug-of-war with its users about how much control it has over their data. The pattern, predictably, is that Facebook declares ownership of all of it and tries to put it to commercial use. Then, there is resistance and Facebook retreats. This is followed by another advance, usually with subtler contours.
PRIVACY AND THE ANXIETIES OF ALWAYS
Extreme self-policing aims for a precorrected self, new regime of self-surveillance; connect to Foucault on panopticism and Heim on word processing.
(256) This is life in the world of cut and paste. Worse, this is life
in the world of cut, edit, and paste.
(258) Here, we see self-policing to the point of trying to achieve a precorrected self.
(258) Brad and Audrey both experience the paradox of electronic messaging. . . . You feel in a place that is private and ephemeral. But your communications are public and forever.
(259) one sees a new regime of self-surveillance at work. . . . We see a first generation going through adolescence knowing that their every misstep, all the awkward gestures of their youth, are being frozen in a computer's memory.
Anxiety of always replacing protean self of earlier Internet.
With the persistence of data, there is, too, the persistence of
people. . . . In principle, everyone wants to stay in touch with the
people they grew up with, but social networking makes the idea of
“people from one's past” close to an anachronism.
(260) This is the anxiety of always. A decade ago I argued that the fluidity, flexibility, and multiplicity of our lives on the screen encouraged the kind of self that Robert Jay Lifton called “protean.” I still think it is a useful metaphor. But the protean self is challenged by the persistence of people and data. The sense of being protean is sustained by an illusion with an uncertain future. The experience of being at one's computer or cell phone feels so private that we easily forget our true circumstance: with every connection we leave an electronic trace.
Persistence of people and data leaves no psychosocial moratorium or separation with the past, leading to fictional Peter Pan beliefs that there is no electronic shadow; real consequences of loss of privacy for intimacy and democracy.
Similarly, I have argued that the Internet provided spaces for
adolescents to experiment with identity relatively free of
consequences, as Erik Erikson
they must have. The persistence of data and people undermines this
possibility as well. . . . The idea of the moratorium does not easily
mesh with a life that generates its own electronic shadow.
(261) The need for a moratorium space is so compelling that if they must, they are willing to find it in a fiction.
(261) Some say this issue is a nonissue; they point out that privacy is a historically new idea. This is true. But although historically new, privacy has well served our modern notions of intimacy and democracy. Without privacy, the borders of intimacy blur. And, of course, when all information is collected, everyone can be turned into an informer.
PRIVACY HAS A POLITICS
Foucault panopticon metaphor for pervasive electronic monitoring results in extreme self-policing in addition to disturbing, confused distinction between embarrassing behavior and political behavior.
(261) When they talk about the Internet, young people make a
disturbing distinction between embarrassing behavior that will be
forgiven and political behavior that might get you into
(262-263) The panopticon serves a metaphor for how, in the modern state, every citizen becomes his or her own policeman. . . . As long as you are not doing anything wrong, you are safe. Foucault's critical take on disciplinary society had, in the hands of the technology guru, become a justification for the U.S. government to use the Internet to spy on its citizens. All around us at the cocktail party, there were nods of assent. We have seen that variants of this way of thinking, very common in the technology community, are gaining popularity among high school and college students.
Turkle recounts her own childhood memories of McCarthyism and pride in civil liberties like the privacy of the mailbox.
(263) When I talk to teenagers about the certainty that their privacy
will be invaded, I think of my very different experience growing up
in Brooklyn in the 1950s.
(263) As the McCarthy era swirled about them, my grandparents were frightened. . . . From the earliest age, my civics lessons at the mailbox linked privacy and civil liberties. I think of how different things are today for children who learn to live with the idea that their email and messages are shareable and unprotected. And I think of the Internet guru at the Webby awards who, citing Foucault with no apparent irony, accepted the idea that the Internet has fulfilled the dream of the panopticon and summed up his political position about the Net as follows: “They way to deal is to just be good.”
Need technical and mental space for dissent; not nostalgic or Luddite, the conversation is about democracy defining its sacred spaces.
(263) You have to leave space for dissent, real dissent. There needs
to be technical space (a sacrosanct mailbox) and mental space.
(264) In democracy, perhaps we all need to begin with the assumption that everyone has something to hide, a zone of private action and reflection, one that must be protected no matter what our techno-enthusiasms.
(264) To me, opening up a conversation about technology, privacy, and civil society is not romantically nostalgic, not Luddite in the least. It seems like part of democracy defining its sacred spaces.
the nostalgia of the young
Shared attention of parents a new challenge for children.
(267) From the youngest ages, these teenagers have associated
technology with shared attention. Phones, before they become an
essential element in a child's own life, were the competition, one
that children didn't necessarily feel they could best.
(267) Previously, children had to deal with parents being off with work, friends, or each other. Today, children contend with parents who are physically close, tantalizingly so, but mentally elsewhere.
Storr agrees with Erikson that space of solitude needed for creative process, which Turkle argues ist lost in din of Internet bazaar.
Erik Erikson writes that in their search for identity, adolescents
need a place of stillness, a place to gather themselves. Psychiatrist
of solitude in much the same way. Storr says that in accounts of the
creative process, “by far the greater number of new ideas occur
during a state of reverie, intermediate between waking and
(272) Online we are jarred by the din of the Internet bazaar.
Eerie loneliness of the disconnected also noted by Boltanksi and Chiapello.
(276) For those not connected, there can be an eerie loneliness, even on the streets of one's hometown.
Transformation goal for online life for deliberation, living without resignation, preserving inviolate sacred spaces by reweighting privacy concerns, versus Lanier plan for monetization of personal data and contributions.
(277) There are no simple answers as to
whether the Net is a place to be deliberate, to commit to life, and
live without resignation. But these are good terms with which to
start a conversation. That conversation would have us ask if these
are the values by which we want to judge our lives. If they are, and
if we are living in a technological culture that does not support
them, how can that culture be rebuilt to specifications that respect
what we treasure—our sacred spaces. Could we, for example.
Build a Net that reweights privacy concerns, acknowledging that
these, as much as information, are central to democratic life?
(277) The phrase “sacred spaces” became important to me in the 1980s when I studied a cohort of scientists, engineers, and designers newly immersed in simulation. Members of each group held certain aspects of their professional life to be inviolate. These were places they wanted to hold apart from simulation because, in that space, they felt most fully themselves in their discipline. For architects, it was hand drawing. . . . A sacred space is not a place to hide out. It is a place where we recognize ourselves and our commitments.
Asking Thoreau questions about lives on the screen, where do we live, and what do we live for?
(277) When Thoreau considered “where I live and what I live for,” he tied together location and values. Where we live doesn't just change how we live; it informs who we become. Most recently, technology promises us lives on the screen. What values, Thoreau would ask, follow from this new location? Immersed in simulation, where do we live, and what do we live for?
Recounts struggle computing pioneers had to come up with uses for personal computers, suggesting instead that humans have become the killer app for keeping them busy; no less profound opposite of deep truth that our time online is busywork.
Some of the most brilliant computer scientists in the world—such
pioneers of information processing and artificial intelligence as
Robert Fano, J.C.R. Licklider, Marvin Minsky, and Seymour Papert—were
asked to brainstorm on the question [what everyday people would do
with home computers]. My notes from this meeting show suggestions on
tax preparation and teaching children to program. No one thought that
anyone except academics would really want to write on computers.
Several people suggested a calendar; others thought that was a dumb
idea. There would be games.
(279-280) Now we know that once computers connected us with each other, once we became tethered to the network, we really didn't need to keep computers busy. They keep us busy. It is as though we have become their killer app. . . . Niels Bohr suggests that the opposite of a “deep truth” is a truth no less profound. As we contemplate online life, it helps to keep this in mind.
Postfamilial families assemble alone together with their devices.
(280) The ties we form through the
Internet are not, in the end, the ties that bind. But they are the
ties that preoccupy.
(280-281) In the evening, when sensibilities such as these come together, they are likely to form what have been called “postfamilial families.” Their members are alone together, each in their own rooms, each on a networked computer or mobile device.
Technology, to express its unconscious, wants to be a symptom.
(282) Kevin Kelly asks, “What does technology want?” and insists that, whatever it is, technology is going to get it. Accepting his premise, what if one of the things technology wants is to exploit our disappointments and emotional vulnerabilities? When this is what technology wants, it wants to be a symptom.
SYMPTOMS AND DREAMS
Idea of robotic companion serves as symptom exploiting disappointments with other humans, and dream for relationships we can control; connect to Descartes automaton.
(283) As we live the flowering of connectivity culture, we dream of sociable robots. Lonely despite our connections, we send ourselves a technological Valentine. If online life is harsh and judgmental, the robot will always be on our side. The idea of a robot companion serves as both symptom and dream. . . . As dreams, robots reveal our wish for relationships we can control.
Psychoanalytic approach to technology, noting cost of creativity is thinking it will solve everything; self-reflectively disturbing the field for long term gain frees from unbending narratives of optimism or despair.
psychoanalytic tradition teaches that all creativity has a cost, a
caution that applies to psychoanalysis itself. . . . A parallel with
technology is clear: we transgress not because we try to build the
new but because we don't allow ourselves to consider what it disrupts
or diminishes. We are not in trouble because of invention but because
we think it will solve everything.
(284) A successful analysis disturbs the field in the interest of long-term gain; it learns to repair along the way. One moves forward in a chastened, self-reflective spirit. Acknowledging limits, stopping to make the corrections, doubling back—these are at the heart of the ethic of psychoanalysis. A similar approach to technology frees us from unbending narratives of technological optimism or despair.
Conventional wisdom dangerously inadequate, taking performance of emotion by caring machines as emotion enough.
(286) Indeed, roboticists want us to know that the point of affective machines is that they will take care of us. This narrative—that we are on our way to being tended by “caring” machines—is now cited as conventional wisdom. We have entered a realm in which conventional wisdom, always inadequate, is dangerously inadequate. That it has become so commonplace reveals our willingness to take the performance of emotion as emotion enough.
Addiction to habits of mind technology allows us to practice.
(288) Sometimes people try to make life with others resemble simulation. They try to heighten real-life drama or control those around them. It would be fair to say that such efforts do not often end well. Then, in failure, many are tempted to return to what they do well: living their lives on the screen. If there is an addiction here, it is not to a technology. It is to the habits of mind that technology allows us to practice.
Slowness practices and other means of seeking solitude, which is ability to summon yourself by yourself, seen as backlash to social media like 1980s romantic reaction against computation as model of mind.
(288-289) Loneliness is failed
solitude. To experience solitude you must be able to summon yourself
by yourself; otherwise, you will only know how to be lonely. . . . I
see the beginnings of a backlash as some young people become
disillusioned with social media. There is, too, the renewed interest
in yoga, Eastern religions, meditating, and “slowness.”
(289) These new practices bear a family resemblance to what I have described as the romantic reaction of the 1980s.
(289) The romantic reaction of the 1980s made a statement about computation as a model of mind; today we struggle with who we have become in the presence of computers.
Appiah moral reasoning challenging quandary thinking by questioning how they are posed.
(291) For [Kwame Anthony] Appiah, moral reasoning is best accomplished not by responding to quandaries but by questioning how they are posed, continually reminding ourselves that we are the ones choosing how to frame things.
Realtechnik skeptical about linear progress, encouraging humility, and recognize the Net is still immature and correctable.
skeptical about linear progress. It encourages humility, a state of
mind in which we are most open to facing problems and reconsidering
decisions. It helps us acknowledge costs and recognize the things we
(294) Because we grew up with the Net, we assume that the Net is grown-up. We tend to see it as a technology in its maturity. But in fact, we are in early days. There is time to make the corrections.
At center of perfect storm, tempted by sociable robots to complete arc started by overwhelming social media technologies, leading to not only programmed visions of Chun but programmed emotions, expectations of simplified and reduced relationships with each other.
(295) The narrative of Alone Together describes an arc: we expect more from technology and less from each other. This puts us at the still center of a perfect storm. Overwhelmed, we have been drawn to connections that seem low risk and always at hand: Facebook friends, avatars, IRC chat partners. If convenience and control continue to be our priorities, we shall be tempted by sociable robots, where like gamblers at their slot machines, we are promised excitement programmed in, just enough to keep us in the game. At the robotic moment, we have to be concerned that the simplification and reduction of relationship is no longer something we complain about. It may become what we expect, even desire.
Her solution is to pursue reclaiming good manners, privacy, and concentration, putting us at war with ourselves due to synaptogenesis.
(296) We will begin with very simple things. Some will seem like just reclaiming good manners. Talk to colleagues down the hall, no cell phones at dinner, on the playground, in the car, or in company. There will be more complicated things: to name only one, nascent efforts to reclaim privacy would be supported across the generations. . . . We now know that our brains are rewired every time we use a phone to search or surf or multitask. As we try to reclaim our concentration, we are literally at war with ourselves.
Needs served by MyLifeBits like Sontag photography and Derrida archive fever, what becomes of recollection in the fully archived life, and does life become a strategy for supplying content to archive?
(300) MyLifeBits is a way for people to “tell their life stories to their descendants.” His [Gordon Bell] program aspires to be the ultimate tool for life collection. But what of recollection in the fully archived life? If technology remembers for us, will we remember less? Will we approach our own lives from a greater distance? Bell talks about how satisfying it is to “get rid” of memories, to get them into the computer. Speaking of photography, Susan Sontag writes that under its influence, “travel becomes a strategy for accumulating photographs.” In digital culture, does life become a strategy for establishing an archive?
Following Kelly, technology wants to ponder our memories as well as be a symptom.
(304) The memex and MyLifeBits both grew out of the idea that technology has developed capacities that should be put to use. There is an implied compact with technology in which we agree not to waste its potential. Kevin Kelly reframes this understanding in language that gives technology even greater volition: as technology develops, it shows us what it “wants.” To live peacefully with technology, we must do our best to accommodate these wants. By this logic, it would seem that right now, one of the things technology “wants” to do is ponder our memories.
Turkle, Sherry. Alone Together: Why We Expect More from Technology and Less from Each Other. New York: Basic Books, 2011. Print.