Emotional Fantasy: AI Can Pretend to Love Us, but Should We Love It Back?
Creators of artificial intelligence measure how well machines can imitate human qualities like empathy, listening, affirmation, and love. Don't reciprocate, says Sherry Turkle.
Sherry Turkle: I have very strong feelings about a future in which robots become the kind of conversational agent that pretend to have emotional lives. Shortly after I finished we can make reclaiming conversation I was interviewed for an article in the New York Times about Hello Barbie. So Hello Barbie comes out of the box and says, now I'm just paraphrasing, the jest hi I'm Hello Barbie. I have a sister. You have a sister. I kind of hate my sister. I'm jealous of your sister. Do you hate your sister? Let's talk about how we feel about our sisters. In other words it just kind of knows stuff about you and is ready to talk about the arc of a human life and sibling rivalry as though it had a life, a mother, the feelings of jealousy about a sister and was ready to relate to you on that basis. And it doesn't. It's teaching pretend empathy. It's asking you to relate to an object that has pretend empathy.
And this is really not a, in my view, this is really a not good direction for AI to go. There are so many wonderful things for robots to do, so many wonderful things for robots to do. Having pretend empathy, having pretend conversations about caring and love and things that a robot can feel about their body's and about their lives and about their mothers and about their sisters gets children and gets elders, which are the other target group for these robots into a kind of fantasy miasma that is not good for anybody. Children don't need to learn pretend empathy, they need to learn real empathy, which they get from having real conversations with real people who do have sisters, who do have mothers. And I think this is a very dangerous and indeed very toxic direction.
We worry so much about whether we can get people to talk to robots, you know, can you get a child to talk to this Hello Barbie? Can you get an elderly person to talk to a sociable robot? What about who's listening? There's nobody listening. These robots don't know how to listen and understand what you're saying. They know how to respond. They're programmed to make something of what you say and respond but they don't know what it means if you say my sister makes me feel depressed because she's more beautiful than I am and I feel that my mother loves her more. That robot really does not do anything useful to you with that information. That's not empathy. And children need to know that they're being heard by a human being that can do this empathy game with them, this empathy dance with them so in the middle of a time when we're having this crisis in empathy to imagine that now we're going to throw in some robots that will do some pretend empathy, I have to say that in all the optimism of my book this is the pessimistic part and I really end the book with a kind of call to arms. I call it what do we forget when we talk to machines? And I mean it to be literally a call to arms that this is not a good direction. We don't need to take this direction, we just need to not buy these products. This doesn't take a social revolution, this just takes consumers saying that they're not going to buy these products. They're not bringing them to their homes.
A professor at the Massachusetts Institute of Technology, Sherry Turkle is constantly questioning the role that technology plays in our lives. From personal computers and medical technology to children's toys that now include sophisticated artificial intelligence, the pace of technological progress has sped rapidly within the last several decades. But has often been the case in the past, our emotional and ethical progress lags substantially behind the advance of technology, and this is what principally concerns Turkle.
As we devote fewer hours of the day to face-to-face human interactions, sometimes substituting an online social experience, are we adversely affecting our deep evolutionary need to be social — to be an integral part of a real human community? Creators of artificial intelligence measure its effectiveness against how well human qualities like empathy, listening, affirmation, and love can be imitated. The famous "Turing Test," a contest which tests a person's ability to distinguish between having a conversation with another human and a conversation with a machine, is merely one example.
But there is something deeper to empathizing, listening, and loving than giving the convincing appearance of having emotion. Indeed the risks of living in an emotional fantasy, where we believe we have genuine relationships with machines that necessarily lack worldly human experience, outweigh the potential benefits, says Turkle. Here she issues a "call to arms" against giving our emotional lives over to robots.
The photos were taken the same day as Russian cosmonauts investigated a mysterious hole discovered in one of the craft.
- The spacecraft belong to Russia and two private American aerospace companies.
- Six astronauts are currently aboard the International Space Station to conduct a variety of experiments.
- On Monday, Russian cosmonauts conducted a spacewalk to investigate the nature and cause of a mysterious 2-millimeter-wide hole in a Russian spacecraft.
On Friday, NASA's InSight Mars lander captured and transmitted historic audio from the red planet.
- The audio captured by the lander is of Martian winds blowing at an estimated 10 to 15 mph.
- It was taken by the InSight Mars lander, which is designed to help scientists learn more about the formation of rocky planets, and possibly discover liquid water on Mars.
- Microphones are essentially an "extra sense" that scientists can use during experiments on other planets.
"Didn't you see me Googling 'baby not moving?'" Gillian Brockell wrote a heartbreaking open letter to big tech companies imploring them to change the ways they target ads to users.
- Advertisers are increasingly using hyper-specific information on users, collected by big tech companies, to sell products.
- An open letter published Tuesday outlines how this kind of ad targeting can be not only creepy, but also inadvertently cruel and distressing.
- Also on Tuesday, the House questioned Google's CEO, partly on issues related to data privacy.
SMARTER FASTER trademarks owned by The Big Think, Inc. All rights reserved.