Skip to content
Guest Thinkers

“Can’t Buy Me Love?” How About for Your Parents?

If your mother is elderly, requires 24-hour attention, and has Alzheimers, would you care for her yourself at home, hire a nurse, or put her in a nursing home? These are questions that plague many couples who must decide how to take care of an aging parent. Now there’s a new option to further confuse the mix: would you use a robot?


In order to appreciate the question “would you use a robot?” you first have to stop thinking about robots as intelligent, emotionally sensitive and witty creations of the type in movies like Wall-E and Transformers. Robotics is going to improve dramatically in the 21st century, but the road from Roomba, the self-directed vacuum cleaner, to WALL-E is long and society will have to grapple with many ethical dilemmas along the way. Take the $6000 robot Paro, for instance. A big fluffy white seal that nods its head and makes baby seal noises if you touch it, Paro has spurred quite a controversy since some nursing homes in the US imported it from Japan to provide emotional comfort to patients.

Paro is a robot with very narrow artificial intelligence, i.e. it can be programmed to respond to human touch and it does so consistently and well. It has sensors in-built for sound, light and touch, and it “responds” to human touch. The Wall Street Journal reports evidence for its effectiveness has been mixed: in some cases, patients with dementia and other age-related ailments have found Paro comforting, whereas in other cases, they have not responded to it at all.  If you gave it to your elderly mother, and she seemed happy to have it around to stroke and snuggle, would you buy it for her?

For most people, the instinctive reaction is: “why not, it’s just another toy.” But on deeper reflection, one appreciates the argument from robo-ethicists that the issue requires closer attention. For example, is it ethical for society to delegate its responsibility to nurture our elderly and provide emotional support to robots? What are the feedback effects of forming emotional bonds with robots on people? Aren’t you deceiving your loved one by bringing them into a situation in which they think they have an emotional bond with an object that has no emotions and is just programmed to respond a certain way?  

Sherry Turkle of MIT has been studying the consequences of our interaction with robots for several years, and strongly believes that we need a national debate on how we incorporate robots of any level of intelligence into our society. Yes, that includes soft cuddly seals with sweet wide eyes.

Ayesha and Parag Khanna explore human-technology co-evolution and its implications for society, business and politics at The Hybrid Reality Institute.


Related

Up Next
Internet comment sections are typically seen as a bastion of free speech, but have they outlived their importance? When do abusive and lazy comments override anonymous expression?