Skip to content
Technology & Innovation

The one thing A.I. needs is the one thing we likely cannot program

Silicon Valley might just be missing the most important aspect of being human: the ability to feel.
Silicon Valley might just be missing the most important aspect of being human: the ability to feel. (Image: Public domain/Big Think)
Sign up for the Smarter Faster newsletter
A weekly newsletter featuring the biggest ideas from the smartest people

Bacteria are likely not the first thing to come to mind when contemplating human cultures. Yet biologists cultivate bacteria in a culture in order to watch the community of bacterium interact. This is more than a simple analogy or clever wordplay. In fact, this idea forms the basis of neuroscientist Antonio Damasio’s latest book, The Strange Order of Things: Life, Feeling, and the Making of Cultures.


Unicellular organisms rely on chemical molecules to sense and respond to their environments. They need to know where to seek nutrients and when to flee from danger. This initially occurs thanks to an ability to feel its way around its surroundings; Damasio argues that humans practice this all the time. Feelings form the basis of consciousness, an essential step in the construction of cultures.

Feelings are also how we know when something is wrong inside of our bodies, which, more than one unified system, is really a complex network of interacting systems. Indigestion likely won’t cause my foot to hurt, while stubbing my toe is not going to cause me to massage my forearm. The feeling that something is wrong is indicative of a system out of balance; conscious attention seeks the problem locally. Damasio continues:

Feelings are the mental expressions of homeostasis, while homeostasis, acting under the cover of feeling, is the functional thread that links early life-forms to the extraordinary partnership of bodies and nervous systems. This partnership is responsible for the emergence of conscious, feeling minds that are, in turn, responsible for what is most distinctive about humanity: cultures and civilizations.

Bacteria cultures are not that dissimilar from those created by humans. Bacteria navigate dangerous terrain and compete with other groups for resources. Most importantly, each bacterium needs to cooperate with its neighbors to ensure the success of the group. While speculation over the number of bacterial cells versus human cells in our bodies ranges from ten to one to roughly the same number, it makes sense that we’d act in a similar manner as bacterial cultures, considering they comprise a sizable portion of what we are.

While Damasio’s thesis on the formation of cultures is fascinating, it also brings to mind another question. If feelings underlie what we term consciousness, and a specific form of consciousness is required to create intelligence, is artificial intelligence possible if we can’t replicate the feelings necessary for its formation? Or will the “artificial” aspect always mean we’ve created a computer simulation void of true expression?


Renaud Meyer (L), United Nations Development Programme (UNDP) Country Director for Nepal, introduces humanoid robot Sophia at a conference on using technology for public services in Kathmandu on March 21, 2018. Sophia, a robot created by Hanson Robotics, was named by UNDP as its first non-human Innovation Champion, in November 2017. (Photo by Prakash Mathema/AFP/Getty Images)

This scenario has not escaped futurists or philosophers. In a recent NY Times editorial on the HBO series,Westworld, Paul Bloom and Sam Harris note the emotional investment of viewers watching robots get murdered and raped over and over again. Indeed, watching visitors find glee in these acts is one of the more disturbing facets of this show, a sort of transhumanist Stanford Prison Experiment. Bloom and Harris argue that as we create robots that are closer approximations to humans, will we program in a moral code—in us as much as them?

After all, if we do manage to create machines as smart as or smarter than we are—and, more important, machines that can feel—it’s hardly clear that it would be ethical for us to use them to do our bidding, even if they were programmed to enjoy such drudgery. The notion of genetically engineering a race of willing slaves is a standard trope of science fiction, wherein humankind is revealed to have done something terrible. Why would the production of sentient robot slaves be any different?

The type of robot we’re discussing matters. Damasio resists a popular metaphor originating in science fiction and now spread across popular culture: the notion you can “download” consciousness into a machine. He continues:

It reveals a limited notion of what life really is and also betrays a lack of understanding of the conditions under which real humans construct mental experiences.

Mental experiences do not result from brains alone, but the interaction of brains and bodies as well as feedback those bodies receive from their environment. Without downloading the body scientists cannot replicate consciousness in any way similar to what we experience.

Of course, in Westworld there are plenty of manufactured bodies that bleed and fight and are raped. That viewers find this disturbing is not surprising; it’s similar to breeding dogs to slip into handbags. We display empathy for other sentient beings. If they look and act like us, the leap to feeling for non-sentient beings is not far, even if they cannot feel for us. A dog is truly happy to see us. Dolores Abernathy sure can fake it, which might be enough.

Creating an emotional experience in a robot is, while great fodder for imagination, still beyond us. Reverse engineering the very skill bacterium mastered billions of years ago will not be without its challenges. Consciousness is an emergent phenomenon. It results from the interaction of numerous systems, and may, as neuroscientist Michael Gazzaniga argues in his new book, be part of a layered system: not one consciousness, but several, depending on the system that takes control at any given moment.

Damasio is equally skeptical of championing one form of consciousness as representative of the species. He concludes:

There is plenty of evidence that artificial organisms can be designed so as to operate intelligently and even surpass the intelligence of human organisms. But there is no evidence that such artificial organisms, designed for the sole purpose of being intelligent, can generate feelings just because they are behaving intelligently.

How one defines intelligence will be of primary importance. As Gazzaniga notes, the human brain may have evolved to control our motor systems. The neuroscientist Rodolfo Llinas argues the same; he speculates that thinking might well be the symbolic internalization of movement, as thoughts fire motor neurons. We are, first and foremost, animals that move, and that movement is dependent upon feeling our way around our environment.

Without that ability to feel, it is unlikely A.I. will ever truly mimic human beings. Its intelligence may far surpass our organic computing power, but life is not a series of algorithms. The terrain we live within is meant to be grappled with and seduced. Missing that primary skill, it’s hard to imagine a truly functional A.I.

That’s not to claim that someone won’t figure it out. It’s just a reminder—an important one in a tech culture that generally loathes the body and wishes to transcend it at every turn—that without incorporating the fundamental importance of physicality, we’re unlikely to ever cross that gap.

Stay in touch with Derek on Facebook and Twitter.

Sign up for the Smarter Faster newsletter
A weekly newsletter featuring the biggest ideas from the smartest people

Related

Up Next