AI Is Evolving on Its Own. Does That Make It Dangerous?

Philosopher Daniel Dennett believes AI should never become conscious — and no, it's not because of the robopocalypse.

Daniel C. Dennett: I think a lot of people just assume that the way to make AIs more intelligent is to make them more human. But I think that's a very dubious assumption.

We're much better off with tools than with colleagues. We can make tools that are smart as the dickens, and use them and understand what their limitations are without giving them ulterior motives, purposes, a drive to exist and to compete and to beat the others. those are features that don't play any crucial role in the competences of artificial intelligence. So for heaven sakes don't bother putting them in. 

Leave all that out, and what we have is very smart “thingies” that we can treat like slaves, and it's quite all right to treat them as slaves because they don't have feelings, they're not conscious. You can turn them off; you can tear them apart the same way you can with an automobile and that's the way we should keep it. 

Now that we're in the age of intelligent design—lots of intelligent designers around—a lot of them are intelligent enough to realize that Orgel's Second Rule is true: "Evolution is cleverer than you are." That's Francis Crick’s famous quip. And so what they're doing is harnessing evolutionary processes to do the heavy lifting without human help. So we have all these deep learning systems and they come in varieties. There's Bayesian networks and reinforcement learning of various sorts, deep learning neural networks… And what these computer systems have in common is that they are competent without comprehension. Google translate doesn't know what it's talking about when it translates a bit of Turkish into a bit of English. It doesn't have to. It's not as good as the translation that a bilingual can do, but it's good enough for most purposes. 

And what's happening in many fields in this new wave of AI is the creation of systems, black boxes, where you know that the probability of getting the right answer is very high; they are extremely good, they're better than human beings at churning through the data and coming up with the right answer. But they don't understand how they do it. Nobody understands in detail how they do it and nobody has to. 

So we've created entities, which are as inscrutable to us as a bird or a mammal considered as a collection of cells is includable; there's still a lot we don't understand about what makes them tick.  

But these entities instead of being excellent flyers or fish catchers or whatever they're excellent pattern detectors, excellent statistical analysts, and we can use these products, these intellectual products without knowing quite how they're generated but knowing having good responsible reasons for believing that they will generate the truth most of the time. 

No existing computer system no matter how good it is at answering questions like Watson on Jeopardy or categorizing pictures, for instance, no such system is conscious today, not close. And although I think it's possible in principle to make a conscious android, a conscious robot, I don't think it's desirable; I don't think there would be great benefits to doing this; and there would be some significant harms and dangers too.

You could at tremendous expense, but you'd have to have in fact quite a revolution in computer design, which would take you right down to the very base of the hardware. 


 

If consciousness is ours to give, should we give it to AI? This is the question on the mind of the very sentient Daniel Dennett. The emerging trend in AI and AGI is to humanize our robot creations: they look ever more like us, emote as we do, and even imitate our flaws through machine learning. None of this makes the AI smarter, only more marketable. Dennett suggests remembering what AIs are: tools and systems built to organize our information and streamline our societies. He has no hesitation in saying that they are slaves built for us, and we can treat them as such because they have no feelings. If we eventually understand consciousness enough to install it into a robot, it would be unwise. It won't make them more intelligent, he says, only more anxious. Daniel Dennett's most recent book is From Bacteria to Bach and Back: The Evolution of Minds.


Photo: Luisa Conlon , Lacy Roberts and Hanna Miller / Global Oneness Project
Sponsored by Charles Koch Foundation
  • Stories are at the heart of learning, writes Cleary Vaughan-Lee, Executive Director for the Global Oneness Project. They have always challenged us to think beyond ourselves, expanding our experience and revealing deep truths.
  • Vaughan-Lee explains 6 ways that storytelling can foster empathy and deliver powerful learning experiences.
  • Global Oneness Project is a free library of stories—containing short documentaries, photo essays, and essays—that each contain a companion lesson plan and learning activities for students so they can expand their experience of the world.
Keep reading Show less

Ashamed over my mental illness, I realized drawing might help me – and others – cope

Just before I turned 60, I discovered that sharing my story by drawing could be an effective way to both alleviate my symptoms and combat that stigma.

Photo by JJ Ying on Unsplash
Mind & Brain

I've lived much of my life with anxiety and depression, including the negative feelings – shame and self-doubt – that seduced me into believing the stigma around mental illness: that people knew I wasn't good enough; that they would avoid me because I was different or unstable; and that I had to find a way to make them like me.

Keep reading Show less

Sexual activity linked to higher cognitive function in older age

A joint study by two England universities explores the link between sex and cognitive function with some surprising differences in male and female outcomes in old age.

Image by Lightspring on Shutterstock
Mind & Brain
  • A joint study by the universities of Coventry and Oxford in England has linked sexual activity with higher cognitive abilities in older age.
  • The results of this study suggest there are significant associations between sexual activity and number sequencing/word recall in men. In women, however, there was a significant association between sexual activity in word recall alone - number sequencing was not impacted.
  • The differences in testosterone (the male sex hormone) and oxytocin (a predominantly female hormone) may factor into why the male cognitive level changes much more during sexual activity in older age.
Keep reading Show less

What the world will look like in the year 250,002,018

This is what the world will look like, 250 million years from now

On Pangaea Proxima, Lagos will be north of New York, and Cape Town close to Mexico City
Surprising Science

To us humans, the shape and location of oceans and continents seems fixed. But that's only because our lives are so short.

Keep reading Show less

Scientists are studying your Twitter slang to help AI

Mathematicians studied 100 billion tweets to help computer algorithms better understand our colloquial digital communication.

Photo credit: Getty Images
Technology & Innovation
  • A group of mathematicians from the University of Vermont used Twitter to examine how young people intentionally stretch out words in text for digital communication.
  • Analyzing the language in roughly 100 billion tweets generated over eight years, the team developed two measurements to assess patterns in the tweets: balance and stretch.
  • The words people stretch are not arbitrary but rather have patterned distributions such as what part of the word is stretched or how much it stretches out.
Keep reading Show less