How to be a good parent to artificial intelligence

Human values evolve. So how will we raise virtuous A.I.s?

BEN GOERTZEL: The way I've instilled my four human children with the human values that I prefer is mostly not by preaching at them what's right and what's wrong, that's not very effective, especially for people with a contrarian personality, which somehow all my kids ended up with, I don't know how. So the way I inculcated them with some approximation of the values that are important to me is just by spending time with them in various situations, and if your children enter into various situations with you and see how you respond to various things and how you guide them to respond to various things in real life then your kids pick up sort of by osmosis. I mean they pick up due to their desire to imitate and to learn and to follow, they pick up the practicalities of your values and your culture. And, interestingly, this can stick with them on an implicit level even if on the surface level many values change. One of my sons became a Sufi Muslim at one point and some of his values are, on the surface, very different than mine. I'm not Muslim, I'm not religious in any conventional sense. On the other hand, he's very compassionate, a kind-hearted person, he's very intellectual, he's a scholar so if you look at a practical level the vast bulk of the values and culture that he got from myself and his mom when he was growing up it's implicit rather than a list of rules, but it's all still there.

For an A.I., I think we need to take an approach somewhat similar to what we do with human children, we need to have A.I.s working and playing side-by-side with us in real situations. We need to give the A.I.s a desire to imitate us on a basic level and to understand why we're reacting, how we react in this after this after this concrete situation. And then the A.I. can get a practical model of our values and our culture as it's manifested in a hundred thousand or a million real-life situations. And this doesn't guarantee the A.I. will always respond the way we want, but it will give it a real foundation, which you're not going to get from giving a list of like the three or ten laws of being a good human or a good human-like mind. An important thing to remember when you talk about getting human values into an A.I. is that human values are very much a moving target. I mean, much of what we do in our everyday lives would have been considered horrendously immoral by the average human of the Middle Ages in Europe or even for that matter a lot of the things we take for granted as being moral and right now were considered horrendously immoral by almost everyone I went to elementary school with in the 1970s in suburban in New Jersey. Like, my mom is gay and because of that in the 1970s we got our car turned over, we got all the windows of our house smashed in, like that was completely unacceptable in the 1970s in Southern New Jersey. Now gay marriage is being legalized everywhere.

Human values even in our lifetime have changed a lot so when you get to a technological singularity with brain-computer interfacing and mind uploading and superhuman A.I.s and we're able to clone our body over and over again, our values are not going to remain precisely the same as they are now even for us humans so it's absurd to think we can start an A.I. with our current human values and they're going to stick with our 2018 human values forever; our own values are going to drift. So all we can ask for is the A.I. starts out with the values that we hold dear now and as its values grow and evolve this growth and evolution is coupled with our own growth and evolution, whose direction we cannot foresee at this time.

  • Until we can design a mind that's superhuman and flawless, we'll have to settle for instilling plain old human values into artificial intelligence. But how to do this in a world where values are constantly evolving?
  • Many of our life choices today would be considered immoral by people in the Middle Ages — or even the 1970s, says Ben Goertzel, whose family personally experienced the sad state of LGBTQ acceptance in Southern New Jersey 50 years ago.
  • Raising an A.I. is a lot like raising kids, says Goertzel. Kids don't learn best from a list of rules, but from lived experience – watching and imitating their parents. A.I.s and humans will have to play and learn side by side, and evolve together as values adapt toward an increasingly technological future.


Jordan Peterson on Joe Rogan: The gender paradox and the importance of competition

The Canadian professor has been on the Joe Rogan Experience six times. There's a lot of material to discuss.

Personal Growth
  • Jordan Peterson has constantly been in the headlines for his ideas on gender over the last three years.
  • While on Joe Rogan's podcast, he explains his thoughts on the gender differences in society.
  • On another episode, Peterson discusses the development of character through competition.
Keep reading Show less

Horseshoe crabs are drained for their blue blood. That practice will soon be over.

The blood of horseshoe crabs is harvested on a massive scale in order to retrieve a cell critical to medical research. However, recent innovations might make this practice obsolete.

Credit: Business Insider (video)
Surprising Science
  • Horseshoe crabs' blue blood is so valuable that a quart of it can be sold for $15,000.
  • This is because it contains a molecule that is crucial to the medical research community.
  • Today, however, new innovations have resulted in a synthetic substitute that may end the practice of farming horseshoe crabs for their blood.
Keep reading Show less

Tech billionaires could end climate change. So why aren’t they?

David Wallace-Wells points out that the people who can save the world just aren't all that interested.

Videos
  • Saving the world from the apocalyptic impact of climate change should be a dream for many Silicon Valley titans concerned about legacy, says David Wallace-Wells, and yet few are dedicating themselves to addressing the catastrophe.
  • Negative emissions technology funded by Bill Gates exists. It would cost $3 trillion per year to operate and would mean human industry could continue at current levels without global warming.
  • That figure sounds astronomical, however global subsidies to fossil fuel industries cost $5 trillion per year.
Keep reading Show less