from the world's big
A.I. will bring a series of social and financial changes, and it will force us to confront a problem we've been avoiding for much too long, says Joscha Bach.
To know whether or not we should fear A.I., we first have to understand how it will behave in the world. Cognitive scientist Joscha Bach believes A.I. has the potential to mistreat humans—but no worse than big corporations already do. The future won't filled with Roombas and anthropomorphized house-help robots, he says, so a physical threat is not the main concern. A.I. will take the form of intelligent systems that operate as corporations, and they will adopt the ethics of whatever company builds them. "If we want to have these systems built in such a way that they treat us nicely you have to start right now. And it seems to be a very hard problem to do so," Bach says. And yet he appears to be optimistic about society's other main A.I. fear: job automation. He frames it like this: if a job is you selling the best years of your life to a corporation, automating as many manual tasks as possible is really a release from that contract—but how will we afford to live, and what will we do with our days? Many think Universal Basic Income, but Bach sees it a little differently: mass public employment. Pay people to be good humans: good at teaching and at raising their children. Pay them to be good scientists, good philosophers, good architects and chefs — the things that make us most human. Job automation will also force us to confront one of our most difficult and uncomfortable problems: that we are living in an age of abundance, but fail to distribute resources so that everyone can live a decent life. "It might turn out to be a very good thing if you are forced... to address this problem," he says. Joscha Bach's latest book is Principles of Synthetic Intelligence.
AI won't really resemble humanoids. Instead, AI will create a world that countries and large companies can live in.
Can AI dream? Can it love? Can it "think" in the same way we do? The short answer is: no. AI doesn't need to bog itself down with simple human tasks like love or dreams or fear. The AI brain posits itself in a much grander scale first and then works backwards to the more human way of thinking. Joscha Bach suggests that much rather than humanoid robots, we are more likely to see AI super-brains developed by countries and larger companies. Imagine a computer brain that is designed to keep the stock market balanced, or detect earthquakes an ocean away that could sound alarms on our shores... that sort of thing.
It's a big concept to wrap our human heads around. But as AI technology develops and grows by the day, it is important to understand where the technology is headed. Think less Rosie The Robot Maid from The Jetsons and more the computer from War Games.
Is our existence base reality—or are we pawns in a matrix? Cognitive scientist Joscha Bach explains how we might be able to tell.
Are we living in a video game? If so, the joke is on us, says cognitive scientist Joscha Bach. When people debate the possibility of human existence as a simulation, it's predominantly assumed that we are the players. Our overlord simulators are watching us, right? Well, that doesn't seem to gel with the amount of detail present in our world and the observable universe beyond. Why did our cosmic creators bother to code trillions of galaxies into the viewfinders of our telescopes? The Higgs boson, for example, is not necessary for our existence, so who would have the time to add such irrelevant frills just for our amusement (maybe the simulators had a really great intern that summer)? The answer? It's not made for us. According to Bach, if this is a simulation it's unlikely that we are the main attraction and much more realistic that the simulators wanted to make a model of a universe to explore hypothetical physics. That tiny blue dot with primates mixing concrete all over the surface? "We are just a random side effect or an artifact of the fact that evolution is possible in this universe," says Bach.