Reaching the Singularity: It’s More Complicated Than We Think
Presumably, if we better anticipate its timeline, we will carve a path that makes the Singularity era most beneficial to our species.
When will humanity reach Singularity, that now-famous point in time when artificial intelligence becomes greater than human intelligence? It is aptly called the Singularity proponents like Ray Kurzweil: like the singularity at the center of a black hole, we have no idea what happens once we reach it. However, the debate today is not what happens after the Singularity, but when will it happen. Presumably, if we better anticipate its timeline, we will carve a path that makes the Singularity era most beneficial to our species.
In light of the Singularity Summit approaching this week in New York, Microsoft co-founder Paul Allen and his colleague Mark Greaves have written a cautionary article in MIT Techology Review regarding the optimistic timelines of 2045 set by Kurzweil. Kurzweil, the father of the Singularity movement, has based his assumptions on the Law of Accelerating or Exponential Returns. According to Kurzweil, technologies become cheaper and their computation power increases at an exponential rate. Thus, it follows that artificial intelligence also increases exponentially in its ability to mimic the brain, enabling us to create an artificial copy of our “soul” and can live forever in robotic bodies that neither age nor die.
But Allen and Greaves challenge both the assumption that computational power will continue to increase exponentially and importantly, that such a condition implies that an exponential increase in our ability to understand, mimic and surpass the human brain. The authors underscore the complexity of the brain and how each time scientists have seen more of the brain with better imaging, they have been thrown off by new intricacies in its working – the result of millions of years of evolution. Even if we one day perfectly mimic the structural connections and biology of its neurons, humans will not automatically understand all the processes of the brain. “If we wanted to build software to simulate a bird's ability to fly in various conditions”, they write, “simply having a complete diagram of bird anatomy isn't sufficient. To fully simulate the flight of an actual bird, we also need to know how everything functions together.” This “complexity brake” will significantly slow down our progress towards the Singularity unless we have the kind of rare and unexpected insights that change the course of scientific progress.
Allen and Greaves points of caution should be noted. At the same, it is imperative that we also devoted attention to the “transition” period to the Singularity – this decades long age of deepening human-machine interdependence and how we as a society must come to terms with it. We call this transition phase our new Hybrid Reality.
Ayesha and Parag Khanna are Directors of the Hybrid Reality Institute
Political activism may get people invested in politics, and affect urgently needed change, but it comes at the expense of tolerance and healthy democratic norms.