The Future of Computing Power (Fast, Cheap, and Invisible)

In just a few years, basic microchips will be so cheap they could be built into virtually every product that we buy, creating an invisible intelligent network that’s hidden in our walls, our furniture, and even our clothing.

Back in the 1960s the IBM 1401 was both state of the art and an engineering marvel. It filled up an entire room and weighed thousands of pounds; in today’s money it would cost about $1.5 million. The old 1401 could perform just over 4,000 calculations per second and at the time was virtually unmatched. Nowadays, the average mobile phone has a microchip about the size of your fingernail and can perform about 1 billion calculations per second.


Here are a few examples of famous predictions about computers that will surely go down in history.

  • Thomas Watson of IBM stated in 1943 that he believed there was a world market for maybe five computers
  • Popular Mechanics stated that in the future computers may weigh no more than 1.5 tons.
  • An engineer at the Advanced Computing Systems Division of IBM, 1968, commenting on the microchip asked "But what...is it good for?"
  • Steve Jobs on attempts to get Atari and HP interested in his and Steve Wozniak's personal computer stated "So we went to Atari and said, 'Hey, we've got this amazing thing, even built with some of your parts, and what do you think about funding us? Or we'll give it to you because we just want to do it. Pay our salary and we'll come work for you.' And they said, 'No.' So then we went to Hewlett-Packard, and they said, 'Hey, we don't need you. You haven't got through college yet.'"
  • Ken Olson, President of Digital Equipment Corp, in 1977 stated that there is no need for any individual to have a computer in their home.
  • The exponential growth of computing power is utterly astonishing and will profoundly reshape all of human civilization. The most spectacular thing about it is the fact that our generation is smack dab in the middle of all of these transitions. By the year 2020, a chip with today’s processing power will cost about a penny, which is the cost of scrap paper we throw in the garbage. Children are going to look back and wonder how we could have possibly lived in such a meager world, much as when we think about how our own parents lacked the luxuries—cell phone, Internet—that we all seem to take for granted. Our world is already much, much smarter than 10 years ago and as computing power doubles every 18 months, it’s propelling us towards a radically different future. 

    Engineers are already designing driverless cars that rely on Global Positioning Systems (GPS) and laser sensors to avoid obstacles autonomously. Some driverless car concepts (which are functional to an extent) achieve these feats with the power of about 8-10 ordinary desktop computers. I actually had a chance to ride in one of these autonomous vehicles in North Carolina while filming with the BBC.  

    Similarly, in the not too distant future, our streets and highways will contain embedded computer chips that manage traffic, turning what once seemed a futuristic fantasy into reality. (If you’ve ever seen the movie "Minority Report," set in the year 2054, you'll remember the cars that simply drove themselves, allowing Tom Cruise to multitask without having to watch the road.) All the vehicles on the road will essentially speak with one another, and I believe that in the future, the words "traffic accident" and "traffic jam" will simply disappear from the English language. 

    By 2020, computer intelligence will be everywhere: not just in the cars and the roads, but practically in every object you see around you. In the past three decades alone, there have been a tremendous amount of changes in this regard. Microprocessors in essence have been around since the early 70’s, but it wasn’t until the 1980’s that the microprocessor war really started to accelerate. In fact, some of the first 32-bit designs were found in Apple’s Lisa, Macintosh, and even the Commodore Amiga.  The 1990s, as you know, was the decade where computing power really started to take shape. This was the decade of networking, when we began to hook all of our computers together, breathing life into the World Wide Web on which you are reading this blog post. We are now at a point in our lives where computers are everywhere: in our phones, televisions, stereos, thermostats, wrist watches, refrigerators and even our dishwashers. In just a few years, basic microchips will be so cheap they could be built into virtually every product that we buy, creating an invisible intelligent network that’s hidden in our walls, our furniture, and even our clothing. Some of you may even have microchips in your dog or cat, acting as a digital collar in the event they become lost.

    You may have heard the term RFID (or Radio Frequency Identification), which is a tag that’s usually incorporated into a product, animal, or person for the purpose of identification and tracking using radio waves. RFID tags are starting to pop up virtually everywhere, and you may not even have noticed it. Grocery stores, for example, have already started implementing the use of RFID technology in a large array of pilot stores. When you reach the checkout there will be no more taking out each item and placing it on the conveyer belt. The RFID tags on each of the items instantly transmit necessary data about your cart full of groceries, completely doing away with the checkout line. This technology is of course still in its proof of concept phase, but I wouldn’t be surprised to hear that it’s perfected in the next few years. In the past decade or so, there has been an explosion of RFID technologies: a new report from market intelligence firm ABI Research predicts that the overall RFID market will exceed $8.25 billion in 2014, or approximately $7.46 billion with automobile immobilization excluded. That would represent a 14% compound annual growth rate (CAGR) over the next five years.

    We are moving from an age of scientific discovery to an age of scientific mastery, and the generation now alive is the most important one in all of history. People tend to forget that we have a front row seat as all of these changes are implemented and how they will reshape the future. Hold on to your hats because the next decade is going to be a thrill ride! The advancement of computing intelligence is just the beginning; it's what we choose to do with it that will mold the future.

    Big Think
    Sponsored by Lumina Foundation

    Upvote/downvote each of the videos below!

    As you vote, keep in mind that we are looking for a winner with the most engaging social venture pitch - an idea you would want to invest in.

    Keep reading Show less

    Essential financial life skills for 21st-century Americans

    Having these financial life skills can help you navigate challenging economic environments.

    Photo by Jp Valery on Unsplash
    Personal Growth
    • Americans are swimming in increasingly higher amounts of debt, even the upper middle class.
    • For many, this burden can be alleviated by becoming familiar with some straightforward financial concepts.
    • Here's some essential financial life skills needed to ensure your economic wellbeing.
    Keep reading Show less

    Scientists create a "lifelike" material that has metabolism and can self-reproduce

    An innovation may lead to lifelike evolving machines.

    Shogo Hamada/Cornell University
    Surprising Science
    • Scientists at Cornell University devise a material with 3 key traits of life.
    • The goal for the researchers is not to create life but lifelike machines.
    • The researchers were able to program metabolism into the material's DNA.
    Keep reading Show less

    New fossils suggest human ancestors evolved in Europe, not Africa

    Experts argue the jaws of an ancient European ape reveal a key human ancestor.

    Surprising Science
    • The jaw bones of an 8-million-year-old ape were discovered at Nikiti, Greece, in the '90s.
    • Researchers speculate it could be a previously unknown species and one of humanity's earliest evolutionary ancestors.
    • These fossils may change how we view the evolution of our species.

    Homo sapiens have been on earth for 200,000 years — give or take a few ten-thousand-year stretches. Much of that time is shrouded in the fog of prehistory. What we do know has been pieced together by deciphering the fossil record through the principles of evolutionary theory. Yet new discoveries contain the potential to refashion that knowledge and lead scientists to new, previously unconsidered conclusions.

    A set of 8-million-year-old teeth may have done just that. Researchers recently inspected the upper and lower jaw of an ancient European ape. Their conclusions suggest that humanity's forebearers may have arisen in Europe before migrating to Africa, potentially upending a scientific consensus that has stood since Darwin's day.

    Rethinking humanity's origin story

    The frontispiece of Thomas Huxley's Evidence as to Man's Place in Nature (1863) sketched by natural history artist Benjamin Waterhouse Hawkins. (Photo: Wikimedia Commons)

    As reported in New Scientist, the 8- to 9-million-year-old hominin jaw bones were found at Nikiti, northern Greece, in the '90s. Scientists originally pegged the chompers as belonging to a member of Ouranopithecus, an genus of extinct Eurasian ape.

    David Begun, an anthropologist at the University of Toronto, and his team recently reexamined the jaw bones. They argue that the original identification was incorrect. Based on the fossil's hominin-like canines and premolar roots, they identify that the ape belongs to a previously unknown proto-hominin.

    The researchers hypothesize that these proto-hominins were the evolutionary ancestors of another European great ape Graecopithecus, which the same team tentatively identified as an early hominin in 2017. Graecopithecus lived in south-east Europe 7.2 million years ago. If the premise is correct, these hominins would have migrated to Africa 7 million years ago, after undergoing much of their evolutionary development in Europe.

    Begun points out that south-east Europe was once occupied by the ancestors of animals like the giraffe and rhino, too. "It's widely agreed that this was the found fauna of most of what we see in Africa today," he told New Scientists. "If the antelopes and giraffes could get into Africa 7 million years ago, why not the apes?"

    He recently outlined this idea at a conference of the American Association of Physical Anthropologists.

    It's worth noting that Begun has made similar hypotheses before. Writing for the Journal of Human Evolution in 2002, Begun and Elmar Heizmann of the Natural history Museum of Stuttgart discussed a great ape fossil found in Germany that they argued could be the ancestor (broadly speaking) of all living great apes and humans.

    "Found in Germany 20 years ago, this specimen is about 16.5 million years old, some 1.5 million years older than similar species from East Africa," Begun said in a statement then. "It suggests that the great ape and human lineage first appeared in Eurasia and not Africa."

    Migrating out of Africa

    In the Descent of Man, Charles Darwin proposed that hominins descended out of Africa. Considering the relatively few fossils available at the time, it is a testament to Darwin's astuteness that his hypothesis remains the leading theory.

    Since Darwin's time, we have unearthed many more fossils and discovered new evidence in genetics. As such, our African-origin story has undergone many updates and revisions since 1871. Today, it has splintered into two theories: the "out of Africa" theory and the "multi-regional" theory.

    The out of Africa theory suggests that the cradle of all humanity was Africa. Homo sapiens evolved exclusively and recently on that continent. At some point in prehistory, our ancestors migrated from Africa to Eurasia and replaced other subspecies of the genus Homo, such as Neanderthals. This is the dominant theory among scientists, and current evidence seems to support it best — though, say that in some circles and be prepared for a late-night debate that goes well past last call.

    The multi-regional theory suggests that humans evolved in parallel across various regions. According to this model, the hominins Homo erectus left Africa to settle across Eurasia and (maybe) Australia. These disparate populations eventually evolved into modern humans thanks to a helping dollop of gene flow.

    Of course, there are the broad strokes of very nuanced models, and we're leaving a lot of discussion out. There is, for example, a debate as to whether African Homo erectus fossils should be considered alongside Asian ones or should be labeled as a different subspecies, Homo ergaster.

    Proponents of the out-of-Africa model aren't sure whether non-African humans descended from a single migration out of Africa or at least two major waves of migration followed by a lot of interbreeding.

    Did we head east or south of Eden?

    Not all anthropologists agree with Begun and his team's conclusions. As noted by New Scientist, it is possible that the Nikiti ape is not related to hominins at all. It may have evolved similar features independently, developing teeth to eat similar foods or chew in a similar manner as early hominins.

    Ultimately, Nikiti ape alone doesn't offer enough evidence to upend the out of Africa model, which is supported by a more robust fossil record and DNA evidence. But additional evidence may be uncovered to lend further credence to Begun's hypothesis or lead us to yet unconsidered ideas about humanity's evolution.