Humans Are the World's Best Pattern-Recognition Machines, But for How Long?

Not only are machines rapidly catching up to — and exceeding — humans in terms of raw computing power, they are also starting to do things that we used to consider inherently human. They can feel emotions like regret. They can daydream. So what is - exactly - that humans still do better than machines?


Quite simply, humans are amazing pattern-recognition machines. They have the ability to recognize many different types of patterns - and then transform these "recursive probabalistic fractals" into concrete, actionable steps. If you've ever watched a toddler learn words and concepts, you can almost see the brain neurons firing as the small child starts to recognize patterns for differentiating between objects. Intelligence, then, is really just a matter of being able to store more patterns than anyone else. Once IBM could build machines that could recognize as many chessboard patterns as a chess grandmaster, the machines became "smarter" than humans.

Artificial intelligence pioneer Ray Kurzweil was among the first to recognize how the link between pattern recognition and human intelligence could be used to build the next generation of artificially intelligent machines. In his latest book, How to Create a Mind: The Secret of Human Thought Revealed, Kurzweil describes how he is teaching artificially intelligent machines to think, based on the stepwise refinement of patterns. According to Kurzweil, all learning results from massive, hierarchical and recursive processes taking place in the brain. Take the act of reading – you first recognize the patterns of individual letters, then the patterns of individual words, then groups of words together, then paragraphs, then entire chapters and books. Once a computer can recognize all of these patterns, it can read and "learn."

The same is true for other fields of endeavor as well, where human "expertise" has always trumped machine "expertise." In a brilliant piece for Medium, Kevin Ashton recently analyzed “how experts think." It turns out patterns matter, and they matter a lot. A star football quarterback needs to recognize all kinds of patterns – from the type of defense he's facing, to the patterns his receivers are running, to the typical reactions of defenders. All of this, of course, has to happen in a matter of nanoseconds, as a 300-pound lineman is bearing down on you, intent on ripping you limb from limb.

The more you think about it, the more you can see patterns all around you. Getting to work on time in the morning is the result of recognizing patterns in your daily commute and responding to changes in schedule and traffic. So here come the Google driverless cars, which are able to recognize all of these traffic and schedule changes faster than humans. Diagnosing an illness is the result of recognizing patterns in human behavior. And now that IBM Watson is getting into medical diagnosis, machines will do it better. The same goes for just about any field of expert endeavor - it's really just a matter of recognizing the right patterns faster than anyone else, and machines just have so much processing power these days it's easy to see them becoming the future doctors and lawyers of the world.

The future of intelligence is in making our patterns better, our heuristics stronger. In his article for Medium, Kevin Ashton refers to this as "selective attention" - focusing on what really matters so that poor selections are removed before they ever hit the conscious brain. While some – like Gary Marcus of The New Yorker or Colin McGinn in the New York Review of Books, may be skeptical of Kurzweil's Pattern Recognition Theory of Mind, they also have to grudgingly admit that Kurzweil is a genius. And, if all goes according to plan, Kurzweil really will be able to create a mind that goes beyond just recognizing a lot of words.

One thing is clear – being able to recognize patterns is what gave humans their evolutionary edge over animals. How we refine, shape and improve our pattern recognition is the key to how much longer we'll have the evolutionary edge over machines.

[image: Human intelligence with grunge texture / Shutterstock]

​There are two kinds of failure – but only one is honorable

Malcolm Gladwell teaches "Get over yourself and get to work" for Big Think Edge.

Big Think Edge
  • Learn to recognize failure and know the big difference between panicking and choking.
  • At Big Think Edge, Malcolm Gladwell teaches how to check your inner critic and get clear on what failure is.
  • Subscribe to Big Think Edge before we launch on March 30 to get 20% off monthly and annual memberships.
Keep reading Show less

Saying no is hard. These communication tips make it easy.

You can say 'no' to things, and you should. Do it like this.

Videos
  • Give yourself permission to say "no" to things. Saying yes to everything is a fast way to burn out.
  • Learn to say no in a way that keeps the door of opportunity open: No should never be a one-word answer. Say "No, but I could do this instead," or, "No, but let me connect you to someone who can help."
  • If you really want to say yes but can't manage another commitment, try qualifiers like "yes, if," or "yes, after."
Keep reading Show less

Six disastrous encounters with the world’s most hostile uncontacted tribe

From questionable shipwrecks to outright attacks, they clearly don't want to be bothered.

Culture & Religion
  • Many have tried to contact the Sentinelese, to write about them, or otherwise.
  • But the inhabitants of the 23 square mile island in the Bay of Bengal don't want anything to do with the outside world.
  • Their numbers are unknown, but either 40 or 500 remain.
Keep reading Show less

Why is 18 the age of adulthood if the brain can take 30 years to mature?

Neuroscience research suggests it might be time to rethink our ideas about when exactly a child becomes an adult.

Mind & Brain
  • Research suggests that most human brains take about 25 years to develop, though these rates can vary among men and women, and among individuals.
  • Although the human brain matures in size during adolescence, important developments within the prefrontal cortex and other regions still take pace well into one's 20s.
  • The findings raise complex ethical questions about the way our criminal justice systems punishes criminals in their late teens and early 20s.
Keep reading Show less