Skip to content
Culture & Religion

Glitches in the Human Brain Also Appear in Computers

The rise of facial recognition technology has resulted in computers seeing faces in nature where there are none. Does this mean computers are given to the same flaws as humans?
Sign up for the Smarter Faster newsletter
A weekly newsletter featuring the biggest ideas from the smartest people

What’s the Latest Development?


Like humans, computers may occasionally be given to seeing the face of Jesus in a potato chip or bowl of oatmeal. How is this possible? The emergence of facial recognition technology has again demonstrated the computational nature of the human mind, as computers, it turns out, see faces in plenty of inanimate objects. In humans, this ability is theorized to serve an evolutionary role. As Carl Sagan once wrote: “As soon as the infant can see, it recognizes faces, and we now know that this skill is hardwired in our brains. Those infants who a million years ago were unable to recognize a face smiled back less, were less likely to win the hearts of their parents, and less likely to prosper.”

What’s the Big Idea?

Can this possibly mean that computers are subject to evolutionary pressures just as humans are? Not exactly. Seeing faces in a random collection of angles, a phenomenon known as pareidolia, is a mistake that humans make. For a computer, it is not strictly a mistake. Facial recognition technology is programmed to piece together disparate parts of nature into what it believes is a face. While humans know the shapes they see in clouds are not actually what they appear to be, computers do not. In other words, a computer’s flaws are still very machine-like. And ours, such as experiencing Jesus in some fried food, are still very human.

Photo credit: Shutterstock.com


Sign up for the Smarter Faster newsletter
A weekly newsletter featuring the biggest ideas from the smartest people

Related

Up Next