Once a week.
Subscribe to our weekly newsletter.
Did we evolve to see reality as it exists? No, says cognitive psychologist Donald Hoffman.
Cognitive psychologist Donald Hoffman hypothesizes we evolved to experience a collective delusion — not objective reality.
- Donald Hoffman theorizes experiencing reality is disadvantageous to evolutionary fitness.
- His hypothesis calls for ditching the objectivity of matter and space-time and replacing them with a mathematical theory of consciousness.
- If correct, it could help us progress such intractable questions as the mind-body problem and the conflict between general relativity and quantum mechanics.
What is reality and how do we know? For many the answer is simple: What you see — hear, feel, touch, and taste — is what you get.
Your skin feels warm on a summer day because the sun exists. That apple you just tasted sweet and that left juices on your fingers, it must have existed. Our senses tell us that reality is there, and we use reason to fill in the blanks — that is, we know the sun doesn't cease to exist at night even if we can't see it.
But cognitive psychologist Donald Hoffman says we're misunderstanding our relationship with objective reality. In fact, he argues that evolution has cloaked us in a perceptional virtual reality. For our own good.
Experiencing a virtual interface
Donald Hoffman says that we we perceive of as reality is an interface of symbols hiding vastly more complex interactions. He likens this to how desktop icons represent software. Image source: Pixabay
The idea that we can't perceive objective reality in totality isn't new. We know everyone comes installed with cognitive biases and ego defense mechanisms. Our senses can be tricked by mirages and magicians. And for every person who sees a duck, another sees a rabbit.
But Hoffman's hypothesis, which he wrote about in a recent issue of New Scientist, takes it a step further. He argues our perceptions don't contain the slightest approximation of reality; rather, they evolved to feed us a collective delusion to improve our fitness.
Using evolutionary game theory, Hoffman and his collaborators created computer simulations to observe how "truth strategies" (which see objective reality as is) compared with "pay-off strategies" (which focus on survival value). The simulations put organisms in an environment with a resource necessary to survival but only in Goldilocks proportions.
Consider water. Too much water, the organism drowns. Too little, it dies of thirst. Between these extremes, the organism slakes its thirst and lives on to breed another day.
Truth-strategy organisms who see the water level on a color scale — from red for low to green for high — see the reality of the water level. However, they don't know whether the water level is high enough to kill them. Pay-off-strategy organisms, conversely, simply see red when water levels would kill them and green for levels that won't. They are better equipped to survive.
"[E]volution ruthlessly selects against truth strategies and for pay-off strategies," writes Hoffman. "An organism that sees objective reality is always less fit than an organism of equal complexity that sees fitness pay-offs. Seeing objective reality will make you extinct."
Since humans aren't extinct, the simulation suggests we see an approximation of reality that shows us what we need to see, not how things really are.
Hoffman likens this approximation to a desktop interface. When a novelist boots up their computer, they see an icon on their desktop that represents their novel. It's green, rectangular, and sits on the screen, but the document has none of those qualities intrinsically. It's a complex string of 1s and 0s that manifests as software running as an electric current through a circuit board.
If writers had to manipulate binary to write a novel, or hunter-gatherers had to perceive physics to throw a spear, chances are both would have gone extinct a long time ago.
"In like manner, we create an apple when we look, and destroy it when we look away. Something exists when we don't look, but it isn't an apple, and is probably nothing like an apple," Hoffman writes. "The human perception of an apple is a data structure that indicates something edible (a fitness pay-off) and how to eat it. We create these data structures with a glance, and erase them with a blink. Physical objects, and indeed the space and time they exist in, are evolution's way of presenting fitness pay-offs in a compact and usable form."
Consciousness all the way down
At this point, you are likely wondering, "Well, then what is reality? If my dog is only a data structure indicating a furry creature that enjoys fetch and hates baths, then what lies beneath that representation?"
For Hoffman the answer is consciousness.
When neuroscientists and philosophers develop theories of consciousness, they traditionally look at the brain. If Hoffman is correct, they can't completely understand consciousness via brain activity, because they are looking at an icon of a material organ that exists in space and time. Not reality.
Hoffman wants to start with a mathematical theory of consciousness as a baseline — looking at consciousness outside of matter and the space-time it may not inhabit. His theory further calls for a potentially infinite interaction of conscious agents, from the simple to the complex. In this formulation, consciousness may even exist beyond the organic world, all the way down to electrons and protons.
"I'm denying that there is such a thing in objective reality as an electron with a position. I'm saying that the very framework of space and time and matter and spin is the wrong framework, it's the wrong language to describe reality," Hoffman told journalist Robert Wright in an interview. "I'm saying let's go all the way: It's consciousness, and only consciousness, all the way down."
Hoffman calls this view "conscious realism." If proven correct, he argues it could make headway on such intractable quandaries like the mind-body problem, the odd nature of the quantum world, and the much sought-after "theory of everything."
"Reality may never seem the same again," Hoffman writes.
Simulation tested, science approved?
Hoffman's hypothesis is fascinating, and if you need a subject for a bar-side bull session, you could do worse. But before anybody suffers an existential meltdown, it's worth noting that the hypothesis is just that. A hypothesis. It has a way to go before overturning the hypothesis that the brain manifests consciousness, and its detractors have thrown down a few gauntlets.
One such critique argues that while we may not perceive reality as it is that doesn't mean our perception is not reasonably accurate. Hoffman would argue we see an icon that represents a snake, not a snake. But then why do nonpoisonous snakes evolve colorings to match poisonous ones? If there is no objective reality to mimic, why would mimicry prove a useful adaptation, and why would the interfaces of multiple species be fooled by such tricks?
Another concern is a chicken-and-egg problem, as Wright pointed out in their discussion. Current orthodoxy argues the universe existed for billions of years before life emerged. This means the first living organisms began their evolutionary tracks by responding to a preexistent inorganic, unconscious environment.
If Hoffman's argument is correct and consciousness is primary, then why develop life and the illusion of reality? Why are some of these unreal symbols ultimately so harmful to consciousness? The network of consciousnesses, one assumes, got along without life for billions of years.
This is why Michael Shermer equates Hoffman's argument to something akin to the "God of the gaps." He writes:
"No one denies that consciousness is a hard problem. But before we reify consciousness to the level of an independent agency capable of creating its own reality, let's give the hypotheses we do have for how brains create mind more time. Because we know for a fact that measurable consciousness dies when the brain dies, until proved otherwise, the default hypothesis must be that brains cause consciousness. I am, therefore I think."
Then there's the issue of whether Hoffman's hypothesis is self-defeating. If our perceptions of reality are merely species-specific interfaces overlaid upon reality, how do we know consciousness is not simply another such icon? Maybe the "I" of everyday experience is a useful fantasy adapted to benefit the survival and reproduction of the gene and not part of the operating system of reality.
None of this is to say that Hoffman and others can't meet these challenges with further research. We'll see. It's just to say that there's a lot of room for exploration into some fascinating ideas. As Hoffman would agree:
"[This theory] has made life far more interesting," he told Wright. "There's lots to explore, a lot I don't know, and things that I thought I knew I had to give up. And so, it makes life far more interesting for me."
- Fully immersive virtual reality: What will it take? - Big Think ›
- Why nothing in the universe may be real - Big Think ›
- Have physicists proven objective reality doesn't exist? - Big Think ›
- How we evolved to see a virtual reality - Big Think ›
- Michio Kaku: Feedback loops are creating consciousness - Big Think ›
- Does simulation theory apply if reality is a data structure? - Big Think ›
Evolution doesn't clean up after itself very well.
- An evolutionary biologist got people swapping ideas about our lingering vestigia.
- Basically, this is the stuff that served some evolutionary purpose at some point, but now is kind of, well, extra.
- Here are the six traits that inaugurated the fun.
The plica semilunaris<img class="rm-lazyloadable-image rm-shortcode" type="lazy-image" data-runner-src="https://assets.rebelmouse.io/eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpbWFnZSI6Imh0dHBzOi8vYXNzZXRzLnJibC5tcy8xOTA5NjgwMS9vcmlnaW4ucG5nIiwiZXhwaXJlc19hdCI6MTY3NDg5NTg1NX0.kdBYMvaEzvCiJjcLEPgnjII_KVtT9RMEwJFuXB68D8Q/img.png?width=980" id="59914" width="429" height="350" data-rm-shortcode-id="b11e4be64c5e1f58bf4417d8548bedc7" data-rm-shortcode-name="rebelmouse-image" />
The human eye in alarming detail. Image source: Henry Gray / Wikimedia commons<p>At the inner corner of our eyes, closest to the nasal ridge, is that little pink thing, which is probably what most of us call it, called the caruncula. Next to it is the plica semilunairs, and it's what's left of a third eyelid that used to — ready for this? — blink horizontally. It's supposed to have offered protection for our eyes, and some birds, reptiles, and fish have such a thing.</p>
Palmaris longus<img class="rm-lazyloadable-image rm-shortcode" type="lazy-image" data-runner-src="https://assets.rebelmouse.io/eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpbWFnZSI6Imh0dHBzOi8vYXNzZXRzLnJibC5tcy8xOTA5NjgwNy9vcmlnaW4uanBnIiwiZXhwaXJlc19hdCI6MTYzMzQ1NjUwMn0.dVor41tO_NeLkGY9Tx46SwqhSVaA8HZQmQAp532xLxA/img.jpg?width=980" id="879be" width="1920" height="2560" data-rm-shortcode-id="4089a32ea9fbb1a0281db14332583ccd" data-rm-shortcode-name="rebelmouse-image" />
Palmaris longus muscle. Image source: Wikimedia commons<p> We don't have much need these days, at least most of us, to navigate from tree branch to tree branch. Still, about 86 percent of us still have the wrist muscle that used to help us do it. To see if you have it, place the back of you hand on a flat surface and touch your thumb to your pinkie. If you have a muscle that becomes visible in your wrist, that's the palmaris longus. If you don't, consider yourself more evolved (just joking).</p>
Darwin's tubercle<img class="rm-lazyloadable-image rm-shortcode" type="lazy-image" data-runner-src="https://assets.rebelmouse.io/eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpbWFnZSI6Imh0dHBzOi8vYXNzZXRzLnJibC5tcy8xOTA5NjgxMi9vcmlnaW4uanBnIiwiZXhwaXJlc19hdCI6MTY0ODUyNjA1MX0.8RuU-OSRf92wQpaPPJtvFreOVvicEwn39_jnbegiUOk/img.jpg?width=980" id="687a0" width="819" height="1072" data-rm-shortcode-id="ff5edf0a698e0681d11efde1d7872958" data-rm-shortcode-name="rebelmouse-image" />
Darwin's tubercle. Image source: Wikimedia commons<p> Yes, maybe the shell of you ear does feel like a dried apricot. Maybe not. But there's a ridge in that swirly structure that's a muscle which allowed us, at one point, to move our ears in the direction of interesting sounds. These days, we just turn our heads, but there it is.</p>
Goosebumps<img class="rm-lazyloadable-image rm-shortcode" type="lazy-image" data-runner-src="https://assets.rebelmouse.io/eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpbWFnZSI6Imh0dHBzOi8vYXNzZXRzLnJibC5tcy8xOTA5NzMxNC9vcmlnaW4uanBnIiwiZXhwaXJlc19hdCI6MTYyNzEyNTc2Nn0.aVMa5fsKgiabW5vkr7BOvm2pmNKbLJF_50bwvd4aRo4/img.jpg?width=980" id="d8420" width="1440" height="960" data-rm-shortcode-id="8827e55511c8c3aed8c36d21b6541dbd" data-rm-shortcode-name="rebelmouse-image" />
Goosebumps. Photo credit: Tyler Olson via Shutterstock<p>It's not entirely clear what purpose made goosebumps worth retaining evolutionarily, but there are two circumstances in which they appear: fear and cold. For fear, they may have been a way of making body hair stand up so we'd appear larger to predators, much the way a cat's tail puffs up — numerous creatures exaggerate their size when threatened. In the cold, they may have trapped additional heat for warmth.</p>
Tailbone<img class="rm-lazyloadable-image rm-shortcode" type="lazy-image" data-runner-src="https://assets.rebelmouse.io/eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpbWFnZSI6Imh0dHBzOi8vYXNzZXRzLnJibC5tcy8xOTA5NzMxNi9vcmlnaW4uanBnIiwiZXhwaXJlc19hdCI6MTY3MzQwMjc3N30.nBGAfc_O9sgyK_lOUo_MHzP1vK-9kJpohLlj9ax1P8s/img.jpg?width=980" id="9a2f6" width="1440" height="1440" data-rm-shortcode-id="4fe28368d2ed6a91a4c928d4254cc02a" data-rm-shortcode-name="rebelmouse-image" />
Image source: Decade3d-anatomy online via Shutterstock<p>Way back, we had tails that probably helped us balance upright, and was useful moving through trees. We still have the stump of one when we're embryos, from 4–6 weeks, and then the body mostly dissolves it during Weeks 6–8. What's left is the coccyx.</p>
The palmar grasp reflex<img class="rm-lazyloadable-image rm-shortcode" type="lazy-image" data-runner-src="https://assets.rebelmouse.io/eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpbWFnZSI6Imh0dHBzOi8vYXNzZXRzLnJibC5tcy8xOTA5NzMyMC9vcmlnaW4uanBnIiwiZXhwaXJlc19hdCI6MTYzNjY0MDY5NX0.OSwReKLmNZkbAS12-AvRaxgCM7zyukjQUaG4vmhxTtM/img.jpg?width=980" id="8804c" width="1440" height="960" data-rm-shortcode-id="67542ee1c5a85807b0a7e63399e44575" data-rm-shortcode-name="rebelmouse-image" />
Palmar reflex activated! Photo credit: Raul Luna on Flickr<p> You've probably seen how non-human primate babies grab onto their parents' hands to be carried around. We used to do this, too. So still, if you touch your finger to a baby's palm, or if you touch the sole of their foot, the palmar grasp reflex will cause the hand or foot to try and close around your finger.</p>
Other people's suggestions<p>Amir's followers dove right in, offering both cool and questionable additions to her list. </p>
Fangs?<blockquote class="twitter-tweet" data-conversation="none" data-lang="en"><p lang="en" dir="ltr">Lower mouth plate behind your teeth. Some have protruding bone under the skin which is a throw back to large fangs. Almost like an upsidedown Sabre Tooth.</p>— neil crud (@neilcrud66) <a href="https://twitter.com/neilcrud66/status/1085606005000601600?ref_src=twsrc%5Etfw">January 16, 2019</a></blockquote> <script async src="https://platform.twitter.com/widgets.js" charset="utf-8"></script>
Hiccups<blockquote class="twitter-tweet" data-conversation="none" data-lang="en"><p lang="en" dir="ltr">Sure: <a href="https://t.co/DjMZB1XidG">https://t.co/DjMZB1XidG</a></p>— Stephen Roughley (@SteBobRoughley) <a href="https://twitter.com/SteBobRoughley/status/1085529239556968448?ref_src=twsrc%5Etfw">January 16, 2019</a></blockquote> <script async src="https://platform.twitter.com/widgets.js" charset="utf-8"></script>
Hypnic jerk as you fall asleep<blockquote class="twitter-tweet" data-conversation="none" data-lang="en"><p lang="en" dir="ltr">What about when you “jump” just as you’re drifting off to sleep, I heard that was a reflex to prevent falling from heights.</p>— Bann face (@thebanns) <a href="https://twitter.com/thebanns/status/1085554171879788545?ref_src=twsrc%5Etfw">January 16, 2019</a></blockquote> <script async src="https://platform.twitter.com/widgets.js" charset="utf-8"></script> <p> This thing, often called the "alpha jerk" as you drop into alpha sleep, is properly called the hypnic jerk,. It may actually be a carryover from our arboreal days. The <a href="https://www.livescience.com/39225-why-people-twitch-falling-asleep.html" target="_blank" data-vivaldi-spatnav-clickable="1">hypothesis</a> is that you suddenly jerk awake to avoid falling out of your tree.</p>
Nails screeching on a blackboard response?<blockquote class="twitter-tweet" data-conversation="none" data-lang="en"><p lang="en" dir="ltr">Everyone hate the sound of fingernails on a blackboard. It's _speculated_ that this is a vestigial wiring in our head, because the sound is similar to the shrill warning call of a chimp. <a href="https://t.co/ReyZBy6XNN">https://t.co/ReyZBy6XNN</a></p>— Pet Rock (@eclogiter) <a href="https://twitter.com/eclogiter/status/1085587006258888706?ref_src=twsrc%5Etfw">January 16, 2019</a></blockquote> <script async src="https://platform.twitter.com/widgets.js" charset="utf-8"></script>
Ear hair<blockquote class="twitter-tweet" data-conversation="none" data-lang="en"><p lang="en" dir="ltr">Ok what is Hair in the ears for? I think cuz as we get older it filters out the BS.</p>— Sarah21 (@mimix3) <a href="https://twitter.com/mimix3/status/1085684393593561088?ref_src=twsrc%5Etfw">January 16, 2019</a></blockquote> <script async src="https://platform.twitter.com/widgets.js" charset="utf-8"></script>
Nervous laughter<blockquote class="twitter-tweet" data-lang="en"><p lang="en" dir="ltr">You may be onto something. Tooth-bearing with the jaw clenched is generally recognized as a signal of submission or non-threatening in primates. Involuntary smiling or laughing in tense situations might have signaled that you weren’t a threat.</p>— Jager Tusk (@JagerTusk) <a href="https://twitter.com/JagerTusk/status/1085316201104912384?ref_src=twsrc%5Etfw">January 15, 2019</a></blockquote> <script async src="https://platform.twitter.com/widgets.js" charset="utf-8"></script>
Um, yipes.<blockquote class="twitter-tweet" data-conversation="none" data-lang="en"><p lang="en" dir="ltr">Sometimes it feels like my big toe should be on the side of my foot, was that ever a thing?</p>— B033? K@($ (@whimbrel17) <a href="https://twitter.com/whimbrel17/status/1085559016011563009?ref_src=twsrc%5Etfw">January 16, 2019</a></blockquote> <script async src="https://platform.twitter.com/widgets.js" charset="utf-8"></script>
Ultimately, this is a fight between a giant reptile and a giant primate.
The 2021 film “Godzilla vs. Kong" pits the two most iconic movie monsters of all time against each other. And fans are now picking sides.
The more you see them, the better you get at spotting the signs.