Skip to content
Who's in the Video
David Eagleman is a neuroscientist at Stanford University and an internationally bestselling author. He is co-founder of two venture-backed companies, Neosensory and BrainCheck, and he also directs the Center for[…]
In Partnership With
Unlikely Collaborators
Sign up for the Smarter Faster newsletter
A weekly newsletter featuring the biggest ideas from the smartest people

Whether it’s something as trivial as pineapple on pizza or as complex as religion, human beings have always been passionately divided by polarizing viewpoints.

David Eagleman, a neuroscientist at Stanford University and the international bestselling author of Incognito: The Secret Lives of the Brain, offers groundbreaking insights into why we struggle to see eye-to-eye and, more importantly, how we can bridge these divides.

Us vs them

“We are born on a particular spot on the planet, and we have a thin little trajectory of experience,” Eagleman says about our internal model of reality. “We construct what we believe the world is made up of from there.” The result? We are each convinced that our unique view of reality is accurate.

The brain’s tendency to categorize people into ingroups and outgroups further complicates matters. Eagleman’s research reveals a startling truth: our brains process members of perceived outgroups differently. In one experiment, participants showed reduced empathy-related brain activity when witnessing pain inflicted on someone they considered part of an outgroup.

This is a slippery slope, as extreme outgroup categorization can lead to dehumanization. “When you look at any conflict in the world, the two sides don’t think of each other like a human. They think of each other like an object,” Eagleman explains. This cognitive shortcut makes it easier for people to justify harmful actions against those they’ve mentally classified as “other.”

Understanding others

So, how do we break free from these neurological constraints and foster a more understanding society? Eagleman offers several strategies:

  1. Blind our biases: Recognizing that we all have biases is the first step. From there, we can implement systems to counteract them. Eagleman cites examples like orchestras conducting blind auditions or tech companies evaluating code samples without demographic information.
  2. Resist dehumanization tactics: By understanding common propaganda techniques, such as associating outgroups with repulsive imagery (moral pollution), we can become more resistant to manipulation and better able to evaluate arguments on their merits.
  3. Entangle group memberships: Eagleman is particularly excited about this approach, which involves creating cross-cutting allegiances. He recounts the story of the Iroquois Native Americans, whose leader assigned members of warring tribes to shared clans, effectively complicating their loyalties and reducing conflict. “If we can strengthen the bonds between all different groups of people, that’s one solution that makes it more difficult to go and attack your neighbor,” Eagleman explains.

These strategies aim to expand our internal models of reality, allowing us to see the world – and each other – more clearly. Eagleman envisions applying these principles to social media algorithms, prioritizing content that highlights shared interests before revealing divisive differences.

Changing for the better

The Albuquerque-born neuroscientist argues that the stakes for implementing such changes are higher now than ever. “It’s very easy to kill an enormous number of people with the push of a button,” Eagleman warns. “Given the current context of our technology, this is why we need to figure out this problem and reach a new level of maturation for our species.”

Eagleman’s work in social neuroscience offers the chance of a more harmonious future. By recognizing the limitations of our perceptions and actively working to expand them, it is possible to create a society better equipped to handle complex challenges.

This research reminds us that while we may be constrained by our brains’ tendencies, we’re not imprisoned by them. Through science, philosophy and a willingness to challenge our assumptions, we can expand our views and recognize that our way of seeing the world isn’t the only truth. It’s one of many.

We spoke to David Eagleman for The Science of Perception Box, a Big Think interview series created in partnership with Unlikely Collaborators. As a creative non-profit organization, they’re on a mission to help people challenge their perceptions and expand their thinking. This series dives into the science behind our thought patterns. Watch Eagleman’s full interview above, and visit Perception Box to see more in this series.

DAVID EAGLEMAN: Why do we accept our reality as the uncontested truth?

You are a data collection machine that moves through the world, vacuuming up your little bits of experience, and in the end, whatever you have, that's what you assume to be true. But our experiences are limited. We're born in a particular spot on the planet, and we have a thin, little trajectory of experience. We construct what we believe the world is made up of from there. As a result, we all have a very limited view of what's going on out there.

The interesting thing about being a human is that we're stuck inside our internal model—it's all we ever see. But with the endeavors of science, literature, and philosophy, we're able to step outside of ourselves and understand, "Hey, the way I see the world isn't the only way to see the world. It's not the only truth." The more we get good at that, the more we can try to build a better society.

My name's David Eagleman. I'm a neuroscientist at Stanford, and I run the podcast *Inner Cosmos*. The interesting thing about the human brain is that we drop into the world half-baked with a certain set of genetics, and then experience wires up our brains. That means our brains are extremely flexible. So, whatever moment in time you're born in, whatever culture you're born into, whatever deities your culture believes in, whoever your parents are, and your neighborhood, all that crafts who you become.

As a result of different genetics, your brain wires up in slightly different ways. My interest in searching out the genetics here is to define a new field called 'perceptual genomics,' which is understanding how slight tweaks in your genome lead to us seeing the world in different ways. In other words, how do the genes you come with change your perception of reality?

For example, how clearly do you visualize something inside? If I ask you to picture an ant crawling on a red and white tablecloth towards a jar of purple jelly, you might see it as a movie in your head, or you might not see any picture but just the concept of it. People have completely different internal lives. Your genetics and life experiences might be different from mine, which makes our models different from each other—and that's true for all 8 billion of us.

Our brains are predisposed to form ingroups and outgroups. We trust and care about our ingroup, but not so much about the outgroup. Presumably, this has an evolutionary basis because we grew up in small tribes. You knew your folks, but the group across the hill? You don't know if they're enemies. We constantly form groups based on our country, religion, or even favorite sports teams. We care more about those who agree with us and are suspicious of those in outgroups.

One of the amazing parts about human brains is our sense of empathy. But it turns out that when you're dealing with somebody in your outgroup, you have less empathy. In my lab, we did an experiment where people saw six hands on a screen, and the computer would randomly pick one. The hand either got touched with a Q-tip or stabbed with a syringe. When it gets stabbed, pain-related networks in your brain come online.

We labeled each hand with one word: Christian, Jewish, Muslim, Hindu, Scientologist, Atheist. And the question was: Does your brain care as much if it's an outgroup member? The answer: It doesn’t. If your ingroup member gets stabbed, you have a big response in the pain matrix. If an outgroup member gets stabbed, you have a smaller response. This happened across all groups we measured. This isn't an indictment of religion; it’s just about ingroups and outgroups.

When you look at conflicts in the world, both sides don’t see each other as humans. They think of the other like an object, and parts of the brain that recognize personhood don’t come online. The internal model you form growing up determines who's in your ingroup and who's in your outgroup.

As people travel and expand their internal models, their ingroups grow. But for those who haven't traveled everywhere, it's easy to feel like certain groups or cultures are foreign.

The first step to expanding our narrow models is recognizing our biases. We can't help but have biases, but the question is: What can we do about it? Many orchestras now hold auditions behind a screen to avoid biases related to gender or race, allowing them to judge purely on the music. We can blind our biases in other ways, too.

The second strategy is learning about dehumanization tactics so we can resist them. For example, there’s 'moral pollution,' where outgroup members are associated with something repulsive. This makes people less eager to hear their perspectives. If we understand these tactics, we can be immune to them.

The third strategy is entangling group membership or complexifying our allegiances. If we find things in common with someone—like shared interests or hobbies—we form a bond. Later, if we discover a disagreement, we approach it with curiosity because we’ve already established a connection. This improves communication and helps us bridge the gaps between different perspectives.

It’s crucial to recognize that everyone isn’t experiencing reality the same way on the inside. To build a better future, we need deeper bonds that help us understand each other as fellow humans.


Related