Skip to content
Thinking

Why changing your mind is a feature of evolution, not a bug

If argumentation led to nothing, it would soon be thrown into the evolutionary dustbin.
Credit: adimas / Adobe Stock
Key Takeaways
  • Reasoning by yourself is a much weaker tool than contributing your reasoning to a group. 
  • This helps explain why so many of the best things humans have produced result from collaboration, from science to the arts.
  • If people could not change their minds, evolution would have gotten rid of argumentation a long time ago.

Excerpted from HOW MINDS CHANGE: The Surprising Science of Belief, Opinion, and Persuasion by David McRaney, published by Portfolio, an imprint of the Penguin Publishing Group, a division of Penguin Random House, LLC. Copyright © 2022 by David McRaney.

Research shows people are incredibly good at picking apart other people’s reasons. We are just terrible at picking apart our own in the same way. 

In a 2014 study that Mercier helped design, a team of Swiss cognitive scientists led by Emmanuel Trouche tricked people into evaluating their own justifications more thoughtfully by making it seem as if they came from the mind of someone else.

To do this, subjects read a series of questions, reached a series of conclusions, and then wrote arguments defending those conclusions. For instance, subjects read about a grocery store that sold many kinds of fruits and vegetables. Some of its apples were not organic. The scientists then asked, what can you say for sure about whether this store carries organic fruits? The correct conclusion is that you can only say for sure that the store carries some fruits that are not organic. In the study, though, many people inferred that none of the fruit was organic, and then said there was really nothing conclusive you could say one way or the other.

After subjects reached their conclusions, the scientists then asked them to write out their justifications. If at any point they found their own reasoning lacking, they could reach a different conclusion, but the vast majority of people didn’t do that. Right or wrong, most people stuck with their original conclusions and came up with reasons they felt justified them. In the next stage of the experiment, subjects got a chance to see all of the questions a second time along with the reasoning of subjects who disagreed. If it seemed like the others had stronger arguments, they could change their answers. What the experimenters didn’t reveal was that they had actually hidden in those answers some switcheroos.

For one of the questions, the supposed justifications from another person were actually the subject’s own. Just as Mercier and Sperber had predicted, when subjects thought the justifications weren’t theirs, 69 percent of people rejected their own bad arguments and then switched to the correct answer. When their poor arguments were presented back to them as those of other people, the flaws suddenly became obvious.

“People have been thinking about reasoning in the wrong way,” Mercier told me. “They’ve been thinking about it as a tool for individual cognition. And if that was the function of reasoning, it would be terrible. It would be the least adapted mechanism that ever showed its face. It would be doing the exact opposite of what you’d like it to do.” When reasoning alone, it only looks for reasons for why you’re right, “and it doesn’t really care whether the reasons are good or not. It’s very superficial. It’s very shallow.”

With no one to tell you that there are other points of view to consider, no one to poke holes in your theories, reveal the weakness in your reasoning, produce counterarguments, reveal potential harm, or threaten sanction for violating a norm, you will spin in an epistemic hamster wheel. In short, when you argue with yourself, you win.

Mercier and Sperber call all of this “the interactionist model,” which posits that the function of reasoning is to argue your case in a group setting. In this model, reasoning is an innate behavior that grows more complex as we mature, like crawling before walking upright. We are social animals first and individual reasoners second, a system built on top of another system, biologically via evolution, and individual reasoning is a psychological mechanism that evolved under selective pressures to facilitate communication between peers in an environment where misinformation is unavoidable. In an environment like that, confirmation bias turns out to be very useful. In fact, bias itself becomes very useful. 

As part of a group that can communicate, every perspective has value, even if it is wrong; so it’s best that you produce arguments that don’t run counter to your point of view. And since the effort is best saved for group evaluation, you become free to make snap judgments and quick decisions based on good-enough justifications. If others produce counterarguments, you can then refine your thinking and update your priors.

“If you think of it as something that serves individual purposes, it looks like a really flawed mechanism. If you think of it as something built for argumentation, it all makes sense,” said Mercier. “It becomes something that is extremely well-tailored to the task in a way that I find quite inspiring, and sort of beautiful in a way.”

Reasoning is biased in favor of the reasoner, and that’s important, because each person needs to contribute a strongly biased perspective to the pool. And it is lazy, because we expect to offload the cognitive effort to a group process. Everyone can be cognitive misers and save their calories for punching bears, because when it comes time to disagree, the group will be smarter than any one person thanks to the division of cognitive labor.

This is why so many of the best things we have produced have come from collaboration, people working together to solve a problem or create a work of art. Math, logic, science, art—the people who see the correct path from moment to moment are able to guide the others and vice versa. With a shared goal, in an atmosphere of trust, arguing eventually leads to the truth. Basically, all culture is 12 Angry Men at scale.

Cognitive psychologist Tom Stafford calls this the “truth wins scenario,” and in his book For Argument’s Sake, he details dozens of studies in which group reasoning arrives at the correct answer in situations where individual reasoning fails.

In studies in which people work on puzzles from the Cognitive Reflection Test, a tool for measuring people’s tendency to favor intuitive reasoning over active processing, people almost always get the wrong answers when reasoning alone. In groups, however, they tend to settle on the correct answers in seconds.

Here are some example questions from the exam:

If it takes 5 machines 5 minutes to make 5 widgets, how long would it take 100 machines to make 100 widgets?

In a lake, there is a patch of lily pads. Every day, the patch doubles in size. If it takes 48 days for the patch to cover the entire lake, how long would it take for the patch to cover half of the lake?

Smarter faster: the Big Think newsletter
Subscribe for counterintuitive, surprising, and impactful stories delivered to your inbox every Thursday

In the widget problem, the answer is five minutes. Each machine makes one widget every five minutes, so a hundred machines working together will make a hundred widgets in five minutes. In the lily pad problem, the answer is forty-seven. The pads double from covering half to covering the entire pond the day before the last day.

Reasoning alone, 83 percent of people who have taken this test under laboratory conditions answer at least one of these sorts of questions incorrectly, and a third get all of them wrong. But in groups of three or more, no one gets any wrong. At least one member always sees the correct answer, and the resulting debate leads those who are wrong to change their minds—lazy reasoning, disagreement, evaluation, argumentation, truth.

“If people couldn’t change their minds there would be no point in bringing arguments forward,” said Mercier, adding that if a disease were to run rampant across humanity causing everyone afterward to be born deaf, then spoken language would soon fade out of the human brain, because there would be no one to hear it—like deep-ocean shrimp that no longer have eyes because no light has reached them for thousands of years. If people just endlessly exchanged arguments with no side ever gaining any ground, no one admitting they were wrong or accepting the propositions of others, then argumentation would have long ago been tossed into the evolutionary dustbin.


Related

Up Next