Skip to content
The Present

The Backfire Effect: Why Facts Don’t Win Arguments

Why are people sometimes more emboldened in their beliefs after being exposed to corrective information?

Let’s say you’re having an argument with a friend about oh, let’s say, Obamacare, or even who the best quarterback in the NFL is. You present your friend with a set of facts that you would think would clinch your argument. And yet, while the facts you present clearly contradict your friend’s position, you discover that presenting your friend with these facts does nothing to correct his or her false or unsubstantiated belief. In fact, your friend is even more emboldened in his or her belief after being exposed to corrective information.


A group of Dartmouth researchers have studied the problem of the so-called “backfire effect,” which is defined as the effect in which “corrections actually increase misperceptions among the group in question.”

The problem here may be the way your friend is receiving these facts. Since your friend knows you and your opinions well, he or she does not view you as an “omniscient” source of information. When it comes to receiving corrective information about a public policy issue, the authors of the Dartmouth study note

people typically receive corrective information within “objective” news reports pitting two sides of an argument against each other, which is significantly more ambiguous than receiving a correct answer from an omniscient source. In such cases, citizens are likely to resist or reject arguments and evidence contradicting their opinions – a view that is consistent with a wide array of research. 

So when we read a news story that presents both sides of an issue, we simply pick the side we happen to agree with and it reinforces our viewpoint. But what of those individuals who don’t simply resist challenges to their views, but who actually come to hold their original opinion even more strongly?

The authors describe the “backfire effect” as a possible result of

the process by which people counterargue preference-incongruent information and bolster their preexisting views. If people counterargue unwelcome information vigorously enough, they may end up with ‘more attitudinally congruent information in mind than before the debate,’ which in turn leads them to report opinions that are more extreme than they otherwise would have had.”

This study goes a long way to explain the state of rational discourse in the country right now. So what can be done? How can you have a more effective discussion with your friend about Obamacare or Peyton Manning?

Think about an argument more as a partnership, says Julia Galef, President of the Center for Applied Rationality. Read about that here.  

Image courtesy of Shutterstock


Related

Up Next
Kas Thomas: The evidence is substantial enough that people should start thinking about taking substantial amounts of vitamin D as prophylaxis against cancers of all kinds.