It is easier to fool a person than it is to convince a person that they’ve been fooled. This is one of the great curses of humanity.

Given the incredible amount of information we process each day, it is difficult for any of us to critically analyze all of it. This is made even more difficult by the natural tendency to be overly critical of any information that threatens our worldview and under-critical of information that supports it.

The menace of misinformation can plague a society with grave consequences. For instance, the failure of people to understand that HIV causes AIDS killed an estimated 300,000 people in South Africa at the turn of the millennium. The state of Minnesota is battling a measles outbreak caused by anti-vaccination propaganda. And discussion over the effects of misinformation on recent elections in Austria, Germany, and the United States is still ongoing.

If only we had a way to prevent our seduction by misinformation. A vaccine of some kind perhaps…

A recent set of experiments shows us that there is a way to help reduce the effects of misinformation on people: the authors amusingly call it the “inoculation.”

 

katherine-maher-wikipedia-beats-fake-news-every-day-so-why-cant-facebook

In two experiments, groups of test subjects were exposed to misinformation after having been exposed to an “inoculation”. This inoculation was given in the form of either a warning of future misinformation or a review of why the misinformation they were about to read was a fallacy, with an additional group being given both. The control group was merely given misinformation. 

 

The tests showed that this “pre-bunking” was extremely effective. While members of the control group saw a significant decrease in acceptance of the scientific consensus on climate change, members of all other groups saw minor drops at worst, which even then were heavily influenced by their pre-existing worldviews.

 The most effective of these methods was an explanation of how the misinformation would be presented and how it would attempt to mislead them. This method was effective not only at slowing the pace of false information taking hold, but also worked across all worldviews and even reduced the polarization of all test subjects.

barbara-oakley-is-it-really-possible-to-change-someones-beliefs

So, we can help prevent rampant misinformation now? Where do I sign up?

Research into how this works is still ongoing, though it is well known that suspicious people are less likely to be taken in by fraudulent action. The researchers also pointed out that several studies support the notion that teaching about misconceptions leads to greater learning overall then just telling somebody the truth. While the topic used in the study was climate change consensus, the researchers saw no reason to suppose these methods function differently with other subjects.

We all know somebody who has been taken in by a bad argument or by information they wanted to wanted to believe is true. Sometimes, it is even ourselves that can be fooled. A method to help prevent being taken in by bad arguments and false narratives  could be a powerful tool for educating ourselves and others.

The full study can be read here.

--