Don't read this blog post. Definitely don't read it to the end. Didn't I tell you not to read this blog post? You're still doing it... We can laugh at our inherent ability to be contrary, but unfortunately something similar can happen when we give a human being scientific evidence that debunks misinformation. One of the most depressing paradoxes of science communication is that not only can misinformation often spread faster and wider than the truth (just take the ubersuccessful but often not so factual "uberfacts" or the success of the paragons of science misinformation Natural News if you need examples); but even worse, combating misinformation with evidence can often have the complete and utter opposite of the desired effect. This horrifying phenomenon known as the backfire effect was demonstrated once again recently by a study of the responses of parents to various different forms of evidence that vaccines are not dangerous. The randomized controlled trial drew from four CDC sources, all designed to use scientific evidence to demonstrate why children must be vaccinated:
Image Credit: U.S. Centers for Disease Control and Prevention
In all four cases, none of the materials increased parents intentions to vaccinate their children. The effects of the straight forward information about measles, mumps and rubella were fairly neutral. The images of children with measles, mumps and rubella and the mother's narrative about her hospitalized child both had the unintended effect of increasing beliefs in vaccine side effects. The images also somehow increased false beliefs that vaccines cause autism. The material that refuted the MMR-autism link successfully reduced false beliefs about the idea that vaccines cause autism but astoundingly actually reduced the intent to vaccinate in parents with the most anti-vaccine beliefs.
This isn't the first time we've seen depressing findings from studies attempting to refute vaccine myths. A study described in a paper by Schwarz et al, found that a CDC flyer containing "facts and myths" about vaccines increased intentions to vaccinate immediately but had the opposite effect after only half an hour - when the participants began remembering the myths as facts. It seems we really are glorified goldfish when it comes to remembering the separation between fact and fiction. When the experimenters created a version of the flyer where myths were rephrased as facts the flyer successfully increased intention to vaccinate, this contrasted with the original CDC flyer which left participants off worse than when they started. Avoiding reference to myths is far from a perfect solution however, because it fails to directly address the myths that are in circulation.
As if things couldn't get any more depressing, Norbert Schwarz the coauthor of the "facts and myths" paper suggests that when a respected institution such as the CDC weighs in and debunks a claim, this can actually end up lending credence to the claim in people's minds. Schwarz cites as an example an internet rumor about flesh-eating bananas that was so prolific it was debunked by the CDC website. When this happened, the flesh-eating banana scare grew and began actually being attributed to the CDC!
In an another study a similar backfire effect was found in Conservative voters who believe that Iraq possessed weapons of mass destruction. After receiving a correction that Iraq did not have weapons of mass destruction they became more likely to believe that Iraq had weapons of mass destruction than controls. The very same thing happened when Conservatives were presented with evidence that Bush's tax cuts failed to stimulate economic growth - in this case the percentage agreeing with the statement that Bush's tax cuts increased government revenue leapt from 36% to 67%, while the same evidence moved the views of non-conservatives in the other direction (from 31% to 28%).
Worryingly, the backfire effect has been shown to be particularly profound in older people who it is believed may remember a statement but forget the contextual information that the statement is untrue. Worse still, repeating that a claim is false can actually leave an even stronger impression that the claim is true. In one study, "the more often older adults were told that a claim was false, the more likely they were to remember it erroneously after a 3 day delay. The size of this effect is far from negligible. After 3 days, older adults misremembered 28% of false statements as true when they were told once that the statement was false but 40% when told three times that the statement was false". Interestingly, in this study the effect was the precise opposite in younger people - reinforcing that the claim was false made them less likely to believe the claim.
While younger adults became less likely to misremember a false claim as true after being told three times that it was false, older adults became more likely to misremember the claim as true. (Skurnik et al, 2005)
It seems that unless we are extremely careful, by trying to convince the most hardened cynics of the evidence we may end up doing more harm than good. The dire and seemingly growing need to combat misinformation on the MMR issue is one I've discussed at length on this blog. The intuitive and somewhat cliché response is often that we need to combat misinformation with better education. It seems however that at present some views are so entrenched that education alone just isn't cutting it. One study of views about global warming found that education doesn't seem to be as important a factor as political beliefs in determining agreement or disagreement with scientific consensus. The study concluded that "cultural worldviews explain more variance than science literacy and numeracy". In those with a "hierarchical individualist" worldview, scientific literacy was actually correlated with decreased beliefs in climate change, while scientific literacy was correlated with an increased belief in climate change amongst those with an "egalitarian communitarian" worldview.
The perniciousness of this problem can't be underestimated and we will without a doubt see much research in the field of tackling misinformation over coming years. It's an area I myself have become particularly interested in and I would genuinely like to hear your ideas as I'm currently researching mass delusions. Hopefully if we can understand how we've gone wrong in the past we can have a better idea of how to stop ourselves going wrong again in the future. For now, the best simple resource that I have come across for understanding how best to handle misinformation is the Debunking Handbook (PDF) by John Cook and Stephan Lewandowsky, it's a five minute rollercoaster that (if you're anything like me) will leave you thinking long and hard.
Kahan D.M., Peters E., Wittlin M., Slovic P., Ouellette L.L., Braman D. & Mandel G. (2012). The polarizing impact of science literacy and numeracy on perceived climate change risks, Nature Climate Change, 2 (10) 732-735. DOI: 10.1038/nclimate1547
Nyhan B., Reifler J., Richey S. & Freed G.L. (2014). Effective Messages in Vaccine Promotion: A Randomized Trial., Pediatrics, PMID: 24590751
Nyhan B. & Reifler J. (2010). When Corrections Fail: The Persistence of Political Misperceptions, Political Behavior, 32 (2) 303-330. DOI: 10.1007/s11109-010-9112-2
Skurnik I., Yoon C., Park D. & Schwarz N. (2005). How Warnings about False Claims Become Recommendations, Journal of Consumer Research, 31 (4) 713-724. DOI: 10.1086/426605
Schwarz N., Sanna L.J., Skurnik I. & Yoon C. Metacognitive Experiences And The Intricacies Of Setting People Straight: Implications For Debiasing And Public Information Campaigns, Advances In Experimental Copyright 2007, Elsevier Inc. Social Psychology, 39 127-161. DOI: 10.1016/S0065-2601(06)39003-X