The Backfire Effect: When Correcting False Beliefs Has the Opposite of the Intended Effect
How providing people with evidence about the safety and effectiveness of vaccines can backfire.
Simon Oxenham covers the best and the worst from the world of psychology and neuroscience. Formerly writing with the pseudonym "Neurobonkers", Simon has a history of debunking dodgy scientific research and tearing apart questionable science journalism in an irreverent style. Simon has written and blogged for publishers including: The Psychologist, Nature, Scientific American and The Guardian. His work has been praised in the New York Times and The Guardian and described in Pearson's Textbook of Psychology as "excoriating reviews of bad science/studies”.
Follow Simon on Twitter
Like Simon on Facebook
Follow Simon on Google+
Subscribe via Email
Subscribe via RSS
Contact Simon directly by Email
According to a new study, 43 percent of the U.S. population wrongly believes that the flu vaccine can give you flu. In reality, this is not the case — any adverse reaction, besides a temperature and aching muscles for a short time, is rare. It stands to reason that correcting this misconception would be a good move for public health, but the study by Brendan Nyhan and Jason Reifler published in Vaccine found that debunking this false belief had a seriously counterproductive effect.
The researchers looked at 822 American adults who were selected to reflect the general population in terms of their mix of age, gender, race, and education. About a quarter of this sample was unduly concerned about the side effects of the flu vaccine. It was among these individuals that attempting to correct the myth that the flu vaccine gives you flu backfired. The researchers showed participants information from the Center for Disease Control (CDC), which was designed to debunk the myth that the flu vaccine can give you flu. This resulted in a fall in people's false beliefs, but, among those concerned with vaccine side effects, it also resulted in a paradoxical decline in their intentions to actually get vaccinated, from 46 percent to 28 percent. The intervention had no effect on intentions to get vaccinated amongst people who didn't have high levels of concerns about vaccine side effects in the first place.
Why is it that, as false beliefs went down, so did intentions to vaccinate? The explanation suggested by the researchers is that the participants who had "high concerns about vaccine side effects brought other concerns to mind in an attempt to maintain their prior attitude when presented with corrective information." A psychological principle that might explain this behavior is motivated reasoning: We are often open to persuasion when it comes to information that fits with our beliefs, while we are more critical or even outright reject information that contradicts our worldview.
This is not the first time that vaccine safety information has been found to backfire. Last year, the same team of researchers conducted a randomized controlled trial comparing messages from the CDC aiming to promote the measles, mumps, and rubella (MMR) vaccine. The researchers found that debunking myths about MMR and autism had a similarly counterproductive result — reducing some false beliefs, but also ironically reducing intentions to vaccinate.
Taken together, the results suggest that in terms of directly improving vaccination rates, we may be better off doing nothing than using the current boilerplate CDC information on misconceptions about vaccines to debunk false beliefs. If this is the case, then the ramifications for public health are huge, but before we can decide whether this conclusion is accurate, we'll have to wait to see if the finding can be replicated elsewhere. History has taught us that when it comes to vaccines, acting on scant evidence can have catastrophic consequences.
The studies do have their limitations: Both looked at intentions to vaccinate rather than actual vaccination rates, which may be different in practice. Furthermore, in both sets of experiments, only the official US CDC vaccine safety messages were used. It is possible that if the experiments were repeated with other wordings, perhaps those used by the NHS in the UK for example, we would see different results.
If the backfire effect is replicated in future studies, how should we proceed? Research into the backfire effect can provide some tentative suggestions. To begin with, it is likely we should avoid restating myths wherever possible and when we must restate myths, we should try to precede the myth with a warning that misleading information is coming up. This can help prevent myths from growing in our minds through mere familiarity. When we debunk myths, we should also try to offer an alternative explanation for false beliefs, to fill the gap left by misinformation. We should also try to keep our explanations brief, which can help counter the imbalance that often occurs between simple, memorable myths and the more complicated reality. What is clear from the recent findings regarding beliefs about vaccines and the recent outbreaks in vaccine-preventable diseases in the UK, the US, and elsewhere, is that what we are currently doing to try to convince people to get vaccinated — may no longer be working.
Nyhan, B., & Reifler, J. (2015). Does correcting myths about the flu vaccine work? An experimental evaluation of the effects of corrective information Vaccine, 33 (3), 459-464 DOI:10.1016/j.vaccine.2014.11.017
Brendan Nyhan, Jason Reifler, Sean Richey, & Gary L. Freed (2014). Effective Messages in Vaccine Promotion: A Randomized Trial PEDIATRICS, 133 (4) DOI: 10.1542/peds.2013-2365d
Lewandowsky, S., Ecker, U., Seifert, C., Schwarz, N., & Cook, J. (2012). Misinformation and Its Correction: Continued Influence and Successful Debiasing Psychological Science in the Public Interest, 13 (3), 106-131 DOI:10.1177/1529100612451018
Science and the squishiness of the human mind. The joys of wearing whatever the hell you want, and so much more.
- Why can't we have a human-sized cat tree?
- What would happen if you got a spoonful of a neutron star?
- Why do we insist on dividing our wonderfully complex selves into boring little boxes
Progressive America would be half as big, but twice as populated as its conservative twin.
- America's two political tribes have consolidated into 'red' and 'blue' nations, with seemingly irreconcilable differences.
- Perhaps the best way to stop the infighting is to go for a divorce and give the two nations a country each
- Based on the UN's partition plan for Israel/Palestine, this proposal provides territorial contiguity and sea access to both 'red' and 'blue' America
A guide to making difficult conversations possible—and peaceful—in an increasingly polarized nation.
- How can we reach out to people on the other side of the divide? Get to know the other person as a human being before you get to know them as a set of tribal political beliefs, says Sarah Ruger. Don't launch straight into the difficult topics—connect on a more basic level first.
- To bond, use icebreakers backed by neuroscience and psychology: Share a meal, watch some comedy, see awe-inspiring art, go on a tough hike together—sharing tribulation helps break down some of the mental barriers we have between us. Then, get down to talking, putting your humanity before your ideology.
- The Charles Koch Foundation is committed to understanding what drives intolerance and the best ways to cure it. The foundation supports interdisciplinary research to overcome intolerance, new models for peaceful interactions, and experiments that can heal fractured communities. For more information, visit charleskochfoundation.org/courageous-collaborations.
SMARTER FASTER trademarks owned by The Big Think, Inc. All rights reserved.