- A recent study compared the public's scientific literacy with their attitudes on GM foods.
- The results showed that "as the extremity of opposition increased, objective knowledge went down, but self-assessed knowledge went up."
- The results also suggest that, in terms of policy efforts to boost scientific literacy, education about a given topic alone isn't going to be enough.
In 1999, the social psychologists David Dunning and Justin Kruger published a study that uncovered a darkly comical cognitive bias. It describes how, to put it crudely, dumb people tend to incorrectly believe they’re smarter than others. Why? Because they’re too stupid to realize they’re stupid. Dubbed the Dunning-Kruger effect, the cognitive bias conjures in people a sense of illusory superiority, one that calls to mind the adage “ignorance is bliss”.
Now, a new study on public opinion about genetically modified foods doesn’t quite show that ignorance is bliss, but it does suggest that ignorance is the fuel that empowers people to hold and voice strongly anti-scientific beliefs.
The findings, published in Nature Human Behaviour, come from public surveys issued in France, Germany and the U.S. that measured scientific literacy and attitudes about GM foods. (Genetic engineering, by the way, involves selectively introducing genes to a crop in order to create a new crop with desired characteristics. Despite labels at the supermarket that say “No G.M.O.s”, decades of scientific research have failed to show any evidence suggesting GM foods are harmful, and they’re viewed as safe by the American Medical Association, the National Academy of Sciences, the American Association for the Advancement of Science and the World Health Organization.)
In the surveys, more than 2,500 people answered true-false statements such as “Electrons are smaller than atoms” (true) and “Ordinary tomatoes do not have genes, whereas genetically modified tomatoes do” (false).
The results revealed a troubling trend.
“What we found is that as the extremity of opposition increased, objective knowledge went down, but self-assessed knowledge went up,” study author Philip Fernbach told The Guardian. “The extremists are more poorly calibrated. If you don’t know much, it’s hard to assess how much you know … The feeling of understanding that they have then stops them from learning the truth. Extremism can be perverse in that way.”
In terms of policy implications, the findings suggest that educating the public about a given problem isn’t going to change many minds.
“Our research shows that you need to add something else to the equation,” Fernbach told The Guardian. “Extremists think they understand this stuff already, so they are not going to be very receptive to education. You first need to get them to appreciate the gaps in their knowledge.”
Cognitive biases and scientific literacy
The Dunning-Kruger effect is one of many cognitive biases that make it difficult for us to interpret reality. Another prominent bias in terms of scientific literacy is cognitive dissonance, which describes the mental conflict we experience when confronted with information that contradicts our current worldview. This inner conflict can prevent people from accepting new ideas, as Bill Nye once described on his Netflix show:
“[So] instead of changing your worldview, which you may have held your entire life, you dismiss the evidence — and along with that you dismiss the authorities that may have provided the evidence.”
In 2016, Business Insider put together a great infographic that provides a quick overview of 20 cognitive biases that can subtly steer our thinking — often in a bad direction.