Supporting climate science increases skepticism of out-groups
A study finds people are more influenced by what the other party says than their own. What gives?
- A new study has found evidence suggesting that conservative climate skepticism is driven by reactions to liberal support for science.
- This was determined both by comparing polling data to records of cues given by leaders, and through a survey.
- The findings could lead to new methods of influencing public opinion.
Among citizens of major Western nations, Americans' acceptance of the science of climate change is relatively low. It is also highly tied to their party affiliation. While members of the Democratic Party accept the scientific consensus on climate change at rates similar to citizens in other countries, members of the Republican Party express an unusual skepticism towards that consensus and generally do not believe that human-made climate change is an important issue.
But it wasn't always this way. Polling data from the late 20th century shows that conservative Americans once expressed a similar agreement with the scientific consensus as their more liberal peers. The shift towards skepticism has been more recent.A new study published in The British Journal of Political Science suggests that this decline is related not to the barrage of denialists on television or an inability to access the data, but rather to the tendency of members of one party to say they support the science and the backlash against them.
Mind the cues
The gulf in accepting the science behind climate change also exists among party elites. It is well known to any American who is attentive to the news, as party leaders are often more than willing to discuss their take to journalists.
Using polling data going back to the 1980s, the researchers were able to create a chart showing the aggregate amount of climate skepticism among the general population. A similar diagram showing the Republicans' skepticism dating back to 2001 was sourced from a previous, similar study. It was shown to be highly correlated with the one produced for this study.
These charts were compared with media content from prominent newspapers that included implicit or explicit stances on climate change by significant political figures. These thousands of articles were classified by using key terms and which major political figures were quoted or referenced. The researchers compared the number of cues over time to measured skepticism and looked for "Granger causality," the tendency for one variable to predict the future value of another variable.
The model shows evidence of both in and out-group cue effects, though the repulsion to out-group cues was much more evident. A significant increase in Democratic cues in favor of climate science was followed by a rise in skepticism among Republican voters. Importantly, the cues lead, rather than follow opinion, and do so with consistency. Changes in view did not predict changes in the number or direction of cues.
The researchers also surveyed nearly 3000 adults to demonstrate the concept. This involved showing them a statement on the scientific consensus around climate change and a cue from either a Republican or a Democrat. This test confirmed the previous observation and provided further support for the notion that signals from leaders cause an increase in skepticism among some respondents.
Before my left-leaning and Democratic readers get too smug, this research references previous studies demonstrating a similar effect in the lead up to the Iraq War. However, in that case, the Democratic Party elites' mixed messages were countered by a Republican Party united behind the idea of invasion. The effect on the Democratic party rank and file was similar to that observed in this case.
Several other studies have examined effects similar to this for other issues. This study's importance is its focus on out-group cues and the effort placed into demonstrating a causal relationship between the statements of certain party elites and public opinion. Most previous studies focused purely on in-group cues or failed to differentiate between the two.
Okay, so I might rely on a cognitive bias that assumes certain other people are wrong because of what club they're in. How do I fix that?
The mechanism that the authors attribute this result to, the tendency to use cues from party leaders as shortcuts in decision making, is something that anybody can find themselves falling back on. Like most of the logical fallacies that people fall victim to, it is a time and energy-saving process that seems useful when getting more information and you don't feel like working things out for yourself.
Also, like many other logical fallacies, knowing about it is half the battle in defeating it.
The next time that you find yourself trying to form an opinion about a complex issue, do what you can to look to primary sources rather than politicians. Remember that maintaining that a stance is correct or incorrect because somebody of importance holds it is just the appeal to authority fallacy. Lastly, remember that whatever political club you're in isn't infallible, and that sometimes people on your team can be mistaken just like people on the other side can be correct.
For anybody trying to increase the widespread acceptance of the scientific consensus in the United States, the authors suggest a few takeaways. They warn that party elites must weigh the positives and negatives of making public statements on their stances as that stance will generate at least some backlash.
On a more positive note, they point out that ideology was less important in decision making than many people suppose, suggesting that routes towards influencing popular opinion may face less difficulty when going against doctrine than is often feared. Similarly, they suggest that investment in forming consensus among party elites from both the left and right may be the best way to bring public opinion in line with the science. It would both benefit from the power of cues and work around the strong reaction against out-group stances.
The polarization of science in the United States has drastic consequences on scientists' ability to convince large portions of the population that climate change is real and that something should be done about it. A similar problem exists with the science around COVID-19. However, an improved understanding of the problem may allow us to solve it. By being aware of the biases that hinder our thinking, we may triumph over them.
Now we just have to be sure that the presentation of this information isn't done by someone who is considered to be part of an out-group.
- Political polarization: 7 most partisan issues in America - Big Think ›
- Study Warns of Boomerang Effects in Climate Change Campaigns ... ›
Famous physicists like Richard Feynman think 137 holds the answers to the Universe.
- The fine structure constant has mystified scientists since the 1800s.
- The number 1/137 might hold the clues to the Grand Unified Theory.
- Relativity, electromagnetism and quantum mechanics are unified by the number.
Younger Americans support expanding the Supreme Court and serious political reforms, says new poll.
- Americans under 40 largely favor major political reforms, finds a new survey.
- The poll revealed that most would want to expand the Supreme Court, impose terms limits, and make it easier to vote.
- Millennials are more liberal and reform-centered than Generation Z.
A 2020 study published in the journal of Psychological Science explores the idea that fake news can actually help you remember real facts better.
- In 2019, researchers at Stanford Engineering analyzed the spread of fake news as if it were a strain of Ebola. They adapted a model for understanding diseases that can infect a person more than once to better understand how fake news spreads and gains traction.
- A new study published in 2020 explores the idea that fake news can actually help you remember real facts better.
- "These findings demonstrate one situation in which misinformation reminders can diminish the negative effects of fake-news exposure in the short term," researchers on the project explained.
Previous studies on misinformation have already paved the way to a better understanding<img type="lazy-image" data-runner-src="https://assets.rebelmouse.io/eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpbWFnZSI6Imh0dHBzOi8vYXNzZXRzLnJibC5tcy8yNDU1NzQ4NC9vcmlnaW4uanBnIiwiZXhwaXJlc19hdCI6MTYxNjE2Mjg1Nn0.hs_xHktN1KXUDVoWpHIVBI2sMJy6aRK6tvBVFkqmYjk/img.jpg?width=1245&coordinates=0%2C800%2C0%2C823&height=700" id="fc135" class="rm-shortcode" data-rm-shortcode-id="246bb1920c0f40ccb15e123914de1ab1" data-rm-shortcode-name="rebelmouse-image" alt="fake news concept of misinformation and fake news in the media" />
How does misinformation spread?
Credit: Visual Generation on Shutterstock<p><strong>What is the "continued-influence" effect?</strong></p><p>A challenge in using corrections effectively is that repeating the misinformation can have negative consequences. Research on this effect (referred to as "continued-influence") has shown that information presented as factual that is later deemed false can still contaminate memory and reasoning. The persistence of the continued-influence effect has led researchers to generally recommend avoiding repeating misinformation. </p><p>"Repetition increases familiarity and believability of misinformation," <a href="https://engineering.stanford.edu/magazine/article/how-fake-news-spreads-real-virus" target="_blank" rel="noopener noreferrer">the study explains</a>.</p><p><strong>What is the "familiarity-backfire" effect?</strong></p><p>Studies of this effect have shown that increasing misinformation familiarity through extra exposure to it leads to misattributions of fluency when the context of said information cannot be recalled. <a href="https://journals.sagepub.com/doi/10.1177/0956797620952797#" target="_blank" rel="noopener noreferrer">A 2017 study</a> examined this effect in myth correction. Subjects rated beliefs in facts and myths of unclear veracity. Then, the facts were affirmed and myths corrected and subjects again made belief ratings. The results suggested a role for familiarity but the myth beliefs remained below pre-manipulation levels. </p>