People remember when governments lie to them and it lowers their satisfaction in government officials.
- A recent study measured how the public's trust in government differs when exposed to rumors, government denials, and subsequent verification of the initial rumors.
- The study, conducted in China, also examined whether any changes in trust lasted over a three-week period.
- The results suggest that governments that deem negative information as "fake news" may persuade some people, but over the long term it can cost them in credibility and public satisfaction.
In late 2019, Dr. Li Wenliang, an ophthalmologist at the Wuhan Central Hospital in China, sent messages to colleagues warning of a potential SARS-like outbreak. Soon after, screenshots of the messages appeared elsewhere online. Li was reprimanded by Chinese officials for "spreading rumors," and he signed a police document publicly stating that his comments "were not factual and broke the law."
It didn't take long for the public to learn the truth. By early February, at least 600 people in China had died of COVID-19, including Li. The Chinese government later declared him a martyr.
How much did this reversal cost the Chinese government, in terms of public trust? More broadly: Do governments lose credibility over the long term when what they call "fake news" later proves to be real?
That's the key question of a new study titled "When 'Fake News' Becomes Real: The Consequences of False Government Denials in an Authoritarian Country."
Both democratic and authoritarian governments have been known to label certain negative information as "fake," the researchers noted. But it's especially concerning when authoritarian governments do it, because their citizens tend to have less fact-checking capabilities, and because it can be a way for violent regimes to silence dissenters.
Credit: Anthony Kwan/Getty Images
"The ability to label claims and explanations that the authorities deem objectionable as fake has long been regarded as a power," the researchers wrote. "Because the revelation of the falsehood of government denials could erode the government's power, it is important to investigate its consequences, particularly in the authoritarian setting."
In the study, the researchers conducted a survey on three groups of participants. Each group was shown different information regarding a new automobile registration policy, and they were also asked general questions about demographic information and political interests. The study explains:
"The first group was exposed to a rumor regarding the government's automobile registration policy (rumor group), the second group was exposed to the government's denial of the rumor (denial group), and the third group was exposed to an event in which the rumor initially denied by the government was verified as true (verification group)."
Each group then reported how much they believed in the initial rumor and the government denial. The denial and verification groups were also asked to rate their satisfaction with the government's handling of automobile registration.
The results showed that government denial effectively decreased belief in the rumor, compared to the group that was exposed only to the rumor. Meanwhile, being exposed to a verification of the rumor increased belief in the rumor and decreased belief in the denial. Also, the verification group reported being slightly less satisfied with the government.
Design of survey 1
Credit: Wang et al.
But do these effects last? After all, past research suggests that the effects of persuasive communication — say, a negative political ad smearing a candidate — tend to disappear within days.
To find out, the researchers conducted a follow-up survey three weeks after the first. This time, the survey included only two groups: the verification group from the first survey, and a group of new participants. Both groups were exposed to a rumor and then a government denial.
"The difference between the two groups was simply that one of them had previously experienced the revelation of the government's false denial of an online rumor, while the other group did not have such an experience," the researchers wrote.
The results showed that the verification group — that is, people who had weeks earlier been shown that the government had lied to them — was much less likely to believe in the government's denial. What's more, the verification group was also less satisfied with the government.
Design of survey 2
Wang et al.
The findings suggest that governments can lose credibility over the long term when they call something "fake news" but it later proves true.
"As discussed earlier, while authoritarian countries can be awash with rumors and fake news, it is less frequent for the government's false denials to be caught due to the lack of independent news media and fact-checking organizations," the researchers wrote.
"It is therefore a vivid and memorable experience to see the government's denial bluntly shown to be false. Unsurprisingly, such an experience would make people less willing to believe a new denial from the government, especially if it is somewhat similar to the one that had been shown to be false."
Ultimately, calling "fake news" on negative information does seem to persuade some people. But it seems to be a costly short-term strategy, one that comes with the added cost of a dissatisfied public.
This is what happens when the fringe becomes mainstream.
- New research finds that YouTube is the worst disseminator of coronavirus misinformation.
- People that rely on social media for their news are more likely to believe coronavirus conspiracy beliefs.
- With only 50 percent of Americans willing to get a vaccination, conspiracy theories are fueling a public health crisis.
As Florida becomes the global epicenter of the novel coronavirus pandemic with over 15,000 cases reported in one day and 3.3 million cases overall, a Windermere restaurant celebrated by offering free grilled cheese sandwiches to any customer showing up without a mask.
The region, and the entire state, has become a hotbed for coronavirus conspiracy theories—a microcosm of America. The nation leads the world in coronavirus infections by a lot. We likely lead the world in conspiracy theories about the virus as well. These phenomenons are not separate.
While it's easy to roll your eyes at anti-mask crusades, new research, published in the journal Psychological Medicine, investigates the real-world effects of coronavirus conspiracy theories.
Corresponding author, Daniel Allington of King's College London, analyzed data from three surveys regarding varying aspects of COVID-19 conspiracy beliefs. His team looked at responses from 5,453 UK residents. They were especially interested in how social media disseminates health misinformation. The findings will not surprise anyone with a social media account:
"All three studies found a negative relationship between COVID-19 conspiracy beliefs and COVID-19 health-protective behaviours, and a positive relationship between COVID-19 conspiracy beliefs and use of social media as a source of information about COVID-19."
Allington identifies YouTube and Facebook as the primary drivers of misinformation. The unregulated nature of social media is cause for concern. In fact, the team discovered that people that consume regulated media, such as broadcast or print, are more likely to engage in protective health measures, such as wearing a mask and social distancing. The opposite is true for people that receive most of their health guidance from social media.
Coronavirus: Conspiracy Theories: Last Week Tonight with John Oliver (HBO)
Allington's team used data collected from partnerships with CitizenMe (Study 1) and Ipsos-MORI (Studies 2 and 3). In the first study, respondents had to identify the truth behind three conspiracy beliefs:
- The virus that causes COVID-19 was probably created in a laboratory
- The symptoms of COVID-19 seem to be connected to 5G mobile network radiation
- The COVID-19 pandemic was planned by certain pharmaceutical corporations and government agencies
Among their findings, younger people tend to buy into one or more conspiracy belief, while older respondents are more likely to engage in protective behaviors. Women listen to public health guidance more than men, though there is no gender distinction in those that believe in conspiracy theories.
Study 2 also asked about the possibility of the novel coronavirus being created in a laboratory, while Study 3 looked more deeply into the respondents' usage of social media. In each case, the results were clear: people that rely on social media for news are more likely to peddle in conspiracy theories.
YouTube appears to be the most problematic source of misinformation. Slickly produced shows, such as London Real, feature prominent anti-vaxxers like Del Bigtree. The anti-vaxx propaganda film, "Plandemic," was viewed over eight million times on YouTube before it was removed; the producer, Mikki Willis, is using this spotlight to raise funds for Part 2. These are only two examples in a deluge of anti-vaxx videos driving a dangerous narrative.
The danger is especially prevalent as a coronavirus vaccine becomes a possibility. Oxford University researchers have just discovered a strong candidate. Still, a June poll found that only 50 percent of Americans plan on getting a coronavirus vaccine. If anti-vaxx organizations continue to influence the public, less than half of this country could receive a vaccination.
In America, conspiracy beliefs are not only spread on social media. A recent study found Fox News pushing coronavirus misinformation 253 times over a five-day period. The going narrative is that vaccination is a question of "individual liberty," and if you're vaccinated you shouldn't worry about the unvaccinated. As with other misinformation, this is false, exposing the real danger of coronavirus misinformation.
People participate in a Reopen New Jersey protest on May 25, 2020 in Point Pleasant, New Jersey.
Photo by Michael Loccisano/Getty Images
In her book, "On Immunity," writer Eula Biss asks readers to imagine vaccination "as a kind of banking of immunity." When getting vaccinated, you contribute to a collective bank, ensuring those who cannot or will not get vaccinated are protected. Herd immunity only occurs when a population reaches a certain threshold; that threshold is well over 50 percent.
"The unvaccinated person is protected by the bodies around her, bodies through which disease is not circulating," writes Biss. "But a vaccinated person is surrounded by bodies that host disease is left vulnerable to vaccine failure or fading immunity. We are protected not so much by our own skin, but by what is beyond it. The boundaries between our bodies begin to dissolve here."
The prevalence of immunosuppressed individuals unable to get vaccinated is left out of this conversation. This is a growing concern in countries like America, where obesity has led to increasing numbers of immunosuppressed citizens.
While the myth that children are protected against the ravages of the coronavirus persists, the long-term complications of this multi-system disease are still becoming known, making anti-vaxx parents accountable for potential harm that may come.
Every citizen should be wary of a rushed vaccine. Researchers are attempting to create a vaccine faster than ever. There are inherent dangers in such a pursuit. But costs associated with rejecting any vaccine on the grounds of perceived "sovereignty" is even more dangerous. The price we'll pay for this misinformation is higher than any society can bear.
An international study finds the vast majority of 15-year-olds can't tell when they're being manipulated.
- International reading tests administered in 79 countries find most teens to be gullible when consuming information.
- As learning has moved online, absolutely reliable sources have become scarce.
- Most teens can't detect the validity of supposed "facts" from contextual clues.
Teen: Dad! This site has those $159 earbuds I want for just $49.99!
Parent: That doesn't sound right.
Teen: I know! Isn't it a great deal?
Parent: I don't see an actual brand name here.
Teen: But they're $49.99!! Less than $50!!
Parent: And the seller is halfway across the world. Um…
The takeaway from this actual conversation is this: The teen hasn't yet learned that if something seems too good to be true, it probably is. A study from the OECD Programme for International Student Assessment (PISA) says this is by no means a rare phenomenon. It finds that just 9 percent of 15-year-olds can actually tell when facts are really facts and not just opinions.
In our current climate of rampant disinformation and "fake news," the implications are troubling.
About OECD and the PISA survey
OECD stands for "Organisation for Economic Co-operation and Development." It's an international organization that's dedicated to the identification and implementation of policies for tackling the world's social, economic, and environmental challenges. Thirty-six countries are members, and the impact of the group's work is felt in more than 100 countries. The OECD collects data and develops reports, including recommendations, for its worldwide audience.
The Programme for International Student Assessment, PISA, is one such report. Subtitled, "What Students Know and Can Do," it's the product of reading, mathematics, and science tests administered to 15-year-olds in 79 countries. The focus in the 2018 tests was reading. The tests were given on computer screens in recognition that this is where most of today's teens do most of their reading.
The very best readers — better than in any other country — come from four provinces in China: Beijing, Shanghai, Jiangsu and Zhejiang. Though these areas are exceptional within the country, China overall still sits at the top of the list of the world's most advanced readers.
Image source: boreala/Shutterstock/Big Think
Reading is one thing, understanding is another
The world, as the report notes, has changed. Reading used to be about the straightforward extraction and absorption of information from reliable sources. Not so for today's learners, says the report:
"Today, they will find hundreds of thousands of answers to their questions online, and it is up to them to figure out what is true and what is false, what is right and what is wrong."
Alarmingly, the PISA research finds that less than 1 in 10 of students tested are "able to distinguish between fact and opinion, based on implicit cues pertaining to the content or source of the information."
Only in six nations did students do better than 1 in 7 at successfully identifying actual facts — China, Canada, Estonia, Finland, Singapore, and the United States.
Image source: OECD
The report summarizes reading proficiency by focusing on two of the six-levels in their reading skill assessment scale.
At Level 2, "students are able to identify the main idea in a text of moderate length, find information based on explicit, though sometimes complex, criteria, and reflect on the purpose and form of texts when explicitly directed to do so." About 77 percent of students on average achieved Level 2 proficiency.
The best readers, comprising just 8.7 percent of the tested teens, performed at Levels 5 or 6 where "students are able to comprehend lengthy texts, deal with concepts that are abstract or counterintuitive, and establish distinctions between fact and opinion, based on implicit cues pertaining to the content or source of the information."
Image source: fizkes/Shutterstock
Hope in AI
The authors of the PISA report see promise in the leveraging of A.I. for helping young people develop a stronger sense of context that would allow them to more accurately assess informational sources.
They suggest "we need to think harder about how to develop first-class humans, and how we can pair the A.I. of computers with the cognitive, social and emotional skills, and values of people." They do note with caution that A.I. itself is ethically neutral while the humans who program it are typically not. This is one of the concerns being studied for the OECD's upcoming Education 2030 project.
Overall, the PISA findings serve as clear notice that we need to be smarter about teaching. "That is why education in the future is not just about teaching people, but also about helping them develop a reliable compass to navigate an increasingly complex, ambiguous, and volatile world.
A large new study uses an online game to inoculate people against fake news.
- Researchers from the University of Cambridge use an online game to inoculate people against fake news.
- The study sample included 15,000 players.
- The scientists hope to use such tactics to protect whole societies against disinformation.
A large new University of Cambridge study proves that it's possible to teach regular people to spot fake news. By analyzing the responses of 15,000 participants, the researchers found that "psychological resistance" to fake news could be increased by having the subjects play an online game.
In the browser game, called Bad News, launched in February 2018, players become propaganda producers. They are allowed to manipulate the news and social media, invoking anger and fear. Tactics at their disposal include twitter bots, conspiracy theories, impersonation and photoshopped evidence. Still, while they use such Machiavellian approaches to attract followers, the players must maintain a "credibility score" to continue to be persuasive.
Dr. Sander van der Linden, Director of the Cambridge Social Decision-Making Lab, explained that the task before the researchers was not easy, since fake news spreads very fast and can go "deeper than the truth." That makes it harder and harder to stand up to misinformation.
"We wanted to see if we could pre-emptively debunk, or 'pre-bunk', fake news by exposing people to a weak dose of the methods used to create and spread disinformation, so they have a better understanding of how they might be deceived," shared the scientist.
He called their game "a psychological vaccination." This work builds on the so-called "inoculation theory," which maintains that beliefs can be guarded against influence the same way you can protect a body against diseases – by being exposed to smaller doses of them over time to build up immunity.
To see how well the participants were inoculated against fake news, they were told to rate how trustworthy various tweets and headlines were. They had to do this before and after playing the game for at least 15 minutes.
The researchers discovered that the subjects were eventually able to pick out fake news better, finding them 21% less reliable after the game. Playing made no difference in how they ranked real news.
Not only that, the scientists saw that those who were most vulnerable to fake news prior to the game were inoculated the strongest.
While they perceive that those who actually played the game were generally younger, male, liberal and educated, the scientists built in a nonpartisan mechanism into the game to avoid bias. The subjects were able to choose fake news either from the left or the right.
Dr. Linden expressed excitement at using such methods across whole populations to build "societal resistance to fake news".
His colleague and the study's co-author Jon Roozenbeek, also of Cambridge University, saw the benefits of their investigation in uncovering pro-active measures that could be taken to fight against bad information. He hoped to use such tactics to create "a general 'vaccine' against fake news."
The game, created by the scientists, in conjunction with the Dutch media collective DROG as well as the design agency Gusmanson, has been translated into nine different languages. It is also being developed for WhatsApp and has a "junior version" for children aged 8-10. The researchers hope to use that version to develop early media literacy.
Check out the study published in the journal Palgrave Communications.
Preserving truth: How to confront and correct fake news
The truth is a messy business, but an information revolution is coming. Danny Hillis and Peter Hopkins discuss knowledge, fake news and disruption at NeueHouse in Manhattan.
- In 2005, Danny Hillis co-founded Freebase, an open-source knowledge database that was acquired by Google in 2010. Freebase formed the foundation of Google's famous Knowledge Graph, which enhances its search engine results and powers Google Assistant and Google Home.
- Hillis is now building The Underlay, a new knowledge database and future search engine app that is meant to serve the common good rather than private enterprise. He calls it his "penance for having sold the other one to Google."
- Powerful collections of machine-readable knowledge are becoming exceedingly important, but most are privatized and serve commercial goals.
- Decentralizing knowledge and making information provenance transparent will be a revolution in the so-called "post-truth age". The Underlay is being developed at MIT by Danny Hillis, SJ Klein, Travis Rich.