Counterintuitively, directly combating misinformation online can spread it further. A different approach is needed.
- Like the coronavirus, engaging with misinformation can inadvertently cause it to spread.
- Social media has a business model based on getting users to spend increasing amounts of time on their platforms, which is why they are hesitant to remove engaging content.
- The best way to fight online misinformation is to drown it out with the truth.
A year ago, the Center for Countering Digital Hate warned of the parallel pandemics — the biological contagion of COVID-19 and the social contagion of misinformation, aiding the spread of the disease. Since the outbreak of COVID-19, anti-vaccine accounts have gained 10 million new social media followers, while we have witnessed arson attacks against 5G masts, hospital staff abused for treating COVID patients, and conspiracists addressing crowds of thousands.
Many have refused to follow guidance issued to control the spread of the virus, motivated by beliefs in falsehoods about its origins and effects. The reluctance we see in some to get the COVID vaccine is greater amongst those who rely on social media rather than traditional media for their information. In a pandemic, lies cost lives, and it has felt like a new conspiracy theory has sprung up online every day.
How we, as social media users, behave in response to misinformation can either enable or prevent it from being seen and believed by more people.
The rules are different online
Credit: Pool via Getty Images
If a colleague mentions in the office that Bill Gates planned the pandemic, or a friend at dinner tells the table that the COVID vaccine could make them infertile, the right thing to do is often to challenge their claims. We don't want anyone to be left believing these falsehoods.
But digital is different. The rules of physics online are not the same as they are in the offline world. We need new solutions for the problems we face online.
Now, imagine that in order to reply to your friend, you must first hand him a megaphone so that everyone within a five-block radius can hear what he has to say. It would do more damage than good, but this is essentially what we do when we engage with misinformation online.
Think about misinformation as being like the coronavirus — when we engage with it, we help to spread it to everyone else with whom we come into contact. If a public figure with a large following responds to a post containing misinformation, they ensure the post is seen by hundreds of thousands or even millions of people with one click. Social media algorithms also push content into more users' newsfeeds if it appears to be engaging, so lots of interactions from users with relatively small followings can still have unintended negative consequences.
The trend of people celebrating and posting photos of themselves or loved ones receiving the vaccine has been far more effective than any attempt to disprove a baseless claim about Bill Gates or 5G mobile technology.
Additionally, whereas we know our friend from the office or dinner, most of the misinformation we see online will come from strangers. They often will be from one of two groups — true believers, whose minds are made up, and professional propagandists, who profit from building large audiences online and selling them products (including false cures). Both of these groups use trolling tactics, that is, seeking to trigger people to respond in anger, thus helping them reach new audiences and thereby gaming the algorithm.
On the day the COVID vaccine was approved in the UK, anti-vaccine activists were able to provoke pro-vaccine voices into posting about thalidomide, exposing new audiences to a reason to distrust the medical establishment. Those who spread misinformation understand the rules of the game online; it's time those of us on the side of enlightenment values of truth and science did too.
How to fight online misinformation
Of course, it is much easier for social media companies to take on this issue than for us citizens. Research from the Center for Countering Digital Hate and Anti-Vax Watch last month found that 65% of anti-vaccine content on social media is linked to just twelve individuals and their organizations. Were the platforms to simply remove the accounts of these superspreaders, it would do a huge amount to reduce harmful misinformation.
The problem is that social media platforms are resistant to do so. These businesses have been built by constantly increasing the amount of time users spend on their platforms. Getting rid of the creators of engaging content that has millions of people hooked is antithetical to the business model. It will require intervention from governments to force tech companies to finally protect their users and society as a whole.
So, what can the rest of us do, while we await state regulation?
Instead of engaging, we should be outweighing the bad with the good. Every time you see a piece of harmful misinformation, share advice or information from a trusted source, like the WHO or BBC, on the same subject. The trend of people celebrating and posting photos of themselves or loved ones receiving the vaccine has been far more effective than any attempt to disprove a baseless claim about Bill Gates or 5G mobile technology. In the attention economy that governs tech platforms, drowning out is a better strategy than rebuttal.
Imran Ahmed is CEO of the Center for Countering Digital Hate.
Fear-mongering is now a billion-dollar industry.
- The Center for Countering Digital Hate found that anti-vaxx groups reach 58 million users on social media, earning the platforms roughly $1B in revenue.
- The Center's founder, Imran Ahmed, says giving anti-vaxxers attention feeds the algorithms, further perpetuating the noise.
- In this interview with Big Think, Ahmed says the best thing we can do is offer credible information to change the algorithms.
Disinformation is rampant. On Monday, a group named "America's Frontline Doctors" held a press conference outside of the U.S. Supreme Court to tout the miracles of hydroxychloroquine, despite contrary evidence. Videos from the event are flooding social media, with the expectable "these doctors are being silenced" rhetoric. This "conspiracy" was boosted when Twitter restricted Donald Trump Jr's account after he shared the doctors claim that masks are unnecessary for preventing the spread of COVID-19.
Who are these doctors wearing branded white coats?
The founder of this group is Simone Gold, a Burbank-based physician with ties to rightwing groups such as ALEC, FreedomWorks Foundation, and Tea Party Patriots (which backed this event). The group's website was launched 12 days ago and has since been taken down. There's Stella Immanuel, a Houston doctor that believes in world leaders are secretly lizards and having sex with witches and demons, and that dreaming has negative consequences. James Todaro also spoke, who, despite claiming to be on the frontline, hasn't seen a patient since 2018. An event of this magnitude wouldn't be complete without one of the Bakersfield doctors, Daniel Erickson.
The follow-up question: How do you even begin to counter such disinformation?
"If you see misinformation, ignore it, because engaging with it helps the platforms accomplish the goal of further spreading it," says Imran Ahmed, founder of the Center for Countering Digital Hate. "Block the person that sent it, then find some good information and share it to try to balance out the algorithmic logic that underpins it."
Ahmed knows the dangers of internet rabbit holes. He founded the CCDH in 2017 to study the proliferation of identity-based hate in digital spaces, though he recently told me they began to focus solely on the coronavirus in early March. The stakes during a pandemic are too high to ignore.
The first result of that effort is the publication of a 34-page report, "The Anti-Vaxx Industry: Big Tech powers and profits from vaccine misinformation." After months of investigation, the team discovered anti-vaxx organizations reach 58 million people on social media. Facebook, Instagram, and YouTube have earned nearly $1 billion in revenue from these groups—and Ahmed was lowballing that sum.
"If we were wrong and our calculations were bad, they would have gone after us. I suspect that because we're incredibly conservative, we may have underestimated it. If they challenged it, they would have to give a real number, and that real number could be substantially higher."
Anti-vaxx organizations earn social media platforms $1B
When I ask Ahmed why these groups spend so much money promoting anti-vaxx disinformation, he laughs while claiming he's not a psychologist. Though he attended medical school, he focuses on the dangers that platforms pose to society. Right now, Big Tech has found a strange bedfellow in the anti-vaxx movement.
"These platforms were not designed for free speech. The timeline is not about reading the most recent thing. It's an algorithmic list of content which prioritizes that information which is most engaging."
The report does reveal interesting clues on the men behind these efforts. The most influential anti-vaxx organizations are funded by osteopath Joseph Mercola, who runs a dietary supplement and medical device company and gives financial support to the National Vaccine Information Center and the Organic Consumers Association, as well as by fund manager Bernard Selz, who ponies up three-fourths of the money that supports the Informed Action Consent Network.
Mercola is easy: he uses fear-mongering to sell supplements, which has put over $100 million into his bank account. Since the start of the pandemic, Mercola has claimed at least 22 vitamins and supplements prevent or cure COVID-19. Vaccines misinformation is just one of his techniques. Previously, he's stated that microwaves alters the chemistry of food, mobile phones cause cancer, and pasteurized milk causes negative health effects.
Selz is harder to figure out. His philanthropic work is extensive thanks to his management of a $500 million fund. His anti-vaxx efforts, including $1.6 million given to discredited physician Andrew Wakefield, which he used to fund the movement's opus, "Vaxxed," appear to be a passion project. Since the Selz family avoids media contact, other reasons may be obscured.
Anti-vaxx sentiment is not new, but social media has given it steroids. As Ahmed notes, anti-vaxxers use the same tactics as other hate groups: don't trust authorities; disseminate conspiracy theories to create confusion; claim to be the sole authority on a topic.
Shortly after quarantine began, health misinformation actors merged with a hardcore group of committed anti-vaxxers to create what Ahmed calls "a coalition of chaos." Over the preceding months, this coalition has tested out a number of ideas: 5G causes COVID-19, which had a moment and then faded; track and trace is part of a global effort to microchip you, which never really caught on; and coronavirus vaccines are part of an elite capitalist conspiracy. The latter is persistent and having real-world consequences.
Dr. Francis Collins, Director of the National Institutes of Health (NIH), holds up a model of COVID-19, known as coronavirus, during a US Senate Appropriations subcommittee hearing on the plan to research, manufacture and distribute a coronavirus vaccine, known as Operation Warp Speed, July 2, 2020 on Capitol Hill in Washington, DC.
Photo by Saul Loeb-Pool/Getty Images
Vaccine hesitancy in the UK is around 30 percent, according to Ahmed. In the U.S., he pegs it at 40 percent, though one poll found that only half of Americans are confident that they'll get a vaccine (if one is created). Enter the danger: herd immunity is different for every virus, though certainly over 50 percent. Ahmed says that confusion over a COVID-19 vaccine could result in the loss of tens of thousands of lives.
As more people turn to social media for medical advice, Ahmed reminds us that platforms are part of the problem. You might think you're doing a public service by debating your anti-vaxxer friend. In reality, you're confirming algorithmic bias.
"The biggest mistake we've made is thinking that public opinion will change their views. Facebook and Twitter and Instagram and Google don't care about your opinion, because you're not their customer. You're their product."
Change agents target weak points, such as advertisers. Ahmed suggests a ruthless, sustained push, similar to the orchestrated effort that resulted in hundreds of brands pulling advertising from Facebook and Instagram. This month-long boycott is over unenforced hate speech policies.
Far from bucking the system, anti-vaxxers are fueling the capitalist greed they claim to decry. Discussing anti-vaxx sentiments, Eula Biss writes in "On Immunity" that, "wealthier nations have the luxury of entertaining fears the rest of the world cannot afford." She compares vaccine refusal as a form of civil disobedience to the trappings of capitalism: anti-vaxxers are more like the 1 percent than the 99 percent. They're looking out for their own self-interest instead of the good of the herd, relying on propaganda promoted by wealthy donors with vested interests as their "research."
This coalition of chaos, in cahoots with the platforms they fund, is capitalizing on vaccine disinformation. The farther from science they lead us, the better. The more enraged we become, the more attention they capture, which is where this new economy thrives.
Elitism has come under fire since the recent wave of populist politics. But when we don't listen to experts, we end up listening to politicians' lies, says Richard Dawkins.
You want expert pilots to fly your planes, top doctors to perform your surgeries, the finest musicians in your orchestra, and for the same reason, you should want experts leading the nation, says Richard Dawkins. There has been a backlash against expert knowledge amid the rising wave of populist politics, but Dawkins doesn't think elitism is the dirty word that people are implying. He contends that not all opinions are equal, and that the leaders of the UK were profoundly misguided in allowing a referendum on Brexit to occur. No average citizen—not even Dawkins himself—was fit to decide on whether to leave a federation of states with so much economic and political importance, and decades of complex history attached to it. And much like the 2016 US presidential election, it was a political movement fueled by misinformation. A representative democracy is one thing, where citizens entrust experts to make national and local decisions, but a referendum democracy seems to Dawkins extremely ill-advised, particularly given that the top Google search in the UK the day after the Brexit vote was 'What is the European Union?'. Dawkins isn't shy: he's an elitist, but a rational one. He affirms he would never want a world where your IQ determines how many votes you get, but he sees the clear benefit of making political decisions based on knowledge rather than emotion or misinformation, deliberate or otherwise. Richard Dawkins' newest book is Science in the Soul: Selected Writings of a Passionate Rationalist.
Your brain stops at the most comforting thought. The truth is somewhere beyond that. Using scientific skepticism as a guide, astrophysicist Lawrence Krauss outlines the questions that critical thinkers ask themselves.
Strange answers aren’t inherently wrong, and satisfying answers aren’t inherently right, says Lawrence Krauss in this critical thinking crash course. The astrophysicist explains how principles of scientific skepticism can be applied beyond the laboratory; it can be a filter for the nonsense and misinformation we encounter each and every day. Here, he establishes a handful of core questions that critical thinkers ask themselves, which can be used to challenge your misconceptions and sense of comfort, question inconsistency, and think past your brain's evolved biases. Piece by piece, you can systematically remove nonsense from your life. Lawrence Krauss' most recent book is The Greatest Story Ever Told -- So Far: Why Are We Here?
Can democracy remain vibrant if the public, and especially children, don't have the tools to distinguish sense from nonsense?
"You can get more information in your cell phone now than you can in any school, but you can also get more misinformation," says American-Canadian theoretical physicist Lawrence Krauss. And he's right: we're in an era where any human can access a previously unimaginable wealth of knowledge. This access has grown faster than our ability to process it critically, however, and what we lack is any decent filter to weed out erroneous or partisan information. Children are the most susceptible to this, and Krauss argues that teaching children how to question information—essentially, how to make children skeptics—may save humanity from a dumbing-down. Lawrence Krauss' most recent book is The Greatest Story Ever Told -- So Far: Why Are We Here?