A 2020 study published in the journal of Psychological Science explores the idea that fake news can actually help you remember real facts better.
- In 2019, researchers at Stanford Engineering analyzed the spread of fake news as if it were a strain of Ebola. They adapted a model for understanding diseases that can infect a person more than once to better understand how fake news spreads and gains traction.
- A new study published in 2020 explores the idea that fake news can actually help you remember real facts better.
- "These findings demonstrate one situation in which misinformation reminders can diminish the negative effects of fake-news exposure in the short term," researchers on the project explained.
Previous studies on misinformation have already paved the way to a better understanding<img type="lazy-image" data-runner-src="https://assets.rebelmouse.io/eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpbWFnZSI6Imh0dHBzOi8vYXNzZXRzLnJibC5tcy8yNDU1NzQ4NC9vcmlnaW4uanBnIiwiZXhwaXJlc19hdCI6MTYxNjE2Mjg1Nn0.hs_xHktN1KXUDVoWpHIVBI2sMJy6aRK6tvBVFkqmYjk/img.jpg?width=1245&coordinates=0%2C800%2C0%2C823&height=700" id="fc135" class="rm-shortcode" data-rm-shortcode-id="246bb1920c0f40ccb15e123914de1ab1" data-rm-shortcode-name="rebelmouse-image" alt="fake news concept of misinformation and fake news in the media" />
How does misinformation spread?
Credit: Visual Generation on Shutterstock<p><strong>What is the "continued-influence" effect?</strong></p><p>A challenge in using corrections effectively is that repeating the misinformation can have negative consequences. Research on this effect (referred to as "continued-influence") has shown that information presented as factual that is later deemed false can still contaminate memory and reasoning. The persistence of the continued-influence effect has led researchers to generally recommend avoiding repeating misinformation. </p><p>"Repetition increases familiarity and believability of misinformation," <a href="https://engineering.stanford.edu/magazine/article/how-fake-news-spreads-real-virus" target="_blank" rel="noopener noreferrer">the study explains</a>.</p><p><strong>What is the "familiarity-backfire" effect?</strong></p><p>Studies of this effect have shown that increasing misinformation familiarity through extra exposure to it leads to misattributions of fluency when the context of said information cannot be recalled. <a href="https://journals.sagepub.com/doi/10.1177/0956797620952797#" target="_blank" rel="noopener noreferrer">A 2017 study</a> examined this effect in myth correction. Subjects rated beliefs in facts and myths of unclear veracity. Then, the facts were affirmed and myths corrected and subjects again made belief ratings. The results suggested a role for familiarity but the myth beliefs remained below pre-manipulation levels. </p>
New research into fake news has uncovered something interesting about misinformation<span style="display:block;position:relative;padding-top:56.25%;" class="rm-shortcode" data-rm-shortcode-id="ddeac998508e09fb9d1b4691d6c20d28"><iframe type="lazy-iframe" data-runner-src="https://www.youtube.com/embed/bJ5qUx1WOsg?rel=0" width="100%" height="auto" frameborder="0" scrolling="no" style="position:absolute;top:0;left:0;width:100%;height:100%;"></iframe></span><p>A 2020 study published in the journal of <a href="https://journals.sagepub.com/doi/10.1177/0956797620952797" target="_blank">Psychological Science</a> explores the idea that fake news can actually help you remember real facts better.</p><p>Fake news exposure can cause misinformation to be mistakenly remembered and believed. In two experiments, the team (led by Christopher N. Wahlheim) examined whether reminders of misinformation could do the opposite: improve memory for and beliefs in corrections to that fake news. </p><p>The study had subjects reading factual statements and then separate misinformation statements taken from news websites. Then, the subjects read statements that corrected the misinformation. Some misinformation reminders appeared before some corrections but not all. Then, subjects were asked to recall facts, indicate their belief in those recalls, and indicate whether they remembered the corrections and misinformation. </p><p>The results of the study showed that reminders increased recall and belief accuracy. These benefits were greater both when misinformation was recalled and when the subjects remembered that corrections had occurred. </p><p>Researchers on the project <a href="https://journals.sagepub.com/doi/10.1177/0956797620952797" target="_blank" rel="noopener noreferrer">explained</a>: "These findings demonstrate one situation in which misinformation reminders can diminish the negative effects of fake-news exposure in the short term."</p><p><strong>The conclusion: fake-news misinformation that was corrected by fact-checked information can improve both memory and belief accuracy in real information.</strong></p><p>"We examined the effects of providing misinformation reminders before fake-news corrections on memory and belief accuracy. Our study included everyday fake-news misinformation that was corrected by fact-check-verified statements. Building on research using fictional, yet naturalistic, event narratives to show that reminders can counteract misinformation reliance in memory reports," <a href="https://journals.sagepub.com/doi/10.1177/0956797620952797" target="_blank" rel="noopener noreferrer">the researchers</a> explained.</p><p>"It suggests that there may be benefits to learning how someone was being misleading. This knowledge may inform strategies that people use to counteract high exposure to misinformation spread for political gain," <a href="https://www.eurekalert.org/pub_releases/2020-10/afps-rtf101620.php" target="_blank">Wahlheim said</a>.</p>
MIT Professor Sinan Aral's new book, "The Hype Machine," explores the perils and promise of social media in a time of discord.
Are you on social media a lot? When is the last time you checked Twitter, Facebook, or Instagram? Last night? Before breakfast? Five minutes ago?
What responsibility should government authorities and Big Tech take in policing the spread of sedition-oriented content?
- It can be hard to believe that comical images online are enough to rile people up enough that they'll actually attack.
- Originating in the darker corners of the internet, Bugaloo is now prominent on mainstream online platforms like Facebook and Instagram.
- The Network Contagion Research Institute's recent series of Contagion and Ideology Reports uses machine learning to examine how memes spread.
What’s in a meme?<p>The Network Contagion Research Institute's recent series of <a href="https://ncri.io/reports/" target="_blank" rel="noopener noreferrer">Contagion and Ideology Reports </a>leverages machine learning to examine how memes spread. The idea is to unearth a better understanding of the role memes play in encouraging real-world violence. </p><p>So what exactly is a meme, and how did it become a tool for weaponizing the web? Originally coined by Oxford biologist Richard Dawkins in his 1976 book "<a href="https://www.theguardian.com/science/2016/may/29/selfish-gene-40-years-richard-dawkins-do-ideas-stand-up-adam-rutherford" target="_blank" rel="noopener noreferrer">The Selfish Gene</a><em>," </em>the term is defined there as "a unit of cultural transmission" that spreads like a virus from host to host and conveys an idea which changes the host's worldview. </p><p>Dawkins echoes "Naked Lunch" author William Burroughs's concept of <a href="https://www.twisttheknife.com/william-s-burroughs-playback-from-eden-to-watergate/" target="_blank" rel="noopener noreferrer">written language as a virus</a> that infects the reader and consequently builds realities that are brought to fruition through the act of speech. In this sense, a meme is an idea that spreads by iterative, collaborative imitation from person to person within a culture and carries symbolic meaning.</p><p>The militant alt-right's use of internet memes <a href="https://journals.openedition.org/angles/369" target="_blank" rel="noopener noreferrer">follows this pattern</a>. They are carefully designed on underground social media using codes, before attaining approval by like-minded users that disseminate these messages on mainstream platforms like Twitter or Facebook.</p><p>Sometimes these messages are lifted from innocuous sources and altered to convey hate. Take for example Pepe the Frog. Initially created by cartoonist Matt Furie as a mascot for slackers, Pepe was appropriated and altered by racists and homophobes until the Anti-Defamation League branded the frog as a <a href="https://www.adl.org/education/references/hate-symbols/pepe-the-frog" target="_blank">hate symbol</a> in 2016. This, of course, hasn't stopped public figures from sharing variations of Pepe's likeness in subversive social posts.</p>
Slenderman and the Boogaloo Bois<p>Indeed, these types of memes have a tricky way of moving from the shadows of the internet to the mainstream, picking up supporters along the way – including those who are willing to commit horrific crimes.</p><p>This phenomenon extends well beyond politics. Take, for example, the Slenderman. Created as a shadowy figure for online horror stories, Slenderman achieved mainstream popularity and a cult-like following. When two girls tried to <a href="https://www.washingtonpost.com/news/the-intersect/wp/2014/06/03/the-complete-terrifying-history-of-slender-man-the-internet-meme-that-compelled-two-12-year-olds-to-stab-their-friend/" target="_blank">stab their friend to death</a> to prove that Slenderman was real in 2014, the power of the meme to prompt violence became a national conversation. </p><p>Slenderman creator Victor Surge told Know Your Meme that <a href="https://knowyourmeme.com/memes/slender-man" target="_blank" rel="noopener noreferrer">he never thought</a> the character would spread beyond the fringe Something Awful forums. "An urban legend requires an audience ignorant of the origin of the legend. It needs unverifiable third and forth [sic] hand (or more) accounts to perpetuate the myth," he explained. "On the Internet, anyone is privy to its origins as evidenced by the very public Somethingawful thread. But what is funny is that despite this, it still spread. Internet memes are finicky things and by making something at the right place and time it can swell into an 'Internet Urban Legend.'"</p><p>While Slenderman is obviously a fiction, political memes walk many of the same fine lines – and have galvanized similarly obsessed followers to commit attacks in the real world.</p>
Understanding digital indoctrination<p><a href="http://www.ncri.io/" target="_blank" rel="noopener noreferrer">The Network Contagion Research Institute</a> (NCRI) uses advanced data analysis to expose hate on social media. The institute's scientists work with experts at the ADL's <a href="https://www.adl.org/who-we-are/our-organization/advocacy-centers/center-on-extremism" target="_blank">Center on Extremism </a>(COE) to track hateful propaganda online and offer strategies to combat this phenomenon. By better understanding how memes spread hateful propaganda, the NCRI is helping law enforcement, social media companies and citizens get better at preventing online chatter about violence from becoming real action.</p><p>Founded by Princeton's Dr. Joel Finkelstein, the NCRI is finding that the spread of hate online can be examined using epidemiological models of how viruses spread, only applied to language and ideas, much as Burroughs imagined decades ago. </p><p>Now that the internet has obviated the need for people to meet in person or communicate directly, recruiting and deploying members of violent groups is easier than ever before.</p><p>Before social media, as <a href="https://njjewishnews.timesofisrael.com/princeton-scientist-connects-web-hate-and-the-acts-it-spawns/" target="_blank">Finkelstein reminds us</a>, organizing underground hate groups was harder. "You would have to have an interpersonal organization to train and cultivate those people," he said. "With the advance of the internet, those people can find each other; what it lacks in depth it makes up in reach. The entire phenomenon can happen by itself without a trace of anyone being groomed."</p>
Can anything be done?<p>Memes generate hysteria in part by being outrageous enough for people to share – even the mainstream media. This is a concrete strategy called <a href="https://www.technologyreview.com/2019/10/24/132228/political-war-memes-disinformation/" target="_blank">amplification</a>. The extreme alt-right manages to reach mainstream audiences with racialist and supremacist memes, even as these messages are being denounced.</p><p>Content spreads virally whether those spreading it support the embedded ideology or not, making it difficult to intervene and prevent meme-incited violence.</p><p>Some people may unwittingly amplify messaging designed to spread violence, mistaking memes like ACAB (all cops are bastards) for harmless if somewhat dark humor.</p>
NCRI<p>So what can be done about the meme wars, and what onus is on law enforcement and Big Tech?</p><p>Finkelstein recommends a three-pronged approach to combating the violent outcome of subversive memes.</p><ol><li>Push for more stringent boundaries defining civility online using technology.</li><li>Educate courts, lawmakers and civil institutions about how extreme online communities operate and create better industry standards to regulate these groups.</li><li>Bring the federal government, including the FBI and Homeland Security, up to speed on the new reality so they can intervene as necessary.</li></ol><p>But not everyone thinks Big Tech or the law have much ground to stand on in combating viral hate. Tech companies haven't gone the mile in truly enforcing stricter standards for online content, but some companies are closing down accounts associated with <a href="https://www.voanews.com/silicon-valley-technology/can-shutting-down-online-hate-sites-curb-violence" target="_blank">extremist leaders' websites</a> and their movements. </p><p>But this is largely tilting at windmills. The proliferation of memes online spreading ideology virally might be too big to combat without dramatically limiting freedom of speech online. Facebook and YouTube have <a href="https://bigthink.com/politics-current-affairs/social-media-2020-us-election?rebelltitem=4#rebelltitem4" target="_self">banned thousands of profiles</a>, but there's no way of knowing how many remain – or are being added every day. This leaves the danger of meme radicalization and weaponization lurking beneath the surface even as we head into elections. </p>
What we stand to lose<p>Boogaloo and other seditious groups have been empowered by recent crises, including the pandemic and the anti-racist protests sweeping the country. Greater numbers of these community members are showing up at anti-quarantine protests and are disrupting other gatherings, and the press they're getting – this article included – is only bringing more attention to their violent messages.</p><p>Extremist memes continue to circulate, as America heads to the polls amidst a global pandemic and widespread civil unrest. It stands to reason that more blood will be spilled. Getting a handle on Twitter handles that spread hate and shutting down groups whose sole purpose is sowing the seeds of brutality is vital – the only question that remains is how. </p>
New research reveals the extent to which groupthink bias is increasingly being built into the content we consume.
- When ownership of news sources is concentrated into the hands of just a handful of corporations, the kind of reporting that audiences get to see is limited and all the more likely to be slanted by corporate interests.
- Newsroom employment has declined dramatically over the past decade, and this has only been exacerbated by the COVID-19 pandemic.
- The findings of a new University of Illinois study suggest that Washington journalists operate in insular microbubbles that are vulnerable to consensus seeking. If the reporters on the Hill are feeding America copycat news information, we are all at risk of succumbing to groupthink.
Deregulation and the rise of new media<p>Up until the 1980s, the federal government worked to <a href="https://billmoyers.com/story/media-consolidation-should-anyone-care/" target="_blank">prevent media consolidation</a> in partnership with the FCC. But under Reagan, many of the existing regulations were shelved, giving corporations greater leeway in acquiring local news outlets.</p><p>The deregulatory trend persisted, arguably culminating with Clinton's 1996 <a href="http://www.commoncause.org/research-reports/National_050905_Fallout_From_The_Telecommunications_Act_2.pdf" target="_blank">Telecommunications Act</a>. A watershed moment for news media homogeneity, the law essentially permitted corporations to amass large numbers of local newspapers and news stations, granting hegemons access to almost every household in America.</p>Traditional news outlets have been suffering for years with the rise of cable networks and the advent of web publishing. With free content constantly available online, many outlets have given up the ghost and shut down print and broadcast. Newsroom employment has <a href="https://www.pewresearch.org/fact-tank/2020/04/20/u-s-newsroom-employment-has-dropped-by-a-quarter-since-2008/" target="_blank" rel="noopener noreferrer">declined dramatically</a> over the past decade, and this has only been <a href="https://www.theguardian.com/media/2020/apr/09/coronavirus-us-newspapers-impact" target="_blank" rel="noopener noreferrer">exacerbated by the COVID-19</a> pandemic.
Record distrust in the media industry<p>There's never been a time in American history when the sources of information were so doubted. Even <a href="https://www.cjr.org/special_report/the-fall-rise-and-fall-of-media-trust.php" target="_blank">after Watergate</a>, trust in the media stood at 74 percent. At last count, <a href="http://www.gallup.com/poll/1597/confidence-institutions.aspx" target="_blank">Gallup found</a> that just 20 percent of American have confidence in print and broadcast journalism, two more percentage points than TV news received in the same poll.</p><p>There is a growing concern that news media is biased, that reporters don't just report but curate and editorialize, and that the money behind the news has an impact on what is reported and how. This suspicion is fodder for conspiracy theorists who vilify the mainstream media and offer alternative facts to what is available. Playing on people's fears, alternative outlets online are picking up steam and spreading misinformation (and deliberate disinformation). </p><p>For example, although many leading news outlets – including The Washington Post, The Independent, The New York Times and even Fox News – independently debunked the "Pizzagate" conspiracy as soon as it began to spread in 2016, media coverage of the story has steadily risen throughout the past year.</p>
Fewer journalists means fewer voices<p>One factor in Americans' diminishing trust in the news is that there are fewer journalists, especially local journalists, that viewers can turn to as distinct voices. Lack of local coverage and the rise of homogeneous, sensationalist journalism are perpetuating distrust and driving many Americans to look for news elsewhere – and leaving them susceptible to manipulation. </p> <p>As mentioned earlier, print media has been <a href="https://www.poynter.org/business-work/2020/here-are-the-newsroom-layoffs-furloughs-and-closures-caused-by-the-coronavirus/" target="_blank" rel="noopener noreferrer">hit hard</a>, and broadcast journalism is also feeling the pain. With lots of newsroom layoffs and closures, having fewer journalists means exposure to fewer perspectives. This has created a situation where there is less original reporting, with more repurposing of others' stories and less fact checking, thereby contributing to the spread of misinformation. </p><p>Lack of local news has far reaching effects on democracy. One study from <a href="https://www.kcl.ac.uk/policy-institute/assets/cmcp/local-news.pdf" target="_blank" rel="noopener noreferrer">King's College London</a> found that communities without local community news outlets have less public engagement and greater distrust of public institutions. </p><p>"We can all have our own social media account, but when local papers are depleted or in some cases simply don't exist, people lose a communal voice," Martin Moore, the author of the study, <a href="https://www.theguardian.com/media/2019/sep/29/local-newspapers-closing-down-communities-withering" target="_blank">remarked</a>. "They feel angry, not listened to and more likely to believe malicious rumour."</p>
Mainstream media and fake news<p> Ironically, while the erosion of mainstream media is contributing to the rise of misinformation and alternative news, when outlets attempt to expose fake news, it often backfires, <a href="https://www.tandfonline.com/doi/full/10.1080/23808985.2020.1759443" target="_blank" rel="noopener noreferrer">propelling its dissemination</a>. Plenty of news consumers first encounter conspiracies and disinformation on the news, but rather than building trust, <a href="https://thehill.com/homenews/media/394352-poll-72-percent-say-traditional-outlets-report-news-they-know-to-be-fake-false" target="_blank" rel="noopener noreferrer">72 percent of Americans</a> believe that traditional outlets are the ones with the agenda. </p><p> And who can blame them? The parroting of identical headlines across consolidated newsrooms doesn't help instill confidence. Take for example this compilation of "local news" talking heads repeating the same script: </p><p> <iframe width="560" height="315" src="https://www.youtube.com/embed/_fHfgU8oMSo" frameborder="0" allow="accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture" allowfullscreen=""></iframe> </p><p> All of these reporters are part of the <a href="http://sbgi.net/" target="_blank" rel="noopener noreferrer">Sinclair Broadcast Group</a>. It's hard to deny the dangers of corporate consolidation of news media when confronted with damning clips like this, and Sinclair is out for even more control. An attempted acquisition in 2017 would have put Sinclair stations in 72 percent of households with a television, but the deal was <a href="https://www.niemanlab.org/2019/08/when-it-comes-to-the-consolidation-of-local-news-companies-american-worry-a-lot-more-about-political-bias-than-about-newsroom-cutbacks/" target="_blank" rel="noopener noreferrer">struck down</a> by Tribune. </p><p> This is a huge amount of influence for one company, or person, to have. In an election year, this is even more pertinent. </p>
Social media algorithms and information bubbles<p>Just as more Americans distrust mainstream news, the majority get their facts on social media. This wouldn't be a problem per se, but the way online news is delivered to consumers <a href="https://bigthink.com/politics-current-affairs/social-media-2020-us-election" target="_self">perpetuates echo chambers</a> and information bubbles. </p><p>Social media deliberately surfaces content to individuals that confirm their views and echo previously viewed or shared content. The algorithms amplify biases and screen out dissenting opinions. Before you know it, other voices are blocked from your feed, leaving you in an echo chamber. This doesn't just apply to news, but also to targeted ads and campaigns designed for microcommunities with shared attributes. </p> <p>It has never been easier to convince so many people to believe stories that aren't necessarily true – lack of trust, consolidation of news outlets, the contraction of journalism, and the pervasiveness of web news is creating isolated information bubbles that many of us now find ourselves stuck in. People naturally want to read news that confirms their beliefs. </p><p>When infotainment is commoditized and served up for quick and easy consumption, critical thinking takes a back seat.</p>
Finding the facts on your own<p>With quantified evidence of journalistic groupthink and information bubbles among those who consume political information, is there hope for open dialogue and a variety of perspectives? </p><p>Ultimately, yes. However, this won't likely be coming from the news media. Choosing not to be misled and seeking out a variety of opinions and perspectives is something that each individual will likely have to do on their own, even if it means questioning one's fundamental beliefs. This entails verifying the information you read, actively engaging with people outside of your comfortable echo chamber, and even changing your mind when confronted with hard evidence. </p><p>Finding the facts on your own can be tough, but if we can't rely on the news to give us the news, there's no other choice. </p>
Will nefarious players use social media to sway public opinion again this November?
- The effective subversion of social media during the 2016 U.S. presidential election was unprecedented and highlighted the major role social media plays in politics.
- Today, it's harder to repurpose private social data than it was four years ago, but paid and organic audience microtargeting continues.
- Fake news and disinformation still spread freely. Networks of fake accounts are being taken down, but there's no way to know what percentage continue to operate. Meanwhile, the same principles power the news feed algorithms, surfacing partisan content and reaffirming audience biases.
Microtargeting will be a strategy<p>It didn't take long for pundits to recognize how much had changed in manipulative electioneering with the rise of social media. Quickly after the results were announced, it became clear that the 2016 election was a watershed moment in how targeted propaganda can be disseminated using advanced computational techniques.</p><p>Consulting firm Cambridge Analytica, for example, bypassed Facebook rules by creating an app requiring Facebook account login, which, in turn, mined vast amounts of personal information about the users and their friends. This information was then shared with Cambridge Analytica's network of contacts, despite a ban on downloading private data from Facebook's ecosystem and sharing it with third parties. </p>
Shutterstock<p>The firm then leveraged the data to generate microtargeted political ad campaigns, according to former Cambridge Analytica staffer and <a href="https://bigthink.com/podcast/cambridge-analytica" target="_self">whistleblower Christopher Wylie</a>. Facebook suspended Cambridge Analytica, but the platform won't stop <a href="https://www.theguardian.com/technology/2020/jan/09/facebook-political-ads-micro-targeting-us-election" target="_blank" rel="noopener noreferrer">microtargeting campaigns</a> on its platform. To make things worse, Cambridge Analytica analysts are already <a href="https://www.politico.com/news/2020/02/19/trump-cambridge-analytica-oczkowski-114075" target="_blank" rel="noopener noreferrer">back at work</a>.</p><p>One trend to watch out for in microtargeting is the rise of nanoinfluencers. These small time influencers have far fewer followers, but target a highly tailored audience. Political marketers <a href="https://www.wired.com/story/opinion-nanoinfluencers-are-slyly-barnstorming-the-2020-election/" target="_blank" rel="noopener noreferrer">will leverage nanoinfluencers</a> in tandem with other forms of social media manipulation to digitally knock on the doors of those most likely to be swayed by their canvassing. But knocking on the right doors requires data.</p><p>This ease of access to data and the continued popular strategy of <a href="https://www.cbinsights.com/research/what-is-psychographics/" target="_blank">psychographic segmentation</a> means that unethical use of user information will likely still play a role in the 2020 elections.</p>
Foreign influence and disinformation is still a threat<p>Dividing voters into narrow segments and then whispering targeted messages into their ears was also central to <a href="https://www.wired.com/story/russian-facebook-ads-targeted-us-voters-before-2016-election/" target="_blank">Russian trolls' strategy</a> of spreading disinformation via social media in an effort to influence the outcome of the 2016 elections. It's estimated that <a href="https://www.nytimes.com/2017/10/30/technology/facebook-google-russia.html" target="_blank">126 million</a> American Facebook users were targeted by Russian content over the course of their subversive campaign.</p><p>Aside from fake news, jackers also skewed the elections by illegally obtaining and then releasing private information and documents amidst tons of hype. The Wikileaks scandal gave hackers the opportunity to <a href="https://www.bbc.com/news/world-us-canada-37639370" target="_blank">discredit Clinton</a> and the DNC leadership by leaking emails days before the party's convention. Likewise, the circumstances surrounding FBI Director James Comey's October 28 letter to Congress, discussion of which dominated social media news feeds for weeks, will likely never be revealed.</p><p>Social media big hitters have stated before Congress that they are taking active steps to prevent the spread of disinformation and ensure protection from foreign influence, but stopping deceptive networks is a constant battle. In 2019, Facebook removed 50 networks from foreign actors, including Iran and Russia, that were actively spreading <a href="https://www.usatoday.com/story/news/politics/2020/06/18/facebook-twitter-describe-efforts-fight-fake-posts-before-election/3215630001/" target="_blank" rel="noopener noreferrer">fake information</a>, and from January to June of this year, another 18 have been removed. Just this month, Facebook removed dozens of troll accounts <a href="https://www.theverge.com/2020/8/8/21359823/facebook-trolls-black-trump-supporters-fake-accounts-instagram" target="_blank" rel="noopener noreferrer">based out of Romania</a> for coordinated inauthentic behavior. </p><p>Both Twitter and Facebook have begun flagging posts from public figures when disinformation is contained therein, although the way these flags look differs considerably.</p>
Facebook and Twitter<p>Facebook, which eventually instituted new ad transparency policies, has also <a href="https://www.cnbc.com/2020/06/04/facebook-will-block-ads-from-state-controlled-media-outlets.html" target="_blank" rel="noopener noreferrer">banned ads</a> from state-controlled media outlets, for example from Russia or China, from their platform. This won't stop governments from accessing more illicit means of spreading propaganda on Facebook and into the minds of America's voters — identifying proxies can be tricky. Plus, they pretty much <a href="https://www.washingtonpost.com/politics/2019/11/18/how-russia-weaponized-social-media-got-caught-escaped-consequences/" target="_blank" rel="noopener noreferrer">got away with it</a> last time and even convinced the mainstream media to pick up some of the fake stories.</p><p>Well into election season, the spread of disinformation and the potential involvement of foreign influence has already begun. And it isn't limited to the election. These tactics are also being used to <a href="https://www.businessinsider.com/coronavirus-trump-us-disinformation-foreign-interference-2020-4" target="_blank" rel="noopener noreferrer">spread lies</a> about the coronavirus pandemic and the racial protests to incite division and unrest. Even with constant vigilance, it's likely that information war-inclined countries with the tech knowhow and the will to do harm could influence the upcoming election, and that should concern us all. </p>
It’s still too easy to weaponize algorithms<p>After the Cambridge Analytica scandal, social platforms made efforts to change their algorithms and policies to prevent manipulation. But it just isn't enough.</p><p>On Twitter, complete anonymity and the proliferation of automated bots and fake accounts were integral to the 2016 campaign and continue to outpace any efforts the platform makes to curb disinformation. Just last month, a <a href="https://www.nytimes.com/2020/07/16/us/politics/twitter-hack.html" target="_blank">Twitter hack</a> into blue check accounts, including those of Obama and Biden, showed that this year's election is still at risk. </p><p>Hackers have a rapt audience if they manage to make it in. Close to <a href="http://www.pewinternet.org/2018/03/01/social-media-use-in-2018/" target="_blank" rel="noopener noreferrer">70 percent of American adults</a> are on Facebook and millions are on Twitter, most of them every day. As part of the 2016 campaign, foreign agents published more than <a href="https://www.judiciary.senate.gov/imo/media/doc/10-31-17%20Edgett%20Testimony.pdf" target="_blank" rel="noopener noreferrer">131,000 tweets</a> and uploaded over<a href="https://storage.googleapis.com/gweb-uniblog-publish-prod/documents/google_US2016election_findings_1_zm64A1G.pdf" target="_blank" rel="noopener noreferrer"> 1,100 videos</a> to YouTube. Now, with the global domination of TikTok, there are even more ways to target voters.</p><p>Digital propaganda has only improved with time, and despite the valiant efforts of Dr. Frankenstein, the monster social media created is not easily subdued. This month YouTube <a href="https://techcrunch.com/2020/08/05/youtube-bans-thousands-of-chinese-accounts-to-combat-coordinated-influence-operations/" target="_blank" rel="noopener noreferrer">banned thousands of accounts</a> for a coordinated influence campaign, Facebook <a href="https://about.fb.com/news/2020/07/removing-political-coordinated-inauthentic-behavior/" target="_blank" rel="noopener noreferrer">shut down even more</a> and has implemented additional encryption and privacy policies since 2016. On Snapchat, Reddit, Instagram and more, malicious manipulation is just a click away, and there is only so much community standards and terms of service agreements can do to stop it. </p>
Echo chambers and growing mistrust<p>Since 2016, more Americans than ever mistrust mainstream news and get their facts on social media making misinformation and deliberate disinformation a concern in 2020. By 2016, only <a href="https://www.bbvaopenmind.com/en/articles/the-past-decade-and-future-of-political-media-the-ascendance-of-social-media/" target="_blank" rel="noopener noreferrer">half of Americans</a> watched TV for news, while those who found their news online reached 43 percent – up 7 percent from the year before. </p><p>The problem isn't that people are getting their news from the internet, it's that the internet is the perfect forum for spreading fake news. And a rapidly growing number of Americans will take at least some of this fake news to be fact. Furthermore, believing misinformation is actually linked to a <a href="https://www.theatlantic.com/ideas/archive/2019/06/fake-news-republicans-democrats/591211/" target="_blank" rel="noopener noreferrer">decreased likelihood</a> of being receptive to actual information. </p>
Shutterstock<p>This demonstrates how <a href="https://bigthink.com/mind-brain/non-partisan-brain" target="_self">the spread of partisan messaging</a> is amplified by the proliferation of echo chambers online. Americans who engage with partisan content are likely choosing to do so because the story confirms their existing ideologies. In turn, social media algorithms exacerbate this tendency by only showing content that is similar to what we engage with. This<a href="https://theconversation.com/feedback-loops-and-echo-chambers-how-algorithms-amplify-viewpoints-107935" target="_blank"> algorithmic amplification</a> of people's confirmation biases screens out dissenting opinions and reinforces the most marginal viewpoints.</p><p>Extreme online groups leverage this tendency to market to <a href="https://www.tandfonline.com/doi/abs/10.1080/17457289.2018.1434784" target="_blank">homogeneous networks</a>. Recent research demonstrates that social networking sites such as Facebook or Twitter can facilitate this selection into homogenous networks, increasing polarization and solidifying misinformed beliefs. The fundamental principles that inform these algorithms hasn't changed since 2016, and we're already seeing them at play, fomenting polarization in 2020, as civil discontent and the pandemic have propelled divisiveness and discontent. </p>