What responsibility should government authorities and Big Tech take in policing the spread of sedition-oriented content?
- It can be hard to believe that comical images online are enough to rile people up enough that they'll actually attack.
- Originating in the darker corners of the internet, Bugaloo is now prominent on mainstream online platforms like Facebook and Instagram.
- The Network Contagion Research Institute's recent series of Contagion and Ideology Reports uses machine learning to examine how memes spread.
What’s in a meme?<p>The Network Contagion Research Institute's recent series of <a href="https://ncri.io/reports/" target="_blank" rel="noopener noreferrer">Contagion and Ideology Reports </a>leverages machine learning to examine how memes spread. The idea is to unearth a better understanding of the role memes play in encouraging real-world violence. </p><p>So what exactly is a meme, and how did it become a tool for weaponizing the web? Originally coined by Oxford biologist Richard Dawkins in his 1976 book "<a href="https://www.theguardian.com/science/2016/may/29/selfish-gene-40-years-richard-dawkins-do-ideas-stand-up-adam-rutherford" target="_blank" rel="noopener noreferrer">The Selfish Gene</a><em>," </em>the term is defined there as "a unit of cultural transmission" that spreads like a virus from host to host and conveys an idea which changes the host's worldview. </p><p>Dawkins echoes "Naked Lunch" author William Burroughs's concept of <a href="https://www.twisttheknife.com/william-s-burroughs-playback-from-eden-to-watergate/" target="_blank" rel="noopener noreferrer">written language as a virus</a> that infects the reader and consequently builds realities that are brought to fruition through the act of speech. In this sense, a meme is an idea that spreads by iterative, collaborative imitation from person to person within a culture and carries symbolic meaning.</p><p>The militant alt-right's use of internet memes <a href="https://journals.openedition.org/angles/369" target="_blank" rel="noopener noreferrer">follows this pattern</a>. They are carefully designed on underground social media using codes, before attaining approval by like-minded users that disseminate these messages on mainstream platforms like Twitter or Facebook.</p><p>Sometimes these messages are lifted from innocuous sources and altered to convey hate. Take for example Pepe the Frog. Initially created by cartoonist Matt Furie as a mascot for slackers, Pepe was appropriated and altered by racists and homophobes until the Anti-Defamation League branded the frog as a <a href="https://www.adl.org/education/references/hate-symbols/pepe-the-frog" target="_blank">hate symbol</a> in 2016. This, of course, hasn't stopped public figures from sharing variations of Pepe's likeness in subversive social posts.</p>
Slenderman and the Boogaloo Bois<p>Indeed, these types of memes have a tricky way of moving from the shadows of the internet to the mainstream, picking up supporters along the way – including those who are willing to commit horrific crimes.</p><p>This phenomenon extends well beyond politics. Take, for example, the Slenderman. Created as a shadowy figure for online horror stories, Slenderman achieved mainstream popularity and a cult-like following. When two girls tried to <a href="https://www.washingtonpost.com/news/the-intersect/wp/2014/06/03/the-complete-terrifying-history-of-slender-man-the-internet-meme-that-compelled-two-12-year-olds-to-stab-their-friend/" target="_blank">stab their friend to death</a> to prove that Slenderman was real in 2014, the power of the meme to prompt violence became a national conversation. </p><p>Slenderman creator Victor Surge told Know Your Meme that <a href="https://knowyourmeme.com/memes/slender-man" target="_blank" rel="noopener noreferrer">he never thought</a> the character would spread beyond the fringe Something Awful forums. "An urban legend requires an audience ignorant of the origin of the legend. It needs unverifiable third and forth [sic] hand (or more) accounts to perpetuate the myth," he explained. "On the Internet, anyone is privy to its origins as evidenced by the very public Somethingawful thread. But what is funny is that despite this, it still spread. Internet memes are finicky things and by making something at the right place and time it can swell into an 'Internet Urban Legend.'"</p><p>While Slenderman is obviously a fiction, political memes walk many of the same fine lines – and have galvanized similarly obsessed followers to commit attacks in the real world.</p>
Understanding digital indoctrination<p><a href="http://www.ncri.io/" target="_blank" rel="noopener noreferrer">The Network Contagion Research Institute</a> (NCRI) uses advanced data analysis to expose hate on social media. The institute's scientists work with experts at the ADL's <a href="https://www.adl.org/who-we-are/our-organization/advocacy-centers/center-on-extremism" target="_blank">Center on Extremism </a>(COE) to track hateful propaganda online and offer strategies to combat this phenomenon. By better understanding how memes spread hateful propaganda, the NCRI is helping law enforcement, social media companies and citizens get better at preventing online chatter about violence from becoming real action.</p><p>Founded by Princeton's Dr. Joel Finkelstein, the NCRI is finding that the spread of hate online can be examined using epidemiological models of how viruses spread, only applied to language and ideas, much as Burroughs imagined decades ago. </p><p>Now that the internet has obviated the need for people to meet in person or communicate directly, recruiting and deploying members of violent groups is easier than ever before.</p><p>Before social media, as <a href="https://njjewishnews.timesofisrael.com/princeton-scientist-connects-web-hate-and-the-acts-it-spawns/" target="_blank">Finkelstein reminds us</a>, organizing underground hate groups was harder. "You would have to have an interpersonal organization to train and cultivate those people," he said. "With the advance of the internet, those people can find each other; what it lacks in depth it makes up in reach. The entire phenomenon can happen by itself without a trace of anyone being groomed."</p>
Can anything be done?<p>Memes generate hysteria in part by being outrageous enough for people to share – even the mainstream media. This is a concrete strategy called <a href="https://www.technologyreview.com/2019/10/24/132228/political-war-memes-disinformation/" target="_blank">amplification</a>. The extreme alt-right manages to reach mainstream audiences with racialist and supremacist memes, even as these messages are being denounced.</p><p>Content spreads virally whether those spreading it support the embedded ideology or not, making it difficult to intervene and prevent meme-incited violence.</p><p>Some people may unwittingly amplify messaging designed to spread violence, mistaking memes like ACAB (all cops are bastards) for harmless if somewhat dark humor.</p>
NCRI<p>So what can be done about the meme wars, and what onus is on law enforcement and Big Tech?</p><p>Finkelstein recommends a three-pronged approach to combating the violent outcome of subversive memes.</p><ol><li>Push for more stringent boundaries defining civility online using technology.</li><li>Educate courts, lawmakers and civil institutions about how extreme online communities operate and create better industry standards to regulate these groups.</li><li>Bring the federal government, including the FBI and Homeland Security, up to speed on the new reality so they can intervene as necessary.</li></ol><p>But not everyone thinks Big Tech or the law have much ground to stand on in combating viral hate. Tech companies haven't gone the mile in truly enforcing stricter standards for online content, but some companies are closing down accounts associated with <a href="https://www.voanews.com/silicon-valley-technology/can-shutting-down-online-hate-sites-curb-violence" target="_blank">extremist leaders' websites</a> and their movements. </p><p>But this is largely tilting at windmills. The proliferation of memes online spreading ideology virally might be too big to combat without dramatically limiting freedom of speech online. Facebook and YouTube have <a href="https://bigthink.com/politics-current-affairs/social-media-2020-us-election?rebelltitem=4#rebelltitem4" target="_self">banned thousands of profiles</a>, but there's no way of knowing how many remain – or are being added every day. This leaves the danger of meme radicalization and weaponization lurking beneath the surface even as we head into elections. </p>
What we stand to lose<p>Boogaloo and other seditious groups have been empowered by recent crises, including the pandemic and the anti-racist protests sweeping the country. Greater numbers of these community members are showing up at anti-quarantine protests and are disrupting other gatherings, and the press they're getting – this article included – is only bringing more attention to their violent messages.</p><p>Extremist memes continue to circulate, as America heads to the polls amidst a global pandemic and widespread civil unrest. It stands to reason that more blood will be spilled. Getting a handle on Twitter handles that spread hate and shutting down groups whose sole purpose is sowing the seeds of brutality is vital – the only question that remains is how. </p>
While it can often be tempting to unfollow friends who have differing political views than you, one philosopher tells us why we should embrace, rather than shun, such challenges to our worldviews.
We've all done it, unfollowed that conspiracy spouting friend we have on Facebook rather than endure one more post about how the Earth is flat and Obama was born on Neptune. Sometimes we go a step further, removing those with opposing political views from our friend lists. After all, social media is for fun! Why should I have to see my nutty uncle's viewpoints when I am looking for pictures of cute cats?