Both social media companies plan to implement special protocols on Tuesday as election results begin rolling in.
- Twitter says it will remove or add a warning to tweets that declare election wins before official results are declared, as determined by national media outlets.
- When Twitter users try to retweet, the company will show them a prompt encouraging them to "quote tweet" (and thereby add their own commentary) instead, a move designed to slow the spread of misinformation.
- Facebook plans to display election results, as determined by national media outlets, on posts from candidates who contest the results or declare early wins.
As the results of the U.S. presidential election start rolling in Tuesday evening, Facebook and Twitter plan to remove or modify posts they deem misleading.
Twitter first announced the plans in October, but elaborated on them Monday in a blog post.
The company says it will remove or add a warning to tweets that declare election wins before they're "authoritatively called" by a state election official, or by at least two of the following national media outlets: ABC News, The Associated Press, CBS News, CNN, Decision Desk HQ, Fox News, and NBC News.
"We do not allow anyone to use Twitter to manipulate or interfere in elections or other civic processes, and recently expanded our civic integrity policy to address how we'll handle misleading information surrounding these events. Under this policy, we will label Tweets that falsely claim a win for any candidate and will remove Tweets that encourage violence or call for people to interfere with election results or the smooth operation of polling places."
Twitter also plans to remove or modify tweets "meant to incite interference with the election process or with the implementation of election results, such as through violent action, will be subject to removal."
A few weeks ago, we announced how we will handle Tweets about US election results. We want to remind you what to e… https://t.co/zGNbLXIu2x— Twitter Support (@Twitter Support)1604326131.0
Expect to see Twitter attach warnings — which users must tap through — on "misleading" tweets from candidates, campaign accounts, and accounts with more than 100,000 followers.
Twitter is also trying to make it harder for the average user to retweet misleading tweets by prompting them to "quote tweet" when they click the retweet button. The goal is to add a layer of "friction" that slows the spread of posts deemed misleading.
When people attempt to Retweet a Tweet with a misleading information label, they’ll see a prompt pointing them to c… https://t.co/gQUr2qGTof— Twitter Support (@Twitter Support)1604326139.0
"We hope it will encourage everyone to not only consider why they are amplifying a Tweet, but also increase the likelihood that people add their own thoughts, reactions and perspectives to the conversation."
Another change: Twitter will prevent "liked by" and "followed by" recommendations from appearing in users' timelines. The company notes that, while this feature can help people access viewpoints outside their network, it doesn't "believe the 'Like' button provides sufficient, thoughtful consideration prior to amplifying Tweets to people who don't follow the author of the Tweet, or the relevant topic that the Tweet is about."
If we see content inciting interference with the election, encouraging violent action or other physical harms, we m… https://t.co/FdAz7uUvWX— Twitter Support (@Twitter Support)1604326141.0
"This will likely slow down how quickly Tweets from accounts and topics you don't follow can reach you, which we believe is a worthwhile sacrifice to encourage more thoughtful and explicit amplification."
Twitter's policy changes are the latest in a series that aim to minimize the influence of misinformation on U.S. elections. Of course, Twitter's policies are also likely designed to shield the company from accusations that it's eroding the quality of American political discourse.
Timeline of Twitter policy changes
Twitter listed some of its recent policy changes, the most impactful of which was its decision to ban political ads in late 2019:
- 1/2019 - Issued a comprehensive review of our efforts to protect the 2018 U.S. midterms
- 6/2019 - Launched public interest notice and defined our approach on public interest
- 10/2019 - Banned all political ads on Twitter, including ads from state-controlled media
- 12/2019 - Added Election Labels to candidates' accounts
- 2/2020 - Introduced our rules on and labels for synthetic and manipulated media
- 3/2020 - Held planning exercises to prepare for a variety of Election Day scenarios
- 5/2020 - Added labels and warnings to potentially harmful misleading information
- 8/2020 - Deployed labels on government and state-affiliated media accounts
- 9/2020 - Implemented account security requirements for high-profile political accounts
- 9/2020 - Built a U.S. Election hub containing credible news and voting resources
- 9/2020 - Encouraged voter registration and emphasizing safe voting options
- 9/2020 - Expanded our civic integrity policy to include specifics around pre and post election day
Similar to Twitter, Facebook wrote in a blog post that it will label potentially misleading posts with election results, as determined by national media outlets.
"If a candidate or party declares premature victory before a race is called by major media outlets, we will add more specific information in the notifications that counting is still in progress and no winner has been determined."
"If the candidate that is declared the winner by major media outlets is contested by another candidate or party, we will show the name of the declared winning candidate with notifications at the top of Facebook and Instagram, as well as label posts from presidential candidates, with the declared winner's name and a link to the Voting Information Center."
Instagram, which is owned by Facebook, plans to temporarily hide hashtags on all "recent" posts, the company wrote on its website:
"Recent posts from all hashtags may be temporarily hidden to help prevent the spread of possible false information and harmful content related to the 2020 US election. Instagram is committed to reducing the spread of false information and giving people accurate information about voting."
After the election, Facebook and Instagram plan to stop circulating all political ads in an effort to block misinformation about the outcome. The company said this ban should last a week.
Why virtue signaling does nothing.
"A big problem with moral outrage on the Internet is that it leads people to think they’ve done something when in fact they haven’t done something," says author Alice Dreger. Sure, you might get a little rush out of updating your status to say something, but all you're really doing is virtue signaling. Alice's latest book is Galileo's Middle Finger: Heretics, Activists, and One Scholar's Search for Justice.
We've got the biggest comments. The best comments. You're going to have such good comments that you're going to be sick of how good these comments are. Believe me.
David Barker: You can't have imagination without knowledge. The more the better. Otherwise, you just have religion.
Alan Garcia: The only way to expand human knowledge is to first imagine something that doesn't exist yet. If we only knew things but didn't imagine things, society would not progress.
Kay Stephan: Maybe just MAYBE because the belief in some wizard that flies through the sky who just pops everything into existence is kinda bonkers for anyone who actually uses his/her brain when it comes to this matter? There is no way that you could stick to religion if you'd simply accept reality and use rational thinking. People like Gervais simply realize that religion is not something that has some actual basis but that it is instead something that people simply share because they are indoctrinated into it 99,99% of the time.
Or as Richard Dawkins said it (the quote is not word by word as I don't remember it THAT exactly. he kinda said): "Isn't it amazing how children always have the same religion as their parents? And it always happens to be the right one."
Matthew Richie: According to The Flat Earth Society, they’ve got members all around the globe.
Mark A. Routt: Let's get this straight, when we first launched man into space people were in awe, now we have SELF LANDING REUSABLE ROCKETS and people are arguing over semantics?
I'll be over here excited for the future.
The internet and social media have made persuasive appeals more powerful than ever before.
The Police song “Every Breath You Take” has been popular for decades. For a hit from the early ‘80s, it’s shown surprising longevity. In an interview during its hay day, Police front man Sting, said he was stupefied that people had turned, what he termed a “creepy” and “ugly” song, into a love ballad. “It's about jealousy and surveillance and ownership," he told the New Musical Express in 1983. Today, it’s played at weddings.
The song has much in common with our new era of ubiquitous social media. On the one hand, it gives us so much joy. We find social media a convenient way to stay in touch with friends and family, stay on top of the latest news, laugh at and share memes, and just enjoy the rich, bizarre pageantry of life—right at our fingertips.
The drawback, almost everything you do online, from the biggest purchase to the single, solitary “Like,” registers. It leaves a trail and builds a profile of you which companies and others can mine and develop strategies around. Much like the song, on the surface it seems all about love. Delve deeper and a more sinister picture arises.
What we click on, what we search for, and even ”Like" on social media reveals a lot about us, far more than we assume. And the more we use it, the more we reveal. What previous studies have shown is that, the music you listen to, the articles you read, and what you post, all lend insight into your motivations and behavior, patterns which collectively are called your digital footprint.
Previous studies have shown that persuasive appeals are more successful when coupled with an approach that matches a person’s personality traits. New research out of Columbia University goes one step farther. It shows how one simple “Like,” can reveal a key aspect of your personality, which can be used to influence your outlook and even behavior.
The more we use social media, the more data we generate that can be mined, for profit and perhaps even to move us in certain directions. Credit: Getty Images.
So besides social media sites, who else has access to your digital footprint? An amazing number of companies including: search engines, web browsers, the maker of your smart phone, and your internet service provider (ISP). And it isn’t only companies but governments, political parties, and even foreign agents who use this data, for good or ill. Consider that Russian operatives knew exactly who to place certain fake news stories in front of, during the last US presidential election. And all this data may be making organizations and agents more persuasive than before.
In a recent study, Columbia Business School researchers, led by Sandra Matz, wanted to see what effects psychological persuasion had in a social media setting. “Recent research…shows that people's psychological characteristics can be accurately predicted from their digital footprints,” researchers write, “such as their Facebook Likes or Tweets.”
Matz and colleagues tailored ads which employed persuasive appeals, according to a person’s social media activity, specifically on whether or not they liked something. The experiment included over 3.7 million users. Researchers evaluated how successful the efforts were on whether or not the participant clicked or purchased an item. They wrote in the study, “…with psychologically tailored advertising, we find that matching the content of persuasive appeals to individuals' psychological characteristics significantly altered their behavior as measured by clicks and purchases.”
We often forget the business model of social media companies is to turn your “Likes” into profit. Credit: Getty Images.
To select targets based on Facebook likes, researchers turned to the database myPersonality.org. It contains the Facebook likes of millions of users. These were correlated with a 100-item PIP questionnaire, which is considered an accurate personality assessment tool. Researchers’ isolated 10 likes in particular associated with either the highest or lowest levels of extroversion.
Most popular with extroverts was making people laugh or the music of Slightly Stoopid. For introverts, these were Stargate and computers. Researchers also looked at openness to new experiences. Those with a greatest openness liked philosophy and the docufiction movie Waking Life, while those who had the lowest levels liked Uncle Kracker and the video game Farm Town.
Facebook currently has rules against ads targeting users directly though psychological traits. However, marketers are allowed to do so indirectly, based on likes and other activity. Once they had a good handle on how to identify introverts and extroverts, Matz and colleagues created two makeup ads, one targeted toward each type. The one for extroverts had three smiling women dressed to the nines, grouped together in order to take a photo. The tagline said, “Love the spotlight and feel the moment.”
The other had one woman cheekily applying makeup with a tagline that said, “Beauty doesn’t have to shout.” A second ad series targeted those open to new experiences and those who weren’t. Persuasive appeals, matched to people’s extraversion level (or openness to-experiences), resulted in up to 40% more clicks and up to 50% more purchases, than mismatched counterparts. “This suggests that psychological targeting can influence large groups of people,” researchers wrote.
Such targeting, coupled with internet history and social media activity, has the potential to influence people to lead healthier lives, save more money, and even make better decisions. But it also allows for greater exploitation of weaknesses for profit, say targeting the highly impulsive with online gambling ads. Such powers should be more robustly studied and common sense regulations put into place, so that we all have the ability to make our decisions free of undue influence.
To learn more about this study, click here:
The social media behemoth wants you to use their platform less, not more, than before.
There’s a good chance you accessed this article from Facebook.
The social media giant has, after all, made it easier than ever for companies to bring their content to users’ news feeds. It’s revolutionized the way media create, package, and publish articles and videos. It’s also allowed for the flourishing of massive amounts of “junk food” content, as The New York Times called it. (Think “prank” videos, actual fake news, and relentless legions of meme pages.)
But all of that could change soon. On January 11, Mark Zuckerberg published a statement on his Facebook page that outlines upcoming tweaks to the website’s algorithms. Basically, you can expect your feed to begin showing you less news and promotional content, and more posts from friends and family. It’s a decision based on research into the effects social media has on well-being, Zuckerberg said:
“We feel a responsibility to make sure our services aren’t just fun to use, but also good for people's well-being. So we've studied this trend carefully by looking at the academic research and doing our own research with leading experts at universities.
The research shows that when we use social media to connect with people we care about, it can be good for our well-being. We can feel more connected and less lonely, and that correlates with long term measures of happiness and health. On the other hand, passively reading articles or watching videos -- even if they're entertaining or informative -- may not be as good.”
Drew Angerer via Getty Images
It’s not the first time the company has addressed the adverse effects of social media use. In December 2017, Facebook issued a blog post outlining the pros and cons of using social media. On the positive side, the post said that interacting with close friends and family—reminiscing on past events, sharing photos, catching up with people—“brings us joy and strengthens our sense of community.”
On the negative, it highlighted research that suggests even small amounts of Facebook use can worsen users’ mood and mental health, and lead to unhealthy social comparison. Other research suggests that social media use can reduce face-to-face interaction, contribute to a sedentary lifestyle, and reduce investment in meaningful activities.
Facebook’s December blog post came just days after ex-Facebook executive Chamath Palihapitiya told CNBC that Facebook was starting to “erode the social fabric of how society works.” Palihapitiya’s main arguments were that Facebook is creating a society that confuses truth with popularity—basically, whichever ideological message has more money behind it wins. He also criticized the company for intentionally causing users to become addicted to its platform by providing a never-ending loop of social feedback.
“That feedback, chemically speaking, is the release of dopamine in your brain,” Palihapitiya said. “I think if you get too desensitized and you need it over and over and over again, then you become actually detached from the world in which you live.”
Jewel Samad via Getty Images
Tristan Harris, an entrepreneur and computer scientist, echoed a similar sentiment in an interview with Big Think:
“...We find ourselves in this kind of wormhole and then we say, ‘Oh man, like, I should really have more self-control.’ And that's partially true, but what we forget when we talk about it that way is that there's a thousand engineers on the other side of the screen whose job it was to get my finger to do that the next time. And there's this whole playbook of techniques that they use to get us to keep using the software more.”
Sean Parker, an early Facebook investor and the founder of Napster, said Facebook was designed to be addictive from the start.
“The inventors, creators—it's me, it's Mark [Zuckerberg], it's Kevin Systrom on Instagram, it's all of these people—understood this consciously,” Parker said at an Axios event. “And we did it anyway.”
Facebook’s long game
Facebook’s stock dropped about 4 percent the day after the announcement. Still, some analysts feel it will prove to be a smart move in the big picture.
“We see this as the right long-term decision for the platform and, over the near-term, doubt that this will have a material impact on revenue,” said Samuel Kemp, senior Internet research analyst at asset management firm Piper Jaffray, to CNBC.
In an interview with The New York Times, Zuckerberg said that Facebook and its users will likely benefit in the long run—even if some users start looking elsewhere for viral content.
“I expect the time people spend on Facebook and some measures of engagement will go down,” he said in his post about the changes. “But I also expect the time you do spend on Facebook will be more valuable.”
Despite the changes, users will still be able to customize the kinds of content that appear in their news feed. But left to default settings, posts from friends and family will rise to the top. A video published by Facebook explains how the new algorithms will prioritize content.