How Facebook helps members of ISIS and other extremists find friends
A new study highlights the role Facebook algorithms unwittingly played in ISIS recruitment and points to continuing problems.
For over a billion people, Facebook has been the place in the digital universe to connect to new friends and keep up with what the faraway ones are doing. It has also been the site where thousands of ISIS members and potential sympathizers found each other, according to a new study.
The social media giant’s “suggested friends” or "people you may know" feature has unwittingly helped terrorists build networks, says the report from the U.S.-based nonprofit Counter Extremism Project. (CEP).
CEP researchers looked at the online life of 1,000 ISIS supporters in 96 countries and found that radical Islamists were often introduced to each other by Facebook. After CEP staff viewed Islamist profiles, they were offered dozens of extremist connections by the ever-helpful site.
What is the real-world effect of this? CEP thinks that it helped jihadists around the world connect, to develop new terror networks, and recruit new members.
It’s important to note that the “suggested friends” feature uses sophisticated algorithms to connect people based on their common interests among other criteria. So if your interests happen to include radical Islamist content, then that’s what Facebook uses to find you new friends.
Gregory Waters, one of the authors of the report by CEP, was himself inundated with suggestions for pro-ISIS friends after contacting one active extremist for his research. His colleague Robert Postings was barraged with friends suggestions for dozens of extremists after clicking on several pages about an Islamist uprising in the Philippines.
"Facebook, in their desire to connect as many people as possible have inadvertently created a system which helps connect extremists and terrorists,” Postings told the Telegraph.
There have been numerous documented cases of extremists radicalizing people over the social media network in just a matter of months after the initial contact.
Facebook co-founder, Chairman and CEO Mark Zuckerberg (R) arrives to testify before a combined Senate Judiciary and Commerce committee hearing in the Hart Senate Office Building on Capitol Hill April 10, 2018 in Washington, DC. Zuckerberg, 33, was called to testify after it was reported that 87 million Facebook users had their personal information harvested by Cambridge Analytica, a British political consulting firm linked to the Trump campaign. (Photo by Win McNamee/Getty Images)
Facebook is also doing a poor job of policing extremist material once it’s identified, not reacting for long periods of time, not removing offending profiles, or allowing removed users to re-register.
"This project has laid bare Facebook's inability or unwillingness to efficiently address extremist content on their site,” explained Mr. Waters to the Telegraph. “The failure to effectively police its platform has allowed Facebook to become a place where extensive IS supporting networks exist, propaganda is disseminated people are radicalized and new supporters are recruited."
Lest you think this is not a big deal, Mr. Waters points out:
“The fact that Facebook's own recommended friends algorithm is directly facilitating the spread of this terrorist group on its site is beyond unacceptable."
For its part, Facebook touts that its approach is working and its automated systems reportedly manage to remove 99% of ISIS or Al Qaeda-theme content from the site. It also claims that “there is no easy technical fix to fight online extremism.”
While Facebook may not know or be responsible for everything its users do on the site (even if it certainly knows enough to sell them hyper-targeted advertising), it has been under fire from all sides of the political spectrum for not focusing on security and the lack of oversight that has aided the Russian interference in the 2016 elections. Congress even released thousands Facebook of ads bought by Russians to target U.S. voters. The new study is sure to put more pressure on Facebook to institute some meaningful reforms in how it codes and regulates the content on its platform.
Swipe right to make the connections that could change your career.
Swipe right. Match. Meet over coffee or set up a call.
No, we aren't talking about Tinder. Introducing Shapr, a free app that helps people with synergistic professional goals and skill sets easily meet and collaborate.
A new study explores how certain personality traits affect individuals' attitudes on obesity in others.
- The study compared personality traits and obesity views among more than 3,000 mothers.
- The results showed that the personality traits neuroticism and extraversion are linked to more negative views and behaviors related to obesity.
- People who scored high in conscientiousness are more likely to experience "fat phobia.
The rise of anti-scientific thinking and conspiracy is a concerning trend.
- Fifty years later after one of the greatest achievements of mankind, there's a growing number of moon landing deniers. They are part of a larger trend of anti-scientific thinking.
- Climate change, anti-vaccination and other assorted conspiratorial mindsets are a detriment and show a tangible impediment to fostering real progress or societal change.
- All of these separate anti-scientific beliefs share a troubling root of intellectual dishonesty and ignorance.
The history of the Geneva Conventions tells us how the international community draws the line on brutality.
- Henry Dunant's work led to the Red Cross and conventions on treating prisoners humanely.
- Four Geneva Conventions defined the rules for prisoners of war, torture, naval and medical personnel and more.
- Amendments to the agreements reflect the modern world but have not been ratified by all countries.
SMARTER FASTER trademarks owned by The Big Think, Inc. All rights reserved.