Skip to content
Technology & Innovation

How Facebook helps members of ISIS and other extremists find friends

A new study highlights the role Facebook algorithms unwittingly played in ISIS recruitment and points to continuing problems.
Facebook co-founder, Chairman and CEO Mark Zuckerberg testifies before the House Energy and Commerce Committee in the Rayburn House Office Building on Capitol Hill April 11, 2018 in Washington, DC. (Photo by Chip Somodevilla/Getty Images) + ISIS fighters.
Sign up for the Smarter Faster newsletter
A weekly newsletter featuring the biggest ideas from the smartest people


For over a billion people, Facebook has been the place in the digital universe to connect to new friends and keep up with what the faraway ones are doing. It has also been the site where thousands of ISIS members and potential sympathizers found each other, according to a new study.

The social media giant’s “suggested friends” or “people you may know” feature has unwittingly helped terrorists build networks, says the report from the U.S.-based nonprofit Counter Extremism Project. (CEP).

CEP researchers looked at the online life of 1,000 ISIS supporters in 96 countries and found that radical Islamists were often introduced to each other by Facebook. After CEP staff viewed Islamist profiles, they were offered dozens of extremist connections by the ever-helpful site.

What is the real-world effect of this? CEP thinks that it helped jihadists around the world connect, to develop new terror networks, and recruit new members.

It’s important to note that the “suggested friends” feature uses sophisticated algorithms to connect people based on their common interests among other criteria. So if your interests happen to include radical Islamist content, then that’s what Facebook uses to find you new friends.

Gregory Waters, one of the authors of the report by CEP, was himself inundated with suggestions for pro-ISIS friends after contacting one active extremist for his research. His colleague Robert Postings was barraged with friends suggestions for dozens of extremists after clicking on several pages about an Islamist uprising in the Philippines.

“Facebook, in their desire to connect as many people as possible have inadvertently created a system which helps connect extremists and terrorists,” Postings told the Telegraph.

There have been numerous documented cases of extremists radicalizing people over the social media network in just a matter of months after the initial contact.

Facebook co-founder, Chairman and CEO Mark Zuckerberg (R) arrives to testify before a combined Senate Judiciary and Commerce committee hearing in the Hart Senate Office Building on Capitol Hill April 10, 2018 in Washington, DC. Zuckerberg, 33, was called to testify after it was reported that 87 million Facebook users had their personal information harvested by Cambridge Analytica, a British political consulting firm linked to the Trump campaign. (Photo by Win McNamee/Getty Images)

Facebook is also doing a poor job of policing extremist material once it’s identified, not reacting for long periods of time, not removing offending profiles, or allowing removed users to re-register.

“This project has laid bare Facebook’s inability or unwillingness to efficiently address extremist content on their site,” explained Mr. Waters to the Telegraph. “The failure to effectively police its platform has allowed Facebook to become a place where extensive IS supporting networks exist, propaganda is disseminated people are radicalized and new supporters are recruited.”

Lest you think this is not a big deal, Mr. Waters points out:

“The fact that Facebook’s own recommended friends algorithm is directly facilitating the spread of this terrorist group on its site is beyond unacceptable.”

For its part, Facebook touts that its approach is working and its automated systems reportedly manage to remove 99% of ISIS or Al Qaeda-theme content from the site. It also claims that “there is no easy technical fix to fight online extremism.”

While Facebook may not know or be responsible for everything its users do on the site (even if it certainly knows enough to sell them hyper-targeted advertising), it has been under fire from all sides of the political spectrum for not focusing on security and the lack of oversight that has aided the Russian interference in the 2016 elections. Congress even released thousands Facebook of ads bought by Russians to target U.S. voters. The new study is sure to put more pressure on Facebook to institute some meaningful reforms in how it codes and regulates the content on its platform.

Sign up for the Smarter Faster newsletter
A weekly newsletter featuring the biggest ideas from the smartest people

Related

Up Next