Report: Facebook accused of helping extremists

Report: Facebook accused of helping extremists

Facebook (FB) is being accused of inadvertently helping Islamist extremists connect and recruit new members. A new report in The Telegraph cites research suggesting that the social media giant connected and introduced thousands of extremists through its “suggested friends” feature. One writer who spoke to CBSN says “it’s cause for concern.”

The research was conducted by the Counter Extremism Project, a non-profit organization that pressures companies to remove extremist content online. It plans to release its findings in an extensive report later this month.

“The failure to effectively police its platform has allowed Facebook to become a place where extensive (Islamic State of Iraq and Syria or ISIS) supporting networks exist, propaganda is disseminated people are radicalized and new supporters are recruited,” researcher Gregory Waters told The Telegraph.

Facebook is already facing criticism for failing to remove terrorist material from its platform. The platform has also been blamed for spreading disinformation that stokes violence in Myanmar.

“There is no place for terrorists on Facebook,” a Facebook spokesperson said in a statement. “We work aggressively to ensure that we do not have terrorists or terror groups using the site, and we also remove any content that praises or supports terrorism. 99 percent of ISIS and Al Qaeda-related content we remove is found by our automated systems.”

    Report: Researchers say Facebook doesn’t curb hate speech in developing countries

    Facebook testing out a “hate speech” button, inadvertently launches it live

J.M. Berger, author of “Extremism” and a fellow with the Counter-Terrorism Strategic Communications program, told CBSN’s Elaine Quijano that this issue is something that’s been known for some time and says “it’s cause for concern,” but further analysis of the research is needed. Berger said that “the online environment for ISIS and other jihadist extremists is much more difficult than it was just a couple of years ago.”

“It’s a problem we’ve known about for a long time … I first wrote about it in 2013,” Berger said. “All of the social media platforms use algorithms that allow them to suggest content that you might be interested in. It’s a key, integral part of their functioning and what we’ve seen is that these algorithms will recommend whatever kind of content … whether it’s extremist content or normal content. Managing that is a slightly different problem than managing extremist content where you go in and look for keywords.”

“You can be on Facebook and be an ISIS supporter and not post content that would get you suspended — if you don’t put anything publicly than you’re not going to get caught,” Berger explained. “But if you’re part of a social network that supports ISIS, then once a person becomes friends with you — Facebook is going to suggest that they all become friends.”

Berger elaborated: “It used to be that it was extraordinarily easy to find this content — to find other people doing active recruiting who are being open supporters — now that is no longer the case. We can’t realistically hope for 100 percent elimination of this content on these platforms, but now the question is how much is left?”