Facebook confirms tests of a new anti-extremism warning prompt
Facebook is testing prompts that will link users to anti-extremism support and resources if the company believes the user knows someone who could be on the path to extremism, or if the user has been exposed to extremist content themselves, according to a report by CNN Business.
In a statement to The Verge, a Facebook spokesperson said that the test is part of the company’s “larger work to assess ways to provide resources and support to people on Facebook who may have engaged with or were exposed to extremist content, or may know someone who is at risk.”
Facebook says that it’ll continue to remove extremist content that violates its rules, though the company has had issues tracking down and removing similar content, even from groups that it’s actively tried to kick off the platform. Facebook has long been the subject of scrutiny from the public and lawmakers, as many say that its algorithms divide people and push them towards extreme ideologies, something the company has itself recognized.
Facebook says that the tests go along with its Redirect Initiative, which “helps combat violent extremism and dangerous organizations” in several countries. According to its webpage, the program (as the name implies) redirects users to educational resources instead of further hateful content. It also says the test is part of its response to the Christchurch Call for Action campaign, and the test identifies both users who may have seen extremist content, and those who have had enforcement actions taken against them by Facebook in the past.
The test links to resources intended to help someone intervene if they’re concerned about a loved one joining a violent extremist group or movement. On a Facebook support page titled “What can I do to prevent radicalization,” Facebook links to Life After Hate’s ExitUSA program, which Facebook says helps people find “a way out of hate and violence.” The support page also gives tips on engaging with someone who’s trying to leave a hate group.
Facebook, like other platforms, has had issues with extremism for a long time, and though it’s good that it’s trying to combat it, some of its efforts feel like they should’ve been implemented long ago. This is the case with many forms of bad behavior online, as many platforms are still struggling to get a hold on users who harass women, or display other general toxic behavior.