Study: White Supremacist Groups Are ‘Thriving’ On Facebook, Despite Extremist Ban

Christopher Mathias

A new study reported that white supremacist groups are “thriving” on Facebook, despite repeated assurances from the company that it doesn’t allow extremists on its platform.

The watchdog group Tech Transparency Project released a study Thursday that found more than 100 white supremacist groups had a presence on Facebook.  

Project researchers identified 221 white supremacist groups — using information collected by Southern Poverty Law Center and the Anti-Defamation League, two of America’s most prominent anti-hate organizations — and searched for those groups on Facebook. 

About 50% of the groups were present on the platform, the study said. 

Of the 113 white supremacist groups the project found on Facebook, 36% had pages or groups created by active users. The remaining 64% had a page auto-generated by Facebook itself. 

“With millions of people now quarantining at home and vulnerable to ideologies that seek to exploit people’s fears and resentments about COVID-19, Facebook’s failure to remove white supremacist groups could give these organizations fertile new ground to attract followers,” TTP’s study said.  

A screenshot of a white supremacist group's Facebook page from the study by the Tech Transparency Project. (Tech Transparency Project)

The study comes after years of rising white nationalism in the U.S. and heightened scrutiny over social media companies’ role in providing online spaces for hate groups to spread their propaganda, to organize and to recruit.  

There was a Facebook event page, after all, for the deadly 2017 white supremacist rally in Charlottesville, Virginia. And a white supremacist gunman live-streamed on Facebook as he massacred 51 people at two New Zealand mosques early last year. (The company later said it had removed 1.5 million videos of the mass shooting.) 

Facebook has since taken a more aggressive stance in banning white supremacist activity but has been criticized for what extremism researchers describe as a whack-a-mole approach to hate on the platform. The company often takes down content only after inquiries from...

Continue reading on HuffPost