Advertisement

Deplatforming Works, Just Ask David Icke

Katie Hopkins, who was removed from Twitter for breaching the platform's rules on hate speech.
Katie Hopkins, who was removed from Twitter for breaching the platform's rules on hate speech.

How should online hate and misinformation be dealt with? When Wiley launched a tirade of anti-Semitism, his social media accounts were removed, as have the accounts of hate actors like Katie Hopkins, and Alex Jones in recent years.

In response to the growing following of the QAnon conspiracy theory, Twitter has deleted over 7,000 accounts dedicated to it. Some deride this as a sign of an emerging “cancel culture” and an attack on free speech, but it should also be asked whether deplatforming actually works?

Earlier this year, David Icke was the king of a profitable conspiracy empire. Within weeks of the pandemic reaching the US and UK, he had become the single greatest producer of coronavirus misinformation anywhere in the world.

Related...

As an organisation focussed on tackling the growing online myths about coronavirus, we could see Icke was a huge problem, but targeting him for deplatforming was not an easy decision. We asked ourselves in advance “is this going to work, or might it backfire?”

After all, publicly challenging those who spread hate and lies comes with the risk that you might just give them fresh exposure, making the problem worse, not better. The Center for Countering Digital Hate’s own research on online trolling shows that engaging with a claim in order to refute it can often help amplify and entrench it, both as a result of the technology and human psychology.

Freedom of speech does not mean freedom of reach.

That said, there are consequences to inaction, too. Icke’s poisonous misinformation about coronavirus had already been viewed 30 million times.

Every week his social media accounts attracted another 22,000 followers. And all of this was helping make Icke...

Continue reading on HuffPost