After Jan. 6, Twitter banned 70,000 right-wing accounts. Lies plummeted.

Trump supporters outside the U.S. Capitol on Jan. 6. (Olivier Douliery/AFP/Getty Images)

In the week after the Jan. 6, 2021, insurrection, Twitter suspended some 70,000 accounts associated with the right-wing QAnon radicalized movement, citing their role in spreading misinformation that was fueling real-world violence.

A new study finds the move had an immediate and widespread impact on the overall spread of bogus information on the social media site, which has since been purchased by Elon Musk and renamed X.

Subscribe to The Post Most newsletter for the most important and interesting stories from The Washington Post.

The study, published in the journal Nature on Tuesday, suggests that if social media companies want to reduce misinformation, banning habitual spreaders may be more effective than trying to suppress individual posts.

The mass suspension significantly reduced the sharing of links to “low credibility” websites among Twitter users who followed the suspended accounts. It also led a number of other misinformation purveyors to leave the site voluntarily.

Social media content moderation has fallen out of favor in some circles, especially at X, where Musk has reinstated numerous banned accounts, including former president Donald Trump’s. But with the 2024 election approaching, the study shows that it is possible to rein in the spread of online lies, if platforms have the will to do so.

“There was a spillover effect,” said Kevin M. Esterling, a professor of political science and public policy at University of California at Riverside and a co-author of the study. “It wasn’t just a reduction from the de-platformed users themselves, but it reduced circulation on the platform as a whole.”

Twitter also famously suspended Trump on Jan. 8, 2021, citing the risk that his tweets could incite further violence - a move that Facebook and YouTube soon followed. While suspending Trump may have reduced misinformation by itself, the study’s findings hold up even if you remove his account from the equation, said co-author David Lazer, professor of political science and computer and information science at Northeastern University.

The study drew on a sample of some 500,000 Twitter users who were active at the time. It focused in particular on 44,734 of those users who had tweeted at least one link to a website that was included on lists of fake news or low-credibility news sources. Of those users, the ones who followed accounts banned in the QAnon purge were less likely to share such links after the deplatforming than those who didn’t follow them.

Some of the websites the study considered low-quality were Gateway Pundit, Breitbart and Judicial Watch. The study’s other co-authors were Stefan McCabe of George Washington University, Diogo Ferrari of University of California at Riverside and Jon Green of Duke University.

Musk has touted X’s “Community Notes” fact-checking feature as an alternative to enforcing online speech rules. He has said he prefers to limit the reach of problematic posts rather than to remove them or ban accounts altogether.

A study published last year in the journal Science Advances found that attempts to remove anti-vaccine content on Facebook did not reduce overall engagement with it on the platform.

Trying to moderate misinformation by targeting specific posts is “like putting your finger in a dike,” Esterling said. Because there are so many of them, by the time you suppress or remove one, it may have already been seen by millions.

Lazer added, “I’m not advocating deplatforming, but it does have potential efficacy in the sense that identifying people who are repeated sharers of misinformation is much easier than going after individual pieces of content.”

It’s still unclear whether misinformation is a major driver of political attitudes or election outcomes. Another paper published in Nature on Tuesday argues that most social media users don’t actually see a lot of misinformation, which is instead “concentrated among a narrow fringe with strong motivations to seek out such information.”

Lazer agreed that misinformation tends to be concentrated in a “seedy neighborhood” of larger online platforms, rather than pervading “the whole city.” But, he added, those fringe groups “sometimes gather and storm the Capitol.”

Anika Collier Navaroli, a senior fellow at Columbia’s Tow Center for Digital Journalism and a former senior Twitter policy official, said the findings support the case she tried to make to Twitter’s leaders at the time.

Navaroli noted that the company had compiled the list of QAnon-affiliated accounts before Jan. 6.

“We already knew who they were,” she said. “People just needed to die for the harm to be [seen as] real.”

Related Content

One graduate’s quiet protest: Bringing a banned book to commencement

The real dolphin tale: They’re smart, sometimes vicious and highly sexed