Facebook Doesn't Understand How Dangerous Its Megaphone Is

Facebook Doesn't Understand How Dangerous Its Megaphone Is

(Bloomberg Opinion) -- Mark Zuckerberg on Thursday gave a passionate defense of how Facebook Inc. and the rest of the internet are essential tools for the free expression that is essential in a healthy democracy. I agree with this in principle, as I imagine most Americans would.

The dark side of Facebook and the mass-market internet is not necessarily the ideas behind them. It is how those principles can wittingly or unwittingly be subverted when principle meets reality. The question is whether the good that comes from anything — Facebook, the automobile, electricity — outweighs the inevitable negative effects and whether it’s possible to mitigate the latter while accentuating the former.

While I’m glad that Zuckerberg is articulating his values and, by extension, those of the company that he controls with absolute authority, the principle of Facebook matters far less than what happens when lofty ideals collide with more than 2.4 billion people around the world using Facebook or its Messenger chat app. (That number grows to more than 2.7 billion when you throw in WhatsApp and Instagram.)

On Thursday, Facebook’s CEO used a speech at Georgetown University to argue for an optimistic view of the internet. It was a good speech and worth watching. He said the spread of prevalent internet hangouts like Facebook is giving more people chances to be heard in ways that weren’t possible when, for example, a handful of rich people controlled printing presses or television airwaves.

Over the long arc of history, Zuckerberg said, more speech from more kinds of people is healthy, within some reasonable limits like prohibiting the proverbial false cries of “fire” in a crowded theater.

How could anyone disagree with that? As always, however, the devil is in the details. Where people don’t agree is what counts as shouting “fire” to cause a stampede of people. And, more important, can Facebook at its scale effectively stamp out the truly harmful speech — terrorist propaganda, incidents of violence or incitements to violence, dangerous hoaxes — when there might be thousands of people falsely yelling out “fire” every minute?

Zuckerberg did not mention Myanmar in his speech, but to me it is the crucible of Facebook’s free-speech principles. In that country, people spread hoaxes, false claims and calls for violence toward the Rohingya Muslim minority. Some of the people spreading those hateful messages on Facebook were politicians, members of the military or other authority figures. There were groups in Myanmar that begged Facebook to stop what many people — and indeed, Facebook’s own rules — regarded as the kind of speech that should be impermissible.

Facebook last year agreed with the United Nations, which said the company didn’t do enough to prevent Facebook from facilitating ethnically based violence. It wasn’t Facebook’s principles that helped cause a genocide in Myanmar. It was the reality of Facebook. A company that is home to 2.7 billion people didn’t pay enough attention to the downsides of free expression in Myanmar and couldn’t or wouldn’t do enough about it until it was too late.

Facebook has said it was “too slow to act” in Myanmar, one of the company’s stock lines when it gets caught being a launchpad for violence, propaganda or other ills when its principles collide with the reality of the world and the limits of the company’s capacity to understand the harm it can cause. Facebook made things worse with computerized systems that rewarded the most outlandish ideas with greater distribution.

Facebook again and again — with Russia-backed groups sowing contentious ideas around the U.S. election, in Myanmar, in Sri Lanka and other spots — has failed to recognize the harmful effects of its amplification of dangerous or divisive views until it was far too late, and then often tried to play down the damage it caused.

This pattern would be funny if the consequences weren’t so dire. Even when Facebook recognizes the problem, it’s not clear it or any company of its size and scope has the attention and resources to assess all the cries of “fire” taking place all over the world.

There will be natural disagreements about where Facebook should — or whether it should — weed out some kinds of information or speech on its internet hangouts. But when Facebook can’t identify or stamp out the worst abuses that come from handing a megaphone to 2.7 billion people, it’s reasonable to ask whether its principles mean very much.

To contact the author of this story: Shira Ovide at sovide@bloomberg.net

To contact the editor responsible for this story: Daniel Niemi at dniemi1@bloomberg.net

This column does not necessarily reflect the opinion of the editorial board or Bloomberg LP and its owners.

Shira Ovide is a Bloomberg Opinion columnist covering technology. She previously was a reporter for the Wall Street Journal.

For more articles like this, please visit us at bloomberg.com/opinion

©2019 Bloomberg L.P.