Facebook defends algorithm-driven reality

·2-min read

Accused of being a Doomsday machine that fuels hate and manipulation, Facebook has posted a defence that it is a global force for democracy.

People are not powerless victims or playthings of technology, Facebook's spin doctor Nick Clegg has told an audience of 170 million readers on Medium.

"This is the magic of social media, the thing that differentiates it from older forms of media," he said.

"Machines have not taken over, but they are here to stay."

There is no editor dictating the front page headline that millions of people might read on Facebook.

Instead there's a "rich feedback loop" and billions of front pages, each personalised to individual tastes and preferences, and each reflecting a unique network of friends, pages, and groups.

"This is a dramatic and historic democratisation of speech," he said.

"Political and cultural elites are confronting a raucous online conversation that they can't control, and many are understandably anxious about it."

But, feeling the heat from regulators in Australia and elsewhere, Mr Clegg concedes ground rules are needed.

He knows social media companies must also come clean about how algorithms work.

"And tech companies need to know the parameters within which society is comfortable for them to operate, so that they have permission to continue to innovate."

Facebook's recent decision to stop recommending civic and political groups to users in the United States is now being expanded globally.

Facebook is also considering how to reduce the amount of political content in news feeds in response to "strong feedback" from users that they want to see less of it.

People, not machines, make the algorithms that shape our online reality.

Facebook says it is mindful of potential bias and harm.

"The reality is, it's not in Facebook's interest - financially or reputationally - to continually turn up the temperature and push users towards ever more extreme content," Mr Clegg said.

Last year's #StopHateForProfit boycott by the civil rights groups saw more than 1000 companies stop paying for ads on Facebook.

Learning an expensive lesson, Mr Clegg said the vast majority of Facebook's revenue is from advertising, and advertisers don't want their brands and products displayed next to extreme or hateful content.

Facebook was forced to curb its streaming feature in response to global criticism, after material from the Christchurch mosque attacks in New Zealand was freely shared.

At the time, world leaders said the fact Facebook had to remove 1.5 million copies of a gunman killing 51 people was a stark reminder to do more to stop it.

Mr Clegg said it would be better if these decisions were made according to frameworks agreed by democratically accountable lawmakers.

But in the absence of such laws, there are decisions that need to be made in real time.

"The internet needs new rules for the road that can command broad public consent."