Charities urge Government to target smaller websites under Online Safety Act

A group of charities and online safety campaigners have written to the Prime Minister, urging him to ignore advice from Ofcom around which websites to categorise as the most dangerous under the Online Safety Act.

The group of campaigners said the regulator’s advice that smaller websites should not be designated Category 1 – the rating which gives Ofcom the greatest scope of powers for oversight and regulation of that platform – left a number of “the most dangerous online forums” not fully in scope of the regulation.

In guidance to the previous Conservative government, published in March, Ofcom proposed setting the threshold for what should be a considered a Category 1 service under the new rules as those which disseminated content easily, quickly and most widely, proposing among other things, that it should be for sites with at minimum, more than seven million UK users.

But, in an open letter to the Prime Minister, the campaigners argue that this approach would leave a number of smaller, but dangerous “suicide forums” free of the most stringent rules, and urged the Technology Secretary Peter Kyle to use powers that enable him to determine which sites should be placed in Category 1 “based on functionality and other characteristics alone rather than requiring that they also be of a certain size”.

General view of an Ofcom logo
Ofcom recommended applying the rules to websites with more than seven million UK users (PA)

“This would allow a limited number of small but exceptionally dangerous forums to be regulated to the fullest extent possible,” the letter says.

“These include forums that are permissive of dangerous and hateful content as well as forums that explicitly share detailed or instructional information about methods of suicide or dangerous eating disorder content.

“Given the cross-party support for such an approach to regulation of these platforms, we were dismayed to see that Ofcom, in its recently published advice to the previous Secretary of State on categorisation, explicitly recommended not using this power to address these extremely dangerous sites.”

The open letter has been signed by a number of leaders from charities including Samaritans, Mind, the Mental Health Foundation, the Molly Rose Foundation and online safety groups such as the Centre for Countering Digital Hate and bereaved families.

The letter highlights a report which links one such forum to “at least 50 UK deaths”, adding “we understand that the National Crime Agency is investigating 97 deaths in the UK thought to be related” to the site in question.

The group argues that this “highly dangerous suicide forum” should be regulated “at the same level as sites like Facebook and Instagram” in order to make them “accountable” for the content they allow to appear on their platform.

The letter also notes that there are similar issues around sites hosting antisemitic and Islamophobic content, as well as smaller platforms being used to “stoke this summer’s racist riots”.

“We would argue that the events of the summer, in tandem with the ongoing human cost of a growing number of suicides, are sufficient evidence in themselves to justify the Secretary of State deciding to divert from Ofcom’s advice and set the categorisation thresholds for the regime in the most robust and expansive way the Act allows,” the letter says.

“Ofcom’s current recommendations, which involve services having content recommendation systems, and having the functionality for users to forward or re-share content, in addition to having a large size, would do nothing at all to address the services we are concerned about.

“We hope that you will be able to take action on addressing this major oversight in the advice that the government has been given by Ofcom.”

Under the Online Safety Act, which is due to start coming fully into force next year, and will place new duties on social media sites for the first time, with the largest and most popular, as well as those which count children among their users, set to face the strictest rules.

Platforms will be required to put in place and enforce safety measures to ensure that users, and in particular young people, do not encounter illegal or harmful content, and if they do that it is quickly removed, with those who do not adhere to the rules facing large fines.

An Ofcom spokesperson said: “There should be no doubt that these sorts of harmful websites will be tightly regulated.

“From next year, any sites that don’t comply with their illegal content and child safety duties will be in breach of our regulations, and we will use the full extent of our powers to take action against them.

“Additional duties such as producing transparency reports will be a powerful tool in making larger platforms safer. But they would do little to tackle the harm done by smaller, riskier sites – and could even attract attention to them.”

A Government spokesperson said: “Too many people are affected by the tragedy of suicide, which is so often preventable.

“The Secretary of State is working steadfast to deliver the Online Safety Act, which will stop children seeing material that promotes self-harm and suicide.

“He recently wrote to Ofcom to request an update on how it intends to monitor such services, using the full force of their enforcement powers.”