Online Safety Bill ‘might fail to protect young people from harmful content’

The Samaritans urges the Government to tighten proposed restrictions on social media sites circulating self-harm material

TikTok social media sites Online Safety Bill
Social media platforms such as TikTok have been criticised for not doing enough to filter out graphic and harmful content Credit: Reuters/Dado Ruvic

Graphic depictions of self-harm and suicide online will continue to put young people at risk of harm despite the Government’s new Online Safety Bill, the Samaritans said.

In its first major intervention over the Bill, the charity said that information, depictions, instructions and advice on methods of self-harm and suicide would not reach the threshold to be classed as illegal content for anyone over 18.

Research showed that vulnerable young people in their late teens and 20s would, as a result, be put at increased risk of self-harming or taking their own lives, according to Julie Bentley, the chief executive of the Samaritans.

Her charity is urging the Government to reverse its decision to abandon plans to regulate legal but harmful content for adults in relation to self-harm and suicide.

In an exclusive interview with The Telegraph, Ms Bentley said that turning 18 and becoming an adult did not mean that young people stopped being vulnerable. 

Julie Bentley, the chief executive of the Samaritans, said that turning 18 did not mean that young people stopped being vulnerable Credit: Facebook/Samaritans

“We have heard from young people over 18, 19, 20, 21 who are feeling suicidal and describe how easy it was to find information to end their life online,” she said.

“That makes them more determined to commit suicide. They reported having automated emails that suggested further content for them to look at.”

New research commissioned by the Samaritans from Swansea University showed that 83 per cent of social media users surveyed were recommended self-harm content on their personalised feeds, such as Instagram’s “Explore” and TikTok’s “For You” pages, without searching for it.

Three-quarters, 76 per cent, of those who had self-harmed said they had done so “more severely” because of viewing self-harm content online.

Freedom of speech concerns

Ms Bentley said: “One of the reasons we have heard that the legislation has been watered down by the Government is around issues of freedom of speech. We very much want to encourage safe and open conversations about suicide because talking about it can protect people.

“But this is not about freedom of speech. It is about protecting lives by restricting access to this content. When they are vulnerable, they are not able to protect themselves from this content.

“When people are in a vulnerable position, they are not able to make those safe choices for themselves that brings in that protection. Whereas if it wasn’t accessible to them, it would keep people safe.”

The Samaritans said that Westminster’s latest proposal, a filter that users could request to block out legal but harmful content, would not be effective for someone who might be in the throes of depression.

“The point is that when someone is vulnerable and feeling that they are considering self-harm or suicide, they are not going to be in a mental space to realise that they should not be looking at that content. They are too vulnerable to be able to switch off that content,” said Ms Bentley.

She warned that the Government risked going against public opinion with its plans to place a duty on social media firms to protect adults from legal but harmful content.

Her views were backed by an omnibus survey, commissioned by the Samaritans, of 2,000 UK adults which showed two-thirds, 66 per cent, were concerned with the current accessibility of suicide and self-harm content on the internet. 

Four in five, 83 per cent, agreed that suicide and self-harm content can have a damaging effect on adults, not just children. 

Almost two-thirds, 63 per cent, believed that access to potentially harmful content on the internet should be restricted for everyone, including adults.

Three-quarters, 75 per cent, agreed that tech companies should be required by law to prevent suicide and self-harm content being shown to users of all ages. 

Labour has proposed an amendment which would reinstate the requirement to regulate legal but harmful content. This is expected to be heard when the Online Safety Bill returns to the House of Commons on Tuesday.