Encouraging self-harm online will become criminal offence

 Molly Russell
Molly Russell

Encouraging self-harm online is to become a criminal offence in a move to prevent a repeat of the Molly Russell tragedy, the Government announced on Saturday.

Social media companies will be required by law to remove and prevent any material promoting self-harm and face multi-million pound fines if they fail to do so.

The new offence, to be part of the Online Safety Bill, puts self-harm on a par with encouraging or assisting suicide, which carries a maximum jail sentence of 14 years.

Ministers said the change was designed to combat “avoidable and tragic” cases such as that of Molly, 14, who took her own life after being bombarded with self-harm and suicide content on social media. An inquest into her death found that social media "more than minimally contributed" to her death.

An investigation by The Telegraph found Instagram posts on which the schoolgirl “binged” before ending her life were still available on the social media platform even after the inquest in September.

Social media giants face fines

Michelle Donelan, the Culture Secretary, said: “I am determined that the abhorrent trolls encouraging the young and vulnerable to self-harm are brought to justice. So I am strengthening our online safety laws to make sure these vile acts are stamped out and the perpetrators face jail time.

“Social media firms can no longer remain silent bystanders either and they’ll face fines for allowing this abusive and destructive behaviour to continue on their platforms under our laws.”

The announcement coincides with what would have been Molly’s 20th birthday. The Molly Rose Foundation, a charity set up by her family and friends, said it appeared a “significant” move.

However, it added that the Bill needed to ensure it covered all the legal but harmful content that damaged Molly’s mental health, including posts such as “Who would love a suicidal girl?”

“It’s therefore important that other ‘harmful but legal’ content, of the type we know was harmful to Molly, is also within scope of the Bill. These are complex and vital matters we need to get right for the sake of young people in the future,” said the foundation.

Assisting self-harm in the real world by providing the instruments to do it will become an offence in a separate law. By also criminalising it online, it joins one of a dozen “illegal” priority harms on the Bill alongside terrorism, child abuse, fraud, revenge porn, harassment and cyberstalking.

It will cover any posts, videos, images and other messages that encourage the self-infliction of wounds. Social media firms will be legally obliged to remove or limit them with fines of up to 10 per cent of their global turnover and the prospect of having their services blocked in the UK if they fail to do so.

Online protection divided into three stages

The inclusion of self harm as an “illegal” harm is the first of three stages of online protection that ministers propose as part of the Bill, which is due to return to the Commons for its final stages on December 5.

The clause protecting adults from legal but harmful content, which previously contained self-harm, is to be scrapped after criticism from free speech campaigners that it could lead to “woke” social media firms removing controversial content that upsets but does not harm.

Instead, companies will be held to account for “legal but harmful” sexist, racist or abusive content through their terms and conditions of service. This would mean that if they failed to deliver what they promised in their terms of service in protecting people from abuse or harassment, they could face multi-million pound fines.

Under the third stage, social media firms will be required to offer their adult users an option to filter out abuse or other “harmful” content that is neither illegal nor in their terms and conditions.

Ms Donelan has also pledged to increase protections for children which it is believed will mean retaining a list of “primary priority” legal but harmful content which children must be prevented from encountering such as material promoting eating disorders, and porn.

With a second list of “priority” content, companies would be expected to ensure it is appropriate for the age of the child. This could include misinformation over health or vaccines and material depicting or encouraging violence.

Ministers are also considering tougher age verification requirements for social media platforms to ensure children only see content appropriate to their age.