Advertisement

How social media killed Molly Russell

Molly Russell became sucked into ‘the bleakest of worlds’, one in which she was bombarded with nightmarish images and videos by social media’s powerful algorithms
Molly Russell became sucked into ‘the bleakest of worlds’, one in which she was bombarded with nightmarish images and videos by social media’s powerful algorithms

Twelve days is all Molly Russell had in the final six months of her life where she was not subjected to the creeping horrors the internet could push her way. That was the only time, her social media records show, when the 14-year-old did not engage with content on Instagram that related to depression, self-harm or suicide.

Now, after years of battling for answers following the death of their beloved daughter, her family finally have the damning truth. Following a two-week inquest and five-year fight urging the social media giants to acknowledge the insidious content on their platforms and better protect children, senior coroner Andrew Walker concluded the online material viewed by Molly was a contributing factor in her death in November 2017.

In an excoriating ruling that has widespread implications for future regulation of social media and the responsibility of tech companies, Walker concluded the material the schoolgirl had secretly accessed in the months leading up to her death “was not safe” and “shouldn’t have been available for a child to see”.

Walker said he did not feel comfortable recording her death as suicide, instead recording: “Molly Rose Russell died from an act of self-harm while suffering from depression and the negative effects of online content.”

He added that she was “exposed to material that may have influenced her in a negative way and, in addition, what had started as a depression had become a more serious depressive illness”.

In the words of her father, Ian Russell, she became sucked into “the bleakest of worlds”, one in which she was bombarded with nightmarish images and videos by social media’s powerful algorithms.

Molly’s parents, Ian and Janet, arriving at the inquest - Jeff Gilbert
Molly’s parents, Ian and Janet, arriving at the inquest - Jeff Gilbert

Ever since his daughter’s death, the 59-year-old and his wife, Janet, have battled against the might of Silicon Valley to introduce proper regulation. But five years on there remains urgent concerns over the extent to which these algorithms – designed to push ever more related content on a user (from cat videos to gruesome images of self-harm) – played a role in fuelling Molly’s depression.

Her death also raises a more fundamental question: whether the business model of tech giants in monetising the attention of smartphone users means any child can be truly safe online?

Following the ruling, NSPCC chief executive Sir Peter Wanless condemned Meta (the parent company of Instagram) and Pinterest’s “abject failure” to protect Molly from content no child should ever see. The Children’s Commissioner, Dame Rachel de Souza, also urged the tech giants to “get a moral compass and step up”.

Readers should note that most of the shocking social media posts they are talking about have – rightly – not been reported by this newspaper or others because of UK media regulatory obligations; reporting restrictions and concerns they could spark further harm.

Pressure is also building on the Government. Despite years of promises to “legislate to make the UK the safest place in the world to be online”, progress has been slow. Ministers have tabled a flagship Online Safety Bill (something the Telegraph has campaigned for since 2018) but even at this late stage there are fears it may be abandoned or watered down.

In recent days, five former culture secretaries have publicly called on the Government to urgently put the proposed Bill into law. “There are, tragically, too many stark examples of suicide and death amongst our young people, where social media has played a role in algorithmically promoting content that promotes violence or self-harm,” they said in a Telegraph article. They added that the tech firms themselves could not be trusted to combat such “devastating” content.

During the inquest, Molly’s mother described in heartbreaking detail the moment she discovered her daughter on the morning of November 21, 2017 in their Harrow home, north London. She had been busy with chores and waving off one of her two older daughters to school, when she realised she had not heard Molly. As she walked into Molly’s bedroom she noticed a pile of clothes on the floor before seeing her daughter’s body and running out screaming. When asked by her daughter what had happened, all she could reply was: “It’s Molly, it’s Molly.”

It is a discovery which will strike fear into every parent. And yet, the often terrifying and manipulative world which sucked Molly in remains just a few clicks away for most children.

‘A dark, exclusive club’

In many respects Molly Russell was like any other teenage girl. She liked Harry Potter, the musical Hamilton and boy bands. She enjoyed horse riding and vintage fashion. She adored her older sisters.

At primary school she was a teacher’s pet and her father told the inquest she was relaxed about moving to secondary school, Hatch End High. The youngster had a keen desire to learn, he said, and “never caused anyone any concerns”.

But like many children, Molly was also exploring a secretive online world. It is now known that she set up an Instagram account at the age of 12. The minimum age for users is 13, but no age verification checks were asked of her. Meta admitted last week that “age-verification is an industry-wide issue”.

At first her online persona reflected the seemingly happy and settled girl she was. She would regularly browse her interests on Instagram, Pinterest and YouTube. Her father says he did not particularly view her as “much of a social media person at all”.

Around the final year of her life, though, all of Molly’s closest family were said to have noticed a change in her behaviour. She spent an increasing amount of time in her room and often appeared to be the last of the household awake at night.

On the outside, at least, the inquest heard she still “happily contributed” to family life. However there were warning signs. In August 2017, Ian recalled that Molly had refused to swim with friends and wore a long-sleeved top despite it being a hot day. She also gave up horse riding, “something she had loved doing”.

Ian has set up the Molly Rose Foundation to help suicidal young people, and vows he will continue to battle to protect children from the harms of social media - Andrew Crowley
Ian has set up the Molly Rose Foundation to help suicidal young people, and vows he will continue to battle to protect children from the harms of social media - Andrew Crowley

Internet records from the last six months of Molly’s life provided unprecedented insight into how a teenager can become consumed by the dark recesses of these platforms.

Oliver Sanders KC, representing the Russell family, told the court that during this period, of the 16,300 images she saved or liked on her Instagram account, 2,100 were depression, self-harm or suicide-related. Molly also engaged with 138 videos during this time that contained these same themes.

Following her death her family also discovered she had a secret Twitter account under a pseudonym which she used to voice her thoughts about her worsening depression. Occasionally using this account she would reach out to public figures for help, saying her “mind was full of suicidal thoughts”.

The first known suicide-related image Molly saved on Instagram was at 10am on May 21, 2017. It showed a man apparently about to take his own life with the caption: “Why would things not be alright?”

Over the course of the next 13 hours, she saved or liked a further 12 posts relating to depression and mental health. That same day, Molly looked at 17 posts related to suicide and depression on Pinterest.

As mentioned, reporting restrictions mean some of the most gruesome content she viewed cannot be made public. As Molly’s anguished father discovered when he trawled through some of the content, these are images and words designed to alienate a person viewing them from normal life.

“The process as I see it is one of normalisation,” he told the inquest. “You might think yourself a remote individual but you can join their club. It is a place of discouraging getting help and encouraging further self-harm.”

Molly Russell - Family handout/PA
Molly Russell - Family handout/PA

Psychiatrist Dr Navin Venugopal who was asked to give evidence said that after viewing some of the suicidal content Molly watched he was left unable to sleep.

Many were bleak quotes promoting self-harm under a veil of concern. They romanticised or even glamourised suicide, the inquest heard, and make those viewing it feel like a member of a “dark, exclusive club”.

Some of the clips we can report included quotes such as: “Maybe I should kill myself.” Others depict people taking their own lives. Some show people engaged in incidents of self harm, while others depict skeletal bodies on weighing scales, tape measures around tiny waists.

It was in the small hours when her family were sleeping, particularly towards the end of her life, that Molly went on a “series of binges” of deeply distressing videos.

In his ruling, Walker concluded: “The platform operated in such a way using algorithms as to result, in some circumstances, in binge periods of images, video clips and text – some of which were selected and provided without Molly requesting them.” He added: “These binge periods, if involving this content, are likely to have had a negative effect on Molly.”

Perhaps the most direct links between Molly’s death and the content she consumed on social media took place on the night of November 11. Over the course of that day Molly saved 148 videos or pictures of all varieties on Instagram.

At 10.16pm, the videos started. First a one-minute, black-and-white montage of distress including the captions “if you dont feel pain…you dont really feel anything”. Another followed three minutes later, then another, then another, escalating from self-harm to death. At 12.56am, Molly switched over to Google where she started to research suicide methods.

Instagram was the final app she used on the night she died. At 12.43am on November 21, she saved an image which read: “The worst thing depression did to me was take my intelligence. Feeling yourself getting dumber and dumber is absolutely painful.”

‘A false world full of illusion’

Originally the Silicon Valley executives who gave evidence at the inquest had hoped to do so remotely, with their lawyers citing concerns over Covid-19 and “significant travel” required as reasons to prevent them attending. This was dismissed by the coroner. Instead, under the gaze of Molly’s family, they faced a bruising appearance at North London Coroner’s Court.

Judson Hoffman, Pinterest’s head of community operations, apologised to Molly’s family after being shown some of the content she viewed. Pinterest said it has now taken steps to remove or hide the majority of suicide and self-harm content from users, aided by artificial intelligence which can scan pictures when they are uploaded. Although Mr Russell pointed out even this summer he could still find images with a search for a depression-related term. Pinterest claimed in response that this was an oversight which has since been corrected.

Meanwhile Elizabeth Lagone, Meta’s head of health and wellbeing, told the inquest that it is “a very delicate balance” of the needs of the viewer and the poster. “We do not allow content that encourages suicide or self injury and when we are aware of it we remove it,” she said, before adding that posts relating to suicidal thoughts should not be removed because “it’s important for people to express themselves” and not to silence, what she deemed, “cries for help”.

During the inquest, Walker occasionally made scathing interventions. In one exchange between Russell’s barrister and Lagone, he interjected: “You wouldn’t send a child out on the edge of a busy road without holding their arm.” He also described social media as “a false world”, one “full of illusion and lacking in any kind of reality”.

After hearing closing submissions, Walker acknowledged he could not make any formal recommendations, but instead urged the companies not to let the chance to make children safer online “slip away”.

Despite tech giants’ claims that they are tackling the problem, this week the Children’s Commissioner published research revealing young people are still being bombarded with content promoting self-harm.

The research, based on 2,000 children aged eight to 17 and covering seven online platforms, showed 45 per cent are subjected to “upsetting and worrying” harmful content – including material promoting self-harm – without even seeking it out.

It is more than five years since the Government took its first tentative legislative steps to crackdown on online harms with a green paper of the Online Safety Bill in October 2017, yet it is still to reach the statute book.

Internet records from the last six months of Molly’s life provided unprecedented insight into how a teenager can become consumed by the dark recesses of social media - Yui Mok/PA Wire
Internet records from the last six months of Molly’s life provided unprecedented insight into how a teenager can become consumed by the dark recesses of social media - Yui Mok/PA Wire

The proposed new regime will be policed by Ofcom, with powers to fine firms failing to protect children up to 10 per cent of their global turnover. Ofcom will also be able investigate a company’s systems and, critically, the algorithms blamed for driving “harmful” content.

If the firms fail to provide Ofcom with the information it needs to investigate potential breaches of their duty to protect children, the Bill proposes their directors could be prosecuted and jailed for up to two years.

But the Online Safety Bill was about to complete its passage through the Commons when it was pulled at the last minute before this summer recess amid a growing backlash, including by senior Tory MPs over its impact on freedom of speech.

At the centre of the criticism are provisions to tackle “legal but harmful” content for adults which critics fear could allow “woke” social media firms to censor controversial comments that offend some people.

Michelle Donelan, the seventh Culture Secretary to take charge of this, is currently consulting over what she describes will be “tweaks” to protect free speech. But she has pledged children will be exempt from these “edits” and that the Bill will be back in the Commons “as soon as possible” to prevent a repeat of “horrendous incidents” like that of Molly Russell.

Even if the entire clause 13 of the Bill regulating “legal but harmful” content for adults is removed, there are separate provisions which require firms to prevent children from “encountering altogether” “primary priority content” which includes promoting self-harm or eating disorders, “legal” suicide material and pornography.

Many experts believe this is likely to be the compromise but there is still nervousness. Powerful voices have spoken up this week against dilution and urged the Government not to delay the legislation any further.

The Russells, too, will continue their fight. Ian has set up the Molly Rose Foundation to help suicidal young people and vows he will continue to battle to protect children from the harms of social media. Nothing will ever return their beloved “positive, bright and happy daughter” as Ian described Molly during the inquest before the algorithms had her in their grip. But at least, they might be able to prevent another youngster from being lost into the void.