‘Rather than block harmful material online, we have to tackle the causes’

Lord Grade - David Rose for The Telegraph
Lord Grade - David Rose for The Telegraph

The inquest into the death of Molly Russell, the teenager who took her own life after viewing images promoting suicide on Instagram, is serving as a “graphic reminder” that “it is time for serious regulation” online, the chairman of Ofcom has said.

Lord Grade, the former television executive, said that the ongoing hearing underscores the importance of the Online Safety Bill currently making its way through Parliament.

He said: “It is a graphic reminder that there is – for all the fabulous benefits that the internet provides – a little toxic bit of it. It has been a Wild West, and it is time for serious regulation.”

Molly, from Harrow, north-west London, was 14 when she died nearly five years ago. In the months before, she had browsed disturbing images on social media platforms including Instagram and Pinterest.

Last week, a senior executive from Meta, the company that owns Instagram and Facebook, gave evidence to the inquest and defended suicide content, claiming it helps people to “share feelings and express themselves”. Lord Grade said Meta’s position raised a “huge question” but warned against drawing conclusions before the coroner.

The 79-year-old, who has held senior leadership roles at the BBC, ITV and Channel 4, and was appointed as chairman of the media regulator in April, spoke after Liz Truss said the legislation required some “tweaks” to avoid damaging free speech.

Critics have attacked parts of the proposed laws that would require tech giants to remove “legal but harmful” content such as cyberbullying, arguing they would usher in a new form of censorship.

social media - PA
social media - PA

Lord Grade, a Conservative peer, said that regardless of where the lines are drawn Ofcom most requires legal clarity to allow it to stand up to the might of Silicon Valley and its lawyers.

He said: “These tech companies are not short of a few bob for my learned friends. So all this stuff will get tested pretty quickly in court. So we must have absolute clarity on where the law stands.”

He spoke ahead of a speech to the media industry on Tuesday at the Royal Television Society Convention in London. In it, he will emphasise that Ofcom aims to use the powers it will gain from the Online Safety Bill to tackle the causes of harmful material online rather than to censor it, in part because of the sheer volume of material posted on social media.

He will say: “Ofcom is not a regulator that intervenes in legitimate debate.

“Even if we could regulate every piece of harmful content, we’d only be treating the symptoms of online harm. Parliament correctly wants us to get to the cause, by shining a light on the decisions companies take in designing and operating their services.”

He will reject attempts by the Left and Right to use Ofcom’s powers as a weapon in the “culture wars” that are waged online over issues of politics, race, gender and other emotive topics.

“I want to be very clear: Ofcom does not, and should not, regulate the culture wars. Some try to conscript us to their cause. But we’re not interested. That is not our job. Whether we are judging that Piers Morgan’s comments about the Duchess of Sussex were justified by freedom of expression, or that Diversity’s tribute to the Black Lives Matter movement was, too – we never make decisions based on personal preference, political pressure, fear or favour.”

Last year, the Duchess of Sussex and more than 50,000 viewers complained to the regulator after Mr Morgan cast doubt over claims she made about her mental health in an interview with the American broadcaster Oprah Winfrey.

Ofcom found Mr Morgan’s comments had been “potentially harmful and offensive” but legitimate debate in the public interest, and rejected the complaint. The dance troupe Diversity attracted 24,500 complaints over their tribute to the Black Lives Matter movement on Britain’s Got Talent.

The regulator also found in their favour. As he seeks to apply similar principles online, Lord Grade will demand a shift in culture from the likes of Google, Meta and TikTok, so that battling harmful material is part of front-line activity and not only a part of public relations or lobbying.

He will say: “Like bankers who think their compliance department belongs to a galaxy far, far away, those who design and operate the tech platforms are not routinely touched by safety concerns.

“Ofcom will have powers to summon people with day-to-day responsibility for users’ safety on the sites and apps themselves. This represents a very meaningful, overdue shift in the regulatory culture of big tech.”

And he said: “We can send for these people and get them to give us the information on how their algorithms work, and how their AI [artificial intelligence] works, or whatever. These are huge powers and the world is watching.”

Lord Grade meanwhile warned the BBC and other public service broadcasters against seeking to compete with “the shrill zealotry of social media”.

He will tell the Royal Television Society Convention: “Of course, broadcasters need to find an audience for content on social media.

“In trying to reach younger people, who watch seven times less linear TV than over-65s, clicks might matter more than viewership. But in the fight for attention, traditional broadcasters will never match social media’s capacity for the shrill and the shocking. Nor should they be trying to. Instead, we look to them for calm, forensic analysis and interrogation.”

Lord Grade, who began his television career nearly half a century ago at London Weekend Television, said that although he is not a user of social media platforms he was taking “deep dive” lessons in how their algorithms work and was “pretty good” at fixing his computer.

Having initially ruled out coming out of retirement to chair Ofcom, he changed his mind after being persuaded of its critical importance. He said: “I don’t mind rolling up my sleeves if they’ve got nobody else.”