Advertisement

Social media giants face multi-million pound fines if they fail to ban child accounts

Online safety
Online safety

Social media firms will be forced to bar underaged children or face multi-million pound fines under a new law to protect them from harm online.

The Government will unveil the revamped Online Safety Bill on Tuesday, which will compel companies by law to publish how they enforce age limits so parents, as well as the watchdog Ofcom, can test their credibility.

Firms that do not follow their own terms and conditions, including on age limits, will face fines of up to 10 per cent of their global turnover. For Meta, the parent company of Facebook and Instagram, that would be up to $12 billion.

The move follows a four-year campaign by The Telegraph for new “duty of care” laws to better protect children from online harms.

The new bill also addresses concerns over freedom of speech by dropping plans for restrictions on content described as “legal but harmful”. Instead, it requires social media companies to give adult users more control over the material they see online.

'Nonsense' of underage child accounts

In an exclusive article for The Telegraph, Michelle Donelan, the Culture Secretary, says that the revised bill will strengthen age verification to end the “nonsense” of technology firms claiming they do not allow underaged children on their social media platforms, when any parent could tell you that they do.

Ofcom research suggests a third of children aged five to seven and 60 per cent of eight to 11 year olds have their own social media profile. This suggests that as many as 1.8 million children under 13 in the UK have a social media account.

In her article, Ms Donelan writes that stronger child protection was a “red line” for her when she took charge of the bill, to ensure technology firms shield children from harmful content - including sexual abuse and cyberbullying.

Under the revised bill, where social media firms specify a minimum age for users, they will now have to clearly set out and explain in their terms of service the measures they use to enforce this. This could include the use of age verification technology that requires users to provide official identities authenticated by a third party to maintain security and privacy.

The technology firms will also have to publish risk assessments on the dangers their sites pose to children. This follows leaked internal Facebook research that showed the technology giant knew about the toxic risks of Instagram to teenage girls’ mental health, as well as the prevalence of drug cartels and traffickers on its app.

Ms Donelan writes that such changes will reinforce the original intention of the bill to protect young people like Molly Russell, 14, who took her own life after being bombarded with self-harm content on Instagram and Pinterest.

The bill is designed to protect young people like Molly Russell, who took her own life after being bombarded with self-harm content online - Family handout/PA
The bill is designed to protect young people like Molly Russell, who took her own life after being bombarded with self-harm content online - Family handout/PA

Ministers intend that the revamped bill, which is due to return to the House of Commons on Dec 5, will offer a “triple shield” of protection for users.

The first level will mean technology firms have to prevent and remove illegal content such as fraud, assisting suicide, threats to kill, harassment and stalking, the sale of illegal weapons and revenge porn.

As revealed at the weekend, encouraging self-harm online will become a new criminal offence.

The firms’ terms and conditions of service - many of which already prohibit abuse and racism - will have to be enforced, with Ofcom empowered to fine social media platforms if they fail to do so.

Free speech requirements

The third level will mean technology firms are required to provide their adult users with tools to filter out “legal but harmful” content that they do not want to see, such as the glorification of eating disorders, racism, anti-Semitism or misogyny. This could be done through moderation, blocking content or warning screens.

The bill will retain protections for children against “legal but harmful” material by requiring social media firms to prevent them from encountering “primary priority” content that falls below the criminal threshold on self-harm, suicide, eating disorders and pornography but is nevertheless deemed inappropriate or harmful.

Ms Donelan has sought to balance the tougher safeguards for children with increased protections for free speech by scrapping plans to regulate legal but harmful content for adults, after critics claimed it would give “woke” technology firms too much power to determine what was published online.

This move will be allied with tougher “free speech” requirements that will prohibit firms from removing or restricting users’ posts, or suspending or banning them, where they have not breached the sites’ terms of service or the law. If anyone or any post is removed, firms will have to offer an effective right of appeal.

In other moves contained in the bill, controlling or coercive behaviour will be included as priority illegal content alongside fraud and other crimes. That means social media firms will have to provide measures so victims can block or prevent it happening.

Adults will also have the right to block anonymous trolls via tools to control whether they can be contacted by unverified social media users.

The children’s commissioner, as well as the victims’ and domestic abuse commissioners, will be added as statutory consultees to the bill, so that they are involved in drawing up the codes the technology firms will have to follow to comply with the law.

The bill is not expected to be on the statute book until spring next year, from which point it could take up to 18 months to fully enact its regulations.


Michelle Donelan, the Culture Secretary, says the Online Safety Bill 'will genuinely change lives for the better' - Rasid Necati Aslim/Anadolu Agency via Getty Images
Michelle Donelan, the Culture Secretary, says the Online Safety Bill 'will genuinely change lives for the better' - Rasid Necati Aslim/Anadolu Agency via Getty Images

Our values should be dictated by us, not Silicon Valley

By Michelle Donelan, Culture Secretary

British values are family values. We protect our children from those who wish to do them harm. We defend the most vulnerable. And we all know that whether it is in a family or in society, free speech and the right to disagree is the bedrock of a healthy community. So why should we allow the online world to be any different?

I don’t think it is too much to ask that these basic British values are reflected online. But that is difficult when social media companies have the financial clout of small countries, and when chief executives have all the power of presidents with none of the accountability. So we must make it clear that our values and our way of life will be determined by us, not Silicon Valley.

Next week, one of the most important building blocks of a safer, freer, more user-friendly online world returns to Parliament. I have carefully amended the Online Safety Bill to ensure  it reflects the values of our way of life - protecting children, safeguarding the vulnerable, protecting legal free speech and defending consumer choice.

Protecting children is the fundamental reason why the Online Safety Bill was created, and so the changes I have made strengthen the child protection elements of the bill significantly. Though debates around free speech have dominated the conversation, its original purpose was to protect young people like Molly Russell. In 2017, the 14-year-old took her life after being bombarded with self-harm content on Instagram and Pinterest.

So when I became Digital Secretary, I set a red line on the child protection measures in the bill, vowing to protect and strengthen them. When it returns, the bill will contain even stronger protections for children, combined with user rights protections for adults that inject genuine choice for users.

Tech companies will have to shield children from a whole range of harmful content - including child sexual abuse, pornography and cyberbullying. If they fail, they will face huge fines of up to 10 per cent of their annual global turnover. For Meta, that would currently be up to $12 billion.

I have also strengthened the legislation to help tackle the absurd situation we have with age limits. Some platforms claim they don’t allow anyone under 13 - any parent will tell you that is nonsense. Some platforms claim not to allow children, but simultaneously have adverts targeting children. The legislation now compels companies to be much clearer about how they enforce their own age limits.

This is completely separate to changes I am making for adults, which I approached with a few simple principles - what is illegal offline should be illegal online, tech giants should abide by their own terms and conditions, and the Government should not be in the business of telling adults what legal content they can see.

The “legal but harmful” clauses in the bill, in my view, violated the rights of adults to choose what legal speech they say and see. So I have removed “legal but harmful” in favour of a new system based on choice and freedom.

Equally, if something is not prohibited in their terms and conditions, tech giants should not be removing it. Platforms will need to be far more transparent about how their algorithms work and, for the first time, users will have the right of appeal. Silicon Valley executives will no longer be able to arbitrarily silence people, nor continue to treat some sections of society differently.

Rather than handing down edicts to tech companies on what legal speech they should and shouldn’t police on their sites, we are putting the control back into the hands of users, while also ensuring that social media companies no longer put profit before children’s lives.  Alongside this, where it is demonstrably obvious that something should be illegal, we should make it illegal. Thanks to an amendment announced on Tuesday, we will close the legal loopholes that allow the horrific encouragement of self-harm.

These common sense solutions combine to form the basis of a bill that will genuinely change lives for the better, while also protecting the rights and values we hold dear.