TIkTok is facing a £27 million fine for endangering children online in a landmark duty of care ruling.
The social media giant has become the first tech company to be issued with a formal notice of intent by John Edwards, the Information Commissioner, for alleged breaches of the children’s code.
TikTok has been accused of allowing children aged under 13 onto its site without "appropriate" parental consent.
It is also alleged to have failed to provide proper information to its users in a concise, transparent and easily understood way, and to have processed "special category data" on ethnic, racial, religious, political or sexual orientation "without legal grounds to do so".
The announcement represents a major victory for The Telegraph’s Duty of Care campaign, which was launched four years ago seeking new laws to protect children from online harms.
The children’s code, which came fully into force last September, was introduced as part of data protection laws to prevent children accessing inappropriate and harmful content unsuitable for their ages.
Mr Edwards can impose fines up to four per cent of a company's global turn-over for breaches.
His latest move comes as the Government is under pressure to reintroduce its Online Safety Bill to Parliament. This will give the Ofcom powers to impose fines of up to 10 per cent of global turnover on firms that expose children to illegal or harmful content including child abuse, suicide, revenge porn and drugs.
'Companies providing digital services have a legal duty'
Mr Edwards said: "We all want children to be able to learn and experience the digital world, but with proper data privacy protections. Companies providing digital services have a legal duty to put those protections in place, but our provisional view is that TikTok fell short of meeting that requirement.
"I’ve been clear that our work to better protect children online involves working with organisations but will also involve enforcement action where necessary."
He disclosed that a further six investigations are underway into tech firms over potential breaches of the code, while a further 50 are being reviewed by the commissioner to decide whether they need to be investigated.
A number of companies have changed practices in response to the code, including Instagram, which has disabled targeted adverts for under-18s entirely, moved children’s accounts to private by default and made it harder for potentially suspicious accounts to find young people.
Facebook now asks for the age of people signing up for accounts, restricts access for people who repeatedly enter different birthdays, has enabled reporting of underage accounts and removes accounts that cannot prove they meet these requirements.
'This is clear proof that tech can be held accountable'
Baroness Kidron, the online safety campaigner who was an architect of the code, welcomed the move on TikTok. "This is clear proof that tech can be held accountable for the safety and privacy of children," she said.
"The end goal should be that companies use their creativity and innovation to comply with privacy legislation rather than make the regulator chase them retrospectively."
TikTok now has the right to contest the notice. A company spokesman said the notice, covering May 2018 to July 2020, was provisional and no conclusions could be drawn.
"While we respect the Information Commissioner’s Office's (ICO) role in safeguarding privacy in the UK, we disagree with the preliminary views expressed and intend to formally respond to the ICO in due course," it said.