Advertisement

Is TikTok safe to use? Concerns raised about harmful content and data privacy

While TikTok narrowly avoided an outright ban under former US president Donald Trump, it is facing growing pressure to prove its security credentials (Loic Vanance / AFP via Getty Images)
While TikTok narrowly avoided an outright ban under former US president Donald Trump, it is facing growing pressure to prove its security credentials (Loic Vanance / AFP via Getty Images)

While social networks such as Facebook and Twitter are stagnating, TikTok is growing quickly. With such growth comes plenty of attention, and TikTok is now facing scrutiny in several areas, from the algorithmic distribution of harmful content to how it handles user data.

The US has already banned the viral video app on government devices in more than 20 states due to spying allegations stemming from its Chinese ownership. Universities in Oklahama, Alabama, and Texas have followed suit by restricting students from accessing the app over campus Wi-Fi networks.

While TikTok narrowly avoided an outright ban under former US president Donald Trump, it is facing growing pressure to prove its security credentials. There are specific fears over Chinese government access to user data — a problem that doesn’t impact comparable western social networks. It also faces the same content clean-up challenges as other platforms, where millions of users are able to post videos instantly and moderators struggle to keep pace.

Should you quit TikTok or curb your kids’ usage? Below we’ve outlined the arguments so that you can make up your own mind.

Is TikTok harmful?

The case against TikTok can broadly be split into two categories: harmful content and privacy concerns due to its Chinese ownership.

For the former, there are good reasons to be alarmed. The Center for Countering Digital Hate has found TikTok will show children harmful content as soon as they show an interest in related topics.

Its researchers generated accounts in the US, UK, Canada, and Australia, on behalf of fictional 13-year-olds. They “liked” and interacted with videos related to mental health and body image, to assess how this would affect the content shown in the app’s For You feed.

The accounts were shown self-harm or eating disorder content every 206 seconds on average. More extreme content was shown to accounts intended to represent vulnerable youths, with references to weight loss in their usernames.

TikTok has also been demonstrated to be a hive of misinformation. In September 2022, Newsguard found that when searching “2022 election”, “mRNA vaccine”, and “Uvalde tx conspiracy”, 20 per cent of TikTok posts contained false or misleading information. Newsguard is a service that rates news and information websites based on how trustworthy they are.

The concern about Chinese ownership is the bigger picture. Governments — including the US’s where a campaign to ban TikTok is gathering pace — are concerned about it as a national security risk, because it’s owned by Chinese company ByteDance.

Donald Trump wants a ban on TikTok (Lionel Bonaventure and Jim Watson / AFP via Getty Images)
Donald Trump wants a ban on TikTok (Lionel Bonaventure and Jim Watson / AFP via Getty Images)

“There is clearly bipartisan support to do something about TikTok, and the continued reports about harmful content and misinformation being served to users – particularly young people – will only add fuel to the fire,” says Insider Intelligence principal analyst Jasmine Enberg.

A paper by cybersecurity firm Internet 2.0 claims the TikTok app uses “excessive” data harvesting, reaping information on user location, the contents of direct messages, and more. The paper says it stores this – in part – on servers in mainland China.

TikTok admitted in November that Chinese staff could and did access user data. But a spokesperson stressed: “We have never provided any data to the Chinese government.

“We believe in the importance of storing European user data in Europe; keeping data flows outside of the region to a minimum.”

TikTok has also denied more recent, explosive claims made by former employee Yintao “Roger” Yu in a wrongful termination lawsuit filed in San Francisco. CNN Business reported Yu’s claims about the Chinese Communist Party having an office within the company. This would, Yu alleged, occasionally be referred to as the “Committee” and monitor and guide Bytedance on “how it advanced core Communist values.”

“The Committee maintained supreme access to all the company data, even data stored in the United States,” the complaint reportedly said. It added that Bytedance was “responsive to the CCP’s requests” to both promote and remove content, and alleges that user data was accessible via a backdoor channel, regardless of where said data was located.

This is all firmly denied by Bytedance, as you would expect. “We plan to vigorously oppose what we believe are baseless claims and allegations in this complaint,” a spokesperson told CNN.

Elsewhere, a Forbes report claimed ByteDance planned to “monitor the personal location of some specific American citizens”.  TikTok denied the claims made in the article, but later sacked four employees for accessing personal data of journalists in an attempt to track down sources.

This was enough for Alicia Kearns, Conservative MP for Rutland and Melton and Chair of the Foreign Affairs Select Committee, to advocate Brits deleting the app. “What TikTok does is it gives away the data that makes you most vulnerable: who are you friends with; what are your interests; what are the interests you have that you may not want publicly disclosed; who you are having private conversations with; the locations you go to,” she told Sophy Ridge of Sky News in February 2023.

“Our data is a key vulnerability and China is building a tech totalitarian state on the back of our data.”

Where the two concerns — harmful content and Chinese ownership — meet, is if the Chinese government has any say in how the algorithms surface content. Could it apply pressure on ByteDance to spread propaganda or harmful content to British teens? It’s not completely far-fetched, given what we know about Russian troll farms spreading disinformation in the West and the alarming fact that seven per cent of UK adults now get their news from TikTok.

Is the criticism fair? Is TikTok safe?

These are serious points worth highlighting. However, it’s worth pointing out how many of these safety concerns apply to other companies, too.

If you side with US politicians who want to ban TikTok, and effectively booted Huawei out of the US in 2019, you should probably also avoid a host of other brands. These include Honor, Xiaomi, OnePlus, Oppo, Lenovo, Realme, and ZTE, among others. They are all Chinese.

As for harmful content, it’s not as if Facebook, YouTube, and Twitter haven’t had their fair share of content scandals where fake news, dangerous disinformation, and scams spread freely with devastating real-world consequences. Social media algorithms value attention and engagement above all else, and that leads to a dark place.

TikTok seems to get special attention because of its Chinese ownership, which can feel somewhat Sinophobic. If you are willing to discount the idea the Chinese government looms over the app 24/7, TikTok starts to look like just another social network.

In October, even the GCHQ’s director Jeremy Fleming said that he wouldn’t be concerned if his own children used TikTok.

But his follow-up was just as important. He said he would “speak to my child about the way in which they think about their personal data on their device”. That’s useful advice whether about TikTok or any other part of the social web.

Sir Jeremy Fleming (Joe Giddens / PA)
Sir Jeremy Fleming (Joe Giddens / PA)

How to make TikTok safer for your children

For all its problems, the upcoming Online Safety Bill is at least taking the issue of harmful content seriously. If the legislation is passed, it would mandate that social media companies must actively look for illegal content, rather than relying on a reporting system to dig it out. However, provisions on “legal but harmful” content, which includes some videos related to self-harm, have been removed.

But what about the here and now? Blocking your children from using TikTok isn’t a problem-free solution. If all their friends use the platform, then you’re making their social life harder, after all. So what can you do to make it as safe as possible?

One tip is to ensure they are registered with the correct date of birth in the app. This won’t allow those under the age of 13 to register, and you shouldn’t let them sidestep that restriction: it’s there for a reason, after all.

But there’s another reason to be truthful with ages. Accounts for those aged 13 to 15 are set to “private” by default, meaning any content posted cannot be viewed by anyone else. And even if these accounts are made public, their content will not be shared to the For You feed.

Direct messaging is switched off for under-16s, and those under 18 cannot live-stream or receive Virtual Gifts, which make up TikTok’s tipping economy.