The short-form video app was once billed as a dreamland of music and memeification, but with talk of banning it left, right, and centre – what exactly went wrong? Megan Warren-Lister investigates
By Megan Warren-Lister
For Boo Betts, the same platform that became famous as a happy-go-lucky hub of lip-syncing and dance videos, was later responsible for her mental breakdown. TikTok might have its roots in musical content, but Betts’ experience exposes a Silicon Valley fairytale gone wrong.
The 23-year-old downloaded TikTok in December 2019, a year after the globally popular app was released. A review of the platform by The New York Times called it “the only truly pleasant social network in existence”. Yet for Boo, the optimism and positive press the app received around the time of its release now feels inconceivable.
“TikTok ruined my mental health,” she explains over WhatsApp. Chatting to me from her home in Kent, she recalls reaching a new low earlier this year. “It got to the point that I was actually experiencing a mental breakdown. It was affecting my family life and everything and I was like ‘right…I actually have to get off this’.” Well and truly at breaking point, she decided to delete the app in January.
This outcome is far removed from the rose-tinted commentary surrounding TikTok’s early days. A 2019 Vox explainer titled “What is TikTok” described the app’s content offering as a cornucopia of comedy sketches, memeified audio, and “stimulating karaoke”. Originally a song based app called Musical.ly, it was bought by Chinese company ByteDance in 2017 and subsequently rebranded as TikTok. Since then it has become a social media behemoth with niches spanning astrology, gut health, and Harry Potter fan fiction. The result? A Willy Wonka-style factory of content served up on a personalised platter.
On top of ruthless trolling on her own posts, the constant barrage of mental health content felt overwhelming for Boo. Chris Stokel-Walker, author of TikTok Boom: The Inside Story of the World’s Favourite App, says the platform’s hyper-personalisation is what makes it so addictive. “There’s this idea that it’s the prior generation of social media on steroids,” he explains.
Since its international debut, TikTok has faced myriad accusations around user wellbeing; from misinformation around Polycystic Ovary Syndrome, to pro-anorexia content. But are these problems any different to the ones the industry has faced before, and can we do anything about them? More importantly, does anyone want to?
I’m on a quest to investigate exactly what’s making young people so spellbound by the platform (myself included). According to SEO strategy company Backlinko, half of the platform’s users are under 30 and a quarter are under 19. While Boo’s two hours of daily use might seem excessive, data collected by Statista suggests her screen time is bang on average for someone in Gen-Z.
For Chris, it’s also unsurprising. “TikTok’s algorithm is designed to maintain user attention at all costs,” he says, but this highlights a fundamental question at the heart of the platform’s operation. Where is the incentive to protect young users, if profits rely on unconditional engagement?
In general terms, the sinister relationship between social media and wellbeing is well-documented. Netflix’s 2020 documentary The Social Dilemma pointed out a correlation between declining mental health in teenagers and the advent of social media and it’s a trend that has been replicated widely.
With the goal of putting the ‘correlation doesn’t equal causation’ argument bed to bed, New York University professor Jonathan Haidt undertook a meta-review, summarising the results for the New Statesman earlier this year. His conclusion? There is growing evidence that social media is not just linked with, but also a cause of depression, anxiety and related behaviours.
This adds to research shared by the Academy for Eating Disorders (AED), which suggests that harmful content around eating disorders can exacerbate and even trigger adolescent mental health conditions.
And on TikTok, this kind of content is abundant. To investigate the extent of the problem, researchers for the Centre for Countering Digital Hate (CCDH) created accounts purporting to be teenagers. They found that TikTok recommended eating disorder content in eight minutes, and suicide content in under three. Vulnerable accounts received over ten times as many recommendations for self-harm and suicide content.
In theory, TikTok has moderation guidelines which set out to make the platform a “safe and positive experience for people under the age of 18” but these findings cast doubt on their efficacy. According to Christhe findings also fit neatly with what the algorithm is built to do: keep users engaged.
Still, the author says we should look backwards. “It’s not just a uniquely TikTok thing. To say that would overlook 15 years of social media history.” Instead, he argues the issue is more systemic. “All new platforms prioritise growth over, making users safe – and we let them,” he explains.
“Growth at all costs does mean growth at all costs – and frequently mental health is the cost.” This is a reminder that TikTok operates within what social psychologist Shoshana Zuboff calls the ‘surveillance capitalism’ economy. Essentially, it profits from trading user data with companies who want to predict (and ultimately nudge) consumer behaviour.
At the end of the day, hiring moderators costs money, and reduces content (a double whammy on revenues). A TikTok spokesperson told me the company is “open about the fact that [it] won’t catch every instance of violative content,” however there is a vast difference between some content slipping the net, and the swathes of harmful eating disorder content that exists on the app.
In the course of my investigations, I found that substituting letters for numbers in so-called pro-anorexia phrases such as ‘pro ana’ (just one of many coded techniques) yielded hundreds of these kinds of videos. Though TikTok removed the sample I sent to them – this was just a sample.
When engagement and consequential data profits are the raison d’etre of social media companies, it’s hard to reconcile these imperatives with content restrictions to protect users. It’s a tension that’s magnified by TikTok’s unique design.
According to research published last year in the academic journal Sage, the factor distinguishing TikTok from rival platforms is its “unprecedented” algorithm. Documents leaked to the New York Times in 2021 revealed that the algorithm manipulates what viewers see by displaying increasingly extreme content with the aim of retaining user attention.
Unlike other platforms, TikTok’s curation is unambiguously driven by the ‘for you’ algorithm, which curates a personalized feed of videos based on user interests and engagement“They abandoned the chronological feed in favour of a more tailored one,” says Stokel Walker. As a result, another issue is transparency. “With TikTok – every user’s experience is different and based on personal engagement.” This, he continues, “makes it much more difficult to identify when people are being sucked into problematic content”.
And we very much are being sucked in. According to statistics published by Backlinko, TikTok has unrivaled levels of engagement with an average session length of 10.85 minutes – double Pinterest’s five.
For Chris, the lack of incentive for moderation is exacerbated by insufficient independent oversight within Europe. “The fundamental challenge is that our regulators are not very good,” he explains. Despite facing a recent fine for data breaches associated with underage users, the European Consumer Organisation reported that TikTok continues to infringe EU law, with violations including “a lack of diligent measures” to prevent users from harmful content.
Currently, TikTok profits as much from pro-anorexia content as it does from attention driven by dance videos. But monetising data, and particularly the highly private information often shared on TikTok raises deeper questions about privacy. As Yallop concluded in her book on the subject, influencing might be sold on a “philosophy of freedom” and as “hashtag good vibes” but the democratisation of creation obscures a fundamental inequality of power.
According to Dr Rys Farthing, director of data policy at Reset Australia: “The business model comes back to a fundamental tension between profit and privacy – these companies derive their profits by trading user information. By nature, it’s a very extractive industry and there’s no way of getting around that within the current system.”
While algorithms and poor moderation are a cause for concern, they are rooted in the platform’s exploitative nature which can be traced back to a disregard for privacy. “The loss of privacy itself doesn’t need to be pinned on [these] harms – it should be taken seriously in and of itself.”
Though the UK’s Online Safety Bill has been lauded as containing measures that will reduce the ‘online harms’ associated with platforms such as TikTok, it has been widely criticised. Although it does include provisions recommending ethical algorithm design, these are non-binding. Respectively, the NSPCC has called for more stringent accountability mechanisms. According to polls carried out by the organisation, 81% of UK adults want company managers to be appointed and held legally responsible for the prevention of harm to young users.
Whether we like it or not, under surveillance capitalism, teenagers are expected to emerge into adulthood in spaces where they are also fodder for revenues. For Dr Farthing, this is fundamentally unfair. “Young people have a right to be from this sort of monetization,” she says. Her point feels particularly salient as other social media giants like Instagram start to hop on the bandwagon of non-chronological feeds in what trend agencies have referred to as an industry-wide ‘tiktokification’.
Without a paradigm shift that at least recognises the system that created TikTok’s ecosystem of disempowerment, regulatory intervention can only go so far. If TikTok was meant to be a Hansel and Gretel-esque gingerbread house of wonders, Boo Betts should have been happy, and at minimum safe – not forced to manufacture an escape like her fairytale counterparts.
Thanks for reading our article! We know young people’s opinions matter and really appreciate everyone who reads us.