Advertisements
Uncategorized

Why the TikTok Algorithm might be Pretty Problematic

There is a darker side to the videos popping up on your ‘For You’ page, including bias based on race, body type, surroundings and potentially political affiliation

✏️ Grace Couch

@GCouch99

If you’re not on TikTok already, you either think you’re too cool or haven’t hit peak isolation boredom. All jokes aside, the social media platform is blowing up worldwide, with over 1 billion downloads in its first year and 800 million active users (Oberlo 2020).

From your classic ‘funny cat videos’ to viral dance challenges, lip-syncing comedy to life hacks, every kind of short video can receive a massive amount of success very quickly. 

TikToks have become a source of entertainment for all ages, despite capturing the teen market originally.

However, there is a darker side to the videos popping up on your ‘For You’ page, including decisions made on race, body type, surroundings and potentially political affiliation.

With algorithms running the show we must understand the social bias they are reinforcing and its relation to the way social media is influencing our lives. 

Algorithms are the computer programming within applications that processes data, such as posts on your Facebook feed. Algorithms are designed by humans, however, they don’t adapt their response to different content the same way humans do – they simply follow the programmed rules.

The Chinese-created social video platform can be argued to attribute its success to its own algorithm, by quickly propelling a video to viral status. This is triggered by how much engagement it receives – if it suddenly receives 20% more ‘Likes’ in a single day the video will be pushed to more users (Hypebot 2019).

Studies and articles have begun to emerge, questioning how the TikTok algorithm may be filtering which profiles you’re suggested to follow or which videos pop up on your feed.

Leaked documents have suggested that TikTok has policies limiting the exposure of content creators based on their physical appearances and the quality of their surroundings.

The physical appearances ‘flagged’ include: ‘“abnormal body shape,” being “chubby,” “obese,” or “too thin,” missing teeth, the presence of “obvious facial scars” and being an older person with “too many wrinkles”’ (Yahoo 2020).

This is concerning since social media is already criticised for creating an unrealistic perception of reality.

We all hear about how seeing only ‘model-like’ influencers affects the mental health of young people, so it is concerning to know that new apps like TikTok are carrying on this trope. 

Further evidence for the success of affluent content creators on TikTok is ‘The Hype House’: a collection of 20 of the biggest TikTok stars in the world who all film in a beautiful mansion in LA (Forbes 2020). Few of them actually live there permanently but they use the mansion as the backdrop to film their videos. Many of their viewing figures come from filming with the other successful TikTokers that ‘live’ there.

But the idea that the algorithm supports those filming in more appealing surroundings is dangerous as it may prevent worse-off creators from reaching the same financial success of these users.

It has been reported that TikTok defended the above policy as the initial purpose was “an early blunt attempt at preventing bullying” and that they were no longer in use (Yahoo 2020).

An equally alarming aspect of the TikTok algorithm was discovered by artificial intelligence researcher Marc Faddoul. In his experiment, he found that TikTok recommends accounts based on those you are already following (so far so good – this is the same on many social media platforms).

However, what was interesting was that the physical characteristics of the person’s profile were mimicked in the recommendations, including racial and cultural features (Forbes 2020).

The reason this is so concerning is that if the majority of TikTok users are white and they subconsciously favour TikTokers with similar features to themselves, this can prevent creators of colour with smaller followings from being seen and recommended on the platform (New York Times 2019).

Similar to the issue of body type or affluent surroundings, race-based preference in social media can lead to real-life inequality of opportunity for content creators. 

Although this is based on less solid evidence, there are claims that pro-Trump content is doing particularly well on TikTok despite the app being specifically designed to discourage news-sharing and banning political advertisements (Vox 2020).

User reports suggest that #Trump2020 has flooded the platform with TikTok confirming that it has seen an increase in political content, although they added that the platform has seen an increase in all kinds of videos (Quartz 2019). 

#Trump2020 was reaching viewing figures of up to 128 million in October 2019, whereas #Bernie2020 paled in comparison at 6 million (more up to date figures will give us a better insight).

Whether this is due to support from a biased algorithm, or just that Trump supporters are finding a new outlet in TikTok is up for debate. But it is important to note how new social media platforms are affecting the real world, such as in political campaigns.

We must adjust these algorithms sooner rather than later. Changing algorithms seems easier than changing people, but they are vulnerable to users’ unknown biases. Algorithms reinforce our racial prejudices.

This is all happening at a time when there’s debate over how necessary it even is to regulate social media. TikTok has a large number of young users, and the workings of its algorithms may be shaping their socio-political outlook – not just their dance moves. 

Thanks for reading our article! We know young people’s opinions matter and really appreciate everyone who reads us.

Give us a follow on Instagram, Twitter and Facebook to stay up to date with what young people think.

Advertisements

Leave a Reply

%d bloggers like this: