TikTok’s recommended feed page, called “For You” is one of the defining features of the platform, and with around 2 billion app users, significant real estate for advertisers. The company in a blog post last week, for the first time, revealed how its recommendation system works, and some of the challenges that it has to address to make sure that users don’t end up seeing similar videos.

The bottom line: When a new video is uploaded to TikTok, it first shows it to a small user base, who the platform has already determined are more likely to engage with. Based on their actions on the video — sharing it, watching it in full — TikTok then shows it to more people who it thinks share similar interests. If enough people engage with a piece of content favourably, the video can go viral. This explains why TikTok’s recommended videos often have videos with few likes and views, alongside highly viral videos.

How does TikTok determine who is likely to engage with a video?

TikTok’s algorithm recommends content by ranking videos based on a combination of factors, including user interests, song clips, hashtags, and language preference, among other things. This explains why there are sound “trends” on the platform, and why a majority of creators rush to create videos on a particular sound. “A strong indicator of interest, such as whether a user finishes watching a longer video from beginning to end, would receive greater weight than a weak indicator, such as whether the video’s viewer and creator are both in the same country,” TikTok said. (This explains why my For You feed is populated by videos from Zack King, and others who make optical illusion videos.)

TikTok also asks new users to select categories of videos that they would like to watch, which is used to determine user preference. For users who don’t select any category while signing up, the app starts offering them a “generalised feed of popular videos”, and determines their preferences based on their first set of likes, comments, and replays. TikTok also ascertains a user’s preferences based on the accounts they follow, and when they search for hashtags, sounds, effects, and other trending topics.

It also revealed that the number of followers that a creator doesn’t have a direct impact on the engagement that a video made by them will receive. “While a video is likely to receive more views if posted by an account that has more followers, by virtue of that account having built up a larger follower base, neither follower count nor whether the account has had previous high-performing videos are direct factors in the recommendation system,” it said.

‘Filter bubble’: The problems with a personalised recommendation algorithm

TikTok acknowledged that a big blind spot that can be created by highly personalised recommendation systems is that of content homogeneity. “One of the inherent challenges with recommendation engines is that they can inadvertently limit your experience – what is sometimes referred to as a ‘filter bubble’”, TikTok said, adding that it runs a risk of presenting an increasingly homogenous stream of videos by optimising for personalisation and relevance. To tackle this, TikTok ensures that it doesn’t show two videos in a row made with the same sound or by the same creator. It also doesn’t recommend duplicated content, content users have already seen before, or any content that’s considered “spam”.

TikTok’s content moderation problems

TikTok claimed that it doesn’t recommend dangerous content such as graphic medical procedures or legal consumption of regulated goods, as they may be “shocking if surfaced as a recommended video” to a “general audience” that hasn’t opted into such content. However, TikTok’s content moderation has been criticised for censoring content that is critical of the Chinese government, and even content that is pro-LGBTQ in countries where homosexuality has never been illegal. At the same time, it has failed to take down content promoting animal cruelty and violence against women, until there’s a significant public outcry.

TikTok came under fire in India only last month after videos promoting animal cruelty went viral on the platform, and only took them down after several users complained about them. Similarly, a popular Indian creator’s video allegedly promoting acid attacks on women was only taken down after it drew the ire from the National Commission for Women. TikTok was even banned in India for 20 days last year after child porn was published on the platform.

TikTok removed a video by popular creator Nazma Aapi’s video which was critical of China, mentioning its handling of the coronavirus, and the standoff along the Line of Actual Control. The platform later reinstated the video, again, after it was criticised for removing an “anti-China” video (several lawmakers in India have questioned TikTok’s close relationship with the Chinese government). This wasn’t the first time something like this happened either. TikTok was found to censor content that mentions Tiananmen Square, and Tibetan independence, among others. The platform also briefly took down a video by an Asian-American woman last year in which she decried China’s treatment of the Uighur Muslims.

Internal training documents from TikTok earlier this year revealed that it directed moderators to suppress videos from people deemed too “ugly”, “poor”, or “disabled”. TikTok has also been found guilty of banning any content that could be seen as positive to gay people or to gay rights, even those with same-sex couples holding hands, and even in countries where homosexuality has never been illegal.