TikTok is trying to adjust the way its individually curated "For You" pages select content to prevent users from seeing too much of the same thing. The company believes that the type of content it is concerned about is fine in isolation, but could be damaging if viewed excessively.
The news was first reported in the Wall Street Journal. The Wall Street Journal had previously published a rather damning study on how TikTok's algorithm learns about its users and supplies them with appropriate, engaging content to send them down the rabbit hole.
In it, bots that the paper set to show depressive tendencies in their interactions with the app were 93% more likely to be shown videos about sadness and depression after just 36 minutes of app use.
In response, the paper claims that TikTom intends to "avoid showing users too much of the same content."
Sure enough, TikTok subsequently wrote a blog post outlining the planned changes. It is "testing ways to not encourage a series of similar content, such as extreme dieting, fitness, grief, breakups, etc.
"We are also recommending only very limited types of content that our system inadvertently could have a negative impact on if it is a large portion of someone's viewing content, such as content about loneliness or weight loss, although it does not violate our policy We are working to recognize if there is a possibility," the company continued. Our goal is for each person's For You feed to feature a wide range of content, creators, and topics."
While it is hoped that this will help the wellbeing of TikTok fans, the algorithm can only go so far. That said, the company is also working on a feature that would allow users to block content related to certain words or hashtags that appear in their "For You" feed.
From YouTube to Facebook, social media companies around the world are under intense scrutiny for the content they put out and the impact their seemingly benign algorithms have on the wellbeing of both individuals and society. tikTok's just surpassed one billion users in September, which is equivalent to almost one-seventh of the world's population.
This problem sounds familiar: just as YouTube's engagement algorithm lengthens dwell time by pushing people toward more extreme and controversial content, TikTok's success is based on learning the profiles of its users and creating the content they crave with TikTok's success seems to lie in learning the profiles of its users and delivering the kind of content they crave, regardless of whether it is ultimately good for them or not.
The change described here is an interesting one. If it is the core argument, then stricter moderation is required and larger issues will invariably be asked. In other words, this is a quantity issue, not a quality issue.
Ultimately, however, this could still be a bugbear. After all, if one extreme diet video is fine, but not a "cluster," how many can be safely viewed without causing damage: five? 20? 100?
These are seemingly impossible questions that the company must eventually answer. If it fails to answer them satisfactorily, this will not be the last time TikTok will be in the news for the wrong reasons.
Comments