Popular video-sharing app TikTok said it would adjust its recommendation algorithm to avoid showing users too much of the same content, as social-media platforms globally come under scrutiny for their potential harm to younger users.
TikTok said on Thursday that it is testing ways to avoid pushing too much content from a certain topic, such as extreme dieting, sadness or breakups, to individual users to protect their mental well-being.
The buzzy app, whose monthly user numbers surpassed 1 billion in September, said it was taking such measures to protect against users “viewing too much of a content category that may be fine as a single video but problematic in clusters.”
TikTok, owned by Chinese technology giant ByteDance Ltd., serves up content from viral dance videos to short cooking demonstrations and is wildly popular in the U.S., where it shot to fame during the early days of the pandemic when many Americans were locked down at home. Since then, U.S. policy makers and their global counterparts have been scrutinizing TikTok and its peers, particularly Meta Platform Inc.’s Instagram, over data-privacy concerns and the possible psychological damage these platforms may cause to younger users.
In September, The Wall Street Journal published an investigation that illustrated how TikTok’s algorithm could push young users into a rabbit hole of content about sex and drugs when they browsed the app’s For You feed, its highly personalized home page serving up an endless stream of content when a user first opens up the app.
TikTok also said Thursday that it would allow people more authority to pick videos they want or don’t want to view. One of the measures the app is working on is a feature that would let users pick words or hashtags associated with content they don’t wish to see on their video feed.
Never miss a story! Stay connected and informed with Mint.
our App Now!!
Originally posted 2021-12-16 14:25:24.