Thursday, April 25, 2024
26 C
Brunei Town

TikTok boosts posts about eating disorders, suicide, says report

AP – TikTok’s algorithms are promoting videos about self-harm and eating disorders to vulnerable teens, according to a report published on Wednesday that highlights concerns about social media and its impact on youth mental health.

Researchers at the non-profit Center for Countering Digital Hate created TikTok accounts for fictional teen personas in the United States (US), United Kingdom (UK), Canada and Australia.

The researchers operating the accounts then “liked” videos about self-harm and eating disorders to see how TikTok’s algorithm would respond.

Within minutes, the wildly popular platform was recommending videos about losing weight and self-harm, including ones featuring pictures of models and idealised body types, images of razor blades and discussions of suicide.

When the researchers created accounts with user names that suggested a particular vulnerability to eating disorders – names that included the words “lose weight” for example – the accounts were fed even more harmful content. “It’s like being stuck in a hall of distorted mirrors where you’re constantly being told you’re ugly, you’re not good enough, maybe you should kill yourself,” said the centre’s CEO Imran Ahmed, whose organisation has offices in the US and UK.

“It is pumping the most dangerous possible messages to young people.”

The TikTok logo is seen on a cell phone. PHOTO: AP

Social media algorithms work by identifying topics and content of interest to a user, who is then sent more of the same as a way to maximise their time on the site.

But social media critics said the same algorithms that promote content about a particular sports team, hobby or dance craze can send users down a rabbit hole of harmful content.

It’s a particular problem for teens and children, who tend to spend more time online and are more vulnerable to bullying, peer pressure or negative content about eating disorders or suicide, according to Executive Director of Fairplay Josh Golin, a non-profit that supports greater online protections for children.

He added that TikTok is not the only platform failing to protect young users from harmful content and aggressive data collection.

“All of these harms are linked to the business model,” Golin said. “It doesn’t make any difference what the social media platform is.”

In a statement from a company spokesperson, TikTok disputed the findings, noting that the researchers didn’t use the platform like typical users, and saying that the results were skewed as a result.

The company also said a user’s account name shouldn’t affect the kind of content the user receives.

spot_img

Latest

spot_img