TikTok can show teenagers suicide content every two minutes

Users aged 13, the minimum age required by TikTok , are exposed to troubling content.


TikTok is a social network with an increasing presence, especially among the youngest.

According to the Center for Countering Digital Hate (CCDH) organization , two-thirds of teens in the United States spend an average of 80 minutes a day on the ByteDance app .

Unfortunately, the content that TikTok shows to minors can be more than harmful, as a study by the organization reveals.

What does TikTok show to minors?

CCHR researchers created new TikTok accounts in the United States, United Kingdom, Canada, and Australia. All with the minimum age required by TikTok: 13 years.

These accounts briefly paused on videos about body image and mental health, giving it a like.

The TikTok algorithm presented them with unpleasant and worrying surprises :

⚠️ In just 2.6 minutes, TikTok recommended suicide-related content.

⚠️ In 8 minutes, TikTok delivered content related to eating disorders.

⚠️ Every 39 seconds, TikTok recommended videos on body image and mental health to teens.

CCHR researchers also created accounts that had a “weight loss” link in the user name, which received 12 times more self-harm and suicide recommendations than “normal” accounts.

“The results are every parent’s nightmare: young people are bombarded with harmful and heartbreaking content that can have a significant cumulative impact on their understanding of the world around them, as well as their physical and mental health.”