France: TikTok still steering vulnerable children and young people towards depressive and suicidal content 

Source: Amnesty International –

The report contains sensitive content including references to self-harm and suicide    

New Amnesty International research has found that TikTok’s ‘For You’ feed is pushing French children and young people engaging with mental health content into a cycle of depression, self-harm and suicide content. 

The research, Dragged into the Rabbit Hole, highlights TikTok’s ongoing failure to address its systemic design risks affecting children and young people. 

“Our technical research shows how quickly teenagers who express an interest in mental health-related content can be drawn into toxic rabbit holes. Within just three to four hours of engaging with TikTok’s ‘For You’ feed, teenage test accounts were exposed to videos that romanticized suicide or showed young people expressing intentions to end their lives, including information on suicide methods,” said Lisa Dittmer, Amnesty International’s Researcher on Children and Young People’s Digital Rights.  

“The testimonies of young people and bereaved parents in France reveal how TikTok normalized and exacerbated self-harm and suicidal ideation up to the point of recommending content on ‘suicide challenges’.

Lisa Dittmer, Amnesty International’s Researcher on Children and Young People’s Digital Rights

TikTok’s ‘For You’ feed is a personalized stream of short videos that recommends content based on viewing.    

Amnesty International researchers set up three teen accounts, two female, one male, registered as 13-year-olds based in France to manually examine the algorithmic amplification of content in TikTok’s ‘For You’ feed. Within five minutes of scrolling and before signaling any preferences, the accounts encountered videos about sadness or disillusionment. 

Watching these videos rapidly increased the amount of content related to sadness and mental health. Within 15 to 20 minutes of starting the experiment, all three feeds were almost exclusively filled with videos related to mental health, with up to half containing depressive content. Two accounts had videos expressing suicidal thoughts within 45 minutes. 

Additional experiments were conducted with the Algorithmic Transparency Institute using automated test accounts of 13-year-olds in France. They found TikTok’s recommender system more than doubled the share of recommended sad or depressive content when watch histories included different levels of such videos.      

The research was conducted in France where TikTok is regulated under the European Union’s Digital Services Act (DSA), which since 2023 required platforms to identify and mitigate systemic risks to children’s rights.   

French lawmakers are currently debating gaps in social media regulation, and this research adds to Amnesty International’s prior evidence that TikTok has not addressed systemic risks tied to its engagement‑based business model.