Source: Amnesty International –
Four young digital rights activists from Ireland, Argentina and France will today deliver a petition at TikTok’s Dublin, Ireland, office demanding that the company address its toxic and addictive design which has exposed children and young people to harmful content.
The petition titled, “Make TikTok safer for children and young people,” has 170,260 signatures from around the world and will be handed over by Mary Kate Harten and Trinity Kendi from Ireland, Abril Perazzini from Argentina and Noe Hamon from France.
The petition highlights harms linked to platform features that prioritize engagement over user safety.
“These signatures represent a global demand for TikTok to replace its current business model of an app that is addictive by design with one that is safe by design. Its toxic design has caused harm to children in many parts of the world,
Zahra Asif Razvi, Campaigner at Amnesty International
“TikTok must make its platform safe for children and young people to socialize, learn and access information and not be harmed.”
Amnesty International’s research shows that TikTok’s business model prioritizes engagement to keep users hooked and drives extensive data collection targeting advertisers.
Amnesty International has also repeatedly found that TikTok’s ‘For You’ feed can push children and young people into a cycle of depression, self-harm and suicide-related content. Young people in France interviewed for recent research by Amnesty International reported streams of videos that normalized and encouraged self-harm and suicide after they engaged with mental health-related content. Parents of children who died by suicide described the horror of discovering the content TikTok had been pushing to their children.
In 2023, Amnesty International published two reports documenting how TikTok’s recommender system and its invasive data collection practices amplified depressive and suicidal content putting young users of the platform with pre-existing mental health challenges at greater risk.Despite risk mitigation measures announced by TikTok since 2024, the platform continues to expose vulnerable users to content that normalizes self-harm, despair and suicidal thoughts.
