Published: 19 August ‘2025. The English Chronicle Desk
Recent research has raised serious concerns about TikTok and Instagram, accusing both platforms of exposing teenagers to suicide and self-harm material at an increased rate compared to two years ago. The investigation was commissioned by the Molly Rose Foundation, established by Ian Russell after his 14-year-old daughter Molly tragically took her own life following exposure to harmful content online.
The charity analysed hundreds of posts on both platforms using the account of a 15-year-old girl in the UK, revealing that algorithms consistently recommended videos containing content related to suicide, self-harm, and severe depression to minors who had previously interacted with similar material. Disturbingly, one in ten of these posts had received over a million likes, with the average engagement reaching 226,000 likes per post.
Mr. Russell described the findings as “horrifying,” highlighting the failure of current online safety legislation to protect vulnerable users. He emphasized that eight years after Molly’s death, content capable of influencing suicidal ideation and self-harming behaviours remains widespread on social media. He called for stronger, life-saving legislation to ensure such preventable harm is mitigated.
Researchers at Bright Data examined 300 Instagram Reels and 242 TikTok posts to determine whether the content promoted or glorified self-harm and suicide, referenced methods, or conveyed extreme feelings of hopelessness and despair. On Instagram, 97% of the recommended videos for the teenage account were deemed harmful, with 44% explicitly referencing suicide or self-harm. Similarly, TikTok was found to recommend 96% harmful content, with 55% of “For You” posts directly related to suicide or self-harm. The analysis also revealed an increase in problematic hashtags and curated playlists promoting risky behaviours since 2023.
Both platforms responded to the allegations. Meta, which owns Instagram, defended its safety measures, citing Teen Accounts designed to limit harmful exposure, automated removal of dangerous content, and proactive interventions on 99% of flagged material. TikTok also highlighted its 50+ safety features for teen accounts, parental Family Pairing controls, and claims that over 99% of violative content is removed before user reports. Both companies insist they do not permit content promoting suicide or self-harm and direct users to appropriate support resources.
The Molly Rose Foundation insists, however, that these measures are insufficient, as harmful content continues to reach young users, demonstrating the urgent need for stronger enforcement and legislative action to prevent further tragedies.



























































































