Published: 09 August 2025. The English Chronicle Desk
A sharp rise in online sextortion cases targeting children has triggered mounting concern among international law enforcement agencies and child protection charities, with new figures revealing the alarming scale of the threat in the United Kingdom and beyond.
Recent data shows that tech companies, including Snapchat and Facebook, reported more than 9,600 instances of adults grooming children in the UK in just the first six months of 2024 — the equivalent of nearly 400 incidents every week. Globally, the US-based National Center for Missing and Exploited Children (NCMEC) received over 546,000 reports of adults soliciting minors during the same year, marking a staggering 192% increase compared with 2023.
Of the UK’s cases, Snapchat emerged as the platform most frequently associated with concerning material, logging about 20,000 instances in the first half of 2024 alone. This figure, which includes reports of sextortion and the sharing of child sexual abuse images, exceeds the combined total from Facebook, Instagram, TikTok, X (formerly Twitter), Google, and Discord during that period. The company has since adjusted its reporting policies, which could lead to lower future tallies.
The National Society for the Prevention of Cruelty to Children (NSPCC) described the findings as “shocking” and warned that the actual number of victims is likely to be significantly higher, as many children do not report their abuse due to fear, shame, or threats from perpetrators.
The UK’s National Crime Agency (NCA) has responded with what it calls “unprecedented” public awareness campaigns, targeting parents, teachers, and young people to warn them of the dangers posed by sextortion. This form of abuse typically involves predators coercing victims into sending explicit images, which are then used as leverage to demand more images, money, or to force the victim into harmful acts.
NCMEC reports that since 2021, more than three dozen teenage boys worldwide have taken their own lives after falling victim to sextortion. Rani Govender, a child safety policy expert at the NSPCC, said the psychological toll on victims is “devastating,” often eroding their ability to trust others or seek help.
Adding to the growing concern, investigators have discovered a chilling 101-page manual circulating in certain online communities, offering detailed instructions on how to groom, manipulate, and exploit young internet users. The document allegedly advises predators on the best technology, encryption tools, and fake identities to use, with the explicit goal of turning victims into “modern-day slaves” through sexual blackmail.
Authorities allege the guide was created by 20-year-old Baron Martin of Arizona, who was arrested by the FBI in December 2024. Describing himself as “the king of extortion,” Martin reportedly boasted that his manual had been “the catalyst for thousands of extortions.” The US Department of Justice links the document to a network of so-called “Com communities,” which glorify abuse and encourage criminal behaviour among members.
Milo Comerford of the Institute for Strategic Dialogue warned that these toxic online subcultures, combined with the anonymity of the internet, create an environment where predators can operate across borders and evade detection. He stressed the urgent need for “robust multi-agency safeguards” to raise awareness of the risks, particularly among children and the adults responsible for their care.
The FBI has identified multiple organised groups involved in sextortion schemes, often using fake romantic approaches to gain a victim’s trust before demanding compromising content. Once obtained, this material is used to blackmail victims into providing more explicit images, self-harming, or even committing acts of violence or animal cruelty.
As the threat grows, both US and UK authorities are calling for stronger cross-border cooperation, enhanced technological safeguards, and education initiatives aimed at dismantling the predators’ networks before more young lives are lost.



























































































