Published: 25 November 2025. The English Chronicle Desk. The English Chronicle Online
Social media platforms are being urged to introduce stricter measures to limit internet “pile-ons” under new guidance aimed at protecting women and girls online. The recommendations, issued by Ofcom, the UK communications regulator, address misogynist abuse, coercive control, and the non-consensual sharing of intimate images, and are intended to provide practical tools for tech companies to safeguard their users.
The guidance comes as Ofcom warns that women in politics, sport, and other public arenas face significant abuse online on a daily basis. The regulator’s new measures suggest platforms impose limits on the number of responses a single post can receive, with the goal of reducing so-called pile-ons, where individuals are deluged with abusive comments or coordinated harassment campaigns.
Other recommendations include the use of sophisticated technology to protect women and girls from the circulation of intimate images without consent. The guidance specifically encourages the use of “hash-matching” software, which converts reported images into digital fingerprints, allowing platforms to detect and remove illicit content such as revenge pornography or explicit deepfakes efficiently. By cross-referencing these hashes against a central database, social media companies can identify and eliminate harmful material before it spreads widely.
While the Online Safety Act (OSA), under which the recommendations are framed, already requires platforms to mitigate online harm, the new measures go beyond existing legal obligations. Although technically voluntary, Ofcom has indicated that compliance will be closely monitored, with a review of platform implementation set to be published in 2027. If companies fail to take sufficient action, the regulator has warned it could advise the government on potential amendments to the OSA to strengthen enforcement powers.
Dame Melanie Dawes, Ofcom’s chief executive, highlighted the urgent need for action after encountering multiple “shocking” cases of abuse suffered by women and girls online. “We are sending a clear message to tech firms to step up and act in line with our practical industry guidance, to protect their female users against the very real online risks they face today,” she said. “With the continued support of campaigners, advocacy groups and expert partners, we will hold companies to account and set a new standard for women’s and girls’ online safety in the UK.”
The guidance also proposes several preventative measures to limit the opportunities for abuse to proliferate. Among these are prompts to encourage users to pause and reconsider before posting potentially harmful content, as well as “time-outs” for accounts that repeatedly violate platform rules. Other measures include preventing users who post misogynistic or abusive content from profiting from advertising revenue and allowing victims to block or mute multiple accounts simultaneously to protect themselves from harassment at scale.
These recommendations finalise a process that began in February, when Ofcom launched a consultation including some of the hash-matching measures. However, more than a dozen of the recommendations, including setting “rate limits” on posts to prevent mass pile-ons, are entirely new. They reflect the regulator’s increasing concern that voluntary compliance alone may not be sufficient to protect vulnerable users.
Internet Matters, a UK nonprofit dedicated to children’s online safety, welcomed the guidance but warned that many tech companies were likely to ignore it unless it becomes mandatory. The organisation has called on the government to consider turning the hash-matching recommendation and other key measures into statutory obligations to ensure consistent enforcement.
Rachel Huggins, co-chief executive at Internet Matters, emphasised the risks of inaction. “We know that many companies will not adopt the guidance simply because it is not statutory, meaning the unacceptable levels of online harm which women and girls face today will remain high,” she said. Huggins added that without legally binding requirements, social media platforms may prioritise engagement metrics over user safety, leaving victims vulnerable to repeated harassment and abuse.
The guidelines also reflect broader concerns about the psychological impact of online abuse. Pile-ons and coordinated attacks often create an environment of fear and intimidation, deterring women from participating in online discussions, politics, or public debate. By limiting exposure to mass harassment and providing quick mechanisms for victims to defend themselves, Ofcom aims to reduce the chilling effect that pervasive abuse can have on public discourse.
Additionally, the regulator highlighted that the hash-matching system could also be applied to emerging threats such as deepfake pornography and digitally altered images, which increasingly target individuals without consent. By encouraging platforms to adopt such technology, Ofcom hopes to establish a proactive approach to content moderation rather than relying solely on reactive takedown processes.
The guidance further recommends that platforms improve reporting systems, making it easier for victims to flag abuse and ensuring timely intervention. Simplifying privacy controls, enabling users to set their accounts to private quickly, and offering better support for affected users are also emphasised as key measures. Together, these steps are intended to create a safer digital environment for women and girls, where abusive behaviour is swiftly identified and addressed.
Ofcom’s recommendations arrive amid growing public scrutiny of tech companies’ responsibilities in moderating content. Critics have argued that social media platforms have historically prioritised user engagement and growth over safety, allowing misogynist abuse, harassment, and revenge porn to proliferate. The new guidance represents a concerted effort to hold companies accountable and establish industry-wide standards for preventing online harm.
Dame Melanie Dawes concluded by stressing that the regulator’s work will continue beyond issuing the guidelines. “If their action falls short, we will consider making formal recommendations to government on where the Online Safety Act may need to be strengthened,” she said. “This is about sending a clear signal to all platforms that women and girls must be safe online, and there is zero tolerance for harm.”
The success of these measures will ultimately depend on social media companies’ willingness to implement them comprehensively and in good faith. While voluntary guidelines mark an important step forward, advocacy groups maintain that statutory backing and robust enforcement will be crucial to ensure meaningful change.
By addressing both the technological and behavioural aspects of online abuse, Ofcom hopes to set a precedent for responsible platform management and enhance the safety of women and girls navigating the digital world. These guidelines could mark a pivotal moment in reshaping the online landscape, demonstrating that proactive regulation and ethical platform governance can coexist with free expression, while holding perpetrators of abuse accountable.


























































































