Published: 2 March 2026
The English Chronicle Desk
The English Chronicle Online
The UK government has announced the start of a formal public consultation on proposals to ban social media use for children under the age of 16, a move aimed at tackling rising concerns over mental health, online harms and data safety among young people. The consultation period — which will gather input from parents, educators, industry experts and platform operators — is expected to shape legislation that could transform how social media platforms operate within the United Kingdom.
Under the proposed plans, leading social media companies would be required to prevent anyone under 16 from accessing their services without robust age verification, or to make platforms off‑limits to that age group entirely. Government officials argue that the ubiquity of platforms such as TikTok, Instagram and Snapchat has contributed to rising levels of anxiety, self‑harm, cyberbullying and other harms among teenagers, and that stronger age‑based protections are overdue.
Children’s charities and health professionals have repeatedly warned that persistent exposure to social media can exacerbate mental‑health problems, distort sleep patterns and encourage unhealthy social comparisons. A Department for Digital, Culture, Media and Sport spokesperson said the consultation is a first step in developing legislation that will “ensure children’s safety online while balancing freedom and innovation.”
The consultation will run for 12 weeks and invite responses from a wide range of stakeholders, including tech companies, parents, researchers and civil liberties groups. It will focus on a number of key questions, such as how age verification systems could be implemented, whether exceptions should be made for educational platforms, and how compliance would be enforced. The government says it plans to publish a white paper later in the year outlining draft regulations based on the feedback received.
Social media companies themselves have expressed mixed reactions. Some operators have welcomed engagement with policymakers and say they already invest in child‑safety features, age‑gating and parental controls. Others have raised concerns that a blanket ban could push younger users toward unregulated or unsafe parts of the internet, including anonymous chat apps and virtual spaces where moderation is limited.
Digital rights advocates have also weighed in, stressing that any policy must balance online safety with freedom of expression and avoid overly intrusive verification methods that could compromise privacy. Experts in internet law caution that age‑checking technologies — such as facial‑recognition or government ID requirements — would need careful design to avoid discrimination or misuse of personal data.
Opposition politicians have broadly supported the consultation’s aim but urged clarity on enforcement. Some have questioned whether existing laws and regulatory agencies, such as the Information Commissioner’s Office (ICO), have sufficient resources to monitor compliance and protect user rights once new rules are in place. Others argue for greater emphasis on digital literacy education for children and parents to accompany any regulatory changes.
Youth charities have largely welcomed the initiative but also called for broader measures, including limits on addictive design features and better access to mental‑health support for young people affected by online harms. Organisations representing educators say schools should be included in the conversation, given the role social media plays in students’ daily lives and wellbeing.
If legislation ultimately follows the consultation, the UK could become one of the first major jurisdictions to impose a broad age‑based restriction on mainstream social media platforms — a development likely to influence international debates around digital safety and children’s rights online.


























































































