UK Considers Social Media Ban and Stronger Age Assurance for Children
The UK government said it is considering restricting or banning social media access for children, with Prime Minister Keir Starmer stating that “no option is off the table.” Reported options include improving age assurance technology, raising the digital age of consent, imposing phone curfews, and limiting platform design practices associated with compulsive use (including “infinite scrolling”), alongside publishing evidence-based screen-time guidance for parents and tightening school enforcement around phone use.
In Parliament, momentum is being driven in part by a proposed amendment to the Children’s Wellbeing and Schools Bill that would require regulated user-to-user services to deploy “highly-effective” age assurance measures to prevent under-16s from becoming users, and would also task the UK’s chief medical officers with publishing age-based advice on children’s social media use. UK officials also indicated they plan to engage with Australia to learn from its under-16 social media restrictions, which Australian authorities said led to millions of account deactivations shortly after implementation.
Timeline
Jan 22, 2026
House of Lords votes to add under-16 social media ban to bill
Britain's House of Lords voted 261 to 150 to amend the Children's Wellbeing and Schools Bill to ban children under 16 from accessing social media within a year. The amendment also requires the UK's chief medical officers to publish guidance for parents on social media's effects at different developmental stages.
Jan 20, 2026
UK government says it is considering child social media restrictions
The UK government said it is considering measures that could effectively ban children from social media, including stronger age assurance requirements, raising the digital age of consent, phone curfews and limits on addictive platform features. Prime Minister Keir Starmer said 'no option is off the table' as ministers also prepared to study Australia's approach.
Jan 19, 2026
Labour MPs publish open letter backing under-16 social media ban
A group of 61 Labour backbench MPs published an open letter urging the UK government to support a ban on social media for under-16s. They argued responsibility should rest with technology platforms rather than parents and said the proposal had strong public support.
Dec 1, 2025
Australia enforces under-16 social media account ban
Australia's ban on social media accounts for children under 16 came into force in December 2025. UK officials later cited it as a model and said the first week led to millions of account deactivations, while acknowledging the rules could be circumvented.
See the full picture in Mallory
Mallory subscribers get deeper analysis on every story, including:
Who’s affected and how
Deep-dive technical analysis
Actionable next steps for your team
IPs, domains, hashes, and more
Ask questions and take action on every story
Filter by topic, classification, timeframe
Get matching stories delivered automatically
Sources
Related Stories

Regulatory Push to Strengthen Child Online Safety and Age Assurance on Social Platforms
UK regulators **Ofcom** and the **Information Commissioner’s Office (ICO)** issued warnings to major social media and video platforms (including **Facebook, Instagram, Snapchat, TikTok, and YouTube**) demanding “urgent steps” to implement more robust **age assurance** controls to prevent access by children under 13. Regulators signaled potential enforcement if platforms continue relying primarily on easily bypassed self-declared ages, arguing this enables unlawful collection and use of children’s data and exposes under-13s to services not designed for them; Ofcom requested companies report back on their plans by the end of April. In the US, the House Energy and Commerce Committee advanced the **Kids Internet and Digital Safety (KIDS) Act** on a party-line vote amid Democratic objections that the bill could reduce platform accountability for harms to minors. Criticisms focused on provisions that could **preempt certain state laws**, a **knowledge requirement** that opponents argue may let companies claim ignorance of minors’ presence, and the lack of a proactive **“duty of care”** requirement; proposed amendments to strengthen protections were not adopted.
1 months ago
European Governments Move to Restrict Social Media Use by Minors
The **European Commission** issued preliminary findings that **TikTok’s product design**—including *infinite scroll*, *autoplay*, *push notifications*, and *personalized recommendations*—may breach the EU **Digital Services Act (DSA)** by failing to adequately assess and mitigate risks to users’ physical and mental well-being, particularly for **minors and vulnerable users**. If confirmed, the Commission said the violations could result in penalties of up to **6% of TikTok’s global annual turnover**, and it signaled expected design changes such as **screen-time breaks**, adjustments to recommendation systems, and disabling or reducing features deemed to drive compulsive use. Separately, **Spain** announced plans to **ban social media access for children under 16** and require **age verification** by platforms, aligning with a broader European trend toward statutory restrictions on minors’ social media use. The announcement follows similar initiatives across Europe, including Australia’s under-16 restriction (cited as precedent), the Netherlands’ push to bar under-15s, French legislation targeting under-14s, and the UK studying a ban for children 15 and under—indicating accelerating regulatory pressure on platforms to implement enforceable child-safety and access controls.
1 months ago
Government Pushes for Age Verification and Content Controls on Digital Platforms
The UK government is urging major technology companies such as Apple and Google to implement nudity-blocking systems on mobile devices, aiming to protect minors by requiring adult users to verify their age before accessing or sharing explicit images. This initiative, which currently stops short of a legal mandate, would leverage nudity-detection algorithms at the operating system level and could be expanded to desktop platforms in the future. The proposed measures would also require child sex offenders to keep such blockers enabled, reflecting a broader governmental effort to enforce age-appropriate content controls across digital ecosystems. Simultaneously, experts are raising concerns about the effectiveness of current age verification technologies, particularly those relying on consumer-grade cameras and AI-powered facial recognition. Research highlights that these systems may provide a false sense of security, as they are susceptible to spoofing and may not reliably authenticate minors. The debate underscores the technical and policy challenges in balancing child safety, privacy, and the practical limitations of available authentication methods on popular platforms like Roblox and other social media or gaming services.
1 months ago