Skip to main content
Mallory

Regulatory Push to Strengthen Child Online Safety and Age Assurance on Social Platforms

privacy-surveillance-policycybersecurity-regulationenforcement-action
Updated April 1, 2026 at 12:04 AM4 sources
Share:
Regulatory Push to Strengthen Child Online Safety and Age Assurance on Social Platforms

Get Ahead of Threats Like This

Know if you're exposed. Before adversaries strike.

UK regulators Ofcom and the Information Commissioner’s Office (ICO) issued warnings to major social media and video platforms (including Facebook, Instagram, Snapchat, TikTok, and YouTube) demanding “urgent steps” to implement more robust age assurance controls to prevent access by children under 13. Regulators signaled potential enforcement if platforms continue relying primarily on easily bypassed self-declared ages, arguing this enables unlawful collection and use of children’s data and exposes under-13s to services not designed for them; Ofcom requested companies report back on their plans by the end of April.

In the US, the House Energy and Commerce Committee advanced the Kids Internet and Digital Safety (KIDS) Act on a party-line vote amid Democratic objections that the bill could reduce platform accountability for harms to minors. Criticisms focused on provisions that could preempt certain state laws, a knowledge requirement that opponents argue may let companies claim ignorance of minors’ presence, and the lack of a proactive “duty of care” requirement; proposed amendments to strengthen protections were not adopted.

Timeline

  1. Mar 30, 2026

    Italian police arrest teen over alleged school attack plot linked to Telegram

    Italian authorities arrested a 17-year-old in Perugia on 2026-03-30 for allegedly planning a Columbine-style school attack linked to Telegram extremist groups. Investigators searched across four regions and identified seven other minors, intensifying scrutiny of platforms' handling of violent content and youth safety in the EU.

  2. Mar 27, 2026

    French minister refers TikTok child-safety concerns to prosecutors

    French Education Minister Edouard Geffray referred TikTok’s recommendation algorithm to prosecutors after a test allegedly showed self-harm and suicide-related content being recommended to accounts presented as minors. The move marked an escalation in French scrutiny of social media child-safety practices.

  3. Mar 12, 2026

    U.K. regulators order platforms to strengthen child age checks

    The U.K. Information Commissioner’s Office and Ofcom warned major social media and online platforms to urgently implement robust age-assurance measures to make it harder for children under 13 to access services. Ofcom required companies to submit their plans by the end of April, while the ICO said it had begun direct engagement with some high-risk services and expected improvements within two months.

  4. Mar 10, 2026

    House committee advances KIDS Act on party-line vote

    The U.S. House Energy and Commerce Committee advanced the Kids Internet and Digital Safety (KIDS) Act on a party-line vote. Committee Democrats opposed the bill, arguing it could weaken platform accountability, preempt some state laws, and lacked a duty-of-care requirement; their proposed amendments were not adopted.

See the full picture in Mallory

Mallory subscribers get deeper analysis on every story, including:

Impact Assessment

Who’s affected and how

Technical Details

Deep-dive technical analysis

Response Recommendations

Actionable next steps for your team

Indicators of Compromise

IPs, domains, hashes, and more

AI Threads

Ask questions and take action on every story

Advanced Filters

Filter by topic, classification, timeframe

Scheduled Alerts

Get matching stories delivered automatically

Related Stories

Debate Over Kids Online Safety Act and Age-Verification Requirements for Minors

Debate Over Kids Online Safety Act and Age-Verification Requirements for Minors

Policymakers in multiple jurisdictions are advancing **child online safety** rules that would restrict minors’ access to social media, “addictive” product features, and certain content (including pornography), increasing pressure on platforms to implement **age assurance/age verification** to determine users’ ages before allowing access. The Lawfare analysis highlights that while protecting children online is a widely shared goal, enforcing age-based restrictions at scale effectively requires collecting and validating age signals for *all* users—raising significant implementation, privacy, and governance challenges as governments consider measures such as the **Kids Online Safety Act (KOSA)**, the **Kids Off Social Media Act**, and the **App Store Accountability Act**.

Today
UK Considers Social Media Ban and Stronger Age Assurance for Children

UK Considers Social Media Ban and Stronger Age Assurance for Children

The UK government said it is considering restricting or banning social media access for children, with Prime Minister **Keir Starmer** stating that “no option is off the table.” Reported options include improving **age assurance** technology, raising the digital age of consent, imposing phone curfews, and limiting platform design practices associated with compulsive use (including “infinite scrolling”), alongside publishing evidence-based screen-time guidance for parents and tightening school enforcement around phone use. In Parliament, momentum is being driven in part by a proposed amendment to the **Children’s Wellbeing and Schools Bill** that would require regulated user-to-user services to deploy “highly-effective” age assurance measures to prevent under-16s from becoming users, and would also task the UK’s chief medical officers with publishing age-based advice on children’s social media use. UK officials also indicated they plan to engage with Australia to learn from its under-16 social media restrictions, which Australian authorities said led to millions of account deactivations shortly after implementation.

1 months ago
Regulatory Push for Online Age Verification and Adult-Site Access Restrictions

Regulatory Push for Online Age Verification and Adult-Site Access Restrictions

A growing regulatory push to require **online age verification**—particularly for access to pornography and other age-restricted content—is accelerating in the U.S. and U.K., with policymakers framing it as a child-safety measure and critics warning of privacy and free-speech risks. An **FTC** commissioner publicly endorsed age verification as a tool to protect children online, pointing to widespread state-level adoption in the U.S. and noting that court outcomes have been mixed, including a **U.S. Supreme Court** decision upholding a Texas law requiring pornography sites to verify users’ ages. In the U.K., the **Online Safety Act (OSA)** is driving direct service changes: **Aylo** (parent company of Pornhub and other tube sites) said it will **restrict access in the United Kingdom** rather than implement the OSA’s age-checking approach for all visitors, while allowing continued access for users who have already verified their identity. Aylo argued the framework diverts traffic to unregulated sites and creates privacy risks, while **Ofcom** countered that services can either implement compliant age checks or block U.K. access and urged development of effective device-level solutions.

1 months ago

Get Ahead of Threats Like This

Mallory continuously monitors global threat intelligence and correlates it with your attack surface. Know if you're exposed. Before adversaries strike.