European Governments Move to Restrict Social Media Use by Minors
The European Commission issued preliminary findings that TikTok’s product design—including infinite scroll, autoplay, push notifications, and personalized recommendations—may breach the EU Digital Services Act (DSA) by failing to adequately assess and mitigate risks to users’ physical and mental well-being, particularly for minors and vulnerable users. If confirmed, the Commission said the violations could result in penalties of up to 6% of TikTok’s global annual turnover, and it signaled expected design changes such as screen-time breaks, adjustments to recommendation systems, and disabling or reducing features deemed to drive compulsive use.
Separately, Spain announced plans to ban social media access for children under 16 and require age verification by platforms, aligning with a broader European trend toward statutory restrictions on minors’ social media use. The announcement follows similar initiatives across Europe, including Australia’s under-16 restriction (cited as precedent), the Netherlands’ push to bar under-15s, French legislation targeting under-14s, and the UK studying a ban for children 15 and under—indicating accelerating regulatory pressure on platforms to implement enforceable child-safety and access controls.
Timeline
Feb 6, 2026
TikTok rejects EU allegations and prepares defense
TikTok disputed the Commission's characterization of its platform as 'categorically false' and said it would challenge the preliminary findings. The company was given the opportunity to review the case file and submit a written response before any final decision.
Feb 6, 2026
EU issues preliminary DSA findings against TikTok over addictive design
The European Commission notified TikTok of preliminary findings that its design features, including infinite scroll, autoplay, push notifications, and recommender systems, likely violate the DSA by failing to adequately protect users' well-being, particularly that of minors. The Commission said TikTok could face fines of up to 6% of global annual turnover and may need to change core product design if the findings are confirmed.
Feb 3, 2026
Spain announces plan to ban social media for children under 16
Spanish Prime Minister Pedro Sanchez announced plans to ban children under 16 from accessing social media and to require platforms to implement age verification, while signaling forthcoming legislation to regulate social media content.
Dec 1, 2025
Australia implements under-16 social media restriction
Australia implemented a comparable restriction barring children under 16 from social media, becoming a reference point for similar youth-access proposals later discussed in Europe.
Feb 1, 2024
European Commission opens DSA investigation into TikTok
The European Commission opened a formal investigation into TikTok under the Digital Services Act, examining whether the platform properly identified and mitigated systemic risks tied to features such as infinite scroll, autoplay, push notifications, and personalized recommendations, especially for minors.
See the full picture in Mallory
Mallory subscribers get deeper analysis on every story, including:
Who’s affected and how
Deep-dive technical analysis
Actionable next steps for your team
IPs, domains, hashes, and more
Ask questions and take action on every story
Filter by topic, classification, timeframe
Get matching stories delivered automatically
Sources
Related Stories

Regulatory Actions Target TikTok in the EU and US
The **European Commission** issued preliminary findings alleging TikTok’s product design violates the EU **Digital Services Act (DSA)**, arguing that features such as *infinite scroll*, *autoplay*, *push notifications*, and highly personalized recommendations can drive addictive use patterns and that protections for **minors** (including parental controls and screen-time tools) are insufficient. TikTok rejected the characterization and said it will contest the findings; potential outcomes include mandated changes to algorithms/interface design and fines of up to **6% of global annual revenue** if violations are confirmed. In the **United States**, TikTok’s continued operation has been tied to a divest-or-ban framework requiring **ByteDance** to divest its U.S. business or face removal from app stores and blocking by service providers, driven by longstanding concerns about data access and Chinese legal jurisdiction. The reference describes repeated deadline extensions via executive orders after an initial shutdown period, ongoing negotiations/interest from potential investors, and reporting that TikTok has explored a separate U.S.-specific app (“**M2**”) amid uncertainty over the platform’s final outcome in the U.S. market.
1 months ago
Regulatory Push to Strengthen Child Online Safety and Age Assurance on Social Platforms
UK regulators **Ofcom** and the **Information Commissioner’s Office (ICO)** issued warnings to major social media and video platforms (including **Facebook, Instagram, Snapchat, TikTok, and YouTube**) demanding “urgent steps” to implement more robust **age assurance** controls to prevent access by children under 13. Regulators signaled potential enforcement if platforms continue relying primarily on easily bypassed self-declared ages, arguing this enables unlawful collection and use of children’s data and exposes under-13s to services not designed for them; Ofcom requested companies report back on their plans by the end of April. In the US, the House Energy and Commerce Committee advanced the **Kids Internet and Digital Safety (KIDS) Act** on a party-line vote amid Democratic objections that the bill could reduce platform accountability for harms to minors. Criticisms focused on provisions that could **preempt certain state laws**, a **knowledge requirement** that opponents argue may let companies claim ignorance of minors’ presence, and the lack of a proactive **“duty of care”** requirement; proposed amendments to strengthen protections were not adopted.
1 months ago
Regulatory-Driven Consumer Privacy and Child Safety Controls in the EU and California
TikTok said it will roll out stronger **age-verification** capabilities across the EU in the coming weeks, following a year-long pilot that analyzes profile details, posted videos, and behavioral signals to estimate whether an account may belong to a user under 13. Flagged accounts are to be reviewed by specialist moderators rather than automatically removed; TikTok said a UK pilot resulted in the removal of thousands of accounts. The move reflects increasing regulatory and public pressure on major platforms to more reliably prevent underage access, particularly where services process significant personal data and use algorithmic recommendations. California launched a new consumer privacy mechanism—the **Delete Request and Opt-out Platform (DROP)**—that allows residents to request deletion of personal information held by more than 500 registered data brokers. The tool, available via `privacy.ca.gov/drop`, supports identity and residency verification either by entering personal details (e.g., name, date of birth, address) or by using a *login.gov* account (which may require uploading government ID). The platform operationalizes expanded state privacy rights by centralizing deletion requests, aiming to reduce the exposure and resale of personal data by the data broker ecosystem.
1 months ago