A familiar storyline is hardening into regulatory doctrine across Europe: frame social media use as addiction, then require platforms to reengineer themselves around age segregation and digital ID.
The European Commission’s preliminary case against TikTok, announced today, shows how that narrative is now being operationalized in policy, with consequences that reach well beyond one app.
European regulators have accused TikTok of breaching the Digital Services Act by relying on what they describe as “addictive design” features, including infinite scroll, autoplay, push notifications, and personalized recommendations.
Officials argue these systems drive compulsive behavior among children and vulnerable adults and must be structurally altered.
What sits beneath that argument is a quieter requirement. To deliver different “safe” experiences to minors and adults, platforms must first determine who is a minor and who is not. Any mandate to offer different experiences to minors and adults depends on a reliable method of telling those groups apart.
Platforms cannot apply separate algorithms, screen-time limits, or nighttime restrictions without determining a user’s age with a level of confidence regulators will accept.
Commission spokesman Thomas Regnier described the mechanics bluntly, saying TikTok’s design choices “lead to the compulsive use of the app, especially for our kids, and this poses major risks to their mental health and wellbeing.” He added: “The measures that TikTok has in place are simply not enough.”
The enforcement tool behind those statements is the Digital Services Act, the EU’s platform rulebook that authorizes Brussels to demand redesigns and impose fines of up to 6% of global annual revenue.