The European Commission has accused TikTok of designing its platform in a way that may be addictive — particularly for minors — marking a major step under the Digital Services Act (DSA).
This marks the first time the European Union (EU) has treated platform “addictiveness” as a systemic risk under the DSA.
If the findings are confirmed, TikTok could be forced to redesign key parts of its app — and potentially face fines of up to 6% of its global annual revenue, a significant amount given the scale of its parent company, ByteDance.
Addictive By Design?
In its preliminary findings, the Commission said TikTok’s infinite scroll, autoplay functions, push notifications, and its highly personalised recommender system create what it describes as an “addictive design.”
Regulators argue that these features continuously reward users with new content, fuelling the urge to keep scrolling and shifting users’ brains into ‘autopilot mode’.
According to the Commission, TikTok failed to properly assess how its design choices could harm users’ physical and mental well-being, particularly minors and vulnerable adults. Officials pointed to indicators such as:
- A high amount of time that minors spend on the app at night
- Frequent app-opening patterns
- Limited friction when dismissing screen-time warnings
- Parental controls that require extra effort to activate
The Commission believes TikTok’s current screen-time management tools are not sufficiently robust to mitigate these risks and do not meaningfully reduce excessive usage.
A First For The Digital Services Act
While the law under the DSA has previously been used to address misinformation and transparency issues, this is the first time it has directly challenged a platform’s core design as a threat to mental health.
EU tech chief Henna Virkkunen described the findings as part of a broader effort to protect young users from systemic harms linked to social media platforms.
What Could Change?
The Commission has signalled that TikTok may need to:
- Disable or significantly modify infinite scrolling
- Introduce stricter, more effective screen-time breaks — including night-time limits
- Adjust its recommendation algorithms
- Strengthen parental control systems
TikTok’s Response
TikTok has strongly denied the allegations. Spokesperson Paolo Ganino called the findings “categorically false and entirely meritless,” saying the company would use “every means available” to challenge them.
Under the DSA process, TikTok now has the right to review evidence and submit a written defense. The European Board for Digital Services will also be consulted before a final decision is made.
Given the scale of TikTok’s parent company ByteDance, even a 6% fine could amount to billions of euros.
Broader Impact On Big Tech
The case could have wider consequences beyond TikTok. Platforms like Meta — which owns Facebook and Instagram — are already under investigation for similar design practices. Digital rights advocates such as Jan Penfrat from European Digital Rights (EDRi) believe the TikTok case could become a template for how the EU handles “attention engineering” across the sector.
For Brussels, this investigation is about more than just one app. It is about defining what responsible platform design should look like in the digital age. By explicitly classifying addictive design as a systemic risk, the EU may be setting a legal standard that other countries could soon follow.
Whether TikTok ultimately changes its design or fights the ruling in court, one thing is clear: Europe is drawing a line against the mechanics of endless scrolling that was once considered standard practice in social media design.
