The European Commission has preliminarily concluded that TikTok violated the Digital Services Act (DSA) through the addictive design of its platform, issuing a formal warning that features such as infinite scroll, autoplay, push notifications, and highly personalised recommender systems pose serious risks—especially to minors and vulnerable users.
In a statement released on Friday, February 6, 2026, the Commission said its investigation showed TikTok failed to adequately assess and mitigate the negative effects of these elements on users’ physical and mental well-being. The Commission described certain design choices as deliberately engineered to trigger compulsive behaviour, constantly “rewarding” users with fresh content and shifting their engagement into what it called “autopilot mode”.
European Commission spokesperson Thomas Regnier told reporters in Brussels:
“These features lead to the compulsive use of the app, especially for our kids, and this poses major risks to their mental health and well-being.”
Regnier cited internal data showing TikTok is the most-used platform after midnight among children aged 13–18 in the European Union. He added that 7% of children aged 12–15 spend between four and five hours daily on the app, underscoring concerns about excessive screen time and its impact on sleep, concentration, and psychological health.
At this preliminary stage, the Commission is calling on TikTok to fundamentally change the core design of its service. Proposed measures include:
Disabling or significantly restricting infinite scroll and autoplay functions
Implementing mandatory and effective screen-time breaks
Adapting the recommender system to reduce addictive patterns and limit the promotion of content that encourages prolonged use
TikTok now has the opportunity to respond to the preliminary findings. The European Board for Digital Services will also be consulted before any final decision is reached. If non-compliance is confirmed after the full procedure, the Commission may impose a fine of up to 6% of TikTok’s global annual turnover—potentially amounting to billions of euros given the platform’s scale.
The preliminary findings are part of the Commission’s broader enforcement of the Digital Services Act, which came into full effect in 2024 and places strict obligations on very large online platforms (VLOPs) to assess and mitigate systemic risks, including those related to mental health, the protection of minors, and addictive design.
Snapchat also under scrutiny
During the same press briefing, Regnier confirmed that Snapchat is also under active monitoring by the Commission. He revealed that Brussels has already sent a formal request for information to the platform, focusing on the alleged sale of illegal products to minors—such as vapes, alcohol, and other restricted goods—through its services. Regnier emphasised that the Commission is actively supervising multiple platforms under the DSA framework.
Response to Trump-shared video
Regnier was also asked about a widely circulated video posted on Thursday by US President Donald Trump on his Truth Social account, which depicted former US President Barack Obama and former First Lady Michelle Obama in an offensive and racist manner (portrayed as monkeys). The spokesperson responded carefully:
“It feels good to feel safe at home in Europe. This is precisely why we have a regulation in place. So racism, hate speech, illegal content, has no place online.”
He clarified, however, that the DSA does not involve the Commission reviewing or sanctioning individual pieces of content. “We’re not looking at individual content from a DSA point of view. This can be done by criminal investigations—not in our remit,” Regnier said.
He reiterated the EU’s firm stance on enforcement: “We’re not shying away from taking action with American platforms, if need be.”
The Commission’s preliminary findings against TikTok represent one of the most significant enforcement actions to date under the DSA targeting platform design and its impact on minors. If upheld, the case could set a precedent for how the EU regulates addictive features across social media, potentially forcing major changes to how platforms like TikTok, Instagram Reels, YouTube Shorts, and others operate in the 27-member bloc.
TikTok has not yet issued a formal public response to the preliminary conclusions but is expected to submit detailed representations in the coming weeks. The company has previously defended its tools, arguing that it offers parental controls, screen-time reminders, and restricted modes for younger users.
The development underscores the EU’s increasingly assertive approach to tech regulation, prioritising the protection of children and mental health in the digital age—even when it means confronting powerful US-based platforms.
