BRUSSELS — Online platforms represent the main source of risks to democratic elections, the safety of children, societal cohesion, and the spread of illegal or harmful content across the European Union, a European Commission spokesperson stated on Wednesday, February 4, 2026.
Thomas Regnier, speaking during a regular press briefing, emphasized that the bloc’s landmark Digital Services Act (DSA) places direct legal responsibility on very large online platforms (VLOPs) to identify, mitigate, and prevent systemic risks arising from their services.
“Who can pose risk to our elections? It’s online platforms. Who can pose risks to our kids? Online platforms. Who can amplify illegal, harmful content? Again, it’s online platforms, and I can go on,” Regnier said. “We want to use them, but they can clearly pose a risk to our citizens and to our societies.”
The spokesperson made the remarks in response to a recent report by the U.S. House Judiciary Committee, which alleged that the European Commission has indirectly influenced electoral processes in Europe, including Romania. Regnier firmly rejected the claim, stressing that the Commission never interferes in national elections.
“The European Commission does not interfere in national elections,” he stated. “What we do is we can organize roundtables with platforms when requested by national authorities—as was the case with TikTok in Romania. These discussions aim to ensure that algorithms do not amplify content from one political party over others.”
Regnier highlighted the Commission’s ongoing formal proceedings against TikTok, launched in December 2024, over concerns related to election integrity, the protection of minors, and the platform’s obligations during political campaigns. He described TikTok as having been “extremely cooperative” throughout the investigation and noted that the company has already implemented several measures to align with DSA requirements.
The spokesperson underscored that the DSA is designed as a “powerful tool” to safeguard citizens and maintain a level playing field in the digital space. Crucially, he clarified that liability under the regulation rests with the platforms themselves—not individual executives or CEOs.
“We are not attacking individuals with the DSA. We’re looking at the platforms,” Regnier explained. “So if the CEO indeed algorithmically promotes a political party and decreases the visibility of all the others, then he doesn’t become liable. The platform becomes liable. So our focus is always, always, always on the platform.”
Addressing the situation with X (formerly Twitter), Regnier confirmed that the platform continues to engage constructively with the European Commission through its dedicated compliance team. He stated that no procedural breaches have been identified to date and reiterated the Commission’s preference for dialogue and cooperation.
“Dialogue is our priority,” he said, “but we will act when necessary to protect the internal market and uphold the rules that apply to all very large online platforms operating in the EU.”
The comments come amid heightened scrutiny of social media platforms ahead of major elections across Europe in 2025 and 2026, including national votes in several member states and ongoing concerns about foreign interference, disinformation, and algorithmic amplification of divisive or illegal content.
Under the DSA, platforms designated as VLOPs—including Meta, Google, YouTube, TikTok, X, and others—are subject to stricter obligations, including risk assessments, transparency reports, independent audits, and swift removal of illegal content. Failure to comply can result in fines of up to 6% of global annual turnover.
The European Commission has repeatedly stressed that its enforcement actions aim to ensure a safer, more transparent, and fairer online environment without targeting specific political viewpoints or interfering in sovereign electoral processes.
Regnier’s briefing reflects the Commission’s consistent messaging: while digital platforms are essential tools for communication, innovation, and public discourse, their scale and influence require robust oversight to prevent harm to democratic processes, vulnerable groups—especially children—and societal stability across the 27-nation bloc.
The ongoing TikTok investigation and continued engagement with other platforms are expected to remain focal points of DSA enforcement in the coming months, with the Commission promising decisive action whenever systemic risks are identified.
