France’s Education Minister, Edouard Geffray, has announced the filing of a criminal complaint against the social media platform TikTok, accusing its algorithm of promoting content that may encourage suicide among young users.
Speaking on France Inter radio on Thursday, Geffray said the complaint had been submitted to the Paris public prosecutor’s office. The case alleges that TikTok’s recommendation system exposes minors to harmful material and may also involve the unlawful transfer of user data.
According to the minister, the decision followed growing concerns about the platform’s impact on adolescents, particularly its ability to rapidly direct vulnerable users toward distressing or dangerous content.
“We must stop these deadly spirals that lead teenagers toward paths that are dangerous for their lives,” Geffray said during the interview.
As part of its assessment, the French Education Ministry conducted an internal experiment to evaluate how TikTok’s algorithm interacts with younger users. Officials created a test account designed to mimic the profile of a 14-year-old user and observed the type of content recommended by the platform.
Geffray revealed that within just 20 minutes of using the account—without liking or engaging with any videos—the algorithm began suggesting content related to depression and suicide. The findings, he said, raised serious concerns about the platform’s content moderation and recommendation systems.
The complaint also alleges that TikTok may be involved in transferring user data in ways that violate French or European regulations, though further details on this aspect of the case were not immediately disclosed.
The move by French authorities comes amid increasing scrutiny of major social media platforms across Europe, particularly regarding their influence on mental health and the safety of younger users. Regulators have been calling for stricter oversight, transparency in algorithm design, and stronger safeguards to prevent exposure to harmful content.
TikTok, owned by Chinese company ByteDance, has faced similar criticism in other countries, with governments and watchdog groups raising concerns about both data privacy and the psychological impact of its highly personalized content feeds.
The company has previously stated that it is committed to user safety and has introduced measures such as content moderation policies, screen time controls, and tools aimed at protecting minors. However, critics argue that these measures are insufficient given the speed and scale at which content is distributed.
Legal experts note that the French complaint could test the extent to which social media platforms can be held accountable for the behavior of their algorithms. If pursued, the case may also intersect with broader European Union regulations, including the Digital Services Act, which places obligations on tech companies to address harmful content and ensure greater transparency.
The outcome of the complaint could have significant implications not only for TikTok but also for other platforms that rely heavily on algorithm-driven content recommendations.
As investigations potentially move forward, the case highlights growing tensions between governments and technology companies over the responsibility to protect users—especially minors—from online harm, while balancing issues of free expression and digital innovation.
