European regulators have issued a preliminary finding that TikTok’s most addictive features likely violate EU law, and they are seeking major changes.
The European Commission, the executive arm of the European Union, announced today that the social media app’s infinite scroll, autoplay, push notifications, and personalized recommendation algorithm likely breach the bloc’s Digital Services Act (DSA). Regulators say the company failed to properly assess how those design choices could harm users, particularly minors.
The Commission stated that TikTok has not adequately assessed the risks its features pose to users’ well-being. It also alleges that the company disregarded warning signs of compulsive behavior, including the amount of time minors spend on the app late at night.
According to the preliminary decision, certain TikTok features fuel users’ urge to keep scrolling and put their brains in “autopilot mode,” which could lead to compulsive behavior and a loss of self-control.
“Social media addiction can have detrimental effects on the developing minds of children and teens,” said Henna Virkkunen, the European Commission’s executive vice-president of tech sovereignty, security, and democracy, in a statement. “The Digital Services Act makes platforms responsible for the effects they can have on their users. In Europe, we enforce our legislation to protect our children and our citizens online.”
The Commission says TikTok will now have the opportunity to respond to the allegations before any final decision is made. But if regulators ultimately confirm the violations, the company could face fines of up to 6% of its global annual revenue.
“The Commission’s preliminary findings present a categorically false and entirely meritless depiction of our platform, and we will take whatever steps are necessary to challenge these findings through every means available to us,” a TikTok spokesperson told Gizmodo in an emailed statement.
The EU’s findings stem from a probe launched in 2024 examining whether TikTok was violating the DSA, specifically looking at the so-called “rabbit hole effect” of its algorithm and the safety of minors on the platform. The Digital Services Act, which took effect in 2023, provides a set of rules that large online platforms must follow to protect users.
The Commission said it believes TikTok may need to make concrete design changes like disabling its infinite scroll, adding screen-time breaks, and tweaking its recommendation algorithm.
The preliminary finding comes as governments around the world are increasingly scrutinizing the impact of social media on children and teens. This year, Australia became the first country in the world to ban social media accounts for children under 16. Other countries, including Spain, Denmark, and Malaysia, have said that they plan to pursue similar restrictions.
In the United States, TikTok and other social media platforms like Meta, Snap, and YouTube are facing a growing number of lawsuits accusing the companies of designing addictive products that harm young users. Last month, TikTok settled one such case in California.
TikTok’s U.S. operations also changed ownership earlier this year through a new joint venture that includes managing investors Oracle, Silver Lake, and Abu Dhabi–based MGX. The arrangement was created to comply with a September executive order from President Donald Trump allowing TikTok to continue operating in the U.S. under new leadership. The new structure gives the joint venture more control over the app’s recommendation algorithm, which U.S. politicians have called a national security threat without detailing their reasoning for that concern.









English (US) ·