The EU’s new digital ambition, starting with TikTok.
Brussels has accused TikTok of adopting an addictive design and is demanding changes, extending the Digital Services Act (DSA) from a content regulator to the design of platforms themselves. Motivations aside, such a move would deepen the micromanagement of Europeans’ digital lives, while such regulatory reach increasingly imposes standards that end up shaping technology worldwide.
For years, the European Union (EU) has been engaged in a battle against large technology companies. Through regulations such as the Digital Services Act (DSA), the Digital Markets Act (DMA), and the General Data Protection Regulation (GDPR), Brussels has created a complex, bureaucratic, and punitive regulatory environment that makes it difficult for truly competitive European tech companies to emerge and grow.
While imposing increasingly intrusive rules on foreign platforms, Europe has failed to produce its own technology giants. Instead of fostering innovation through open competition and entrepreneurial freedom, the EU has chosen to regulate existing companies quite heavily. The result is a landscape of limited European innovation and growing dependence on technologies developed outside the continent, the very technologies that, in turn, Brussels continues to shape according to its regulatory preferences. Major platforms comply simply because losing access to the European market is not an option.
In February 2026, under the DSA, Brussels concluded that TikTok’s design is addictive, citing violations of Articles 34 and 35, which require the assessment and mitigation of systemic risks, including risks to mental health and minors. Brussels also cited Article 25, which prohibits manipulative interfaces.
If the DSA was originally designed to address content on platforms—already a problematic principle in itself, given that it places a central authority in charge of regulating speech—this new step goes further. The EU is now expanding its interpretation of existing law and intervening directly in platform design, claiming that certain features were intentionally created to encourage compulsive use.
By requiring measures such as limiting or disabling infinite scroll over time, introducing mandatory screen breaks (including at night), and altering recommendation algorithms to make them less “addictive,” regulation shifts away from content and begins to dictate how platforms should function.
Although the protection of children is the main justification, these measures do not apply only to minors. The Commission’s conclusions also affect adults, and because, as mentioned above, digital platforms operate with unified global designs, rules adopted in Brussels end up shaping user experiences worldwide, including in the United States, where tensions around the DSA have been growing.
One example comes from the report “The Foreign Censorship Threat, Part II,” published in February 2026 by the House Judiciary Committee. Based on thousands of internal documents obtained through subpoenas, the report accuses the European Commission of pressuring major platforms, across more than 100 closed-door meetings, to strengthen their content moderation policies globally, including the removal or suppression of political content and factual information that is legal in the United States.
Regardless of the report’s political framing, it highlights a growing risk. When European regulators impose heavy requirements, platforms tend to apply the same rules worldwide in order to avoid fines of up to 6% of global revenue. Penalizing the very design that has made an application successful under threat of fines goes far beyond the proper role of the state, and carries wider consequences for innovation. Forcing companies to alter their products deliberately and abandon the formulas that made them successful discourages technological progress, both for the specific company and others.
Above all, this approach erodes individual responsibility. Instead of trusting parents and users themselves to manage their digital habits, the state assumes the role of an ever-present guardian that decides what is acceptable.
Throughout history, society has consistently adapted to new entertainment and information technologies. Television sparked decades of concern about its impact, yet these concerns were largely addressed through industry self-regulation, rating systems, and parental tools. Video games triggered even stronger moral panics, from pinball to titles like Mortal Kombat and Grand Theft Auto, but were ultimately integrated through classification systems and recognition as forms of protected expression.
In both cases, adaptation came through individual responsibility, market competition, and innovation, without the state dictating how technologies should be technically designed.
The relationship between individuals and technology should rest on personal responsibility, not on overreaching government mandates that seek to replace the freedom to innovate, to speak, and to consume entertainment with control.