fbpx

The €120 Million Fine Imposed on X Is a Sign That Europe Is Losing Its Digital Neutrality

Science and Technology - December 11, 2025

The €120 million fine imposed on X is not just a technical dispute: it reveals the discretionary drift with which Brussels is applying the Digital Services Act, transforming a tool created to ensure transparency into a political power capable of affecting European digital pluralism.

Europe, which claims to regulate the global digital ecosystem with the authority of the law, is increasingly at risk of slipping into the realm of political discretion. The $120 million fine imposed by the European Commission on X—the first in the history of the Digital Services Act—does not just open a technical dispute with a platform. It opens up a cultural fault line that affects the very conception of Europe as a space of freedom, pluralism, and legal certainty.

The point is not Musk. The point is how and why Brussels chose to target a single actor with a nine-figure fine based on violations that are vaguely defined, interpretative, and unmeasurable. And this is where the European Conservatives and Reformists group speaks out forcefully.

A DSA that slips towards arbitrariness

The position of MEPs Nicola Procaccini and Patryk Jaki, co-chairs of the ECR Group, is unequivocal. The Commission’s decision, they say, reveals structural problems in the way the DSA is interpreted and applied. This is not a technical issue, but a political alarm. Procaccini, as reported in the parliamentary group’s official statement, notes that when Brussels imposes a $120 million fine for violations “defined in vague and highly subjective terms,” a legitimate doubt arises about the proportionality and neutrality of the decision. He adds a decisive point: “A digital right without legal certainty risks becoming an instrument of political discretion.” This is not an attack on regulation. It is an attack on the lack of clear criteria, on the possibility that a supranational apparatus could use flexible rules to impose selective sanctions, to “send a signal” rather than to enforce the law.

When platforms fear politics rather than the law. But it is Patryk Jaki who points out the most dangerous consequence: self-censorship. If platforms start to fear not what the regulation says, but what the Commission might think, the result will be a less free, more conformist, more predictable—and therefore more easily controllable—digital environment. As Jaki states: “If companies fear that controversial design choices or interpretations of transparency could lead to huge fines, the result will not be greater security, but more self-censorship and less open debate.” In a liberal democracy, platforms’ reverential fear of political power is an indicator of institutional disease. Not a regulatory success.

Opaque proportionality and non-existent criteria: the sanction as a political act. Another critical point denounced by the ECR concerns the lack of transparency in the calculation of the fine. According to the official statement, the Commission has been unable to explain the parameters used to arrive at the exact figure of 120 million. No model. No formula. No numerical justification. Only a reference to “proportionality.”It is precisely this lack of method that makes the sanction a dangerous precedent: a European legislator who, instead of providing objective criteria, reserves the right to strike in a punitive and arbitrary manner, creating the impression that compliance is not assessed on the basis of the law, but on the basis of changing political expectations.

And the ECR is not limiting itself to criticism: it will formally ask the Commission to provide explanations on the logic, criteria, and proportionality of the intervention.

The systemic risk: a Europe that punishes those who do not conform

For years, the European debate on digital regulation has oscillated between two impulses:

  • the legitimate desire to protect users;
  • the much less legitimate temptation to regulate dissent, even when it takes the form of business models, platform design, or editorial choices.

The sanction against X falls precisely at this crossroads.And it creates the impression that Europe is embarking on a path of governance by intimidation, in which the “non-aligned” or simply more difficult to control platform becomes the exemplary target. The fact that the case is exploding in the midst of growing transatlantic friction over digital rules is not a detail: it is the geopolitical context in which this decision will be read by Washington, now led by an administration less inclined to consider Brussels a neutral arbiter.

Freedom, pluralism, and law: the conservative vision

The ECR’s position does not defend Musk or any particular platform. It defends a principle: regulatory power must be neutral, measurable, and verifiable. Because when the law becomes open to interpretation, citizens—and with them digital users—lose the fundamental guarantees of the rule of law. The DSA, in its original intent, was supposed to be the European tool for ensuring transparency, security, and accountability. But a DSA applied as in this case risks turning into its opposite: a framework where discretion prevails over certainty, and where platforms learn more about avoiding political conflicts than about complying with technical standards.  This is the crux of the matter: a Europe that punishes without explaining, interprets without clarifying, and regulates without guaranteeing neutrality does not defend democracy: it weakens it.

The real battle is over the future of digital pluralism

The ECR is right to demand clarity, proportionality, and legal certainty. It is right to call for regulation that does not become a political weapon. And above all, it is right to denounce the risk that Brussels’ fear will create a poorer, more cautious, and less free digital ecosystem. Democracy does not grow in silence: it grows in the clash of ideas. And any law that encourages silence, even unintentionally, is a law that needs to be reviewed. If Europe wants to be a civilization before a system of rules, it must remember that freedom of speech is not defended with exemplary sanctions. It is defended with legal certainty, with the neutrality of institutions, and with the rejection of any temptation to exercise discretion. Case X is only the first test case. And it is good that someone is finally saying this clearly.

Focus – What is the Digital Services Act

The Digital Services Act is the new regulatory framework with which the EU intends to regulate large digital platforms, imposing much stricter obligations on Very Large Online Platforms (VLOPs), i.e., services with more than 45 million users in the Union. The DSA was created with the aim of increasing transparency, mitigating systemic risks (disinformation, foreign interference, manipulation of public debate), protecting users, and making the European digital market more predictable. For VLOPs, this means having to make their advertising archives public, allow accredited researchers access to data, avoid deceptive practices in interfaces (dark patterns), assess and mitigate risks to democracy, and respond quickly to removal orders from authorities. The political crux, as noted by the ECR, is that many key concepts in the DSA — ‘systemic risk’, ‘deceptive design’, ‘adequate mitigation’, ‘meaningful transparency’ — are broad and flexible, giving the Commission unprecedented interpretative leeway, combined with the power to impose penalties of up to 6% of global turnover and prescribe corrective measures. In theory, the DSA should ensure order and accountability; in practice, if applied with changing criteria, it risks becoming a tool for discretionary enforcement, where platforms fear not the law but the political mood in Brussels. This is the point contested by the European Conservatives in case X: without clear criteria, regulatory neutrality falters and digital freedom becomes vulnerable.