The European Commission’s decision on 5 December 2025 to levy a €120 million fine against X under the Digital Services Act (DSA) marks a sharp intensification of its regulatory approach toward major online platforms.
As the first non-compliance sanction issued under the DSA, the ruling is being held up as evidence of the EU’s determination to enforce its new framework for transparency, accountability, and platform oversight.
But to the European Conservatives and Reformists (ECR) Group, the decision simultaneously exposes persistent weaknesses in the DSA’s enforcement logic—weaknesses that risk converting a law intended to ensure fairness into one prone to arbitrary, inconsistent, or politically coloured application.
The ruling lands in an environment already shaped by escalating interventions from Ireland’s Data Protection Commission (DPC), which has imposed billions in GDPR fines this year alone. This pattern has resurfaced concerns about proportionality, legal certainty, and the regulatory burden placed on an already disadvantaged European tech ecosystem. Many observers argue that Europe’s regulatory posture, if pushed further, risks suffocating innovation in pursuit of abstract compliance ideals.
At the core of the Commission’s case are three alleged transparency failures. The first involves X’s reimagined blue checkmark, which regulators describe as “deceptive design.” Since the company rebranded from Twitter to X, the blue tick—formerly a controlled verification marker—has been purchasable with minimal identity scrutiny. The Commission argues this blurs the boundary between verified and unverified accounts, increasing exposure to impersonation and misinformation. While the DSA does not require verification, it explicitly prohibits features that imply authentication where none exists.
The second alleged breach concerns deficiencies in X’s advertising repository. Researchers report slow loading, incomplete entries, and missing details about targeting criteria, content, and sponsor identities. According to the Commission, these gaps undermine the detection of systemic risks such as political manipulation or discriminatory ad practices.
The third issue involves barriers to researcher access. X’s restrictions on scraping and public-data analysis, the Commission claims, violate Articles 39 and 40(12), which require very large platforms to facilitate independent study of platform behaviour and systemic risks.
Despite asserting that the €120 million fine reflects the seriousness and duration of these infringements, the Commission has declined to explain the calculation method. Instead, it relies on broad references to proportionality and user impact. X must now fix the verification system within 60 working days and provide action plans for advertising transparency and research access within 90. Failure could result in recurring penalties. The proceedings trace back to December 2023, following preliminary findings in July 2024, and accompany other investigations into alleged illegal content and manipulation.
Individually, questions of verification integrity, advertising disclosure, and research access are valid. But as ECR Co-Chairmen Nicola Procaccini and Patryk Jaki note, the Commission’s approach reveals deeper structural dysfunction in the DSA’s enforcement architecture.
The ruling highlights an expanding gap between the DSA’s stated goals and its operational reality. Although presented as a neutral transparency regime, early enforcement suggests regulators are leaning on highly subjective interpretations of “risk” and “harm.” For the ECR, this shift is dangerous: when intent rather than demonstrated conduct becomes the basis of enforcement, platforms are subjected to uncertain, discretionary, and potentially politicised regulatory power. In the absence of defined metrics, enforcement risks becoming selective under the guise of consumer protection.
The Commission’s silence on sanction methodology compounds these concerns. Without clear metrics, industry cannot meaningfully assess compliance obligations. For legislators, researchers, and platform operators, this opacity shifts the DSA away from a predictable, rule-based system and toward discretionary intervention.
This episode also crystallises a broader philosophical divide between regulatory cultures. The United States emphasises post-violation remedies and judicial correction, while the EU increasingly relies on anticipatory control and expansive compliance structures. The X ruling stands at the centre of this divide. To critics within ECR and beyond Europe, the decision demonstrates a regulatory culture more concerned with symbolic assertion than practical outcomes—one that risks provoking retaliatory measures from Washington and destabilising transatlantic cooperation.
Within the EU, the fine may intensify growing doubts about the DSA’s trajectory. Member states hosting major tech employers are becoming uneasy about the scale and pace of enforcement. Without clearer benchmarks and stronger safeguards against selective application, both industry and governments may begin resisting Commission oversight. Rather than signalling regulatory strength, the X case may become the focal point for a broader reckoning over the EU’s economic strategy and constitutional balance.
These developments unfold within a wider climate shaped by Ireland’s assertive GDPR enforcement. Ireland, home to numerous European tech headquarters, has become the bloc’s primary enforcement hub, imposing €4.04 billion in fines since 2018—more than four times the amount levied by any other authority. In 2025 alone, GDPR penalties exceeded €5.88 billion. TikTok received a €530 million fine in May for transferring EEA data to China without adequate safeguards, while Meta saw activation of a €1.2 billion penalty tied to post-Schrems II violations.
Conservative MEPs have long warned that such cumulative pressures threaten Europe’s competitiveness relative to the US and China. Critics argue that the DSA’s stated aims—protecting minors, combating disinformation, addressing systemic risks—are being stretched into tools of pre-emptive content control. In 2024 hearings, ECR MEP Piotr Müller questioned Commissioner Virkkunen on undisclosed communications with platforms, echoing earlier concerns about Commissioner Thierry Breton’s letters to Elon Musk urging moderation decisions. These interactions risk blurring the line between oversight and coercion, reducing space for dissent.
Economically, the consequences are profound. Europe’s overcautious regulatory reflex is already suppressing innovation and increasing reliance on imported technologies. Large fines divert capital away from research and product development into compliance overheads. Parallel DSA investigations into Meta, AliExpress, and others add further instability.
The ECR has proposed pragmatic reforms: tie enforcement to demonstrable harm, adopt objective sanction metrics, and require innovation and free-speech impact assessments before major penalties.
X’s response will test the Commission’s posture. The company is likely to appeal while simultaneously preparing for compliance—a dual process consuming substantial resources. For the EU, the case will determine whether the DSA emerges as a principled regulatory tool or a politically flexible instrument.
The US response underscores the geopolitical stakes. Senior Trump administration officials have portrayed the fine as an attack on American innovation and free expression. Vice President JD Vance wrote on X on December 4: “Rumors swirling that the EU commission will fine X hundreds of millions of dollars for not engaging in censorship. The EU should be supporting free speech not attacking American companies over garbage.”
Given the tone of the US response and the administration’s willingness to defend American firms aggressively, diplomatic compromise seems unlikely. Washington now views the Commission’s actions—and the officials directing them—with open hostility.
The stakes extend beyond the immediate dispute with X. The DSA, along with the DMA, AI Act, and evolving cybersecurity regimes, forms the backbone of the EU’s ambition to define a distinctly European model of digital governance. Yet the X case raises an uncomfortable question for policymakers: can the EU maintain regulatory sovereignty without undermining its own technological relevance? Europe remains structurally dependent on American cloud providers, Asian hardware, and external AI models. By layering compliance obligations onto platforms already wary of the European market, Brussels risks accelerating a slow-motion exit of major digital firms—an “innovation drain” already visible in investment patterns, with venture capital increasingly flowing to the United States and Gulf states.
There is also a democratic dimension to consider. The DSA grants the Commission exceptional discretionary powers, including the ability to demand rapid content removal during crises, issue binding orders to platforms, and compel extensive data access. Civil liberties groups have begun warning that without strict procedural limits, these powers could be invoked in politically sensitive contexts. The X ruling, in their view, demonstrates how enforcement can drift toward interpretive flexibility rather than strict legality. If such tendencies continue, the credibility of EU regulatory governance—long premised on predictability and rule-of-law legitimacy—may erode just as global trust in European institutions is faltering.