EU Expands Probe into Elon Musk’s X Over Content Moderation Compliance

The EU expands its probe into X’s compliance with the Digital Services Act, focusing on algorithm transparency and systemic risks.

The European Commission has escalated its investigation into X, formerly known as Twitter, over potential violations of the Digital Services Act (DSA).

The probe, launched in December 2023, examines whether X’s algorithms comply with legal requirements to prevent the amplification of harmful content and misinformation.

As part of this expanded scrutiny, regulators have ordered X to submit internal documents detailing its recommender systems by February 15, 2025, and retain all future records of algorithmic updates until December 2025.

Henna Virkkunen, the EU’s digital chief, highlighted the investigation’s focus, stating, “Today we are taking further steps to shed light on the compliance of X’s recommender systems with the obligations under the DSA.”

The Commission’s demands include access to certain commercial APIs, allowing direct analysis of X’s content moderation processes and account visibility metrics. These actions aim to uncover systemic risks posed by X’s algorithms and their impact on the platform’s operations.

Related: Meta Ends Third-Party Fact-Checking, Turns to Community Notes

Retention Orders and Technical Access Requests

A central aspect of the investigation involves recommender systems—algorithms designed to personalize user experiences by suggesting content based on behavior and preferences. While these systems can enhance user engagement, they also carry risks of promoting divisive or harmful material.

The DSA obligates Very Large Online Platforms (VLOPs) to ensure their algorithms mitigate such risks, with transparency measures enabling regulatory oversight.

The retention order issued to X requires it to safeguard internal records concerning future changes to its algorithms until the investigation concludes or December 31, 2025, whichever comes first.

Related: Apple: Meta’s Interoperability Demands for the EU Undermine Privacy

In addition, the Commission has requested access to commercial APIs—interfaces allowing external parties to assess data, such as content moderation actions and the virality of posts. These APIs could reveal whether X’s algorithms unfairly amplify certain political narratives or misinformation, a recurring concern raised by EU regulators.

An official statement from the Commission explained, “These steps will allow the services to take all relevant facts into account in the complex assessment under the DSA of systemic risks and their mitigation.”

Broader Context of the Investigation

The investigation into X represents one of the most prominent tests of the DSA since the legislation came into full effect in February 2024.

The law requires platforms with over 45 million monthly EU users to implement measures that reduce the spread of illegal content, ensure transparency in algorithmic operations, and provide data access to researchers. Non-compliance can result in fines of up to 6% of a company’s global revenue.

Related: Meta Hit with €800M EU Fine for Facebook Marketplace Push

X’s relationship with European regulators has grown increasingly contentious under Elon Musk’s leadership. Musk has criticized the DSA, claiming it undermines free speech. However, his controversial interactions with European politics—such as publicly supporting Germany’s far-right Alternative for Germany (AfD) party—have drawn additional scrutiny.

Critics allege that X’s algorithms may amplify polarizing or extremist content, potentially influencing political discourse in the EU.

In July 2024, the Commission issued a warning to X over its inadequate content moderation measures, cautioning that the platform could face significant financial penalties if it failed to comply with the DSA.

Stress tests conducted in mid-2024 further exposed potential gaps in X’s adherence to the law, though specific findings have not been disclosed.

Understanding the Technical and Regulatory Stakes

Recommender systems have become a focal point for regulators due to their ability to shape user engagement and public discourse. Under the DSA, platforms must provide mechanisms for users to understand how these algorithms work, such as by disclosing criteria used for personalized content recommendations.

The law also requires regular risk assessments to identify and mitigate potential harms stemming from algorithmic decisions.

Related: EU Targets Musk’s X: SpaceX and Neuralink May Be Fined Next

The Commission’s investigation extends to broader issues of transparency and accountability. For instance, earlier reports highlighted concerns over X’s refusal to provide researchers with sufficient access to its data, as required by the DSA.

These challenges are compounded by allegations of deceptive user interface design, particularly regarding verification badges tied to subscription-based services.

EU Officials Address Allegations of Political Influence

The investigation has also sparked debate over the EU’s regulatory motivations, particularly in light of Musk’s political associations.

Responding to claims of bias, EU spokesperson Thomas Regnier stated, “The Commission’s actions are entirely independent from any political considerations or indeed any specific events recently happening.” The assertion reflects the EU’s commitment to enforcing its legal framework objectively.

Henna Virkkunen reinforced this stance, emphasizing, “We are committed to ensuring that every platform operating in the EU respects our legislation, which aims to make the online environment fair, safe, and democratic for all European citizens.”

Future Implications for X and the DSA

As the investigation progresses, X’s compliance will serve as a crucial test of the DSA’s enforcement mechanisms. The EU’s ability to demand transparency from one of the world’s largest social platforms could set a precedent for how other digital platforms operate within the region. X has not issued a formal response to the latest demands, maintaining its previous stance of limited public engagement with regulatory matters.

The outcome of this investigation could influence the regulatory landscape for years to come, shaping how platforms balance innovation with accountability and user safety.

Markus Kasanmascheff
Markus Kasanmascheff
Markus has been covering the tech industry for more than 15 years. He is holding a Master´s degree in International Economics and is the founder and managing editor of Winbuzzer.com.

Recent News

0 0 votes
Article Rating
Subscribe
Notify of
guest
0 Comments
Newest
Oldest Most Voted
Inline Feedbacks
View all comments
0
We would love to hear your opinion! Please comment below.x
()
x