Seven major digital platforms, including Meta, TikTok, and X, are set to undergo a stress test on January 31 to evaluate their capacity to address disinformation ahead of Germany’s national election on February 23.
The exercise, organized by the European Commission, is designed to assess whether the platforms comply with the Digital Services Act (DSA), which mandates risk mitigation systems for platforms with over 45 million monthly users in the EU.
As Reuters reports, the Commission is coordinating with German authorities to ensure safeguards are in place for one of Europe’s most politically significant elections of the year.
“The stress test is really going through potential scenarios where DSA comes into play and to check with platforms how they would react to these specific scenarios,” said Thomas Regnier, a spokesperson for the European Commission. Participants include Google, Snap, LinkedIn, Microsoft, Meta, TikTok, and X, although only TikTok has publicly confirmed attendance.
The stress test marks the first instance where such an exercise focuses specifically on a national election. Previous stress tests, such as one conducted ahead of the European Parliament elections in 2024, provided critical insights into platforms’ readiness to handle coordinated disinformation campaigns.
Examining Risks and Recent Precedents
The stress test will simulate various scenarios, including misinformation campaigns, algorithmic manipulation, and coordinated influence operations, to measure the platforms’ ability to counter risks effectively.
Regnier described the test as “not theoretical but a practical evaluation of operational preparedness.” The closed-door exercise will involve senior compliance officers and technical experts from the invited platforms, tasked with demonstrating how their systems align with DSA requirements.
Germany’s election presents unique risks, amplified by recent controversies. Earlier this month, X, under Elon Musk’s ownership, drew criticism when Musk hosted a live interview with Alice Weidel, leader of Germany’s far-right Alternative for Germany (AfD) party.
The incident raised concerns about the platform’s role in amplifying divisive narratives, particularly during politically sensitive periods. The European Commission has also been investigating X for compliance with the DSA, focusing on whether its algorithms unfairly promote harmful or extremist content.
Past cases, such as TikTok’s involvement in Romania’s annulled 2024 election, highlight the stakes. Romanian intelligence services identified coordinated disinformation campaigns linked to Russian interference, leading to the election’s nullification.
In December 2024, the EU launched formal proceedings against TikTok, accusing the platform of failing to mitigate risks adequately under the DSA.
DSA Enforcement and Broader Regulatory Challenges
The Digital Services Act, which came into force in 2023, is a cornerstone of the EU’s digital governance framework. It requires Very Large Online Platforms (VLOPs) to conduct regular risk assessments, implement transparency measures for algorithms, and provide regulators with access to internal data. Platforms that fail to comply face fines of up to 6% of their global revenue.
However, enforcing the DSA has proven complex. X, for example, has been ordered to submit detailed records of its recommender systems, which personalize content for users based on their behavior.
Critics argue that these algorithms often amplify polarizing material, complicating efforts to maintain election integrity. “Today we are taking further steps to shed light on the compliance of X’s recommender systems with the obligations under the DSA,” said Henna Virkkunen, the EU’s digital chief, earlier this month.
Meta, which faces similar scrutiny, has drawn attention for its recent overhaul of content moderation policies in the United States. The company replaced third-party fact-checking partnerships with a user-driven “Community Notes” system modeled after X’s approach.
Joel Kaplan, Meta’s global policy chief, defended the shift, stating that “one to two out of every 10 of these [fact-checking] actions may have been mistakes.” While Meta continues to work with fact-checkers in Europe to comply with local regulations, the policy change has raised questions about its long-term commitment to combatting misinformation globally.
Balancing Regulatory Enforcement Amid Geopolitical Pressures
The EU’s focus on the German election comes amid broader regulatory challenges. Earlier this month, the European Commission paused investigations into Apple, Google, and Meta under the Digital Markets Act (DMA), citing the complexity of enforcement strategies.
This decision coincides with mounting lobbying efforts by Silicon Valley executives, including Meta CEO Mark Zuckerberg, who criticized EU regulations as financially burdensome. “EU regulators have forced us to pay more than $30 billion in penalties over the past 20 years,” Zuckerberg said in a recent statement.
The regulatory pause has also been linked to geopolitical considerations, particularly as the incoming U.S. administration under Donald Trump signals a preference for lighter tech regulations.
EU officials, however, have emphasized their commitment to holding platforms accountable. “This is about ensuring a fair and safe democratic process,” Regnier stated, dismissing claims of political interference.
The Implications for Future Elections
The January 31 stress test will serve as a litmus test for the EU’s ability to enforce the DSA effectively. Success could set a precedent for regulatory oversight in future elections, while failures may expose gaps in the current framework. As platforms face increasing scrutiny, the outcomes of the stress test are likely to shape how digital companies balance innovation with accountability.
The stakes are high not only for the German election but for the EU’s broader efforts to regulate Big Tech and preserve democratic processes.
Virkkunen reinforced the importance of these efforts, stating, “We are committed to ensuring that every platform operating in the EU respects our legislation, which aims to make the online environment fair, safe, and democratic for all European citizens,”