Meta has clarified that its transition from third-party fact-checking to a user-driven moderation system will, for now, apply exclusively to the United States.
The company announced earlier this month that it would replace its U.S.-based fact-checking partnerships with a feature called Community Notes.
Speaking at the World Economic Forum in Davos, Nicola Mendelsohn, Meta’s head of global business, explained in an interview with Bloomberg Television’s Francine Lacqua, “Nothing is changing in the rest of the world at the moment; we are still working with fact-checkers globally.”
This decision comes amid increasing scrutiny of Meta’s moderation practices and concerns about misinformation. By limiting the rollout to the U.S., the company seeks to test its new system while navigating the complexities of differing global regulations. Mendelsohn added, “We’ll see how that goes as we move it out over the years.”
The Shift to Community Notes and Its Rationale
Community Notes, modeled after a similar feature on X (formerly Twitter), allow users to annotate flagged posts with contextual information.
Unlike the previous system, which prominently flagged and restricted potentially misleading content, Community Notes rely on a consensus-driven approach to provide additional clarity without removing the content.
Joel Kaplan, Meta’s global policy chief, justified the move by highlighting the flaws of the earlier system, stating, “One to two out of every 10 of these actions may have been mistakes.”
Meta aims to reduce these errors and restore trust in its content moderation. The company has argued that excessive reliance on automated enforcement and external fact-checkers created a system prone to overreach, stifling free expression.
In a statement on the policy shift, Kaplan wrote, “As well-intentioned as many of these efforts have been, they have expanded over time to the point where we are making too many mistakes, frustrating our users and too often getting in the way of the free expression we set out to enable.”
Political Timing and Strategic Implications
The timing of Meta’s U.S.-only rollout coincides with the inauguration of President-elect Donald Trump, a vocal critic of social media platforms’ moderation policies. Trump has frequently accused companies like Meta of bias against conservative voices. In a recent statement, he praised Meta’s decision, saying, “Meta has come a long way.”
Adding to the controversy, Meta has appointed UFC CEO Dana White, a known Trump ally, to its board of directors. Critics argue that the appointment signals an alignment with conservative interests.
Meta’s decision to relocate its trust and safety teams from California to Texas has further fueled speculation about its political motivations. While the company has framed the move as a way to improve regional inclusivity in moderation practices, observers see it as a strategic gesture toward appeasing conservative audiences.
Balancing Global Moderation with Regional Regulations
While Meta is experimenting with user-driven moderation in the U.S., it must maintain traditional partnerships with external fact-checkers in regions governed by stricter regulations, such as the European Union.
The EU’s Digital Services Act (DSA) imposes heavy penalties on platforms that fail to actively moderate harmful or deceptive content. By retaining its global fact-checking partnerships, Meta seeks to ensure compliance with such laws.
However, critics argue that scaling back oversight in one region sets a risky precedent. Whistleblower Frances Haugen, who previously exposed Meta’s failures to prevent harmful content during the Rohingya crisis, raised concerns about the potential consequences of reduced enforcement. She warned, “What happens if another Myanmar spirals out of control?”
Internal and Public Reactions
Internally, Meta employees have expressed frustration with the decision to eliminate third-party fact-checking in the U.S. Leaked documents revealed that many staff members felt blindsided by the lack of transparency during the policy shift.
Advocacy groups, including Hope Not Hate, predict that the new moderation system will embolden extremist groups and fuel divisive narratives.
Externally, public response has been mixed. While some users welcome the changes as a step toward greater transparency and free expression, others fear a surge in harmful content. According to Google Trends data, searches for terms such as “how to delete Facebook” and “alternative to Facebook” have spiked since the announcement.