Meta Dismantles Misinformation Control in the United States

Meta Dismantles Misinformation Control in the United States

2025-01-15 companies

Menlo Park, California, Wednesday, 15 January 2025.
Meta has ended its misinformation control systems in the U.S., stopping fact-checking partnerships and allowing viral misinformation to spread unchecked. This aligns with incoming policy changes under the Trump administration.

Major Policy Shift

On January 7, 2025, Meta Platforms (META) announced the termination of its fact-checking partnerships in the United States [1][2]. The company has invested over $100 million in its fact-checking program since its inception in 2016 [3]. CEO Mark Zuckerberg justified the decision by claiming that fact-checkers have been ‘too politically biased’ and ‘have destroyed more trust than they’ve created’ [3]. This dramatic shift comes as Meta aligns its policies with the incoming Trump administration [1].

System Dismantling and New Approach

The company has begun dismantling systems that previously reduced the reach of misinformation by over 90% [1]. These systems included machine-learning classifiers and partnerships with third-party fact-checkers [1]. In place of these controls, Meta plans to implement a ‘community notes’ system, similar to X (formerly Twitter), allowing users to add their own annotations to posts [3]. The effectiveness of this new approach remains uncertain, as research shows that 74% of accurate notes correcting false claims often go undisplayed [7].

Broader Policy Changes

The changes extend beyond fact-checking to include significant modifications in content moderation policies. Meta has updated its ‘hateful conduct’ policies [5], relaxed certain community standards [1], and ended its diversity, equity, and inclusion program [1]. The company will no longer reduce the reach of posts pending fact-checker evaluation [1], and previously restricted content, including viral hoaxes from past elections, will now be eligible for amplification across Facebook, Instagram, and Threads [1].

Global Implications

While these changes currently apply only to the United States [1], they have sparked international concern. The decision could destabilize fact-checking operations in other countries, as Meta has supported such initiatives in over 100 nations [3]. Medical experts and advocacy groups have expressed particular concern about the potential spread of harmful misinformation [5], while international fact-checking partners have warned that this decision ‘is a step backward for those who want to see an internet that prioritizes accurate and trustworthy information’ [3].

Sources


misinformation meta platforms