Meta Shifts from Fact-Checking to Community Notes Model

Meta Shifts from Fact-Checking to Community Notes Model

2025-01-07 companies

Menlo Park, Tuesday, 7 January 2025.
Meta is ending its fact-checking program, introducing a community-driven system to enhance free expression and reduce moderation errors, impacting billions of users on Facebook, Instagram, and Threads.

Major Policy Shift

Meta Platforms Inc. (NASDAQ: META) announced on January 6, 2025, a significant overhaul of its content moderation approach [1][2]. The company is discontinuing its third-party fact-checking program, which has been in place since 2016 [2][3], in favor of a community-driven system similar to X’s Community Notes feature. This change represents a fundamental shift in how content verification will be handled across Meta’s platforms, which collectively serve over 3 billion users [3].

Reasons Behind the Change

CEO Mark Zuckerberg cited excessive moderation mistakes and censorship concerns as key drivers for this decision [1][2]. According to Meta’s December 2024 internal assessment, approximately 10-20% of content moderation actions may have been errors [2]. ‘A program intended to inform too often became a tool to censor,’ Zuckerberg stated [4], highlighting the company’s renewed focus on free expression. The decision comes at a time when social media platforms are under increasing scrutiny for their content moderation practices [1].

Implementation and Structural Changes

The new Community Notes system will be phased in across the United States over the next couple of months [2][4]. In a notable operational shift, Meta plans to relocate its trust and safety teams from California to Texas [1][2]. The company will also stop demoting fact-checked content and replace full-screen warnings with more subtle labels indicating additional information is available [4][6].

Industry Response and Future Implications

The announcement has sparked mixed reactions from industry observers. While X’s CEO Linda Yaccarino praised it as ‘a smart move’ [3], critics like Ross Burley view it as ‘a major step back for content moderation’ [3]. Meta plans to enhance the Community Notes model throughout 2025 [3], focusing automated systems primarily on high-severity violations such as terrorism and drug-related content [3]. The company will also implement new transparency reporting regarding content moderation mistakes [2].

Sources


Meta fact-checking