Meta Revamps Moderation: Embracing Community Notes
Meta is abandoning its U.S. fact-checking program for a community-based moderation system, reducing restrictions on sensitive topics. This shift in policy aligns with Meta's push for free expression, following changes in leadership and criticism over prior censorship practices. The new model will rely on user-generated context for posts.

Meta has announced the termination of its U.S. fact-checking program, opting instead for a community-based system akin to Elon Musk's X. This strategic pivot allows free-form discussions on contentious subjects such as immigration and gender identity, highlighting Meta's shift back towards free expression.
CEO Mark Zuckerberg, who has previously supported stringent content moderation, acknowledged the company had made too many mistakes. He insists this new direction aims to simplify policies and restore free speech on its platforms. The decision follows the appointment of Joel Kaplan and Dana White to high-ranking positions within Meta.
While the independent Oversight Board has welcomed the move, some partners in the fact-checking community are expressing surprise and concern. The community notes model, already under scrutiny in Europe, offers users the chance to provide context and identify misleading posts during its U.S. rollout.
(With inputs from agencies.)