Meta’s Fact-Checking Policy: Disinformation Experts Slam Zuckerberg’s Decision
- Laxmi Galani

- Feb 12
- 3 min read
Washington: Meta’s reported decision to scale back or alter key aspects of its fact-checking and content moderation policies has triggered sharp criticism from disinformation researchers, media watchdogs and civil society groups, who warn that the move could weaken safeguards against online falsehoods at a politically sensitive time in the United States.

The controversy comes amid a long-running and deeply polarised debate over the role of fact-checking in moderating online speech. While conservative advocates in the U.S. have frequently argued that third-party fact-checking partnerships unfairly target right-leaning content and suppress free expression, researchers maintain that such systems are essential to curb the spread of harmful misinformation.
What Changed
Meta, the parent company of Facebook and Instagram, has indicated adjustments to its content moderation approach, including recalibrating how certain claims are labelled and how external fact-checkers are used in the review process. Company representatives have framed the changes as part of a broader effort to promote “free expression” and reduce what they describe as over-enforcement.
According to Meta, the updated approach aims to ensure that users can engage in robust political debate without excessive content removal, particularly around contentious issues such as elections, public health and immigration.
Experts Raise Alarm
Disinformation experts, however, say the shift risks emboldening bad actors who exploit algorithm-driven platforms to amplify false narratives. They argue that scaling back fact-checking signals a retreat from commitments Meta made after facing intense scrutiny over its role in the spread of misinformation during past election cycles and the COVID-19 pandemic.
“Fact-checking is not censorship; it is contextualisation,” several researchers have noted in public statements, emphasising that labels and corrections help users make informed decisions rather than silencing viewpoints outright.
Experts also caution that reduced oversight could lead to:
Faster viral spread of false claims.
Increased targeting of minority communities through coordinated misinformation campaigns.
Erosion of public trust in democratic institutions.



Comments