A recent study shows that the Community Notes system, adopted by X and recently by Meta, remains largely ineffective.
A Risky Bet on Participatory Moderation
Faced with the proliferation of misleading content, platforms have sought solutions to strengthen fact-checking. X (formerly Twitter) has implemented Community Notes, a system where Users themselves add contextualizations to disputed publications. Meta followed by announcing the end of its third-party verification program in favor of the same system.
But according to a study by the Spanish site Maldita, which specializes in fact-checking, this system still relies heavily on sources previously used by professional fact-checkers. When Community Notes cite accredited organizations such as the International Fact-Checking Network (IFCN) or the European Fact-Checking Standards Network (EFCSN), they are considered more reliable and obtain a better rate of adherence.
A lack of visibility glaring
The problem: a large majority of these notes are simply not visible. The study reveals that 85% of Community Notes remain hidden from X users. On average, only 8.3% of submitted comments are displayed, a rate that rises to 15.2% when a verification organization is mentioned.
Why such opacity? For a Community Note to be published, it must obtain the agreement of users with divergent political opinions. This requirement, supposed to guarantee consensus, actually acts as a brake on the highlighting of factual corrections, particularly on polarizing topics.
A regression in the fight against disinformation
This excessive filtering comes on top of another regression: the end of direct partnerships between platforms and fact-checkers. These collaborations, however, made it possible to effectively stem the spread of fake news. Lucas Graves, a journalism professor at the University of Wisconsin-Madison, believes that the effectiveness Community Notes is seriously limited without the involvement of fact-checking professionals.
Alex Mahadevan, director of MediaWise, also advocates for these fact-checkers to be fully integrated into social media moderation strategies. According to him, if platforms truly want to combat misinformation, they must stop marginalizing experts and collaborate more closely with them.
Necessary adjustments
Despite its limitations, the Community Notes system has clear potential. Maldita's study indicates that when they cite recognized sources, these notes are published on average 90 minutes earlier than others. This proves that relying on professional fact-checkers strengthens their credibility.
To improve their impact, several avenues are needed: review the criterion of political consensus, give more weight to contributions from experts, and give greater visibility to factual notes before disinformation goes viral.
While other platforms are considering adopting similar mechanisms, it is necessary to find a balance between citizen participation and journalistic expertise. By relying solely on crowdsourcing without professional support, Meta and X risk weakening the credibility of their initiatives and allowing the disinformation they claim to combat to flourish.
0 Comments