Meta’s Bet on Decentralized Moderation Could Be Working
In January, Meta began replacing its fact-checking and content-removal program with a crowdsourced moderation program based on X’s Community Notes, and new data from its first few months of implementation show promising results. While Meta will use the same name of “Community Notes” and use the open-source code from X, they have made their own changes to accommodate the service on their platform.
Meta came under intense public scrutiny over the last few years after removing certain pieces of content that were eventually determined not to violate terms or to be more factually accurate than they first appeared. The move to Community Notes signified an effort from Meta to censor less content and instead add meaningful, high-quality context about information on the site.
The headline figure is startling: Meta reduced its own enforcement mistakes—defined as content that was removed but shouldn’t have been—in the United States by 50 percent between Q4 2024 and the end of Q1 2025.
While Notes isn’t entirely to credit for this reduction, as Meta explained that they audited and disabled flawed automated systems that were moderating content too aggressively, it arguably allowed them to be more lenient with content. Meta states that a main reason for the reduction in mistakes was “[r]equiring more confidence that content violates our policies before we remove it.” While automated systems have been improved, some of this success likely comes from the fact that Notes allow media platforms to be much more tolerant of content while still supplying key contextual information to users.
Not every figure from the report was positive, as Facebook saw a small increase in bullying and harassment (from 0.07 percent to 0.08 percent). Likewise, violent and graphic content ticked up from 0.07 percent to about 0.09 percent. Any increase is concerning on the surface, but these increases are too marginal to be considered problematic or stand as a reason to revert the changes from the quarter.
The shift to Community Notes proves that speech platforms like social media are dynamic systems that respond to user pressure and are finding innovative ways to deal with concerns about content moderation and misinformation. While Notes isn’t a panacea for those concerns, it offers a legitimate way to significantly decrease the amount of content that platforms feel they need to censor. Taken together, the early results of Meta’s shift to Community Notes represent a compelling case for decentralized moderation as a viable alternative to heavy-handed content control.