Meta’s shift to Community Notes model proves that we can fix big problems without big government
This analysis is in response to breaking news and has been updated. Please contact pr@rstreet.org to speak with the author.
Meta’s recent decision to replace its third-party fact-checking program with a user-driven “Community Notes” system modeled after X (formerly Twitter) highlights the importance of market dynamics and consumer demand in shaping the content moderation policies of even the largest social media platforms. This shift demonstrates that social media companies are capable of creating solutions to complex problems without the need for a heavy-handed public policy response.
Social media platforms have been under increasing pressure to address their role in spreading misinformation and how they determine what gets flagged. They have long faced an intractable problem called the “moderator’s dilemma” in which any decision the moderator makes will be criticized. Users were particularly dissatisfied with Meta’s centralized approach to content moderation, wherein a third-party board made most of the determinations—a monolithic, rigid, and nontransparent process.
These frustrations boiled over into proposals to completely rework foundational information laws like Section 230. Proposed changes would remove important liability protections for speech platforms and effectively move more content moderation decisions from private platforms to some form of government regulation.
In 2021, while Congress considered options, X rolled out what would eventually become Community Notes—a system where any user can flag a post or account for review by certain other designated users. If the reviewer feels the post or account contains misinformation, they can add a note to it. While X does delete certain content it labels as extreme, Community Notes allow the platform to censor less information by letting the community decide what should be flagged. While it may not be perfect, the few studies conducted on Community Notes show its promise in helping address misinformation.
One study sampled 205 Community Notes and conducted surveys that showed “strong agreement on note topics (90% agreement), source credibility (87% agreement), and accuracy (96% agreement).” Another study showed that the Community Notes process significantly increased user trust, calling it “an effective approach to mitigate trust issues with simple misinformation flags.”
Meta’s Chief Global Affairs Officer, Joel Kaplan, acknowledged the effectiveness of X’s community-driven model, stating, “We’ve seen this approach work on X—where they empower their community to decide when posts are potentially misleading and need more context.” In short, Community Notes are likely the best tool currently available for platforms to address misinformation.
While recent elections may have highlighted the challenges of content moderation, attributing Meta’s policy change solely to public policy pressures oversimplifies the situation. The primary catalysts for this shift are the inherent market incentives to provide accurate information and the necessity of maintaining user trust. As CEO Mark Zuckerberg stated in a video announcement, “[t]he recent elections also feel like a cultural tipping point towards once again prioritizing speech,” indicating that the elections served as a catalyst rather than a cause.
However, the success of Meta’s new system will depend on its implementation. Ensuring a diverse range of perspectives and insusceptibility to manipulation are critical factors. Meta’s plan to phase in Community Notes across the United States over the next few months (with improvements expected throughout the year) indicates an awareness of these challenges and a commitment to refining the system.
In a competitive market, companies must adapt to consumer demands for transparency and reliability. Meta’s adoption of Community Notes exemplifies how private enterprises can innovate and self-regulate in response to market signals without the need for government intervention.
Finally, market-driven solutions can play a pivotal role in addressing content moderation challenges without removing fundamental internet laws or imposing a burdensome new regime. The government’s role would then be to assist the private sector in the study and design of these systems rather than regulating them from the outside.