Strong Intermediary Liability Protections in the USMCA are a Big Win for Americans
While the president’s U.S.-Mexico-Canada Agreement (USMCA) covers a wide array of subjects, one part is key: strong liability protections for online tech platforms, including Facebook and YouTube. While the merits of exporting U.S. law through trade agreements may be questionable, the importance of this protection is not — on both foreign and domestic fronts.
The law the agreement is effectively exporting is Section 230 of the Communications Decency Act. Its significance cannot be overstated. The law states that a user of an interactive computer service such as Facebook or YouTube cannot hold the platform liable for the content other users post.
Legislators included this protection to address a quirk of tort law that developed in the courts: If a platform moderates any user content, it would have constructive knowledge of that content and thus be liable for all user content that was potentially defamatory. As a result, platforms would either have to remove any potentially controversial posts to avoid possible lawsuits or not moderate at all. Under either approach, legitimate speech would suffer.
By clarifying that platforms are not the publisher or speaker of user-generated content, the law allows platforms to moderate user content that serves to harass or stifle the speech of others while giving them the flexibility to leave up potentially controversial but legitimate posts without fear that they will be sued.
Including this language in the USMCA, then, is not only good for the economy, but also vital to protecting user speech on these platforms. If platforms find that they may be liable for offensive content in foreign jurisdictions, they may be forced to remove the content from their platform entirely. Indeed, just as China can manipulate American basketball players’ speech through pressure on the National Basketball Association, foreign governments can exert pressure on internet platforms to remove speech they don’t like. Intermediary liability protections in trade agreements add a layer of protection against foreign government jawboning.
Despite the importance of these protections, fears about Big Tech here at home still drive many to question their wisdom. Unfortunately, most criticisms rely on misunderstandings of the statute and the internet ecosystem.
First, some conservatives argue that Section 230’s liability protection is predicated on a platform being neutral, citing the legislative findings in the statute. Yet these findings, which are not binding law, only state that platforms offer a forum for a “true diversity of political discourse.”
Of course, this was true then and remains true today. Social media, facilitated by liability protection, has driven the expansion of a variety of political positions and perspectives that may have otherwise been stifled by their lack of mainstream appeal. But nowhere does the law state that platforms must be neutral. Doing so would run counter to the goal of the statute: allowing platforms to moderate the content based on market and consumer interests rather than political pressure from Washington.
Second, the law isn’t a gift to or a sweet deal for Big Tech. The protections in Section 230 apply to any service that allows users to post and share content. Whether it be an innovative service like Wikipedia or a small startup trying to compete with Big Tech, or even a traditional news outlet that allows users to comment on their stories, everyone in the online ecosystem benefits from and relies on the protections Section 230 provides. Indeed, many nascent and smaller companies would not survive the significant litigation costs that Section 230 protects them from. As the regulatory costs tend to be felt more deeply by smaller startups and competitors of Big Tech, removing Section 230 protection would only bury small platforms with the tort liability regimes of the 1990s and further entrench the dominant firms into their current market positions.
Finally, it is important to remember this isn’t a political issue. In fact, there is actually widespread bipartisan agreement on the importance of strong intermediary liability protections. In July, for example, a coalition of civil society organizations and experts who study the regulation of user-generated content sent a letter to lawmakers outlining the general principles that should govern liability for user-generated content. The letter encouraged lawmakers to make sure any amendments to the law held users, not platforms, primarily responsible for the content they create and to avoid discouraging platforms from moderating content.
Including strong intermediary liability protections in the USMCA is a tremendous win for users. Moving forward, we shouldn’t let fears about Big Tech ruin what made the internet what it is today.