Section 230

Author

Jeffrey Westling
Former Resident Fellow, Technology & Innovation

BACKGROUND

Modern social media services, news, e-commerce and many other websites are built around input from their users, including messages, comments, reviews and pictures. These communications and community-building features connect people across the globe in diverse ways and break down barriers that previously limited the free flow of ideas and communications. Indeed, this connectivity has generated significant benefits as individuals can do business, share interest in hobbies, discuss politics or even date online.

At the same time, this increased connectivity also facilitates manipulation, fraud and a myriad of other harms. While most interactive websites implement community guidelines and moderate user-generated content, these practices have increasingly come under fire, accompanied by calls for regulation.

SECTION 230 AND THE MODERATOR’S DILEMMA

At the center of these calls for regulation sits Section 230 of the Communications Decency Act. Section 230 was passed as a part of the Telecommunications Act of 1996, and responded to a dilemma created by the application of intermediary liability laws to social media companies. Courts have held that intermediaries like bookstores or newsstands are only liable for the content contained in the materials they stock if they have knowledge of the specific content in question. However, in Stratton Oakmont, Inc. v. Prodigy Services Co., the New York Supreme court held that services which moderated any user-generated content had the requisite knowledge to retain liability for everything users posted. In order to avoid liability, the websites would have to avoid moderating altogether. This “moderator’s dilemma” threatened to disincentivize the removal of harmful, hateful or otherwise objectionable content via internal self-regulatory behavior.

Congress passed Section 230 in response to this dilemma. The law contains two basic principles. First, the provider of an interactive computer service cannot be treated as the publisher or speaker of user-generated content. Second, there is a Good Samaritan provision which makes clear that actions to moderate content will not lead to liability. These two provisions work in tandem to solve the moderator’s dilemma. However, the provisions are not all encompassing, and the protections do not extend to areas such as federal criminal law or intellectual property.

CALLS FOR REGULATION

There are two main camps calling for changes to Section 230. For those worried about the spread of misinformation and other harmful content, Section 230 appears to provide too much leeway to companies without imposing responsibility on them. They argue that the business models for many of these companies disincentivize the removal of harmful content because they incentivize drawing attention. Therefore, calls for regulation focus on imposing additional requirements and responsibilities on platforms before the protections apply.

On the other end, some worry that Section 230 gives too much power to companies to remove speech, leaving Americans without access to the rest of the public to engage and debate ideas. Indeed, many calls for regulation on the right focus on limiting 230 protections to when a platform is politically neutral or adhering to a First Amendment standard for moderation.

IMPORTANT CONSIDERATIONS FOR LAWMAKERS

Evaluate whether changing or repealing Section 230 addresses the specific problem

Often, altering or eliminating Section 230 becomes a solution to a problem that is completely unrelated. For example, platforms have a First Amendment right to remove content, and no reforms to Section 230 would change that. Likewise, much of the speech that platforms remove is not illegal (such as spam and pornography), and Congress would run into significant First Amendment issues trying to force platforms to remove specific types of content. Identifying illegal content with certainty is also difficult for moderators who are often unaware of context and are not equipped to infallibly do the job that judges do when matters of unlawful content go to court. As lawmakers contemplate changes to the statute, they should carefully consider whether the problem is in fact with Section 230 or perhaps a separate issue such as antitrust, privacy or the Constitution itself.

Consider the practical implications of the proposed changes

Section 230 resolves the moderator’s dilemma, and by making changes to the regime, Congress could recreate it. Section 230 works because it allows cases to be resolved early in litigation, preventing companies from drowning in legal costs. Limiting the applicability of the liability protections may drive these litigation costs back up, forcing companies to decide whether they wish to over-remove content to prevent any potential liability or simply allow everything, meaning harmful content will spread more freely.

While this may seem obvious, even innocuous changes can have significant effects. For example, if a reasonableness standard was applied, litigation would likely need to move into the discovery phase or beyond to determine whether the actions of the company were in fact reasonable. This additional cost (amplified by the significant amount of content on these services) would likely drive smaller companies out of business and recreate the problems that Section 230 was designed to resolve.

Would the proposed changes limit competition?

While things like network effects and access to data can make new entry a challenge, many innovators in the space have arisen to upset existing dominant players. Section 230 prevents litigation from driving these new entrants out of business before they have a chance to compete. And as more competition exists, the market can drive moderation practices of the larger firms. Removing these protections threatens to entrench the dominant players into their existing market position, limiting consumer choice and reinforcing existing harms.

CONCLUSION

The online ecosystem exacerbates many of the problems that currently exist in society, but this unprecedented level of connectivity also drives the spread of ideas, community and free expression in the online space. When Congress looks to address harms, it should carefully consider the delicate balance that 230 strives to achieve.