The repeal of Section 230 has broad ramifications beyond Big Tech
The reality is, repealing Section 230 would have harmful consequences for a broad spectrum of businesses.
It doesn’t apply only to social media companies and digital press outlets. It protects Grubhub, Airbnb, Peloton, Etsy, Walmart, Amazon, Bumble — any site that hosts user content, including reviews. It’s not hyperbole to say that a full repeal of Section 230 would mean our growing online economy as we know it would cease to exist.
Section 230 is part of the Communications Decency Act of 1996, a law that was gutted in a Supreme Court case the next year. What was left was, in large part, a “good Samaritan” provision — most notably, Section 230(c)(1), the most litigated portion of the law, which is just 26 words long. It says, “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” That means the users, not the service provider hosting user content, are responsible for what they say online, even if the provider engages in some amount of content moderation.
Without Section 230, any business that engages in any form of content moderation would be liable for user content. Moderation includes removing spam or taking down inappropriate content in the comment sections of a blog or website. Without Section 230, a perverse incentive known as the “moderator’s dilemma” encourages companies to do no moderation whatsoever or to restrict user content heavily, as they would be liable for it. No moderation means more undesirable content, spam, pornography, and harassment, making the internet a much less useful and safe place.
The alternative would be to moderate every user message, as the company risks legal liability for each piece of content. It means rejecting all speech that is in any way controversial or critical or even that makes moderators uncomfortable. Unlawful content such as libel and other speech-related harms is not open-and-shut. There is a reason that cases about these matters make their way to courts to decide. Moderation risks expensive litigation if there’s any disagreement or dislike of a content decision. It effectively casts content moderators in the role of fully accountable judges who must be immune from error and subject to a heckler’s veto from anyone who wants to pressure for the removal of content.
For example, dating sites would be riskier and less attractive for users without Section 230, as they would be unable to protect users from romance scams, which is a rising problem. Services such as Bumble are continually fighting to catch and remove scammers. But without 230, they may opt against moderation entirely for fear of litigation.
Other businesses depend on moderation to trust the legitimacy of the products they purchase, encouraging potential customers to frequent shops on the website. For example, Etsy is a home for all kinds of small businesses, particularly during the pandemic. In the third quarter of 2020, Etsy had 3.68 million active sellers, up from 2.65 million in the first quarter. It was recently reported that, in 2019, Etsy “removed or disabled access to more than 470,000 listings from nearly 97,000 sellers who didn’t meet its community standards.” This moderation scale is not easy for any company, and particularly not a smaller business.
Stack Overflow is a software developer community website where professionals and enthusiasts can ask questions and share insights. Knowledge exchange websites such as Stack Overflow offer large-scale social and practical value for free, whether one is trying to learn how to code or is looking for a spare tire for a used BMW. Stack Overflow has forums on virtually every conceivable software programming need. But because its users depend on the quality of the information they view, its moderation needs are complex. One former moderator published a blog post in which he details the struggles of content moderation at Stack Overflow. He described how the service reprimands users for posting questions that are too broad or don’t show the user has researched his or her question before posting. Such a culture might seem harsh, but it makes the website efficient and invaluable for all kinds of professional coders.
By design, Section 230 allows each company to moderate as it sees fit, in whatever manner best suits its business and its users. For some, such as Stack Overflow or the popular Wikipedia online encyclopedia, their answer is to set a high bar for expertise and research. For others, such as Bumble and Etsy, they focus on scams and fraudulent behavior.
And few businesses would thrive in an environment without such protections. Peloton, the networked bike company, has moved to prohibit political speech on its online forums after QAnon terminology appeared in its “leaderboards,” displayed while exercising and in other contexts. The very things people turned to during a global pandemic to stay healthy, such as Peloton or the AllTrails app and community, are at the same time also online services that must engage in content moderation and therefore depend on Section 230.
Merely targeting the repeal of Section 230 for the largest services would have consequences for small businesses. They benefit by reaching specific audiences online through Facebook’s social media and Google’s search services. A bridal shop in Portland, Maine, can advertise on Facebook specifically to women in the Portland area who are engaged, for example, rather than blanketing all demographics with even a regionally targeted television ad. In 2019, Google reported that it drove 1.9 billion direct connections each month, “including phone calls, requests for directions, messages, bookings and reviews for American businesses.” Repealing Section 230 would not just risk harming Facebook and Google’s corporate welfare but also every business built or supported through them.
It’s reasonable to ask companies to be better and expect elected officials in government to improve the internet experience. Transparency into moderation practices, accountability, and consistency in enforcement of content policies can help preserve the freedom to innovate and promote diversity and heterogeneity in the online experiences.
The total repeal or broad reform of Section 230, however, due to partisan political fights, would have a significant, negative impact on businesses, both large and small, that rely on the protections that allow them to flourish.