Billionaire businessman Elon Musk took the reins of Twitter earlier this year following extended corporate negotiations. Ever since Musk announced his intentions to pursue Twitter ownership, the subject has become a battleground for the American culture war. Musk’s transition to running the company—following the final $44 billion acquisition—has forced a debate on social media content regulation and moderation. Twitter’s change in ownership and policy comes as the percentage of Americans who get their news from social media platforms continues to grow. This has prompted calls to discuss increased government regulation of social media, both on the federal and state level. But as lawmakers mull over such proposals, they should question whether the government can truly help solve problems with social media.     

Despite the widespread use of social media, it is important to remember that people are not forced to join these platforms; they voluntarily do so and agree to the terms. Moreover, these companies aren’t state-provided services or entities. Governments, therefore, shouldn’t interfere in their business dealings as long as users and the tech companies are operating legally.

Yet a host of progressives want to see increased government regulation because they claim social media companies aren’t doing enough to stop the spread of false information and hate speech. On the other side of the political divide, some conservatives are expressing concern about social media censorship and demanding government involvement. 

Legislators on both sides of the aisle have floated proposals to police social media companies’ content management policies. In fact, Sen. Lindsey Graham (R-S.C.), appears poised to introduce a measure—with support from some Democrats—that would create a social media licensing system, an appeals process for removed content and unspecified programs for combating foreign misinformation campaigns. 

Leaked documents from the Department of Homeland Security have recently revealed that the Biden administration targeted social media users spreading disinformation. And these federal attempts are hardly the extent of such activities. State governments have attempted to dictate how social media companies operate as well. 

Texas House Bill 20 is perhaps the most prominent state-led attempt to regulate the actions of social media companies. The measure, signed into law by Gov. Greg Abbott (R-Texas) last year, seeks to force large platforms to disclose content moderation processes and prohibit them from censoring the political viewpoints of their users. 

The law was recently upheld by the 5th U.S. Circuit Court of Appeals, with judges stating that companies do not have a First Amendment right to censor the positions of users. A dissenting judge raised concerns about the lack of guiding legal precedent in the area of social media content moderation. The 5th Circuit later prevented the law from going into effect pending a Supreme Court decision. 

These attempts to regulate social media companies’ practices may look like valiant efforts to protect the individual rights of an untold number of people. After all, social media platforms are used by billions across the world, profoundly impacting civic life and political discourse. 

However, applying additional regulations on speech—either by limiting companies’ abilities to regulate content or by mandating them to police additional content—places the government in the position of moderating individual speech in private spaces. The government of course has a role to play in protecting people’s privacy rights online, policing threats of violence and addressing the spread of illicit content. But when the focus shifts to misinformation and personal political views, the state should not play a role in determining what conduct is appropriate for social media companies.

Private companies are better placed to decide what content ought to remain on their platforms, and users should feel empowered to exercise their role in a market system by choosing only to use platforms they find appealing. Innovations and movements in the market have already shown the promise of free market solutions to content moderation problems. 

As citizens vote with their feet, these issues will sort themselves out. These are difficult conversations that society must have as we seek to build a more stable online community. If we outsource our individual ability and responsibility to determine truth to the government, we undermine the First Amendment rights guaranteed in the Constitution. If we cede that power to the state, we must recognize that we may never get it back. 

Featured Publications