The Misinformation Congress Peddled at a Hearing to Combat Misinformation with Technology CEOs
At the end of March, two House Energy and Commerce subcommittees held a joint hearing titled “Disinformation Nation: Social Media’s Role in Promoting Extremism and Misinformation.” This is ironic because many of the very members of Congress conducting the hearings were spreading misinformation in real time across television, streaming platforms and social media.
Over multiple hours, several lawmakers continued to misunderstand the fundamentals behind Section 230 of the Communications Decency Act; the intersection between free speech and online platforms; the role of government in protecting speech; the party that is legally responsible for content hosted across online and social media platforms; and the steps the various social media companies continue to take to remove harmful content and misinformation.
Let’s analyze what they got wrong one statement at a time.
1. The biggest whopper—ol’ faithful of American misinformation—made an appearance. “While it may be true that some bad actors will shout fire in a crowded theater, by promoting harmful content your platforms are handing them a megaphone to be heard in every theater across the country in the world,” said Rep. Frank Pallone (D-N.J.).
But the idea that one cannot “shout fire in a crowded theater,” is actually a line from Justice Oliver Wendell Holmes, who wrote for the majority in Schenck v. United States that, “The most stringent protection of free speech would not protect a man in falsely shouting fire in a theatre and causing a panic.” However, this case, in which the Supreme Court decided that Charles Schenck’s conviction under the Espionage Act for disseminating anti-war leaflets did not violate his First Amendment rights, was also partially overturned by Brandenburg v. Ohio, which introduced a stricter test concerning the First Amendment. As lawyer Ken White explains, “Holmes’ famous quote comes in the context of a series of early 1919 Supreme Court decisions in which he endorsed government censorship of wartime dissent — dissent that is now clearly protected by subsequent First Amendment authority.” Unfortunately, much like the famous geyser, this bit of misinformation gets spewed regularly by the likes of Sens. Herb Kohl (D-Wis.) and Chuck Shumer (D-N.Y.) and now-Justice Elena Kagan.
2. At another point, Rep. Michael Burgess (R-Texas) expressed confusion as to why Facebook was allowed to remove political opinions from both the left and the right side of the political spectrum. “Mr. Duncan eloquently pointed out there was restriction of conservative speech, and our colleague Angie Craig eloquently pointed out how there was restriction of trans-affirming speech,” said Rep. Burgess. “So that strikes me that we’re getting awfully close to the line of exercising editorial discretion. And forgive me for thinking that way, but if—and I’m sure I’m not alone in this—it does call into question then the immunity provided under Section 230. Maybe it is not the problem with the law itself, Section 230, maybe the problem is that the mission has changed in your organization and other organizations.”
However, Section 230 does not require that platforms operate in a viewpoint-neutral manner, nor should it. First, governments should never act as arbiters of what content is viewpoint-neutral, as they will always favor pro-government viewpoints. Second, requiring neutrality would violate the First Amendment. Third, platforms need flexibility to allow for various points of view and, further, not all bias is bad. AllTrails is a platform for hikers that removes all content that doesn’t help other hikers. The moderators of Stack Overflow, a software developer community platform, restrict content that they don’t think will be helpful to the community, such as questions where it’s obvious that the user didn’t try researching first. Groups on both the left and right should be allowed to have online communities catered to them if they so desire—it is not up to the government to decide one view is better than another. Finally, if both the left and right have examples of their content being restricted, as Rep. Burgess noted, it is strange to use those to argue that the platform is biased.
He continued that, “Even if to the casual observer, it appears you’re exercising editorial authority and as such, maybe you should be regulated as a publisher as opposed to simply someone who is indifferent to the content they are carrying.” However, Section 230 does not require “indifference” to the content on their platform. Indeed, this entire hearing was held to ensure that they are not indifferent to many kinds of harmful content. Section 230 was created to ensure platforms could moderate as they see fit and keep their communities safe, as opposed to being sued for harmful content on their platform that they didn’t find.
3. Burgess was further confused as to why Twitter would not be liable for content it created. “Mr. Dorsey, every presidential tweet that I read following the election had an editorial disclaimer appended to it by you,” said Rep. Burgess. “How does that not make you someone who’s exercising editorial discretion on the content that you’re carrying?” However, Twitter is liable for any content, such as warning labels, that it creates. It just isn’t liable for user content. This is one of the basics of Section 230.
That being said, credit is due to Burgess for saying in the past that, “Twitter, Facebook, and other online platforms are private entities and have a First Amendment right to regulate speech that they host.” This is a key point regularly missed by those discussing Section 230 and online content moderation.
4. A few lawmakers condemned these companies for not cooperating with law enforcement and asked them to work on that. During an exchange with Rep. Gus Bilirakis (R-Fla.), Twitter CEO Jack Dorsey expressed confusion as to why the congressman might think Twitter was not actively working with law enforcement. “We would love to work with you in more detail about what you are saying, but we work with law enforcement regularly.” Twitter’s most recent public transparency report explains in detail its tens of thousands of interactions with government information and law enforcement requests. The congressman should know that any online user can download that data.
5. Several members cited The Social Dilemma, a Netflix documentary that raised alarms about problems with social media but wove disinformation into its narrative to tell the story it wanted to, rather than a factual one. First, the documentary gave no room for dissenting viewpoints. As Mike Masnick writes in TechDirt, the film “uses straight up misinformation to argue that social media has some sort of godlike control on the people who use it. It’s nonsense.” He reminds viewers that Netflix—which produced and distributed the documentary—is not mentioned in the film despite it using the tools that the film decries. Further, the reenactments through the film suggest that social media masterminds can radicalize an average child within two weeks. Masnick writes, “It is literally emotionally engaging misinformation designed to impact our beliefs and actions. In other words, the same thing that the film claims social media companies are doing.” The film even compared Hong Kong protesters to anti-vaxxers.
One of our favorite scenes from the movie was when a speaker confidently asserted that “no one got upset when bicycles showed up.” The Pessimists Archive, a Twitter account that archives “moral panic/luddism of yore,” disproved that statement in both a video and a podcast. In fact, this type of technopanic is all too common throughout our history, including fears that society is being corrupted by: recorded music, mirrors, the telegraph, umbrellas, electricity, novels, comic books, jazz music and even the waltz—just to name a few. It seems impossible at this point to invent a technology (or music genre) without someone being afraid of it—or going so far as to claim society is collapsing because of its influence.
6. Sidney “The Kraken” Powell, who said the 2020 presidential election was stolen is now arguing that “reasonable people would not accept such statements as fact but view them only as claims that await testing by the courts through the adversary process.” Unfortunately, numerous members of Congress echoed her words for months after the election.
It is concerning that a hearing on misinformation would contain so much of it. Further, there are more productive ways to spend time with the CEOs of top technology companies. For instance, Dorsey continuously referenced a “protocols over platforms” concept encouraged by scholars such as Masnick. The idea is worth real consideration and would allow users a more dynamic online experience by allowing them to choose what integrations they do and do not want. Unfortunately, no member of Congress asked Dorsey about the idea he mentioned at least four times throughout the hearing. These and other CEOs regularly come before Congress to discuss social media, Section 230, misinformation and related topics. Next time they do, members ought to engage with ideas like Dorsey’s to understand technological solutions to the problems that concern them.