Response to: Senate Judiciary Committee Hearing, “The EARN IT Act: Holding the Tech Industry Accountable in the Fight Against Online Child Sexual Exploitation,” 116th Congress (March 11, 2020). https://www.judiciary.senate.gov/meetings/the-earn-it-act-holding-the-tech-industry-accountable-in-the-fight-against-online-child-sexual-exploitation.

By: Jeffrey Westling

Earlier this month, the Senate Judiciary Committee held a hearing on the “EARN IT Act,” a bill designed to combat the dissemination of Child Sexual Abuse Material (CSAM) by conditioning Section 230 protections on the adoption of a series of recommended best practices. While addressing this issue is an important and worthy goal, removing these protections may actually make the challenge of combating CSAM more difficult, while potentially limiting key privacy tools, such as encryption.

The EARN IT Act attempts to combat CSAM by limiting 230 protections for specific material if a platform fails to adhere to, as proponents argue, basic moral responsibility. Specifically, the bill would create a 19-member commission made up of prosecutors, law enforcement personnel, victims, relevant experts, industry representatives and technologists with the charge of developing a list of recommended best practices. The Attorney General (AG) would chair the commission and approve or deny the recommendations with agreement from the Secretary of the Department of Homeland Security (DHS) and the Chair of the Federal Trade Commission (FTC).

The bill approaches the Fourth Amendment by making adoption of these best practices an optional choice. However, as a practical matter, that “choice” is effectively one in name only, as any platform that refuses to adopt the recommended best practices loses its Section 230 protections for any content that violates Federal CSAM law. This means that both local law enforcement and individuals can sue the company if it shares or distributes CSAM. Importantly, the EARN IT Act would also lower the “knowledge” requirement of these crimes to a “reckless” standard, which means that platforms that have a conscious disregard for a substantial risk that the content is being shared can be found guilty of transporting CSAM.

Throughout the hearing, a few common arguments emerged in favor of the bill’s adoption. Accordingly, we address them each individually in the following sections.

Argument #1: Technologies companies have no sympathy regarding CSAM and its victims.

Unfortunately, as with SESTA/FOSTA before it, proponents of the EARN IT Act attempt to characterize opposition as callous (at worst) or ambivalent (at best) to the harms that CSAM creates. Indeed, as Senator Lindsey Graham explained to a witness from the Internet Society: “I don’t buy anything you said about the tech companies and the internet ecosystem really caring.” In a similar vein, Senator Dianne Feinstein commented: “What’s amazing to me is that anyone should have a resistance to this.” In other words: not to support this bill is some kind of moral failing. And, to be clear, statements like these likely come from a good place. As I explained during R Street’s live tweeting of the hearing, it is of the utmost importance not to dismiss the committee’s concerns or the work it is attempting to prevent CSAM. And, on that front, the sponsors deserve commendation.

However, it is wrong to assume that platforms do not take the issue seriously—they do, although they could undoubtedly do more. But, Congress also has other avenues at its disposal to target CSAM.

Rather than punishing companies, Congress should focus on making it easier for platforms to make progress in developing tools to detect CSAM. For example, in order to develop detection systems that are highly accurate and efficient, it is necessary for companies to preserve the content while the AI trains itself to recognize it. However, as it currently stands, companies are only allowed to preserve material for 90 days, which is scarcely enough time to produce the AI systems desired by Congress and the public, given the intricacies and nuances involved in the training of AI. In particular, to develop these CSAM detection tools, companies need massive sets of data so the artificial intelligence can detect patterns—the very data companies are forced to delete.

However, these patterns are random at first and require researchers to pick the best versions and test over and over—a technique called evolution by researcher selection. To aid in these efforts, Congress could extend the limit for holding the necessary data, which would give companies more time to train technical tools to identify similar content. As an added benefit, extending that period would also give law enforcement more time to act on reports of CSAM.

Further, Congress could also increase funding for the National Center for Missing and Exploited Children (NCMEC), the organization that works with both private industry and law enforcement to identify CSAM and remove it. Currently, companies are asked to give the NCMEC feedback on “false positives” (which is to say, content that is reported as CSAM, but isn’t) in their database.

However, more often than not, the NCMEC doesn’t actually get around to correcting these errors. In light of this—and perhaps not surprisingly—for companies, continuing to report them becomes an exercise in futility. Additional resources, however, would allow NCMEC to actually act on these reports, ensuring that the database is up to date and accurate and that the ecosystem is functioning more effectively for all parties involved.

All of this is to suggest that platforms are not the enemy in this fight, and those who oppose the bill also take seriously the challenges presented by CSAM. Accordingly, to portray this merely as a black-and-white moral debate clouds potential problems with the bill’s practical implications, and failing to address those may actually make children less safe online.

Argument #2: The bill isn’t about encryption.

Because the bill would establish a Commission to develop recommended best practices, many opponents worry that what it would amount to—as a practical matter—is an attack on the use of encryption. These fears are compounded by the fact that the Attorney General, whose office has been critical of both encryption and Section 230 over the last few months, will chair the Commission and will ultimately implement the best practices. Because a direct attack on encryption has failed in the past, some have begun to refer to this as a back door to the back door.

Unsurprisingly, during the hearing, Senator Richard Blumenthal, one of the bill’s drafters and key proponents, made the case that encryption is not necessarily at odds with stopping CSAM on these platforms, and therefore the best practices may not even have to address it. In fact, he said, this bill never even mentions the word.

But, while the bill may not be about encryption or mention it, on its face, its design allows the Commission, ultimately at the Attorney General’s discretion, to include prohibitions on end-to-end encryption in the recommended best practices. And, while some attempt was made by the drafters to reduce the AG’s influence on the process, ultimately the problem still remains, because there is nothing in the bill that prohibits a blanket ban on end-to-end encryption. Such silence leaves the matter open to interpretation. And while some Senators may not believe this bill is about the AG’s attack on encryption, one of the drafters, Senator Graham, identified end-to-end encryption as a key issue during the hearing. It is likely that, knowing a ban on end-to-end encryption would receive significant push back on its own merits, Senator Graham is using this convoluted method to work with the DOJ to target the practice by giving the AG leadership of the Commission and ultimate enforcement of the best practices. Again, this is not to suggest that Senator Graham or the AG’s office don’t care about identifying and removing CSAM, but as Shoshana Weissmann has argued, their idea of combatting the harm may be tied to ending the practice of end-to-end encryption.

Argument #3: The measure isn’t too draconian because companies have the choice to comply.

Proponents of the bill also argue that because the bill technically allows platforms to choose to comply with the best practices, they are not being forced to do anything. However, in the bill, the platform’s continued protection against liability under Section 230 of the Communications Decency Act is contingent upon adherence to the best practices or reasonable measures relating to them. Accordingly, given the nature of content moderation and its reliance on these strong intermediary protections, this is not much of a choice at all.

On this point, the problem stems primarily from the “recklessness” standard the bill adopts for the underlying offense of handling or transporting CSAM. Currently, platforms are held to what is referred to as a “knowledge standard,” which effectively means that they must have knowledge they are transporting CSAM in order to be held liable. This is an important distinction because, as a practical matter, a platform that acts as a host for a wide range of content posted by millions of individuals is not expected to have active knowledge of everything their users posts in real time. And thus, they can’t (and shouldn’t) be held immediately liable for what is, effectively, the content of others.

However, the EARN IT Act adopts a new “recklessness” standard, which means that even if a platform has no knowledge of CSAM on their service, they can still be held liable if they consciously disregard a substantial risk that it’s being shared. Such a standard is flawed at the outset because a substantial risk is inherent in any platform with autonomous users that can upload content in real time. Moreover, the bill doesn’t provide solid guidance about what specifically would constitute “recklessness” as a practical matter. And while it’s true, as I’ve acknowledged elsewhere, that the recklessness standard is likely hard to prove, it nevertheless overrides the moderator’s dilemma created by existing case law (wherein platforms can choose whether to address the knowledge standard either by refusing to moderate at all or by over-removing content to avoid liability).

In practice, this means that if they choose not to comply with the recommended best practices, even small and fledgling platforms are forced to become moderators or risk liability. This would likely open up internet platforms to expensive and lengthy civil litigation—much of which could be built on nebulous and even frivolous claims. For a large platform like Facebook this may constitute less of an existential risk, but for smaller, start-up ones, it may mean they get shuttered before they fully get off the ground. In other words, the only “choice” being offered is simply to comply—or else.

Argument 4: The EARN IT Act is a better mechanism to protect children and society from the harms of CSAM.

As I’ve noted, supporters of the EARN IT Act are well-intentioned in their desire to do more to protect against the harms of CSAM. But, the fact is, it may actually be counterproductive in at least one key way that has to do with the issue of constitutionality. The Fourth Amendment protects against unreasonable searches and seizures, but currently, platforms can implement measures to find and remove CSAM without running afoul of the law, because they are private companies. However, if these companies are deemed to be acting as agents of the government, they become subject to it.

For example, courts have determined that the National Center for Missing and Exploited Children is a government actor because federal law mandates that it maintain a Cyber Tip Line and collaborate with law enforcement personnel. Therefore, when the NCMEC passes along reports about private citizens’ activities, Fourth Amendment protections suddenly apply. If this bill forces platforms to adopt the best practices, they too may suddenly be deemed government agents and their activities subject to the Fourth Amendment. And, as Shoshana Weissmann notes, should this happen, many positive actions they’re already taking to detect and remove CSAM, such as e-mail providers scanning and filtering for it, would suddenly become unconstitutional.

There is no doubt that Congress should continue its efforts to reduce the spread of CSAM, and there are many different avenues it can explore to do so. However, the EARN IT Act is the wrong approach. Rather than empowering platforms and lawmakers to eliminate this content, the bill would actually make the process more nebulous and difficult, while opening the door for major breaches of user privacy.

Featured Publications