On July 20, the House Energy and Commerce Committee held a markup for the American Data Privacy and Protection Act (ADPPA). The bill was reported favorably by the committee with a 53-2 vote after an Amendment in Nature of a Substitute (AINS) was passed, representing significant progress for federal data privacy and security legislation. Eleven amendments to the AINS were proposed with six passing, one failing and four being withdrawn. This marks continued movement on privacy legislation after years of on-again-off-again negotiations, and it shows political will from both sides of the aisle to move privacy legislation forward.

The bill’s newest version has no shortage of edits: from expanded sensitive data categories to varying knowledge standards relating to children’s activity online. But there are three specific changes that reflect the bill’s larger journey and efforts at compromise through the most recent AINS.

On Preempting State Law

Providing a uniform federal law—instead of a patchwork of state laws—is clearly a goal of the ADPPA, as multiple members stressed during the markup. We have called for this in our privacy research as well. However, some continue to advocate for a law that serves as a floor: Rep. Anna Eshoo (Calif.-18) proposed an amendment at the markup to allow states to create stricter provisions than the potential law. The proposal failed by a 48-8 vote, however, with some like Chairman Frank Pallone (N.J.-06) going as far as to say that the amendment would undermine the bill.

The adopted amendments try to further refine this balance. For example, the California Privacy Protection Agency would be able to enforce the ADPPA like it would otherwise enforce the California Consumer Privacy Act. They also expand the list of state laws preserved to add public health activities, reporting, data or services; laws pertaining to encryption; and new categories of civil laws. In addition, the issue of regulation by the Federal Trade Commission and the Federal Communications Commission was further clarified, but some members asked for additional reassurance that the two entities are not both regulating in the privacy space.

At the same time, the number of federal laws not impacted by ADPPA was expanded, including the Confidentiality of Alcohol and Drug Abuse Patient Records and the Genetic Information Nondiscrimination Act, while clarifying the applicability of the Family Educational Rights and Privacy Act.

On a Private Right of Action  

Whether a federal law should even include a private right of action (PRA) was up in the air until the ADPPA, but some have critiqued the bill as having “major enforcement holes” or structural issues. The amended version seeks to provide more middle ground. To assuage consumer-advocate concerns, for example, the PRA’s waiting period was halved from four to two years. For business-minded stakeholders, the markup brings good news for smaller shops by providing an exemption from the PRA, which applies to covered entities with less than $25 million in revenue per year, that process data on less than 50,000 individuals, and that have less than 50 percent of its revenue from transferring data.

On a more global level, the bill now prohibits pre-dispute arbitration agreements regarding claims related to gender or partner-based violence or physical harm; some continue to call for a full ban on arbitration. Also, declaratory relief was added as an option for courts, and a new requirement for entities to demonstrate to a court that they cured a violation was incorporated.

On Countering Algorithmic Discrimination 

The ADPPA contains provisions around discriminatory uses of data and the impact of algorithms, with the latter receiving multiple edits. Consider the threshold for conducting algorithmic impact assessments, which has gone from applying to a large data holder that “uses an algorithm” to algorithms that “may cause potential harm” to algorithms that pose a “consequential risk of harm” to individuals or groups of individuals. These shifts aim to balance requiring entities to conduct impact assessments and design evaluations for all or most algorithms and those that involve important decisions or have significant effects, although the exact meaning of consequential risk is not clear and should be further defined. A similar change was made under algorithm design evaluations, where language was also removed that expressed a preference for external auditors or researchers conducting them.

These three changes reflect the blended feedback from stakeholders across the board, like our research on reaching a compromise. And this is not just since the discussion draft’s release on June 3; it is also from years of discussions, disagreements and impactful research on privacy legislation. We acknowledge that the amended version is not perfect and that work should continue, which is a perspective conveyed by several members who spoke at the hearing. With plenty of effort on behalf of the House Energy and Commerce Committee, the baton now passes to the full House—and potentially the Senate thereafter—to continue the journey.

Buy-in from key senators is lacking, so time will tell if this second round of amendments helps alleviate concerns. And as the bill continues to shuttle through Hill procedure, stakeholders must decide whether the United States can continue without federal data privacy and security legislation or whether a law rooted in compromise—keeping in mind that compromises don’t leave everyone completely satisfied—will become the uniform standard. The future of American data privacy and security depends on it.