The R Street Institute respectfully submits these comments in response to the Notice of Proposed Rulemaking published July 8, 2022 regarding the California Privacy Protection Agency’s (CPPA) proposed changes to the California Code of Regulations to align the existing California Consumer Privacy Act regulations with the Consumer Privacy Rights Act of 2020 (CPRA).

We appreciate the challenge facing the Agency: In many ways CPRA directly mandates specific changes to the regulations, whether or not such changes will result in better public policy or better outcomes for Californians. In most instances, we regard the proposed regulations as a reasonable attempt to implement the adopted law, but we note some exceptions that are problematic, for which we will propose an alternative approach.

I. Context

The new privacy regulations are being debated in California even as Congress considers a federal data privacy and security law, which could potentially render all or part of this work moot through federal[1] preemption. At the R Street Institute, we believe a comprehensive federal data privacy and security law is essential for national security, consumers and industry, but we also believe there is a role for states. R Street recently offered multiple recommendations to pass a federal law based on addressing traditional roadblocks through compromise after 120+ stakeholder engagements.[2]

Part of the compromise requires finding a middle ground between state and federal privacy enforcement and applicability. While we believe in strong preemption to create a uniform federal standard, there should be carve-outs for select state legislation, room for state enforcement and a role for state data protection authorities like the CPPA.[3] From that perspective, it is worthwhile to continue with this process of developing CPPA rules despite the possibility of a federal privacy law, as these efforts are not necessarily contradictory.

As another crucial contextual note, the consideration of cybersecurity provisions and risk assessments is critical to an effective law and the protection of data. However, the Notice of Proposed Rulemaking states that rules on these topics will be covered by a future rulemaking. While we understand the need to limit rulemaking, these sections should be prioritized because there can be no privacy in practice without security, and businesses may have inadequate guidance to conduct audits and assessments. Consider a company that transparently informs its users of its data use practices on the information it collects, but has weak access permissions for this information. There would be inadequate defense against unauthorized access and the company would not be able to provide adequate data protection to its customers.

To mitigate this, a symbiotic relationship between security and privacy should be fostered. This does not mean that the rulemaking now has to mandate certain encryption standards, for instance, but cybersecurity should be taken into consideration at all stages and not shelved for future action.

II. Consequences

The proposed regulations will be costly for California business and, in turn, California citizens. The CPPA estimates that the proposed regulations will have a cost impact of $127.50 per business, which represents the labor costs of updating website information.[4] This coverage is misleading, in part, because it assumes businesses are in compliance with current laws and it only addresses the new economic impact, with the cost of existing regulations being covered by previous filings. And even beyond that legacy burden, the scale and complexity of just the new requirements would seem to require not only drafting of information, but revamped internal processes to ensure the requested data is available and accurate, and likely legal counsel review to ensure compliance. Together, the broader cost and burden of compliance would seem to be significantly higher, especially for smaller businesses. Given that these regulations are estimated to impact over 66,000 businesses in California, with nearly 44,000 being small businesses, both the individual and collective costs of compliance will be significant in a way that dwarfs the nominal regulatory estimate.[5]

Even more than the implementation cost, R Street is concerned by the possibility of rules that will drown users in excessive and unusable information. In the long history of privacy policy work, perhaps no challenge is more insidious than over-sharing. Numerous studies, such as a 2017 article co-authored by usable privacy expert Professor Lorrie Cranor, indicate that attempts to provide users with all information [6] that may be relevant to a consumer decision are ineffective. From the history of corporate privacy policies to the European Union’s infamous “cookie directive,” forcing users to confront significant information at the outset of engagement rarely achieves the right balance of informing and empowering effective consumer choice.

From that perspective, section 7011 mandates a substantial volume of specific information to be included within privacy policies. While the language includes ample softening descriptors like “explanation” and “in a manner that provides consumers a meaningful understanding” (e.g. 7011(e)(1)(C)), it seems implausible that ordinary consumers will spend the necessary time to read an individual company’s explanations no matter how plain the language. Similarly, few if any consumers are likely to compare across similar services the “categories of personal information the business has collected” (7011(e)(1)(A)) in order to make a choice among possible market options for services. The attempt at homogenizing privacy policies reflected in these regulations appears to make such comparisons more feasible, but for the everyday consumer, it will more likely have the opposite effect, by making it more difficult for a company to compete on the clarity and efficacy of its privacy policies and how it frames its strengths on privacy to the consumer.

To provide an example of the fragility of overly-specific notice obligations in the proposed rules, the proposed 7011(e)(6) requires a business providing a notice at the point of data collection to inform the user of any other businesses (“third parties”) who “control” the collection of such information. Yet, the proposed 7012(g)(1) requires all such third parties to also provide a notice to the user, “at collection.” If “at collection” means that users must have visibility into the third parties’ notice when viewing the first party website, it would seem that users would be presented with a notice from the first party which names or describes the practices of any third parties involved, and also a separate notice from each of the third parties collecting data: double notification regarding each of the third parties. Loading “” in Firefox (updated to its most current version) at the time of this writing, twenty four (24) separate domains with tracking content are identified; each of these is designed to collect information from the user, although there is overlap among the parties providing them. Judiciously narrowing this set down to, say, ten unique third parties indicates that a user would be presented with one notice from Yahoo that identifies all ten of these parties, and ten additional, separate notices.

An alternative interpretation of the rules that limits redundancy would be to follow the example presented in 7012(g)(4)(A), where the third party at issue—”Business G”—is directed to provide its notice “on its homepage,” as in the website of the analytics service. Such a website would presumably be designed for Business G’s customers (other businesses) not end users, and it’s unclear whether any would visit Business G’s website in order to read such a notice, as well as whether visiting Business G’s website would turn the relationship into a first party relationship, as the user is now aware of Business G, visiting their website, and expecting to interact with them.

A more general notice that gives more opportunity for businesses to tailor notices related to third party collection and control, in a streamlined manner optimizing for utility to the user, would likely be more effective in practice than the specific guidance offered in the proposed rules.

III. Other concerns

The CPPA is obligated to implement the CPRA’s prohibition on “dark patterns,” the design of user interfaces to encourage a user to make certain business-preferred choices. In theory, this goal is commendable, and helps ensure the smooth and accurate operation of markets through informed and effective consumer choice. However, the proposed implementation rules regarding “symmetry in choice” would impose paternalistic and artificial limitations on product design that go beyond what is needed to implement the CPRA obligation. Several of the proposed examples create vague risk and the possibility of user frustration:

  • Example (A) imposes a hard limit on the number of clicks involved in making a choice, an artificial limitation when some options may involve sub-options, or where their selection may be better informed by presenting the user with additional information before making a decision.
  • Example (C) presumes that a user would only wish to choose “Accept All” or “Decline All”, whereas modern practice typically gives users more choice than this, including the ability to allow, for example, analytics collection where the website operator may benefit. Ignoring the possibility that the user may wish to support the website operator, while still being protected from cross-site tracking, unnecessarily structures the user and the website in a “hostile by default” relationship.
  • Examples (D) and (E) both impose a vague limitation that a business-preferred option not be presented in a more “eye-catching color” than others. Even ignoring that this bears no relation to the rule itself, which is limited to the length of path that must be followed (not the visual appeal of the path), the requirement is vague and undoubtedly will be the subject of litigation, and likely to lead to businesses forcing uniform colors for buttons, unnecessarily and arbitrarily. By extension, though, would positioning the “Accept All” button on the left be viewed as preferable, given that the English language is read from left to right? Presumably this would not be viewed as an unfair advantage or overly major skew on a user’s fair choice.


IV. Conclusion

While the proposed regulations tackle an admirable goal in that they seek to offer useful clarity and examples to help businesses comply, in the context of the volume of notice and obligation under the law, the excessive specificity, potential redundancy, and in some instances vagueness together create unnecessary risk of unjustified litigation and a likelihood of over-compliance that goes beyond protecting users and data and produces unhelpful homogenization and a deluge of detail that will not lead to consumers feeling more informed or empowered.

We recommend that CPPA consider modifications to the proposed rules as follows:

  • Prioritize cybersecurity alongside privacy to invest in total user protection;
  • Streamline, reduce, and uplevel notice obligations as much as possible within the confines of the statute, giving businesses room to invest in meaningful user notice; and
  • Scale back and clarify “symmetry in choice” requirements to realize the statutory duty of limiting “dark patterns” without unnecessary and harmful restrictions on user interface design.

Whether for good or ill, CPRA is part of California law, absent future approved propositions or amendments to the state constitution. The proposed regulations are a reasonable start to providing clarity in the implementation of CPRA, though further improvements and tailoring would help minimize unnecessary obstacles and risk.

Respectfully submitted,

Chris Riley

Senior Fellow, Internet Governance

Brandon Pugh

Senior Fellow, Cybersecurity and Emerging Threats

Sofia Lesmes

Senior Research Associate, Cybersecurity and Emerging Threats

R Street Institute
1212 New York Ave. N.W., Suite 900
Washington, D.C. 20005

Contact: [email protected]


[1] Brandon Pugh and Sofia Lesmes, “Marking Up Momentum: What’s Next for the ADPPA,” R Street Institute, July 21, 2022.

[2] Tatyana Bolton et al., “The Path to Reaching Consensus for Federal Data Security and Privacy Legislation,” R Street Institute, May 26, 2022.

[3] Tatyana Bolton et al., “Preemption in Federal Data Security and Privacy Legislation,” R Street Institute, May 31, 2022.

[4] “Notice of Proposed Rulemaking regarding implementation of the Consumer Privacy Rights Act of 2020,” California Privacy Protection Agency, July 8, 2022.

[5] “Economic and Fiscal Impact Statement regarding proposed California Consumer Privacy Act Regulations” California Privacy Protection Agency, June 28, 2022.

[6] Florian Schaub et al., “Designing Effective Privacy Notices and Controls,” IEEE Internet Computing, June 16, 2017.

[7] Florian Schaub et al., “Designing Effective Privacy Notices and Controls,” IEEE Internet Computing, June 16, 2017.