BEFORE THE FEDERAL TRADE COMMISSION

In the matter of: Competition and Consumer Protection in the 21st
Century Hearings Hearing #12: The FTC’s Approach to Consumer
Privacy

Docket No. FTC-2018-0098

COMMENTS OF THE R STREET INSTITUTE

 Last November, The R Street Institute[1] (“R Street”) filed comments with the National Telecommunications and Information Administration[2] in response to the administration’s request for comments on a variety of issues relating to consumer privacy.[3] R Street has also submitted numerous letters to a variety of congressional hearings on consumer privacy issues.[4] These comments summarize and expand upon R Street’s prior commentary to offer five key considerations for policymakers to take into account when evaluating different approaches to protecting consumer privacy:

The Commission should carefully consider these principles when bringing enforcement actions or educating lawmakers.

     I.         Effects on Competition, Market Concentration, and Startup Entry

While the focus of these comments is consumer privacy, several commenters and policymakers have recently expressed concern about market concentration in many sectors of the economy[5]—including online services[6]—and the enormous monetary value of aggregate personal data.[7] These privacy and competition concerns are inextricably linked, so increasing protections in one area may impose limitations or harms on the other. Therefore, before embracing any new privacy proposal, it is essential to first consider the effects such a proposal would likely have on competition, market concentration, and startup entry.

Components of existing and proposed privacy laws cut in multiple directions with respect to competition. For example, one commonly proposed idea under the umbrella of commercial privacy is to mandate “data portability,” which entails enabling users to collect data (posted content, contact information, friend networks, etc.) from one service and port those data to a competing service provided by another firm.[8] Such provisions could enhance competition by avoiding lock-in effects and lowering switching costs to promote consumer choice. But these provisions could also restrict competition by foreclosing certain business models and limiting the ability of startups to disrupt industry incumbents with newer and better services. Open platforms and so-called “walled gardens” offer different benefits to users and come with different costs; consumers should generally be free to choose the services that best meet their needs. For that reason, imposing a one-size-fits-all approach to data portability on the entire ecosystem—even if done with the goal of promoting competition—may actually limit consumer choice and hurt competition over the long run. Policymakers should therefore tread carefully when considering ostensibly procompetitive regulations like data portability, as they may not always have the desired effect.

Similarly, privacy rules may impose substantial compliance costs on firms that raise barriers to entry for startups and smaller firms,[9] thereby favoring industry consolidation and likely diminishing competition.[10] A requirement to maintain a dedicated privacy officer within a firm,[11] for example, would impose a minimal burden on established firms with thousands of employees. However, startups with only a handful of employees would struggle to meet that requirement, likely forcing them to divert precious capital away from research and development, marketing, or other areas into legal compliance merely to avoid liability.

For that reason, evaluation of any privacy proposal must include consideration of any anti-competitive effects that it may impose, especially on small and medium-sized firms. One possible solution would be to set a minimum threshold on firm size before a relevant privacy regulation becomes operative against a firm.[12] This is certainly better than the one-size-fits-all approach. Yet it could potentially lead to the odd consequence of encouraging small firms to avoid growth, leading in turn to a bifurcated environment of very small startups below the threshold and highly concentrated giants above it.[13]

It is also worth noting that competition itself can motivate firms to protect user privacy. Companies can use privacy as a tool for differentiating their products and services from those of their competitors; Apple and DuckDuckGo are common examples.[14] Allowing the market to experiment with different levels of privacy protections enables consumers to make choices that most closely align with their privacy preferences.[15] To that end, it may be desirable to harness market forces in this area in order to push companies to take up various privacy proposals. This could include, for example, transparency requirements that direct companies to clearly and concisely disclose their privacy practices with consumers, allowing those consumers to make informed choices about which services they would like to use and then vote with their feet.

   II.         Effects on Development of New Technologies Such as Artificial Intelligence

Many artificial intelligence (“AI”) systems require access to large datasets—such as those including personally identifiable information—in order to improve. As AI continues to develop, the value of high-quality training datasets used in machine learning (“ML,” a branch of AI) will only continue to increase in importance. Many of the most valuable datasets used in ML training come from traditional technology platform services like Facebook, Google, and Amazon, as consumers interact with and reveal their preferences through these platforms in a variety of ways.[16] There is no doubt that this wealth of data has produced services of substantial value to the economy and to consumers. Advanced search technology and voice recognition are existing services that have resulted from applying AI techniques to volumes of consumer data; highly automated vehicles and new medical treatments are promises on the horizon for the same reason.[17]

Insofar as more stringent privacy rules would reduce the collection, analysis, and use of these consumer datasets, policymakers should think carefully about the trade-off between privacy on the one hand and AI performance and innovation on the other that these rules may entail. Scholars have already determined that data use restrictions—in the form of copyright and trade secret laws—can have serious consequences for AI development, both by limiting development of technologies overall and potentially leading to pernicious inaccuracies that exacerbate societal biases.[18] Privacy laws could have a similar effect if not drawn up with attention to this possibility. Indeed, several researchers have noted that European privacy laws that provide a “right to be forgotten” are difficult to implement when the data to be “forgotten” is used for ML.[19]

This is particularly relevant as we consider the international competition in AI development taking place between the United States and China. China, which is currently investing billions of dollars into research and development in an attempt to become the world leader in AI by 2030,[20] is actively cultivating new datasets without serious regard for consumer privacy[21] and handing it over to its domestic corporations. While China’s disregard for privacy is no justification for doing the same in the United States, there is certainly a need to be cautious about inadvertently hamstringing American competitiveness in AI through an unnecessarily stringent privacy regime.

 III.         Effects on Cybersecurity and National Security

Privacy is an important cybersecurity and national issue in several respects. First, as noted above, access to large datasets (sometimes referred to as “Big Data”) containing private information and/or biometrics (such as facial images) can drive development of AI, which has many important military, intelligence, law enforcement, and economic implications. Second, protecting privacy through more effective cybersecurity will reduce the risk that adversaries will steal data for the purpose of, for example, identifying intelligence officers, human sources, and recruitment targets. Third, the combination of emerging technologies such as 5G networks and the Internet of Things will result in an explosion of highly personal digital data that adversaries will likely try to exploit for various malign purposes, including AI development.[22] Finally, adversaries will likely seek to steal this personal data and weaponize it in order to develop more effective means of breaching cybersecurity (such as through more persuasive and genuine-looking phishing emails) or to develop more targeted and effective influence operations to undermine elections and other democratic institutions.

Perhaps based on Benjamin Franklin’s misunderstood quip that those who sacrifice liberty in order to obtain temporary safety “deserve neither Liberty nor Safety,” many would say that privacy and security are a zero-sum trade-off. Yet when drafted correctly, privacy laws can potentially enhance national security and cybersecurity in at least two ways. First, to the extent that privacy regulation encourages good cybersecurity practices to prevent data breaches—especially breaches of critical infrastructure and information technology systems—that regulation also protects national security. Second, in view of the relationship between AI technologies and Big Data described above, data—especially data on Americans—are potentially of great value to foreign adversaries.[23] Privacy regulations that enhance cybersecurity and prevent unmonitored exchange of those data can thus potentially prevent malicious behavior against national interests.

Thus, it is certainly possible to adopt systems that maximize both privacy and security. Policymakers should, as a first order, look to privacy regulations that enhance both individual privacy and cybersecurity, which will in turn protect national security interests.

At the same time, there is no doubt that privacy regulations, even commercial ones, can impede otherwise appropriate and necessary law enforcement or intelligence activity. In that sense, increases in security do necessitate decreases in privacy, and vice versa. Where trade-offs are necessary, policymakers should be cautious in assessing the balance of harms. New privacy legislation or regulations should not, for example, enable an increase in malicious cybercrime, as some fear that recent European regulations may have done.[24] Predictive judgments regarding effects are difficult, to be sure, but at a minimum, legislative consideration should include an effort to evaluate carefully the national security and cybersecurity implications of any privacy proposal.

 IV.         Institutional Implementation of Privacy Law at the Federal Trade Commission

As important as the substance of any privacy proposal is the way in which it is implemented. There is a general consensus that the FTC, at the very least, should be involved in privacy enforcement, in no small part because the agency has already used its unfair and deceptive acts or practices authority to bring hundreds of privacy cases to help protect consumers harmed by companies’ data practices.[25] Thus, we identify several important considerations with regard to the FTC in particular. There are at least five areas for examination.

a.     Statutes, rules, and standards

The level of specificity in privacy proposals is important to consider, as different approaches come with different costs and benefits. For example, Congress could codify specific privacy rules into statute, or the FTC could codify specific privacy rules into the code of federal regulations—either under Section 18[26] or via a new grant of authority from Congress. The former approach has historically been problematic because statutes are difficult to change and thus quickly become outdated.[27] Agency rulemaking is more flexible and thus more adaptive to current practices. Yet to the extent that additional rulemaking authority is granted, policymakers should be concerned with the effectiveness and politicization of privacy enforcement.

Congress constrained the FTC’s general rulemaking authority for good reason,[28] so policymakers should try to avoid repeating the mistakes of the past by giving the FTC, or any other enforcement agency, too much authority over privacy. The best way to go about addressing this issue would be for either Congress to codify general privacy standards into statute for the FTC to enforce case by case, or for Congress to grant the FTC additional rulemaking authority that is carefully limited to specific privacy issues.

The case-by-case approach is how the FTC enforces Section 5 today. Therefore, it would be relatively straightforward for Congress to provide the FTC with broad privacy standards that the agency can then interpret and apply in specific enforcement actions. This approach would also be consistent with common-law traditions, which are generally more flexible than code-based civil law traditions. However, many have rightly criticized the FTC’s case-by-case approach for providing insufficient guidance and certainty.[29] The optimal approach to privacy enforcement, then, may be some hybrid between the two approaches. For example, legislation or rules may provide some specific requirements and safe harbors for certain practices, in addition to more general standards that can be developed and enforced over time to address practices that are not covered by the specific rules or safe harbors.

b.    Enforcement procedures

Beyond the substantive statutes, rules, and standards governing privacy, policymakers must consider the process by which these privacy protections will be enforced. While most federal agencies make rules under the informal provisions of the Administrative Procedure Act,[30] the FTC generally must follow the more stringent procedures of the Magnuson-Moss Warranty Act in issuing rules.[31] Therefore, if Congress were to direct the FTC to issue privacy rules, it would need to consider rulemaking procedure. If this procedure is too burdensome, however, a privacy rulemaking could hamstring the Commission’s ability to fulfill its mission by tying it up with protracted comment cycles, public hearings, and voluminous dockets. On the other hand, if procedures are too streamlined, rapid reversals of agency policy after each change in administration could lead to massive uncertainty and politicization. This should be a key consideration in any privacy proposal that relies on FTC rulemaking.

If the FTC is to develop case law through privacy enforcement actions, then Congress must consider whether the FTC has sufficient incentives and resources to litigate and establish such case law. While the Commission has brought hundreds of privacy complaints in recent decades, the overwhelming majority have ended with consent decrees that admit no liability and offer very limited guidance to industry.[32] This is a problem, but not entirely one of the FTC’s own making.

The FTC currently has two main remedies at its disposal: injunctive relief against behavior that violates its rules[33] and civil penalties for “knowing” or repeat violations.[34] In the privacy and online platform space, careful attention must be paid to the right levels and types of penalties available. Given the difficulty of discovering these kinds of violations and quantifying the associated harms, there seems to be general agreement that mere injunctive relief would be insufficient to remedy all privacy harms and deter all future privacy violations. Yet the FTC has relied heavily on injunctive relief because of its limited authority to assess civil penalties. With new privacy rules, the Commission’s civil-penalty authority would be expanded, increasing its ability to provide remedies and deterrence for informational injuries and other non-financial privacy harms. But even in that case, the FTC’s civil-penalty authority may need to be recalibrated. The FTC’s penalty authority is currently tied to the number of consumers harmed,[35] but that can be wildly disproportionate for online platforms with billions of users as opposed to platforms with relatively few. If civil penalties are to play a bigger role in the FTC’s privacy enforcement going forward, the precise nature of these penalties should be reconsidered.

Finally, with its massive purview but limited staff, some have argued that the Commission simply cannot adequately protect consumers.[36] Regardless of whether Congress chooses to pursue specific privacy legislation, policymakers should strongly consider supplementing the FTC’s work with additional resources. A grant of additional authority may be entirely prudent, but the FTC must have adequate staff and other resources to fulfill its mission of protecting consumers and competition.

   V.         Relevance of Individual Liberty to Privacy Legislation

Privacy law intersects with individual liberty in several ways, and policymakers should consider these intersections in evaluating any privacy proposal. On the one hand, privacy itself is an important individual liberty. There is an extensive literature on privacy as a driver of self-expression, personal growth, and open democratic processes.[37] On the other hand, the ability of an individual to exchange private data for valuable services is an important economic right as well,[38] one that arguably has made successful many of the free online services that people willingly enjoy today.

Certainly, there are known problems with the notice-and-choice framework. Even with transparent privacy policies, individuals may lack the capacity or willingness to read and understand the myriad terms of service with which they are presented daily.[39] These transaction costs and frictions in the technology market are reasonable problems for federal legislation to solve. But there are also some privacy proposals that substitute personal decisions about private data with policymakers’ preferences—such as proposals that would prevent users from consenting to data collection in certain situations—based on the argument that users are incapable of giving informed consent.[40] Proposals such as these would be worrying not just because they prohibit successful business models but also because they deny individual will and choices.

This is not to say that individual liberty necessarily conflicts with consumer protection. In many cases, default rules that favor consumers are beneficial, not least because they likely represent the actual choices that fully informed consumers would make. Additionally, many concerns to be addressed in privacy legislation involve privacy interests of third parties not involved in a commercial transaction, such as when a social network shares data about a user’s friends. In those cases, the privacy interests of the user’s friends must be accounted for regardless of what the user and the social media platform negotiate between themselves. But careful attention needs to be paid to ensure both that default rules represent real value judgments and that there is sufficient flexibility in such rules to enable informed bargainers to make their own decisions about their own privacy.

***

R Street thanks the Federal Trade Commission for the opportunity to submit comments in response to its recent public hearing on consumer privacy issues. R Street recommends that the Commission pursue the above-identified areas in its ongoing work on protecting consumer privacy.

Respectfully submitted, 

/s/ 

Charles Duan

Director, Technology and Innovation

Paul Rosenzweig

Senior Fellow, National Security and Cybersecurity

Tom Struble 

Manager, Technology and Innovation

Caleb Watney

Fellow, Technology and Innovation

Jeffrey Westling

Fellow, Technology and Innovation

The R Street Institute

1212 New York Ave., N.W.

Suite 900

Washington, D.C. 20005

May 31, 2019


[1]The R Street Institute (“R Street”) engages in policy research and outreach to promote free markets and limited, effective government. That mission includes policy research and outreach on issues relating to consumer privacy.

[2] Charles Duan et al., “Comments of R Street Institute,” In re Developing the Administration’s Approach to Consumer Privacy, Docket No. 180821780-8780-01, Nov. 9, 2018. https://bit.ly/2Jsf8W0.

[3] “Request for Comments on Developing the Administration’s Approach to Consumer Privacy,” Docket No. 180821780-8780-01, Sept. 25, 2018. https://bit.ly/2DBzJ8g.

[4] See Charles Duan et al., “Letter to the House Subcommittee on Consumer Protection and Commerce Re: Hearing on ‘Protecting Consumer Privacy in the Era of Big Data,’” Feb. 26, 2019. https://bit.ly/2QgTu7H; and Tom Struble, “Letter to the House Subcommittee on Consumer Protection and Commerce Re: Hearing on ‘Oversight of the Federal Trade Commission: Strengthening Protections for Americans’ Privacy and Data Security,’” May 15, 2019. https://bit.ly/2WirW7q.

[5] See, e.g., Fiona Scott Morton et al., “Draft Report,” Market Structure and Antitrust Subcommittee of the Committee for the Study of  Digital Platforms, George J. Stigler Center for the Study of the Economy and the State, The University of Chicago Booth School of Business, May 15, 2019, p. 11. https://bit.ly/2VV67eT

[6] Elizabeth Warren, “Here’s how we can break up Big Tech,” Medium, March 8, 2019. https://bit.ly/2SRNIt5.

[7] James E. Short and Steve Todd, “What’s Your Data Worth,” Massachusetts Institute of Technology Sloan Management Review 58:3(2017). https://bit.ly/2IkXAao.

[8] See, e.g., “Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the Protection of Natural Persons with Regard to the Processing of Personal Data and on the Free Movement of Such Data, and Repealing Directive 95/46/EC,” European Parliament, 2016, Article 20. https://bit.ly/2HPHMw8

[9] See The True Cost of Compliance With Data Protection Regulations, Globalscape, Dec. 2017, p. 9. https://bit.ly/2ksrNKp.

[10] Mark Scott et al., “Six months in, Europe’s privacy revolution favors Google, Facebook,” Politico, Nov. 27, 2018. https://politi.co/2P42bQw.

[11] GDPR, Article 37.

[12] See, e.g., Cal. Civ. Code § 1798.140(c)(1) (creating an exception for business that have less than $25,000 in annual gross revenue, sells the personal information of 50,000, or derives less than 50 percent of its annual revenues from selling consumer data).

[13] Cf. François Gourio and Nicolas Roys, “Size-dependent regulations, firm size distribution, and reallocation,” Quantitative Economics 5 (2014), p. 377. https://bit.ly/2Juj8oS.

[14] See, e.g.,Jason Evangelho, “Why You Should Ditch Google Search and Use DuckDuckGo,” Forbes, Oct. 3, 2018. https://bit.ly/2Zg6DSx.

[15] Maureen K. Ohlhausen and Alexander P. Okuliar, “Competition, Consumer Protection and the Right [Approach] to Privacy,” Antitrust Law Journal 80:1 (2015), pp. 133–34. https://bit.ly/2HyJLa3 (“Each of these digital platforms is relatively new and, despite the size of social media players like Facebook, Twitter, and Google, has been able to quickly attract large volumes of consumer traffic by offering greater anonymity as an attribute of otherwise similar social media offerings.”).

[16] Will Rinehart, Understanding Calls for Regulating Artificial Intelligence, American Action Forum, Jan. 14, 2019. https://bit.ly/2LW3CVa.

[17] Eline Chivot and Daniel Castro, The EU Needs to Reform the GDPR to Remain Competitive in the Algorithmic Economy, Center for Data Innovation, May 13, 2019, p. 4. https://bit.ly/2LW01Gp.

[18] Caleb Watney, A Framework for Increasing Competition and Diffusion in Artificial Intelligence, American Action Forum, Mar. 15, 2019. https://bit.ly/2JSxsqP.

[19] Eduard Fosch Villaronga, Peter Kieseberg and Tiffany Li, “Humans Forget, Machines Remember: Artificial Intelligence and the Right to Be Forgotten,” Computer Security and Law Review 34:2 (Apr. 28, 2019), p. 304; and Andrew Burt, “How will the GDPR impact machine learning?” O’Reilly,May 16, 2018. https://oreil.ly/2xlHT1w.

[20] Arthur Herman, “China’s Brave New World of AI,” Forbes,Aug. 30, 2018. https://bit.ly/2Wmernu.

[21] Greg Williams, “Why China will win the global race for complete AI dominance,” Wired,Apr. 16, 2018. https://bit.ly/2vmrJnl.

[22] John Chen et al., China’s Internet of Things, Research Report on Behalf of the U.S.-China Economic and Security Review Commission, Oct. 2018, p. 5. https://bit.ly/2OPmdmK.

[23] For example, the Committee on Foreign Investment in the United States designated Beijing Kunlun Tech Co. LTD’s ownership of the popular U.S. based dating app a national security risk. Echo Wang, “China’s Kunlun in Talks with U.S. over Grindr: filing,” Reuters, Apr. 1, 2019. https://reut.rs/2UlIMNy.

[24] Tyler Moffit, “Is GDPR a Win for Cybercriminals?,” Webroot, June 6, 2018. https://bit.ly/2V4K9ov.

[25] See, e.g., Privacy and Data Security Update: 2018, Federal Trade Commission, March 15, 2019. https://bit.ly/2TZvV7G.

[26] 15 U.S.C. § 57a.

[27] For example, the Stored Communications Act allows law enforcement to access emails without a warrant if they have been stored for 180 days. 18 U.S.C. § 2703(a). This made sense when users downloaded emails, but users now routinely keep emails for years through a webmail service.

[28] J. Howard Beales, Former Director, Bureau of Consumer Protection, Federal Trade Commission, Speech at The Marketing and Public Policy Conference, “The FTC’s Use of Unfairness Authority: Its Rise, Fall, and Resurrection,” May, 30, 2003. https://bit.ly/2nkeI4H.

[29] See, e.g.,Justin (Gus) Hurwitz, “Data Security and the FTC’s UnCommon Law,” Iowa Law Review 101:3 (March 2016). https://bit.ly/2Hxt7HX.

[30] 5 U.S.C. § 553.

[31] See15 U.S.C. § 57a; and5 U.S.C. §§ 556–57.

[32] Tom Struble, Reforming the Federal Trade Commission Through Better Process, The R Street Institute, Dec. 2017. https://bit.ly/2KPFlPW.

[33] 15 U.S.C. § 45(l).

[34] 15 U.S.C. § 45(m).

[35] Office of Public Affairs, FTC Publishes Inflation-Adjusted Civil Penalty Amounts, Federal Trade Commission, March 1, 2019. https://bit.ly/2C4xpn1.

[36] See, e.g., Dylan Gilbert, “The FTC Must Be Empowered to Protect Our Privacy,” Public Knowledge, June 18, 2018. https://bit.ly/2VGG4mA.

[37] See, e.g., Julie Cohen, “What Privacy is For,” Harvard Law Review 126:7(2013), p. 1904. https://bit.ly/2WzbijX.

[38] See, e.g., Ann Cavoukian, Privacy as a Fundamental Human Right vs. an Economic Right: An Attempt at Conciliation, Information and Privacy Commissioner Ontario, Sept. 1999, p. 12. https://bit.ly/2w8d6BP.

[39] See, e.g., Richard Warner and Robert Sloan, “Beyond Notice and Choice: Privacy, Norms, and Consent,” Journal of High Technology Law 14 (Jan. 2013), pp. 15–27. https://bit.ly/2wbO2Kk.

[40] See, e.g., Daniel J. Solove, “Introduction: Privacy Self-Management and the Consent Dilemma,” Harvard Law Review 126 (2013), p. 1893. https://bit.ly/2LYHMQJ (“But the true consequences of information use for individuals cannot be known when they make these decisions. Furthermore, the consequences are cumulative, and they cannot be adequately assessed in a series of isolated transactions. Privacy self-management’s very structure impedes its ability to achieve its goal of giving people meaningful control over their data.”).

Featured Publications