RE: R Street Institute Opposition to Maryland Age-Appropriate Design Code Act, House Bill 603 and Senate Bill 571

Governor Moore:

My name is Josh Withrow, and I am a fellow with the Technology and Innovation Policy team at the R Street Institute (“R Street”), which is a nonprofit, nonpartisan, public policy research organization. Our mission is to engage in policy research and outreach to promote free markets and limited, effective government in many areas, including the technology and innovation sector. This is why we have a strong interest in the Maryland Age-Appropriate Design Code (AADC) legislation, House Bill 603 and Senate Bill 571.

HB 603 and SB 571 were written with the most noble of intentions; however, we are concerned that in pursuit of the worthy goal of protecting children online, the measures impose a duty of care upon online services that lack clarity and are subjective in its terms. The inadvertent result of these nebulous bills is that online platforms will feel compelled to take measures that reduce the quality of their services and the quantity of information available to users. We believe that it will compromise both platforms’ and users’ freedom of speech.

Much of this legislation is borrowed, in name and function, from the California Age-Appropriate Design Code Act of 2022, which in turn is based on a U.K. law.[1] In common with its predecessors, Maryland’s AADC places a duty of care on social media companies to design their online platforms with the “best interest of children” in mind. This vague fiduciary responsibility obligates these platforms to avoid anything that might result in “reasonably foreseeable and material physical or financial harm” or even “psychological or emotional harm” to children (defined as anyone under 18 years of age).[2] These are all incredibly unclear, subjective terms that leave platforms in the role of having to determine what is best for kids, or face significant ramifications, including tremendous fines. These terms will cause platforms to not only alter their product design, but the content they host as well.

The lack of clarity around many of these vague terms means that their interpretation falls to the Division of Consumer Protection of the Office of the Attorney General of Maryland. In the meantime, covered platforms are left in an unpleasant unconstitutional Catch-22. They could modify their services to enact default safeguards that treat all users as if they were children, or they could verify the age of all their users so that they can create a separate experience for minors, or even just exclude them altogether.[3]

Maryland’s AADC is likely to run afoul of the First Amendment due to its strong inducement for online platforms to over-censor content in order to avoid being penalized under the law’s vague concept of what might be harmful to minors. Every digital service is required to file a Data Protection Impact Assessment within 90 days of introducing any new service that minors might conceivably access, which requires them to “determine whether the online product is designed and offered in a manner consistent with the best interests of children.” Under threat of massive fines for misjudging what may be hypothetically not in the best interests of children, many platforms will certainly default to taking down all content on entire subjects, which is likely to remove self-help, educational, and satirical material along with anything genuinely harmful.[4]

This pressure to over-censor content is a core flaw with the very design concept of the Age Appropriate Design Code. The British law from which the AADC derives did not need to consider the First Amendment’s stringent protections of freedom of access to speech, but a U.S. law does. California’s AADC is already under injunction, with a District Court judge ruling that it is likely to be found in violation of the First Amendment because of the pressure it creates for platforms to censor otherwise legal content.[5] This legislation would almost certainly attract a similar legal scrutiny, at great cost to Maryland taxpayers.

The 2024 edition of Maryland’s AADC does improve upon last year’s version by specifying that it should not “be interpreted or construed to… require a covered entity to implement an age-gating requirement.” In practice, however, it is likely that the vagueness of what makes a site “reasonably likely to be accessed by children” will lead many covered platforms to feel obliged to enact age-gating even in the absence of a hard mandate that they do so. The existing methods that websites can employ to estimate or verify age accurately are all to some extent intrusive and imperfect, and all create a barrier to accessing a given website or app.[6]

Moreover, the bills create many practical problems regarding user privacy and cybersecurity because websites, as well as the third-party services they utilize, will need to process more personally identifying information.[7] This will raise the risk that backers might obtain access to user’s personal data. Such an outcome would undermine the clear intent of the proposal with respect to protecting data privacy and security.

The Maryland Age-Appropriate Design Code, like the California legislation from which it is derived, is worthy in its aim to address the real and significant problems raised by minors who come into contact with harmful content and individuals on the internet. However, the law is simply too vague for even the most conscientious online service to be able to comply with, and would thus pose a likely unconstitutional burden on both platforms’ and users’ rights to free speech. In addition, it also raises serious concerns that the increased data collection that will arise from these bills raises the risks for users if hackers obtained it.   Therefore, we respectfully request your veto of HB 603 and SB 571.

Sincerely,

Josh Withrow

Fellow, Technology & Innovation Policy

R Street Institute

CC:        Eric Luedtke, Chief Legislative Officer, Office of Governor Westley Moore

[1] Assembly Bill No. 2273, The California Age Appropriate Design Code Act, California Legislature; “We Need to Keep Kids Safe Online: California has the Solution,” 5 Rights Foundation, last accessed March 3, 2023. https://californiaaadc.com; “Introduction to the Age appropriate design code,” U.K. Information Commissioner’s Office, last accessed March 3, 2023. https://ico.org.uk/for-organisations/guide-to-data-protection/ico-codes-of-practice/age-appropriate-design-code.

[2] House Bill 603, Maryland Age-Appropriate Design Act, General Assembly of Maryland. https://mgaleg.maryland.gov/mgawebsite/Legislation/Details/HB0603. 

[3] Corbin Barthold, “Brief of amicus curiae TechFreedom in support of plaintiff –appellee and affirmance,” U.S. 9th Circuit Court of Appeals, Case No. No. 5:22-cv-8861. https://techfreedom.org/wp-content/uploads/2024/02/TechFreedom-Amicus-Brief-Bonta-v-Netchoice-9th-Cir.pdf.

[4] Tamra Moore and Christopher P. Eby, “Amici Curiae Brief of Chamber of Progress, IP Justice, and LGBT Tech Institute in Support of Plaintiff’s Motion for Preliminary Injunction,” King & Spalding LLP, March 1, 2023. http://progresschamber.org/wp-content/uploads/2023/03/AS-FILED-Ex.-A-Amici-Curiae-Brief-of-Chamber-of-Progress-et-al.-NetChoice-1.pdf.

[5] “Order Granting Motion for Preliminary Injunction,” Case No. 22-cv-08861-BLF, U.S. District Court Northern District of California, SanJose Division, https://netchoice.org/wp-content/uploads/2023/09/NETCHOICE-v-BONTA-PRELIMINARY-INJUNCTION-GRANTED.pdf.

[6] See, e.g. “Online age verification: balancing privacy and the protection of minors,” Commission Nationale de l’Informatique et des Libertés, Sept. 22, 2022. https://www.cnil.fr/en/online-age-verification-balancing-privacy-and-protection-minors and Shoshana Weissmann, “The Fundamental Problems with Social Media Age Verification,” R Street Institute, May 16, 2023. https://www.rstreet.org/commentary/the-fundamental-problems-with-social-media-age-verification-legislation/.

[7] Ibid.