Testimony from:
Josh Withrow, Fellow, Tech & Innovation Policy, R Street Institute

Testimony in Opposition to S. 289, “An act relating to age-appropriate design code.”

April 5, 2024

House Committee on Commerce and Economic Development

Chair Marcotte and members of the committee,

My name is Josh Withrow, and I am a fellow with the Technology and Innovation Policy team at the R Street Institute (“R Street”), which is a nonprofit, nonpartisan, public policy research organization. Our mission is to engage in policy research and outreach to promote free markets and limited, effective government in many areas, including the technology and innovation sector. This is why we have a strong interest in S. 289.

We are concerned that in pursuit of the worthy goal of protecting children, Senate Bill 289 places a duty of care upon online services that is vague and subjective in its terms. As a result, online platforms will inadvertently feel compelled to take measures that reduce the quality of their services and the quantity of information available to users. We believe that this would subvert both platforms’ and users’ freedom of speech.

Much of this legislation is borrowed, in name and function, from the California Age-Appropriate Design Code Act of 2022 (AADC), which in turn is based on a U.K. law.[1] In common with its predecessors, S. 289 places a duty of care on companies to determine what content and features on their platforms might be “to the detriment of a minor consumer.” They are obligated to figure out what might in “reasonably foreseeable material physical or financial injury” or even “emotional distress” in minors (anyone under 18 years of age) and to avoid “dark patterns” or anything that might promote “excessive or compulsive use” of their platforms.[2] These are all incredibly vague, subjective terms that leave platforms in the role of having to determine what is best for kids, or face significant ramifications, including tremendous fines.

Much of the interpretation of these vague terms is left to the discretion of the Vermont Attorney General. In the meantime, covered platforms are left in an unpleasant unconstitutional Catch-22. Either platforms could modify their services to enact default safeguards that treat all users as if they were children, or they could verify the age of all their users so that they can create a separate experience for minors, or even just exclude them altogether. Each of these outcomes is likely to be held in violation of the First Amendment.[3]

Assuming that platforms choose not to simply age-gate and exclude all minors, they face extreme pressure to over-censor content – a core flaw with the very design concept of the Age-Appropriate Design Code. Because the definition of what may be considered harmful to children is both broad and subjective, sites will be forced to consider taking down broad swaths of content relating to subjects such as mental health. The algorithms that sites must use to proactively take down such content are largely incapable of distinguishing context and intent, meaning that self-help and satirical content is likely to be taken down along with legitimately troubling content.[4]

The British law from which the AADC derives did not need to consider the First Amendment’s stringent protections of freedom of access to speech, but a U.S. law does. California’s AADC is already under injunction, with a District Court judge ruling that it is likely to be found in violation of the First Amendment because of the pressure it creates for platforms to censor otherwise legal content.[5] S. 289 would almost certainly attract a similar constitutional challenge, at great cost to Vermont taxpayers.

In addition, although the bill does specify that it should not be construed to mandate that the platforms must conduct age-gating, in reality platforms are still heavily incentivized to verify age to either exclude or create a separate experience for minors. The covered entity prohibitions in § 2449e, for example, would likely require platforms to modify their experience by disabling ease-of-use features like infinite scroll and autoplay, and making other changes that would reduce the overall usefulness of their platform for affected users.

Creating a separate experience for minors obviously requires figuring out who the minors are and thus subjecting all users of a platform to some degree of age verification. A de facto age verification requirement could subject S. 289 to yet another First Amendment question because of the barrier it creates for adult access to free and anonymous speech online. Moreover, it creates a host of practical problems relating to privacy and cybersecurity, as websites—or the third-party services they employ—will have to process more personally identifying information, thereby raising the risk that hackers might gain access users’ personal data.[6] This outcome would subvert the clear intent of the rest of the bill with respect to protecting data privacy and security.

Ultimately, in addition to concerns about constitutionality and ease of compliance, the AADC approach to regulating online services places platforms in the impossible position of making subjective decisions about what content is beneficial or detrimental to minors on their platform. For parents, the tools necessary to prevent minors from accessing unwanted sites and apps, and to limit and supervise their time spent online, are easily available at the device, browser and even network level.[7] Similarly, social media literacy education in the school curriculum is one helpful policy change that can help children and teens more safely navigate an increasingly online world.[8]

S. 289, like laws it has largely copied, is worthy in its intent and aim to address the real and significant problems raised by minors who come into contact with harmful content and individuals on the internet. However, the law is simply too vague for even the most conscientious online service to be able to comply with, and would thus pose a likely unconstitutional burden on both platforms’ and users’ rights to free speech. Thus, we respectfully ask for your opposition and an unfavorable report to S. 289.

Thank you for your time,

Josh Withrow
Fellow, Technology & Innovation Policy
R Street Institute
(540) 604-3871
[email protected] 


[1] Assembly Bill No. 2273, The California Age Appropriate Design Code Act, California Legislature; “We Need to Keep Kids Safe Online: California has the Solution,” 5 Rights Foundation, last accessed March 3, 2023. https://californiaaadc.com; “Introduction to the Age appropriate design code,” U.K. Information Commissioner’s Office, last accessed March 3, 2023. https://ico.org.uk/for-organisations/guide-to-data-protection/ico-codes-of-practice/age-appropriate-design-code.

[2] S. 289 “an act relating to age-appropriate design code,” Vermont Legislature. https://legislature.vermont.gov/bill/status/2024/S.289.

[3] Corbin Barthold, “Brief of amicus curiae TechFreedom in support of plaintiff –appellee and affirmance,” U.S. 9th Circuit Court of Appeals, Case No. No. 5:22-cv-8861. https://techfreedom.org/wp-content/uploads/2024/02/TechFreedom-Amicus-Brief-Bonta-v-Netchoice-9th-Cir.pdf.

[4] Mike Masnick, “Masnick’s Impossibility Theorem: Content Moderation at Scale is Impossible to Do Well,” TechDirt, Nov. 20, 2019. https://www.techdirt.com/2019/11/20/masnicks-impossibility-theorem-content-moderation-scale-is-impossible-to-do-well/.

[5] “Order Granting Motion for Preliminary Injunction,” Case No. 22-cv-08861-BLF, U.S. District Court Northern District of California, San Jose Division, https://netchoice.org/wp-content/uploads/2023/09/NETCHOICE-v-BONTA-PRELIMINARY-INJUNCTION-GRANTED.pdf.

[6] Shoshana Weissmann, “The Fundamental Problems with Social Media Age Verification,” R Street Institute, May 16, 2023. https://www.rstreet.org/commentary/the-fundamental-problems-with-social-media-age-verification-legislation/.

[7] For example, a quick step-by-step walkthrough for how to enable parental controls on any commonly-owned mobile device: “Parental Controls,” Internet Matters, https://www.internetmatters.org/parental-controls/.

[8] Jennifer Huddleston, “Improving Youth Online Safety without Sacrificing Privacy and Speech,” Cato Institute, June 20, 2023. https://www.cato.org/briefing-paper/improving-youth-online-safety-without-sacrificing-privacy-speech