Testimony from:
Josh Withrow, Fellow, Tech & Innovation Policy, R Street Institute

Testimony in Opposition to RI H 5291, “The Rhode Island Social Media Regulation Act”

March 27, 2025

Rhode Island House Innovation, Internet, and Technology Committee

Chair Baginski and members of the committee,

My name is Josh Withrow, and I am a resident fellow with the Technology and Innovation Policy team at the R Street Institute, which is a nonprofit, nonpartisan, public policy research organization. Our mission is to engage in policy analysis and outreach to promote free markets and limited, effective government in many areas, including the technology and innovation sector.

We are concerned that, instead of empowering parents, H 5291 merely imposes a government mandate to replicate protections that are already easily available to them. In doing so, it erects barriers to accessing speech for all social media users and creates novel data privacy and security problems. While we may share the same goal of protecting kids and teens from harmful content and other dangers online, we believe that mandatory age verification for general-use social media platforms poses practical and constitutional concerns.[1] These issues outweigh whatever limited good they may achieve.

Increasingly, the major online platform owners are investing heavily to make their parental control tools at the device, browser, and platform levels more accessible and effective.[2] In addition, there has long been a robust market for third-party software that grants parents even more granular control over their children’s mobile device screen time and online access.[3] Given the ready availability of better private solutions to online safety, we believe that educational efforts aimed at providing both kids and parents with the knowledge of how to more safely navigate the digital world would be a better approach. For one example, Florida’s legislature passed legislation that directed public schools to include education about online safety into their curriculum.[4]

Instead, H 5291 mandates that social media platforms verify their users’ age and prohibit access to an account – new or existing – until they have established that the account owner is over 18 years of age. For those determined to be under 18, their accounts may not be accessed or created until the platform has obtained “the express consent of a parent or guardian.”[5]

How the companies are allowed to go about verifying users’ age and obtaining parental consent are left to a future rulemaking process by the Department of Business Regulation. However, given that the bill also grants a private right of action to any parent who believes that the above requirements have not been faithfully met, platforms will be incentivized to enact strict age verification, which means obtaining documentary confirmation of a user’s age or something equally intrusive. Current age estimation technologies have improved, but remain error-prone.[6] Even if companies are allowed by rule to rely on supposedly privacy-protecting age-estimation technologies such as facial scans instead of hard verification, they will have to adjudicate false minor identifications via collecting documentary proof of age.

Obtaining verifiable parental consent without requiring intrusive identity verification and the collection of further sensitive personal data is even harder.[7] Platforms are somehow left with the obligation to figure out whether someone really is a valid parent or legal guardian of a minor account holder, which at best compromises the online anonymity or pseudonymity of the parent. It is even more difficult for platforms to discern parental status for children in non-traditional family situations, such as having divorced parents or legal guardians who are not relatives. And for children in dysfunctional family situations, a parental consent requirement may deny them the potentially crucial outlet of social media connections altogether, all the way to the age of 18.

The additional data that app stores will have to collect from consumers in order to comply with H 5291 also creates a tempting new trove of information for hackers. Even with a data minimization requirement, companies are put in the odd spot of both having to delete information they collect to verify ages and also to be able to prove that they did comply with the verification requirements if brought to trial.[8] Just as problematic, if they use a third-party service to conduct the age estimation or verification, those services are not immune to hackers either. As if to emphasize this point, one of the biggest services used by some large social media platforms to verify user age and identity recently suffered a major data breach.[9] Similar data security concerns were one component of why California’s Age Appropriate Design Code has been enjoined by the courts, with the district court finding that the law was “actually likely to exacerbate the problem by inducing covered businesses to require consumers, including children, to divulge additional personal information.”[10]

The fact that every social media user would have to go through the age verification process to access these widely used online platforms also almost certainly dooms this bill on constitutional grounds. Previous attempts to enact broad age-gating restrictions for online services have been found to violate the First Amendment in the past. In the 1990s, the majority of the Communications Decency Act was struck down, with the U.S. Supreme Court finding unanimously that the law’s “burden on adult speech is unacceptable if less restrictive alternatives would be at least as effective in achieving the Act’s legitimate purposes.”[11] The ubiquity of parental tools and guidance on how to use them certainly means that this bill’s mandates would fail this least-restrictive-means test.[12] 

Similarly, requiring parental consent for minors to access lawful, non-obscene content was found to be unconstitutional, in Brown v. Entertainment Merchants’ Association. Justice Antonin Scalia wrote in the majority, that “we note our doubts that punishing third parties for conveying protected speech to children just in case their parents disapprove of that speech is a proper governmental means of aiding parental authority.”[13] Essentially, even minors have limited First Amendment rights to accessing non-obscene speech and online content, which may be curtailed by their parents but not by the government.[14]

On top of these concerns, the curfew on social media access is a flagrant violation of the rights of individuals’ access to speech and parental rights in particular. If parents want to restrict their children’s access to social media between certain times, they are able to do so either via software or by physical confiscation of devices, but the state has no authority to mandate such time limits.

The constitutional issues inherent in age-verification and parental consent requirements such as those in H 5291 have caused similar proposals in several states to be enjoined by courts, likely on the path to being struck down altogether.[15] Lawmakers would be better off focusing on ways to improve online literacy, both for parents and their children, and encouraging parents to exercise the substantial power they already have to control what content and interactions their kids can access online.  

Thank you for your time,

Josh Withrow
Fellow, Technology & Innovation Policy
R Street Institute
(540) 604-3871
jwithrow@rstreet.org 


[1] See Shoshana Weissmann, et al. “The Fundamental Problems with Social Media Age Verification Legislation,” R Street Institute May 16, 2023. https://www.rstreet.org/commentary/the-fundamental-problems-with-social-media-age-verification-legislation/

[2] See, “Helping Protect Kids Online,” Apple.com, Feb. 2025. https://developer.apple.com/support/downloads/Helping-Protect-Kids-Online-2025.pdf, “Leading Technology Companies and Foundations Back New Initiative to Provide Free, Open-Source Tools for a Safer Internet in the AI Era,” PR Newswire, Feb. 10, 2025. https://www.prnewswire.com/news-releases/leading-technology-companies-and-foundations-back-new-initiative-to-provide-free-open-source-tools-for-a-safer-internet-in-the-ai-era-302371243.html.

[3] “Children Online Safety Tools,” Competitive Enterprise Institute, Last accessed Feb. 16, 2025. https://cei.org/children-online-safety-tools/.

[4] HB 379, Florida Senate, 2023 Legislative Session. https://www.flsenate.gov/Session/Bill/2023/379 

[5] “The Rhode Island Social Media Regulation Act,” H 5291, Rhode Island General Assembly, 2025 Legislative Session, last accessed March 26, 2025. https://webserver.rilegislature.gov/BillText/BillText25/HouseText25/H5291.pdf 

[6] On error rates for the best age estimation technologies, see: Kayee Hanaoka, et al., “Face Analysis Technology Evaluation: Age Estimation and Verification,” NIST Internal Report 8525, May 2024. https://nvlpubs.nist.gov/nistpubs/ir/2024/NIST.IR.8525.pdf.

[7]  “The State of Play: Is Verifiable Parental Consent Fit for Purpose?” Future of Privacy Forum, June 2023. https://fpf.org/verifiable-parental-consent-the-state-of-play/

[8] Shoshana Weissmann, “Age verification legislation discourages data minimization even when legislators don’t intend that,” R Street Institute, May 24, 2023. https://www.rstreet.org/commentary/age-verification-legislation-discourages-data-minimization-even-when-legislators-dont-intend-that/ 

[9] Jason Kelley, “Hack of Age Verification Company Shows Privacy Danger of Social Media Laws,” Electronic Frontier Foundation, June 26, 2024. https://www.eff.org/deeplinks/2024/06/hack-age-verification-company-shows-privacy-danger-social-media-laws 

[10] Adrian Moore and Eric Goldman, “California’s Online Age-Verification Law is Unconstitutional,” Reason, Nov. 28, 2023. https://reason.org/commentary/californias-online-age-verification-law-is-unconstitutional/.

[11] Reno v. ACLU, 521 U.S. 844 (1997), U.S. Supreme Court, June 26, 1997. https://supreme.justia.com/cases/federal/us/521/844.

[12] For example, a quick step-by-step walkthrough for how to enable parental controls on any commonly-owned mobile device: “Parental Controls,” Internet Matters, https://www.internetmatters.org/parental-controls/

[13] Brown et al. v. Entertainment Merchants Assn. et al., 564 U.S. 786 (2011). U.S. Supreme Court, June 27, 2011. https://supreme.justia.com/cases/federal/us/564/786.

[14] Jennifer Huddleston, “Courts Should Affirm First Amendment Rights of Youths in the Digital Age: A Case for 21st-Century Tinker,” Cato Institute, Mar. 28, 2024. https://www.cato.org/briefing-paper/courts-should-affirm-first-amendment-rights-youths-digital-age-case-21st-century#.

[15] See, e.g.: Netchoice LLC v. David Yost, U.S. District Court for the Southern District of Ohio, Eastern Division, 2:24-cv-00047. https://netchoice.org/wp-content/uploads/2024/01/2024.01.09-ECF-27-ORDER-Granting-TRO.pdf, NetChoice LLC  v. Lynn Fitch, U.S. District Court for the Southern District of Mississippi, Southern Division, 1:24-cv-170-HSO-BWR https://netchoice.org/wp-content/uploads/2024/07/NetChoice-v-Fitch-District-Court-Preliminary-Injuction-Ruling-July-1-2024.pdf , and NetChoice v. Sean Reyes, U.S. District Court for the District of Utah, 2:23-cv-00911-RJS-CMR and 2:24-cv-00031-RJS-CMR https://netchoice.org/wp-content/uploads/2024/09/NetChoice-v-Reyes-2024.09.10-ECF-86-ORDER-Granting-PI.pdf