Testimony from:
Josh Withrow, Fellow, Tech & Innovation Policy, R Street Institute

Statement on SB 1253, the “Children’s Device Protection Act”

February 5, 2024

Idaho Senate State Affairs Committee

Chairman Guthrie and members of the committee,

My name is Josh Withrow, and I am a fellow with the Technology and Innovation Policy team at the R Street Institute, which is a nonprofit, nonpartisan, public policy research organization. Our mission is to engage in policy research and outreach to promote free markets and limited, effective government in many areas, including the technology and innovation sector.

SB 1253 requires every mobile phone or tablet activated within the state of Idaho to have a built-in “obscenity filter” that is enabled by default. Although the aim of the bill to protect children from exposure to obscene content is meritorious, a device-level content filter mandate is likely to prove ineffective and likely to be found unconstitutional because it poses an unduly broad burden on accessing speech.[1] It also substitutes a government mandate for parental choice in how to best protect children and teens who are given access to cell phones and tablets.  

Practically speaking, the bill would require mobile device manufacturers to run a modified operating system that builds a default-enabled content filter into every device sold in the U.S., because it applies to any such device “activated in this state.” This filter must block obscene materials accessed via “internet browsers or search engines via mobile data networks” or other internet connections.[2] This requirement is at least more technologically feasible than previous iterations of this bill which sometimes required a content filter that could somehow block all obscene content being accessed by any means on the mobile device.[3]

However, better and more thorough means to protect kids from accessing obscene material on a mobile device already exist than what this mandate is likely to accomplish. The U.S. Supreme Court considered a similarly broad duty to protect against the dissemination of obscene speech two decades ago in Reno v. ACLU, which resulted in a unanimous decision to strike down the majority of the Communications Decency Act (CDA) of 1996.[4] One of the court’s findings in Reno v. ACLU was that “The CDA’s burden on adult speech is unacceptable if less restrictive alternatives would be at least as effective in achieving the Act’s legitimate purposes.”[5] Similarly, mandating a new, device-level content filter is redundant with the many other methods of protecting minors from obscene content that already exist—some pre-built into apps, browsers and operating systems, and others which are easily available for parents to download.[6]

Forcing manufacturers to modify their operating systems to build in a content filter that every adult would have to figure out how to disable to have full access to the internet is far from a less-restrictive method to prevent children from accessing obscene content. Thus, these device content filter mandates will likely suffer a similar fate in court to the CDA.

These government-mandated content filters will compete with the existing, diverse market for parental online safety tools that already exist. This will doubtlessly undermine investment and innovation in the online safety market, as many parents are likely to default to just accepting and trusting whatever filters device manufacturers can get approved by the government.

The mandate may also send a false signal to parents that the filter-enabled devices are fully safe to protect their kids against obscene content, when in fact it can only prevent a couple of the many ways that such content may be received and accessed. Parental education on the superior resources that are already available to protect their children online would be a smarter and more effective way of protecting kids than placing the onus on device manufacturers.

Altogether, SB 1253 posits a new government mandate that would create additional burdens for both device manufacturers and their users to solve a problem for which private solutions already exist. Rather than burdening every adult who buys a smart device with having to figure out how to disable content filtering on their device, the responsibility should lie with parents to configure as they see fit any device to which their children may have access. And ultimately, if enacted, the bill will likely end up struck down by the courts on First Amendment grounds, after costly litigation.

Thank you for your time,

Josh Withrow
Fellow, Technology & Innovation Policy
R Street Institute
(540) 604-3871
[email protected]


[1] For more or R Street’s analysis on the problems posed by device content filter mandates, see: Josh Withrow, “Device-level Content Filter Mandates Defy Common Sense and the Constitution,” R Street Institute, June 1, 2023. https://www.rstreet.org/commentary/device-level-content-filter-mandates-defy-common-sense-and-the-constitution/

[2] HB 2661, “electronic devices; filters; obscene material,” Arizona State Legislature, 2023. https://www.azleg.gov/legtext/56leg/2R/bills/HB2661P.pdf

[3] E.g. HB 349, “An Act requiring an obscenity filter be enabled by default on electronic devices sold and activated in the state,” Montana State Legislature, 2023. https://leg.mt.gov/bills/2023/billhtml/HB0349.htm

[4] Reno v. ACLU, 521 U.S. 844 (1997), U.S. Supreme Court, June 26, 1997. https://supreme.justia.com/cases/federal/us/521/844.

[5] Ibid.

[6] See, e.g., “Child online safety tools,” Competitive Enterprise Institute, https://cei.org/children-online-safety-tools/ and “Parental Controls,” Internet Matters, https://www.internetmatters.org/parental-controls/