Testimony from:
Chris McIsaac, Governance Fellow, R Street Institute

Testimony in Opposition to S 44 and H 76, “An Act to Protect Against Election Misinformation,”

September 11, 2025

Joint Committee on Advanced Information Technology, the Internet and Cybersecurity

Chairman Moore, Chairwoman Farley-Bouvier and members of the joint committee:

My name is Chris McIsaac, and I am a resident fellow of Governance Policy the R Street Institute. The R Street Institute (RSI) is a nonprofit, nonpartisan public policy organization. Our mission at RSI is to engage in research and outreach to promote free markets, and limited, effective government in a variety of policy areas, including the intersection of artificial intelligence (AI) and election policy. This is why S 44 and H 76 are of particular interest.

Recent advances in artificial intelligence are impacting all aspects of modern life, including the way elections are administered and campaigns are run.[1] These advances have raised fears that AI technology will be used to create highly realistic “deepfakes” that could mislead voters at scale and undermine confidence in elections. In response to these concerns, there has been an uptick in the number of states regulating certain false statements about campaigns, candidates, and the election process, including six new states in 2025.[2] Currently, twenty-six states now have laws prohibiting or requiring disclosure of deceptive uses of AI in certain election communications. However,For example, an August 2025 decision struck down California’s prohibition law as a violation of the First Amendment and a lawsuit is pending against Minnesota’s law on similar grounds.[3]

By approving S 44 or H 76, Massachusetts would join this list of states by prohibiting the malicious distribution of a “materially deceptive election-related communication” within 90 days of an election. The proposed restriction, which would make Massachusetts the 4th state to pursue the prohibition approach rather than disclosure, applies not only to synthetic media produced by artificial intelligence but also forms of written, audio and visual media.[4] In addition, the bill outlines five specific subjects that are covered by the restriction, four of which relate to false statements about the election process and a fifth regarding candidate endorsements. While these bills are well-intentioned efforts to protect the public from deception and ensure a trustworthy electoral process, they raise significant Constitutional questions and would likely expose the Commonwealth to a First Amendment challenge.

The prohibition outlined in S 44 and H 76 is flawed for two primary reasons. First, the scope of the regulation extends beyond election administration- an area where the government has a legitimate role to play in dispelling false claims- and into the realm of political campaigns. Specifically, the prohibition applies to false information about the “express endorsement of a candidate or ballot initiative.” While it’s true that the public has an interest in knowing truthful information about who supports a candidate or initiative, the responsibility for correcting false claims about endorsements ultimately falls to the candidates and campaigns rather than enlisting the government to determine the truthfulness of core political speech.  

Beyond the candidate endorsement issue falling outside of the government’s legitimate scope, the second flaw in the proposal is slightly more nuanced but equally problematic. In particular, the prohibitions outlined in the bills primarily target false claims about the election process, such as polling locations, election dates, registration deadlines, and the certification process. Focusing on these topics is entirely reasonable because the government has a compelling interest in protecting the election process and a duty to ensure that the public has correct information about how to participate in the democratic process. However, unlike political campaigns where competition creates an incentive for candidates to respond to lies from the opposition, there’s no natural constituency poised to respond to false claims about the election process. This means that the government, in their capacity administering the election, needs to play a role. Unfortunately, S 44 and H 76 take the heavy-handed approach of prohibiting false claims about election procedures when a better and less restrictive method is for state and local election officials to engage in more speech by proactively distributing true information about the process in advance and aggressively countering false claims as they arise.  

Overall, the push toward protecting the public from exposure to election related deceptions and the prioritization of false claims about the election process is well meaning but imposing a prohibition on certain types of election related communications creates an untenable burden on free speech. For these reasons, we urge an unfavorable report on S 44 and H 76.

Thank you,

Chris McIsaac
Fellow, Governance
R Street Institute
cmcisaac@rstreet.org


[1]Chris McIsaac, “Impact of Artificial Intelligence on Elections,” R Street Policy Study No. 304, June 2024. https://www.rstreet.org/wp-content/uploads/2024/06/FINAL-r-street-policy-study-no-304.pdf 

[2] National Conference of State Legislatures, “Artificial Intelligence in Elections and Campaigns,” July 23, 2025. https://www.ncsl.org/elections-and-campaigns/artificial-intelligence-ai-in-elections-and-campaigns.

[3] Kohls v. Bonta, Order (granting summary judgment), Aug 29, 2025.  https://storage.courtlistener.com/recap/gov.uscourts.caed.453046/gov.uscourts.caed.453046.101.0.pdf. Chris McIsaac, “Update on 2025 State Legislation to Regulate Election Deepfakes,” R Street Institute, March 2025. https://www.rstreet.org/commentary/update-on-2025-state-legislation-to-regulate-election-deepfakes/.

[4] Ibid.