“It’s just not workable,” a fellow at the R Street Institute, Shoshana Weissmann, tells the Sun. Although AI impersonation “is a problem” and fraud laws should protect against it, that’s not what this law would do, she says. 

The bill defines likeness as the “actual or simulated image or likeness of an individual, regardless of the means of creation, that is readily identifiable” by virtue of “face, likeness, or other distinguishing characteristic.” It defines voice as “any medium containing the actual voice or a simulation of the voice of an individual, whether recorded or generated by computer, artificial intelligence, algorithm, or other digital technology, service, or device” to the extent that an individual is “readily identifiable” from the sound of it. 

“There’s no exception for parody, and basically, the way they define digital creations is just so broad, it would cover cartoons,” Ms. Weissmann says, adding that the bill would extend to shows such as South Park and Family Guy, which both do impersonations of people.

“It’s understood that this isn’t the real celebrity. When South Park made fun of Ben Affleck, it wasn’t really Ben Affleck. And they even used his picture at one point, but it was clear they were making fun of him. But under the pure text of this law, that would be unlawful,” she says. 

If the bill was enacted, “someone would sue immediately,” she says, adding that it would not pass First Amendment scrutiny. 

Lawmakers should be more careful to ensure these regulations don’t “run afoul” of the Constitution, she says, but “instead, they have haphazard legislation like this that just doesn’t make any functional sense.”

While the bill does include a section relating to the First Amendment defense, Ms. Weissmann says, it’s essentially saying that “after you’re sued under our bill, you can use the First Amendment as a defense. But you can do that anyway under the bill. That doesn’t change that.”

Because of the threat of being “dragged into court” and spending “thousands of dollars on lawyers,” the bill would effectively be “chilling speech,” she notes. 

One of the harms defined in the bill includes “severe emotional distress of any person whose voice or likeness is used without consent.” 

“Let’s say Ben Affleck said he had severe emotional distress because South Park parodied him,”  Ms. Weissmann says. “He could sue under this law. That’s insane, absolutely insane.”

The bill would be more workable if it was made more “specific and narrow to actual harms, and also made sure that people couldn’t sue over very obvious parodies,” she says. The way it’s drafted now, however, is “going to apply to a lot more than they intended,” she adds.