California’s governor signed laws to protect actors from unauthorized AI use
California Governor Gavin Newsom signed legislation on Tuesday designed to protect actors and performers from unauthorized use of artificial intelligence to create digital clones of themselves. California is among the first states to introduce such protections, following Tennessee, which enacted a similar law for musicians in March.
The new laws are intended to promote responsible AI usage without stifling innovation, although critics, including the California Chamber of Commerce, argue that they may be difficult to enforce and could result in lengthy legal disputes. These measures are part of a broader set of regulations introduced this year to address the growing influence of AI, with other proposals, such as those targeting election-related deepfakes, still pending.
Newsom’s actions reflect a balancing act between protecting workers and fostering the state’s evolving AI industry. The legislation allows performers to withdraw from contracts with ambiguous language that might allow studios to use AI to create digital versions of their likeness or voice. This law, supported by the California Labor Federation and SAG-AFTRA, will take effect in 2025.
Another law prevents the digital cloning of deceased performers for commercial purposes without permission from their estates. This move addresses concerns like those raised by an incident where a media company created an AI-generated comedy special in the style of the late comedian George Carlin without his estate’s consent.
SAG-AFTRA President Fran Drescher hailed the new laws as a significant victory, highlighting that California’s actions often set a precedent for national policy.