The entire entertainment industry is more or less aligned on the need for performers to be able to protect their likeness and voice in the context of AI. However, at the same time, the movie and gaming industries want to ensure any new rights in this domain don’t restrict their future output.
This is what we learned during a roundtable on ‘AI and protections for use of an individual’s name, image, likeness or reputation’, hosted by the US Patent & Trademark Office yesterday.
According to Law360, Benjamin Sheffner from the Motion Picture Association said that, while the movie studios support proposed new protections to help performers stop the distribution of unofficial deepfakes or vocal clones, “legislating in this area requires very careful drafting to address real harms, without inadvertently chilling or even prohibiting legitimate, constitutionally protected uses of technologies to enhance storytelling”.
Speaking for the gaming sector, Bijou Mgbojikwe from the Entertainment Software Association, echoed those concerns, stating that any new ‘digital replica’ right must be defined in a way that doesn’t negatively impact on video game development, for example by “reducing characters to more cartoonish or alien depictions rather than realistic depictions” for fear of breaking the law.
The roundtable followed some key developments last week when it comes to clarifying and introducing legal protections against unauthorised AI-generated imitations of likeness and voice.
The US Copyright Office published a report stating that there is an urgent need for a new US-wide digital replica right, after concluding that existing copyright and trademark law, and state-level publicity rights, don’t provide sufficient protection to combat unofficial deepfakes and vocal clones.
Meanwhile in the Senate, the NO FAKES Act was formally introduced, which sets out a framework for a new digital replica right like that described in the Copyright Office’s report.
Performers and companies from across the entertainment industries, including the music industry, have expressed concern about the use of AI to imitate likeness and voice. That technology is obviously also an opportunity for performers and their business partners, but only if the law provides a clear process for stopping unauthorised uses.
However, there are some divisions within the entertainment sector on how this should all work. Within the music industry, concerns have been expressed that business partners like record labels might be able to pressure artists to sign over any new digital replica rights.
Both the Copyright Office report and the NO FAKES Act set out proposals to address those concerns, by prohibiting the assignment and restricting the licensing of any new rights.
Meanwhile, for some parts of the entertainment industry there are concerns that strict new digital replica rights might impact on freedom of expression. Basically by making it harder for those creators and studios who are employing AI technologies in their productions to include real life individuals in their storytelling.
This is of particular concern to studios and producers, though could also negatively impact on writers and directors. To that end, some of the entertainment industry’s lobbyists will be looking for restrictions on any new right similar to the fair use principle in copyright law.
During yesterday's roundtable, Sheffner said that the MPA supports the NO FAKES Act as a “thoughtful effort to address abusive uses of digital replicas of likeness and voices”. However, he added that, where possible, it would be better to rely on existing laws to deal with the challenges posed by AI, rather than rushing to create too many new rights.
He also disagreed with the Copyright Office, which has proposed that any new US-wide digital replica right should complement rather than replace state-level publicity rights that could also be used to stop the unauthorised use of a person’s likeness or voice. Sheffner said a consistent approach across the entire US would be more desirable.