Sep 10, 2024 3 min read

NO FAKES Act is a “blunt solution” that “goes too far” say tech companies

The NO FAKES Act in the US - seen by many in the music industry as a break-through legislative safe-guard against the threat of AI - is actually a bit rubbish. At least that’s what a group of signatories to a new letter say, adding that the new legislation could have unintended consequences

NO FAKES Act is a “blunt solution” that “goes too far” say tech companies

The NO FAKES Act in the US has come under fire from organisations representing technology companies, libraries and digital rights advocates which insist that the proposed new legal framework to regulate AI-generated digital replicas “goes too far”. The proposals, formally introduced into US Senate in July and widely welcomed by the music industry, offer “a blunt solution before we understand the problem”, an open letter declares. 

“We understand and share the serious concerns many have expressed about the ways digital replica technology can be misused, with harms that can impact ordinary people as well as performers and celebrities”, the letter says, adding, “these harms deserve the serious attention they are receiving and preventing them may well involve legislation to fill gaps in existing law”.

However, regulation already exists at both a federal and state level, continues the letter,  through publicity rights, and laws relating to fair competition, false advertising and privacy. This means that Congress should concern itself with introducing new rules where there are actual gaps in the current legal framework, argues the letter, and that “The recently-introduced NO FAKES bill goes too far in introducing an entirely new federal intellectual property right”.

Interestingly, while the letter expresses concerns about the NO FAKES Act introducing new liabilities for digital platforms - which is unsurprising given the tech company signatories - it also claims that the rights the act creates for individuals could be exploited by companies to the detriment of those individuals.

For the music industry, the NO FAKES Act is an attempt to create a new US-wide right that would allow musicians and performers to protect and monetise their voice and likeness in the context of AI. They could stop third parties from creating unapproved digital replicas and work with business partners to drive new income streams through the approved use of their voice and likeness in AI projects. 

There has been some concern in the performer community that movie studios and record labels might pressure actors and artists to assign them those rights - or assume they already have them via wide-ranging image rights clauses in old contracts - and then exploit them in a way that would benefit the studio or label more than the performer. 

However, the NO FAKES Act includes provisions to stop that from happening, in that the new digital replica right would be unassignable, licensing deals would be limited to ten years, and any deals would need to include “a reasonably specific description of the intended uses of the applicable digital replica”. 

But those protections are not sufficient, this new open letter argues. “While NO FAKES includes some limitations on license and transfer, it still leaves room for abuse”, leaving the door open “to the use of a simple click-through licence to obtain an exclusive ten year right to create indefinitely many sound recordings or audiovisual works that match a particular description”. 

“Professional performers and private people alike could find themselves alienated from their own likeness for up to a decade”, it claims, “unable to create or authorise the creation of works that incorporate their digital replica”. Plus, “nothing in the bill would stop a licensed user from deploying a digital replica to create misinformation, including videos that show someone doing or saying things they never did or said, with no disclosure that the performances are synthetic”. 

Unsurprisingly, First Amendment free speech concerns are also raised in the letter, which states that the act “creates a chilling presumption that any use of a digital replica requires authorisation, subject to a closed list of exceptions with uncertain scope”. And it also argues that the new digital replica right, as an intellectual property right, is in itself unconstitutional because “a person’s voice and appearance are matters of fact” and “the constitution prohibits granting an exclusive intellectual property right in their use”. 

Ultimately, the letter - and the organisations behind it - reckon that American law-makers should focus on regulating specific improper uses of generative AI, such as what it calls “AI-generated non-consensual intimate imagery”, rather than introducing a wide-ranging new right. 

The music industry’s lobbyists will likely argue that this letter is really about tech giants once again seeking to avoid being held responsible for content on their platforms by pretending to stand up for the individual users and free speech. However, it could slow the momentum in Washington in relation to new laws around digital replicas. 

Signatories of the letter include the Association Of Research Libraries, American Library Association, Computer & Communications Industry Association, Center For Democracy And Technology and Electronic Frontier Foundation, as well as ReCreate, which brings together all those organisations. 

Great! You’ve successfully signed up.
Welcome back! You've successfully signed in.
You've successfully subscribed to CMU.
Your link has expired.
Success! Check your email for magic link to sign-in.
Success! Your billing info has been updated.
Your billing was not updated.
Privacy Policy