The US Copyright Office has said there is “an urgent need for effective nationwide protection against the harms that can be caused” by unauthorised deepfakes and voice clones. To that end it urges US Congress to pass new laws similar to those proposed in the NO FAKES Act that was introduced in the US Senate yesterday.
That recommendation is made in the first of a series of reports on copyright and artificial intelligence, this one focused on what the Copyright Office refers to as AI-generated ‘digital replicas’, defined as “a video, image, or audio recording that has been digitally created or manipulated to realistically but falsely depict an individual”.
Launching the report, US Register Of Copyrights Shira Perlmutter says, “It has become clear that the distribution of unauthorised digital replicas poses a serious threat not only in the entertainment and political arenas but also for private citizens. We believe there is an urgent need for effective nationwide protection against the harms that can be caused to reputations and livelihoods”.
Based on comments submitted to the Copyright Office, as well statements made during relevant Congressional hearings, the report runs through the role existing laws play in helping people to stop the distribution of unauthorised digital replicas. That includes state-level publicity rights, US-wide copyright and trademark law, and relevant powers exercised by the Federal Trade Commission and Federal Communications Commission.
Having concluded that the protection provided by those existing laws is insufficient, the report also considers what a new digital replica right might look like and how it might be managed. It advises against allowing individuals to assign away any new right of this kind and says that any licensing deals in relation to a digital replica right should be restricted in length.
As public interest in generative AI has surged in the last couple of years, there has been much debate as to whether publicity or personality rights, which allow people to control to use of their image or identity, could be used by creators, performers and people more generally to stop the distribution of unauthorised deepfakes and voice clones.
In the US, these operate at a state level. As a result, the Copyright Office says, publicity rights are “inconsistent” around the country and often “insufficient”.
The report says, “some states currently do not provide rights of publicity and privacy, while others only protect certain categories of individuals. Multiple states require a showing that the individual’s identity has commercial value, not all states’ laws protect an individual’s voice; those that do may limit protection to distinct and well-known voices”.
Some states have revised their publicity rights in the context of digital replicas. For the music industry, of most note is the ELVIS Act in Tennessee, giving artists specific control over their voice. The Copyright Office’s report also notes relevant recent changes to publicity rights in Louisiana and New York. However, it concludes, in much of the country state-level laws do not profile sufficient protection.
Copyright law does also provide some control, in that to train an AI model to imitate someone’s likeness or voice, it will need to be exposed to content that features that person’s likeness or voice, which will likely be protected by copyright.
However, many AI companies argue that they can make use of copyright protected works without getting permission from the copyright owner. And even if it is ultimately decided that that is not the case, as the Copyright Office’s report notes, a person whose likeness or voice is being imitated may not own the copyright in video, images, audio or music in which they feature.
Having proposed the introduction of a new digital replica right, the Copyright Office’s report asks a number of questions about that new right, including whether it should protect an individual’s likeness or voice after their death, whether the right can be assigned, and how it might be commercialised.
Unsurprisingly, companies in the entertainment industry would like any new digital replica right to apply beyond a person’s death, and for people to be able to assign or at least license their digital replica right to a business partner.
The report notes that talent agency WME “argued in favour of postmortem rights” on the basis that unauthorised deepfakes “threaten to usurp estates’ valid interests in preserving and strengthening artists’ legacies through the legitimate use of AI”, and may “detract from the authenticity, credibility, and commercial value of an artist’s body of work”.
Meanwhile, Universal Music, it says, “stated that it is important that - as with all forms of intellectual property - a digital replica right should be eligible for assignment or licensing either in whole or in part, so that enforcement may be delegated”.
In the music industry, it seems likely artists will want to work with business partners like record labels to pursue opportunities around AI-generated digital replicas. However, some musician and manager groups have expressed concerns that artists may be pressured into assigning any new right over to a business partner on a long-term basis.
The Copyright Office seems to share that concern. It advises that people should be allowed to license their digital replica right to business partners - so that that partner can help protect and monetise the right - but that assignment, whereby ownership of the right would actually transfer to the business partner, should not be allowed.
“Although assignments are common in other areas of intellectual property, digital replica rights are most appropriately viewed as a hybrid of privacy interests and a form of property”, the report says. “Unlike publicity rights, privacy rights, almost without exception, are waivable or licensable, but cannot be assigned outright”. Therefore the Copyright Office says it recommends “a ban on outright assignments” of any new digital replica right.
There should also be restrictions on any licensing deals, it adds, given “unequal contracting power or knowledge, particularly in the context of employment or talent contracts”. Therefore, the Office suggests “limiting licences to a relatively short term, such as five or ten years”.
There are two sets of proposals currently in Congress that propose rights similar to those set out in the new report, the No AI FRAUD Act in the House Of Representatives and the NO FAKES Act in the Senate. The latter was first outlined in a discussion paper last year but was properly introduced yesterday.
The right proposed in the NO FAKES Act would extend beyond a person's life-time, for up to 70 years, subject to some use-or-lose-it requirements. The right would not be assignable and licensing deals would be limited to ten years.
It’s interesting that the Copyright Office has opted to mainly use the term digital replica rather than deepfake, a term first made popular on Reddit in 2017, initially in the context of content where the likeness of celebrities was inserted into pornographic videos.
The term deepfake does perhaps imply that a person’s likeness has been used without authorisation, although - as the Copyright Office’s report itself notes - the term has also been used in the context of legitimate projects where people have allowed their likeness to be imitated by AI.