Jan 22, 2024 2 min read

Anthropic expands on its fair use defence in music publisher legal battle

Anthropic has asked a court in Tennessee to reject a preliminary injunction request from a group of music publishers that accuse the AI company of copyright infringement. Anthropic insists there is no need for the court to intervene because training AI models with existing content is fair use

Anthropic expands on its fair use defence in music publisher legal battle

AI company Anthropic last week formally set out a fair use defence in its legal battle with three music publishers, stating that using lyrics "to train an AI model, particularly one not technologically capable of outputting the texts of those songs going forward, is a classic fair use that does not constitute infringement of plaintiffs’ copyrights". 

That statement was made as the AI firm, which is backed by Microsoft and Google, urged the courts in Tennessee to reject a request by Universal Music Publishing, Concord and ABKCO for a preliminary injunction that would impact on its chatbot Claude. 

After suing Anthropic for copyright infringement in October 2023, the publishers then requested an injunction that would force the AI company to do two things. First, to ensure that their lyrics are not used to train any future AI models it develops. And secondly, to ensure that the current iteration of Claude doesn’t spit out any lyrics owned by the music companies. 

Anthropic deals with those two requests separately in its response to the court. The fair use defence is relevant to the first part of the injunction. The copyright industries, including the music industry, argue that training a generative AI model with existing content requires permission from relevant copyright owners. Therefore Anthropic is liable for copyright infringement, because it trained Claude with lyrics without a licence from the publishers. 

However, many tech companies counter that AI training constitutes 'fair use' under American law, meaning no permission is required. Continuing, Anthropic's latest legal filing says:"Relying on the fair use doctrine, courts have consistently found that making ‘intermediate’ copies of copyrighted materials to develop new technologies does not violate copyright law". 

On the second part of the injunction, Anthropic argues that no court intervention is necessary, because it doesn’t want Claude to provide users with any lyrics that are owned by the music companies. After all, it adds, the "purpose of training on data including songs is not to reproduce the lyrics themselves, but to learn general ideas and concepts about how language works, in all its forms". 

The publishers have claimed that if a user prompts Claude to provide lyrics to songs they have published - including ‘American Pie’, ‘What A Wonderful World’, ‘Moves Like Jagger’ and ‘Uptown Funk’ - the chatbot "will provide responses that contain all or significant portions of those lyrics”. 

However, Anthropic says in its new filing that since the publishers instigated their lawsuit it has voluntarily "built additional safeguards to prevent display of plaintiffs’ works", which means “it is unlikely that any future user could prompt Claude to reproduce any material portion of the works-in-suit”. 

That said, when CMU provided some simple prompts to Claude it still regurgitated some key elements of the publishers’ lyrics. You can see more about those tests here

Anthropic previously responded to the music companies’ lawsuit last November, but that initial response mainly focused on jurisdiction issues

The publishers have gone legal in Tennessee and Anthropic argues that the litigation should be fought in California, where it is based, and where many of the other lawsuits testing the copyright obligations of AI companies have been filed.

Great! You’ve successfully signed up.
Welcome back! You've successfully signed in.
You've successfully subscribed to CMU.
Your link has expired.
Success! Check your email for magic link to sign-in.
Success! Your billing info has been updated.
Your billing was not updated.
Privacy Policy