Mar 4, 2024 3 min read

Stability files defence in important test case on AI and UK copyright law

Stability AI has argued that it can’t be held liable for copyright infringement in the UK, because training of its Stable Diffusion AI platform took place on servers in the US. Getty Images says that the company infringed its copyrights by training its AI model on pictures it owns without permission

Stability files defence in important test case on AI and UK copyright law

Stability AI last week filed a defence in its UK copyright legal battle with Getty Images. Stability argues that it is not liable for copyright infringement within the UK because the training of its Stable Diffusion platform happens in the US. Meanwhile, any images generated by the platform for UK users which are similar to Getty-owned pictures are not direct copies and should be considered "pastiche", which would allow the AI company to rely on a copyright exception. 

In a lawsuit filed with the London courts last May, Getty accuses Stability of making copies of its photos without permission when training Stable Diffusion. Although not a music case, it is an important legal battle testing the copyright obligations of generative AI companies under UK law. As a result, it is of interest to all copyright industries, including the music industry. 

"Stable Diffusion was not trained in the UK and we expect to be fully vindicated at trial", a spokesperson for the AI company told Law360 last week, after the formal defence papers had been filed. In its legal filings, Stability says that the training of Stable Diffusion took place on an Amazon Web Services cloud computing cluster located in the US, a few thousand miles away from the jurisdiction of the London courts. 

The location of the actual AI training is important. When sued in the US courts, AI companies argue that making copies of works as part of an AI training process constitutes fair use and therefore no permission is required. But there is no fair use defence under UK law, nor a specific data mining copyright exception that can be relied on. Hence Stability is so keen to stress that no training of Stable Diffusion happened within the UK. 

Even if Stability wins that part of the dispute, Getty has other arguments. For example, it wants the AI firm to be held liable for secondary infringement by making Stable Diffusion - wherever it may have been trained - available in the UK. Getty notes that UK law states that "the copyright in a work is infringed by a person who imports into the UK an article which is, and which he knows or has reason to believe is, an infringing copy of the work". 

Stable Diffusion is a copyright infringing article, Getty argues. Though Stability counters that when that rule talks about 'articles', it means tangible objects, and therefore the rule does not apply to a person or company making software available via a website to UK users. 

Stability actually tried to have Getty's case struck out last year based on its arguments that all training took place in the US and the claim of secondary infringement was not valid. However, in December the judge overseeing the litigation said the case should go to trial so that more evidence could be gathered and presented to inform both those arguments. Hence the filing of a formal defence by Stability last week. 

The other claim that is commonly made in copyright lawsuits against AI companies is that some works generated by an AI model are sufficiently similar to works in its training dataset that there is another claim for infringement on the output. 

Like other AI companies defending copyright actions, Stability argues that that is based on a misunderstanding of how its technology works. The "synthetic images" generated by its model are "diffused from random noise", it argues, and are never direct copies of any training materials. 

While it may be possible to get Stable Diffusion to generate images similar to existing Getty images, that requires very specific prompting from a user, it adds, which is only likely if that user - say Getty or its agents - is specifically seeking such similarity. Which is to say, if an image is "induced under highly artificial conditions". 

Plus, in what is perhaps Stability's most interesting argument, it says if any image generated by its AI is similar to a Getty image, that should be considered a pastiche. Doing so would be useful, because, under UK law, using a work "for the purposes of caricature, parody or pastiche does not infringe copyright in the work". 

The case continues.

Great! You’ve successfully signed up.
Welcome back! You've successfully signed in.
You've successfully subscribed to CMU.
Your link has expired.
Success! Check your email for magic link to sign-in.
Success! Your billing info has been updated.
Your billing was not updated.
Privacy Policy