Stability AI has defeated the big copyright lawsuit brought against it in the UK by Getty Images. But before everyone gets excited, it’s not quite as ground-breaking a judgement as it could have been, given when the litigation was first filed it was seen by many as having the potential to be the landmark AI copyright case in the UK.
This is because the scope of the lawsuit was whittled down as it worked its way through the courts, meaning in the end the judge was only asked to rule on one specific and quite narrow copyright dispute.
The result is that, what initially looked set to be a landmark test case on copyright and AI in the UK, and therefore of relevance to all the copyright industries, including the music industry, actually leaves many issues unresolved.
Getty originally accused Stability of infringing the copyrights in its images by making unlicensed copies of them when training its generative AI image creation model Stable Diffusion .
However, Stability was able to demonstrate that its training - and therefore any copying - happened in the US, not the UK, meaning a UK court did not have jurisdiction to rule on that issue. As a result, Getty dropped those claims of ‘direct infringement’, which was the broader and more significant part of the original lawsuit.
That left the more specific claim that Stability was separately liable for so called ‘secondary infringement’, by making its Stable Diffusion product available in the UK market. That is the only claim that Judge Joanna Smith has now rejected, something she stresses in her judgement.
“This court can only determine the issues that arise on the ‘diminished’ case that remains before it”, she writes. And while the creative industries would like clarity on other copyright issues in the context of AI, “it is not part of this court’s task to consider issues that have been abandoned or to consider arguments that are no longer of relevance to the outstanding issues”.
Nick Eziefula from entertainment law firm Simkins highlights the ramifications of Smith’s narrower judgement. “Today’s decision”, he says, “will frustrate many in the creative industries, who are calling for stronger, modernised copyright protections against the unauthorised use of their work by AI developers”.
However, the judgement does not, he continues, “address some central copyright issues: whether using copyright material to train AI models amounts to infringement and whether AI-generated outputs can themselves infringe”. That’s because “Getty dropped its claims on those key issues due to jurisdictional hurdles, as much of the AI training occurred in the US”.
Most copyright lawsuits filed against AI companies - including those filed by music companies in the US and Continental Europe - accuse AI platforms of infringing copyright by making unlicensed copies of works as part of their training processes. -Which is exactly what Getty originally accused Stability of doing.
Many tech companies argue that AI training either is already, or should be, covered by a ‘text of data mining’ copyright exception, meaning they don’t need permission to make use of existing works. In the US, they argue that AI training is covered by the concept of ‘fair use’, which would also mean no permission is required.
UK copyright law does not include an explicit exception for commercial text and data mining - and the fair use principle does not apply here either - which made Getty’s AI lawsuit under UK law particularly interesting.
However, because Stability’s AI training was done in the US, where it would claim the fair use defence, Getty’s primary copyright infringement claim under UK law - that Stability had trained on copyright protected images and there was no copyright exception in the UK that allowed them to do so - was not something that the UK courts could consider. As a result, that element of the claim faltered.
However, Getty was still able to go ahead with its secondary infringement claim. Under UK law, a party is liable for secondary infringement if they are shown to be importing or commercialising “an article which is, and which he knows or has reason to believe is, an infringing copy of the work”.
So, for example, if you manufacture mugs in China using an image protected by copyright in the UK then import them and sell them in the UK, you might be liable for secondary infringement.
That’s fairly straight forward when it comes to physical products like mugs and t-shirts, but Getty’s argument was more complex from a legal standpoint.
The image licensing company argued that because Stable Diffusion was developed by infringing Getty’s copyrights in the US, and the model was then made available to users in the UK, Stability was liable for secondary infringement. That argument prompted much debate in court as to whether Stable Diffusion - a digital service that produces AI-generated images created by training on copyright protected sources - is either ‘an article’ or ‘an infringing copy’.
In her judgement Smith summarises Getty’s secondary infringement argument, noting that the image library “does not say that Stable Diffusion is itself a copy of, or that it stores within it any copies of, the copyright works”.
However, she explains, “Getty Images contend that Stable Diffusion is an infringing copy” under UK copyright law “because the making of its model would have constituted infringement of the copyright works had it been carried out in the UK”. It was that argument - had it been successful - that would have provided the basis for other copyright owners to take action against AI platforms operating in the UK, no matter where they had undertaken their training.
Back to Nick Eziefula at Simkins, who says that Getty was basically arguing that “bringing an AI model trained on unlicensed works into the UK was akin to importing infringing copies”. Smith disagreed, he adds, “ruling that existing laws, drafted in an era of physical piracy, cannot easily be stretched to cover AI systems that do not store copies of the original works”.
As a result, Eziefula continues, “This case underscores a growing gap between old copyright law and new technology. The creative sector is now looking to lawmakers - and future court battles - to deliver clearer answers and fairer frameworks for ethical, transparent AI innovation”.