Mar 6, 2026 3 min read

Lords tell UK government it needs to “choose between two AI futures”, before backing the kind of future the music industry wants

The UK government has to provide a plan around AI and copyright by 18 Mar. Ahead of that, a committee in the House Of Lords has published a report endorsing the music industry’s position: AI companies shouldn’t be given a free ride and must get licences when making use of existing content

Lords tell UK government it needs to “choose between two AI futures”, before backing the kind of future the music industry wants

With the UK government obliged to provide an update on its plans around copyright and AI later this month, the Communications And Digital Committee in the House Of Lords has put out a report that basically endorses everything the music and wider copyright industries have been calling for. 

So that means: no new copyright exceptions for AI companies; legal obligations for AI firms to be transparent about training data; and new protections around digital replicas. 

According to the Lords, the government must now “choose between two AI futures”. 

Either “become a world-leading home for responsible, licensing-based artificial intelligence development”. Or “drift towards acceptance of large-scale use of unlicensed creative content by opaque US-based AI models, allowing damage to our creative industries to go unchecked”. 

Unsurprisingly given that framing, the Lords support option one. As does the music industry 

Welcoming the new report, UK Music CEO Tom Kiehl echoes the Lords’ position. “The UK is at a crossroads and the government has a choice to make”, he says, “either become a global leader in ethical and transparent AI innovation, or sell our incredible cultural and creative sectors down the river to unscrupulous big tech firms”. 

“The future for AI in the UK should be based on transparent and responsible use of training data”, he goes on. “We are calling on the government to embrace the opportunities this presents, and to demonstrate its commitment to the UK’s gold-standard copyright regime and our outstanding creative industries in its forthcoming economic assessment and update on AI and copyright”. 

When the government embarked on a big consultation on copyright and AI at the end of 2024 it proposed introducing a new ‘text and data mining’ - or TDM - copyright exception similar to that which already exists in the EU. 

It would mean AI companies could make use of existing copyright protected content when training generative AI models without getting permission from copyright owners, except where rightsholders have explicitly ‘reserved their rights’ through some kind of formal opt-out process. 

Politicians saw this as a compromise position between AI companies - which say they need easy access to large quantities of training content - and copyright owners - who say AI businesses should ask for permission and pay for licences if they need to use existing creative works when training models.  

However, it’s not a compromise that works. The copyright industries are strongly against any new exceptions in copyright law to benefit AI companies. And it’s not clear that a TDM exception with rightsholder opt-out is that much use for AI companies either. 

Following a major backlash from the creative industries, ministers backtracked a little on their support for the TDM exception with opt-out, but have generally indicated they are still looking for a compromise position, rather than picking a side, even though it’s not clear any real workable compromise exists. 

AI expert and campaigner Ed Newton-Rex published a paper this week on another possible compromise that is being considered - a copyright exception that would allow AI companies to train models with copyright protected material without permission during the development phase, but said companies would then have to secure licences before commercialising any models.

As Newton-Rex explains, that doesn’t work for AI companies, because if just one copyright owner refuses to licence at the commercialisation stage, the millions an AI developer potentially spent on training a model would be wasted. 

The only workaround to that problem would be a compulsory licence - forcing rightsholders to license at that commercialisation stage, most likely at standard rates - but for the copyright industries that’s an even worse proposal than a TDM exception with opt-out. 

As far as the Lords on the Communications And Digital Committee are concerned, the government needs to abandon its bid to find some kind of compromise that can work for both freeloading AI companies and the country’s all important creative industries, and instead stand up for copyright and support the growth of a UK AI licensing market. 

Committee member Barbara Keeley says, “The government should now make clear it will not pursue a new TDM exception with an opt-out mechanism for training commercial AI models”. Instead, ministers should “create the conditions that will allow a licensing-first approach to AI training to flourish”.   

Because of provisions set out in the Data (Use And Access) Act last year, the government is obliged to publish a plan on AI and copyright by 18 Mar. It remains to be seen what ministers say. 

Great! You’ve successfully signed up.
Welcome back! You've successfully signed in.
You've successfully subscribed to CMU | the music business explained.
Your link has expired.
Success! Check your email for magic link to sign-in.
Success! Your billing info has been updated.
Your billing was not updated.
Privacy Policy