The boss of cross-sector trade group UK Music, Jamie Njoku-Goodwin, has written to Culture Secretary Lucy Frazer urging the UK government to adopt five key principles when considering how it goes about regulating artificial intelligence.
In the letter, Njoku-Goodwin notes that government has requested a clear position from the music industry regarding the challenges posed by AI, and especially generative AI. UK Music, and some of its member organisations, are also participating in a working group convened by the UK’s Intellectual Property Office which is seeking to put together a code of practice regarding copyright and AI.
Generative AI can create new content – but in order to be able to do this, needs to be ‘trained’ on examples of existing content. In some cases, this training is done using content specifically developed to train the AI. However, in other cases it is less clear what data has been used, how that data has been obtained, and whether use of any one particular data set requires permission from copyright owners.
As AI tools become more sophisticated, and an increasing number of companies begin to develop and release generative AI models, the various copyright questions these tools raise become more pressing.
That includes the issue of whether or not developers who train their generative AI with existing copyright-protected materials need a licence from the relevant copyright owners. The music industry – and other copyright industries – would argue that using copyright-protected material to train generative AI definitely needs an explicit licence from the owners of the copyright.
However, some tech companies argue that the training of AI models is covered by exceptions in copyright law, at least in the countries where they base their servers, which would mean that licences are not required. And at one point the UK government proposed putting a specific exception covering ‘data mining’ – which would apply in this scenario – into copyright law, although it subsequently backtracked on that proposal.
Other questions relate to how copyright owners even know if their content has been used for training purposes, and whether there should be an obligation on AI companies to clearly state what data they have used to train their AI. And there is also the question as to whether content generated by AI should enjoy copyright protection.
There has also been lots of discussion in the music industry about AI tools that can generate ‘deep fake’ vocal clones of established artists, allowing almost anyone to create AI-generated output that mimics the vocal characteristics of a real person. Those cloned vocals can then be integrated into tracks that in some cases purport to be the work of the artist whose voice has been cloned.
That takes the conversation beyond copyright. How can artists protect their vocal style – or for that matter their visual identity – from being used without their permission? In some countries, they could possibly rely on what are variously called image, publicity or personality rights for such protection, although that concept doesn’t exist in UK law.
In his letter to Frazer, Njoku-Goodwin writes: “Ultimately, we believe that a responsible and balanced approach to AI must be centred on the principle of consent. As an industry, we are excited about some of the opportunities that AI offer, and want to work with the technology sector to help seize these opportunities”.
“However”, he goes on, “it is not acceptable for creators’ work or their identity to be used by AI developers without their consent. Taking other people’s work without their permission contravenes basic principles of property rights, undermining both creator incomes and the economic model which has enabled the UK to build a world-leading music industry”.
The letter is accompanied by a policy document which outlines the five key principles identified by UK Music that, it says, should guide any policy-making in this domain. That policy document echoes what Njoku-Goodwin previously wrote in a opinion editorial piece on AI last month, and also the priorities of the globally focused Human Artistry Campaign which UK Music is supporting.
The five key principles are as follows…
1. Creators’ choice: The creator, or their chosen rightsholder, should be able to decide if and how they want to use their creative talent. This certainty underpinned by legal rights (copyright) should not be undermined by any exception to copyright or compulsory licensing during the input stage. Users need to respect creators’ choice as baseline for any discussions.
2. Record keeping: It is important that in the input stage, the technology providers keep an auditable record of the music ingested before the algorithm generates new music. This is the only point in the process when these data points can be documented.
3. Without human creativity there should be no copyright.
4. Labelling: Music generated by AI should be labelled as such.
5. Protection of personality rights: A new personality right should be created to protect the personality/image of songwriters and artists.