A group of organisations that speak for songwriters, performers and their collecting societies has published a set of seven principles which they say should guide the regulation of artificial intelligence and especially generative AI.
Echoing previous statements from the music industry, they insist that AI companies must seek licences when using existing music as training data, and that there should be full transparency about what data any one AI company has utilised.
Introducing those seven principles, the organisations state: “The cultural sector and international creative community acknowledge there are a number of useful and important purposes to which AI more generally is currently being applied. However, in the case of generative AI there is a clear and urgent need for policymakers around the world to take action, adapt and improve current regulatory regimes”.
And as policymakers do that, they add, “it is imperative that the cultural sector and international creative community are at the table in those policy discussions, to ensure their interests are incorporated and, in turn, that AI systems are transparent, ethical, fair and lawful”.
Generative AI models – which can generate original text, images, audio and video – are ‘trained’ by being exposed to existing content. The music industry is adamant that where copyright-protected works are used in that training process, AI companies must seek permission from the relevant copyright owners.
However, some AI companies have suggested that there may be copyright exceptions in one country or another that actually allow them to train their models with copyright-protected works without seeking permission or negotiating a licence.
With that in mind, the music industry – and the copyright industries at large – are seeking clarity from lawmakers regarding the copyright obligations of AI companies.
And as part of that, they also want those companies to be obliged to keep detailed records about what datasets have been used as part of the training process, and to make that information public, so that copyright owners know if their content has been used.
Getting clarity on the copyright and transparency obligations of AI companies has become all the more urgent as generative AI rapidly becomes more sophisticated and much more widely used. As a result, the music industry has become very vocal on this issue in the last year.
In March, the US music industry launched the Human Artistry Campaign which also sets out some key priorities for lawmakers considering how to regulate AI. Although originating in the US, that campaign is global and has been backed by numerous music industry organisations around the world, and groups representing copyright owners beyond music.
In terms of the campaign’s music industry supporters, there are trade bodies for record labels and music publishers, as well as organisations representing artists, songwriters and performers.
A number of the organisations involved in putting together the seven principles that were announced yesterday are also supporting the Human Artistry Campaign.
Although, with a couple of exceptions – ie the IMPF with represents independent music publishers and CIAGP which is an organisation for the visual arts – these organisations primarily represent music-makers and their collecting societies.
As is often the case with new technology, from a music-maker perspective there are two challenges. First, the music-makers need to work alongside their business partners, like labels and publishers, to ensure that the music community isn’t screwed over by the tech sector.
But then music-makers may also need to go to battle with those business partners, some of which may negotiate and manage deals with the tech companies that seem to benefit their shareholders much more than the artists and songwriters they represent.
So when it comes to AI, expect to see music-makers campaigning side by side with labels and publishers, for example via the Human Artistry Campaign, but also seeking to ensure that any deals negotiated by the labels and publishers are fair for the wider music community.
That said, while the organisations behind yesterday’s seven principles are more skewed towards music-makers, those seven principles make similar demands to the Human Artistry Campaign, and are mainly focused on ensuring solid regulation by governments of the AI companies.
Those organisations, in addition to IMPF and CIAGP, include a number of global and regional organisations speaking for music-makers, and especially songwriters, including CIAM (global), ALCAM (Latin America), AMA (Africa), APMA (Asia-Pacific), ECSA (Europe) and MCNA (North America).
Also involved is CISAC, which brings together song right collecting societies around the world, and two organisations that bring together performer societies, AEPO-ARTIS in Europe and SCAPR on a global basis.
The seven principles are as follows…
1. Creators’ and performers’ rights must be upheld and protected when exploited by AI systems
AI systems analyse, scrape and exploit vast amounts of data, typically without authorisation. These datasets consist of musical, literary, visual and audiovisual works and performances protected by copyright. Those copyright works and datasets have a value, and creators and performers should be in a position to authorise or prohibit the exploitation of their works and performances and be compensated for such uses.
2. Licensing should be enabled and supported
Licensing solutions should be available for all potential exploitation of copyright works, performances and data by AI systems. This would encourage open exchanges between innovators who require the data, and creators and performers who wish to understand how and to what extent their works will be used.
3. Exceptions for text and data mining which do not provide for effective opt-out by rightsholders should be avoided
The introduction of exceptions, including for text and data mining, that permit AI systems to exploit copyright works and performances without authorisation or remuneration must be avoided. Some existing exceptions should be clarified, in order to provide legal certainty for creators of the underlying data and performers, as well as for AI systems wishing to benefit from such data.
4. Credit should be given
Creators and performers must be entitled to obtain recognition and credit when their works and performances have been exploited by AI systems.
5. Transparency obligations should apply to ensure fairer AI practices
Legal obligations relating to disclosure of information should apply. These should cover (i) disclosure of information on the use of creative works and performances by AI systems, in a sufficient manner to allow traceability and licensing; and (ii) identification of works and performances generated by AI systems, as such. This will ensure a fair approach towards creators, performers and consumers of creative content.
6. Legal responsibility for AI operators
There should be legal requirements for AI companies to keep relevant records. There should also be effective accountability for AI operators for activities and outputs that infringe the rights of creators, performers and rightsholders.
7. AI is only an instrument in the service of human creativity, and international legal understandings should reinforce this
AI models should be considered as simply an instrument at the service of human creativity. While there is a spectrum of possible levels of interactions between humans and AI to consider when defining the protectability of works and performances, policymakers should make clear that fully autonomous AI-generated works cannot benefit from the same level of protection as human-created works. This topic should be an urgent priority and global discussions should be initiated rapidly.