Nov 30, 2023 2 min read

Sony Music digital chief asks Congress to close "legal loopholes" to ensure artists can stop authorised AI voice clones

Sony Music's President of Global Digital Business Dennis Kooker has spoken at a US Senate session on AI calling for a US-wide publicity right to make it easier for artists to stop authorised voice clones

Sony Music digital chief asks Congress to close "legal loopholes" to ensure artists can stop authorised AI voice clones

Sony Music’s President of Global Digital Business, Dennis Kooker, has called for a US-wide publicity right via which artists can protect their voices and identity, so that they can stop the unauthorised cloning of their vocals by AI.

He revealed that the major has now issued nearly 10,000 takedown notices seeking the removal of unauthorised voice clones, adding that digital platforms are using legal loopholes to "drag their feet" when dealing with such notices.

He’s also made it clear in a statement to US senators that he does not regard AI training as ‘fair use’ under copyright law - despite what many AI companies argue - saying: "Congress should assure, and agencies should presume, that reproducing music to train AI models, in itself, is not a fair use".

Kooker was speaking at another session instigated by members of US Congress exploring the impact of AI on copyright. This session was part of Senate Majority Leader Chuck Schumer's AI Insight Forum.

The Sony exec stressed that the music industry sees many positives in the rapidly evolving world of generative AI, but added that there are challenges to be tackled too, including the increasing number of "deepfakes and unauthorised voice clones of existing artists" that are being generated and posted online.

"An artist literally makes their livelihood from their voice", he said in his statement to Schumer's forum. "Deepfakes intentionally exploit an artist’s talent and reputation to steal that income stream. Every stream of a deepfake takes streams and royalty payments away from the legitimate artist".

An AI model that imitates an artist's voice needs to be trained with that's artist's music. The music industry fairly unanimously argues that this exploits copyright and therefore permission from the copyright owner is required.

However, there has also been much discussion this year about other legal protections that artists can use to stop unauthorised use of their voices, in particular publicity or personality rights.

In the US publicity rights exist at a state level and it remains unclear exactly how they will work in the context of deepfakes and voice clones. "Platforms are quick to point to the loopholes in the law as an excuse to drag their feet or to not take the deepfakes down when requested", Kooker revealed.

"Existing state ‘right of publicity’ laws are inconsistent", he went on, "and many are not sufficient to protect Americans against AI clones. Creators and consumers need a clear unified right that sets a floor across all fifty states".

He then noted the NO FAKES Act proposals presented by four US senators last month, adding that those proposals are "an important first step".

Great! You’ve successfully signed up.
Welcome back! You've successfully signed in.
You've successfully subscribed to CMU | the music business explained.
Your link has expired.
Success! Check your email for magic link to sign-in.
Success! Your billing info has been updated.
Your billing was not updated.
Privacy Policy