SoundCloud has announced that it is changing its terms of service to address concerns that an AI clause added last year was too broad and suggested that the digital platform might train AI with its users’ content.
The changes were announced in a blog post by CEO Eliah Seton who admitted that the AI clause as previously written wasn’t clear enough, adding “that’s on us - that’s why we’re fixing it”, while insisting that SoundCloud’s use of AI is “focused on discovery - helping fans find new music and helping artists grow, starting with their first fans - that’s core to our mission”.
Although that change to SoundCloud’s terms “is being presented as a win for artists”, Ed Newton-Rex, founder of Fairly Trained, says in a blog post that it doesn’t address concerns he and others raised last week and, like previous SoundCloud statements, “it still reflects a position that is fundamentally terrible for artists when you read between the lines”.
SoundCloud's amended AI term now says “we will not use your content to train generative AI models that aim to replicate or synthesise your voice, music or likeness without your explicit consent, which must be affirmatively provided through an opt-in mechanism”.
However, Newton-Rex argues that making that commitment specific to “generative AI models that aim to replicate or synthesise your voice, music or likeness” is too specific, because it leaves open the possibility of users’ content being used to train AI models that seek to generate generic content.
“It would have been very easy to simply say, ‘We will not use your ontent to train generative AI models without your explicit consent’”, Newton-Rex writes, adding “Why didn’t they?”
SoundCloud needs to make further amendments to truly fix the problem, he adds, asking why the company has not been more explicit in how it will use - or not use - content for AI.
Concerns were raised last week about the change to SoundCloud terms, originally made last year, that said that creators uploading content to SoundCloud were automatically opted-in “to explicitly agree that your content may be used to inform, train, develop or serve as input to artificial intelligence or machine intelligence technologies or services as part of and for providing the services”.
After Newton-Rex highlighted that term online - and with a backlash mounting from outraged creators who felt the clause was sufficiently broad that it would allow SoundCloud to use content on the platform to train generative AI models - the company quickly issued a statement insisting that the AI clause was added simply so that users could benefit from AI-powered discovery, recommendation and anti-fraud tools.
After reiterating that point in his blog post yesterday, Seton added, “SoundCloud has never used artist content to train AI models. Not for music creation. Not for large language models. Not for anything that tries to mimic or replace your work. Period. We don’t build generative AI tools, and we don’t allow third parties to scrape or use artist content from SoundCloud to train them either”.
Noting that Seton’s blog post also says that “AI should support artists, not replace them”, Newton-Rex lays down a challenge, adding that if Seton, “and SoundCloud genuinely believe this, the solution is simple”. Make the new term unambiguous: “we will not use your content to train generative AI models without your explicit consent”.
“With that change”, he concludes, “musicians will be able to trust that SoundCloud will not train generative AI models on their music without their permission, and I suspect people would return to the service. I certainly would, and I’d be the first to thank them for listening to artists’ concerns”.