AI-generated blues character Eddie Dalton got a lot of column inches in the UK newspapers this weekend - including the Mail, Telegraph and Sun - but only because music released under that moniker has gained some slightly unusual traction on one of the UK’s official singles charts.
The mainstream news reporting was prompted by an AI-generated track credited to Eddie Dalton getting to what various click-bait journalists described as “number two” on the “official UK singles chart”. The Daily Mail reported that “Eddie Dalton’s song ‘Another Day Old’ peaked at number two on the official UK singles chart”, even though “Dalton’s velvet voice is actually just lines of generated code”.
What is not so obvious from that reporting is that ‘Another Day Old’ didn’t get to number two on what most people would conventionally recognise as the ‘singles chart’ - ie what is titled the ‘Official Singles Chart’ by the UK Official Charts Company - and which uses a combination of both sales and streaming data.
Instead, the song reached the number two position on a niche chart - the ‘Official Singles Sales Chart’ - which lists, the OCC explains, “the UK's biggest selling singles of the week, based on sales of downloads, CDs, vinyl and other formats, across a seven day period”. So the chart which explicitly excludes streams, by far the most significant form of music consumption.
On the week ‘Another Day Old’ was at number two on the sales chart, it didn’t appear at all in the main Top 100 of the actual Official Singles Chart. Nor did it appear in Spotify’s Daily Top Songs chart for the UK, which is a snapshot of the 200 most streamed tracks.
Various tracks by Eddie Dalton - released on April Fools Day - have been getting impressive media coverage but, in fact, very modest numbers on Spotify. ‘Another Day Old’ is the most popular with 1.73 million Spotify streams globally, while the AI artist has nearly 485,000 monthly listeners.
Compare that to the actual Official Singles Chart number two track in the same week - ie number two in the chart that does include streaming - ‘iloveitiloveitiloveit’ by Bella Kay, which got 4.1 million Spotify streams just in the UK during that chart week alone. And which has more than 177 million total global streams on Spotify as of today.
‘Another Day Old’ got to number two in the sales chart because of its downloads. Partly because you don’t actually need massive numbers to climb the sales-only chart these days. And possibly because someone gamed the system to boost download sales, maybe in order to get the high sales chart ranking for an AI artist and the accompanying coverage in the mainstream press.
Though, it’s worth noting, artificially boosting download sales, while sneaky and against the rules, isn’t as damaging as artificially boosting streams, in that you can’t profit from such a scam (you only get back what you put in minus the download store’s cut). Which means it’s a marketing expense. But one that might get you articles on the Mail, Telegraph and Sun websites.
However, even if Eddie Dalton’s chart performance was far from “astounding”, despite what the Telegraph reported, it has prompted further conversation within the music community about where AI-generated music should fit in, and what position different stakeholders in the industry should be taking.
What are the policies of streaming services and chart compilers when it comes to accepting, promoting and ranking AI tracks, and are they the right policies? And to what extent is AI music facilitating and furthering streaming fraud - which, unlike artificially boosting downloads, does negatively impact on the royalties received by everyone else in the music industry?
The streaming services are all currently figuring out what approach to take as millions of AI-generated tracks are uploaded to their platforms. Spotify currently accepts entirely AI-generated tracks, not wanting to get into the business of judging AI-generated music as being somehow inferior. However, it will remove AI music if it believes whoever created and delivered a track is a bad actor.
That could mean good old fashioned stream manipulation, so using bots to artificially boost the number of plays a track receives in order to receive higher royalties. It’s no secret people employing that scam increasingly use AI to generate the music they upload and manipulate.
And whenever AI-generated tracks gain traction on streaming services there is usually speculation that some sort of stream manipulation has been employed.
However, there is another increasingly prevalent form of AI-enabled streaming fraud, which involves tracks that include unapproved voice clones of known artists and/or deliberately misleading metadata designed to make it look like a track has been released by a known artist.
The people releasing that kind of AI-generated content may also employ stream manipulation or may simply hope that enough fans of the targeted artist organically stream the music to generate decent royalty payments, especially if their tracks appear on the artist’s official profile page within the streaming apps.
Spotify does have a form via which artists and their teams can report tracks that illegally exploit their likeness or trademarks. And last month it also announced a new beta service called Artist Profile Protection, which allows artists to approve any tracks before they appear on their Spotify profile, stopping tracks uploaded by bad actors, who are likely employing AI, from ever getting listed on those profiles.
That will also help make artists aware that rogue tracks are in circulation. Though - across all the different streaming platforms - there is still an increased amount of monitoring and admin work for artists and their teams to do in dealing with this kind of content.
As a result, more pressure is mounting on the distributors to do more to stop bad actors at the point of upload, possibly by implementing tougher ‘know you customer’ requirements when people set up accounts.
But what about people uploading AI-generated tracks that don’t employ stream manipulation or rip off the brands and voices of existing artists? Eddie Dalton is the creation of Dallas Little, who is listed as the composer and producer of the AI artist's tracks on Spotify. He has a company called Crunchy Records that specialises in making and releasing AI-generated recordings and videos.
He also insists that he is legitimately and transparently employing AI to generate recordings of songs he has written. In a statement to Showbiz411 last month, he said “I don’t appreciate how my work has been characterised - referring to it as a ‘content farm’ and suggesting people are being misled is inaccurate”.
“Every social media video is clearly labeled as AI-generated”, he added, and “many listeners are fully aware of that and enjoy the music for what it is”.
Currently Spotify treats those AI-generated tracks - which are not identified as being problematic - like any other track. But some in the industry believe AI-generated music should be treated differently, even in the absence of bad conduct.
That might mean those tracks are labelled as AI-generated within the streaming apps, or a platform could decide to not allocate any of its royalty pool to such tracks, or to not promote them via playlists and algorithms, or to block them from appearing on streaming services entirely.
One challenge is how the platforms identify what tracks are AI-generated. Apple Music is now encouraging labels and distributors to identify AI-generated music in the metadata when delivering new tracks, while Deezer has been hyping up the technology it has developed for spotting AI-generated music.
And most streaming services, including Spotify, are part of industry-wide conversations on how best to disclose the use of AI when delivering new music.
Another challenge is where to draw the line between ‘AI-generated’ and ‘AI-assisted’, which is important given the increasing number of artists who do make use of AI in some way as part of their creative process. The US Copyright Office has given some consideration to this question, as have some collecting societies that are allowing AI-assisted, but not AI-generated, works to be logged in their databases.
However, it’s a tricky line to draw, which is possibly one reason why the likes of Spotify are currently focusing their efforts on AI tracks where there is some other conduct that is problematic, rather than worrying about the extent to which AI might have been used in the creation of a song and recording.
A Spotify spokesperson tells CMU, “We believe the right response to AI in music isn’t any single policy, it’s a combination of proactive controls, industry-wide standards and a deeper investment in the human creativity behind every track. Our priority is addressing harmful uses like spam and impersonation, rather than trying to filter music based on how it was made”.
“That's partly because AI in music isn’t a binary”, they add, “it exists on a spectrum, from a producer using AI to master a track, to fully synthetic compositions, and everything in between. As the technology continues to evolve, we’ll keep rolling out new measures to protect artists and give listeners more context about what they're hearing”.