- AI-Generated Songs of Deceased Artists, Like Blaze Foley, have been mistakenly uploaded to Spotify
- The streaming service takes them down as they are stained
- The tracks slid past Spotify’s content verification processes through platforms like Soundon
Last week, a new country song called “Together” appeared on Spotify under the official artist’s page of Blaze Foley, a country artist shot and killed in 1989. The ballad was unlike his second work, but there it was: Cover Art, Credits and Copyright Information -like any other new single. Except for this was not a revealed trace of before his death; It was an AI-generated false.
After being marked by fans and Foley’s label, lost art registers and reported about 404 media, the course was removed. Another fake song attributed to the late land icon Guy Clark, who died in 2016, was also taken down.
The report found that the AI-generated tracks transported copyright tags showing a company named Syntax error as the owner, although not known little about them. Stumbling over AI-made songs on Spotify is not unusual. There are entire playlists with machine-nered LO-Fi beats and surrounding chillcore, already rapidly rapidly in millions of acting. But these numbers are typically presented under imaginary artist names and have usually mentioned their origins.
The attribution is what makes the Foley case unusual. An AI-generated song uploaded to the wrong place and false attached to real, deceased people are many steps beyond just sharing AI-created sounds.
Synthetic music that is embedded directly in the legacy of long-dead musicians without the permission of their families or labels is an escalation of the long-term debate on AI-generated content. The fact that it happened on a huge platform like Spotify and was not caught by the streamer’s own tools is understandably worrying.
And unlike some cases where AI-generated music is transferred as a tribute or experiment, these were treated as official releases. They appeared in the artist’s discographies. This latest controversy adds the troublingly wrinkle of real artists who are incorrectly represented by forgeries.
Posthum AI artists
As for what happened at Spotify’s end, the company attributed to Soundon, a music distributor owned by Tiktok.
“The content in question violates Spotify’s misleading content policies that prohibit imitation that intended to mislead, such as repeating another creator’s name, image or description or posing as a person, brand or organization in a misleading way,” Spotify said in a statement to 404.
“This is not allowed. We intervene against licensors and distributors who do not have the police for this kind of fraud and those who commit repeated or irregular violations can and have been permanently removed from Spotify.”
The fact that it was taken down is great, but the fact that the track appeared at all suggests a problem with marking these problems earlier. Given Spotify processes tens of thousands of new traces daily, the need for automation is obvious. However, this means that there may be no control of the origin of a track as long as the technical requirements are met.
This means not only for artistic reasons, but as a matter of ethics and economics. When generative AI can be used to produce fake songs in the name of dead musicians, and there is no immediate or foolproof mechanism to stop it, wonder how artists can prove who they are and get the credit and royalties they or their estates have earned.
Apple Music and YouTube have also struggled to filter deep phase content. And like AI tools like Suno and Udio make it easier than ever to generate songs in seconds, with lyrics and vocals to match, the problem will only grow.
There are verification processes that can be used, as well as building codes and watermarks for AI-generated content. However, platforms that prioritize streamlined uploads may not be fans of the extra time and forces involved.
AI can be a great tool to help produce and improve music, but it uses AI as a tool, not as a mask. If an AI generates a track and it’s labeled as such, it’s great. But if someone intentionally passes who acts as part of an artist’s heritage, especially one they can no longer defend, it is fraud. It may seem a smaller aspect of the AI debates, but people are interested in music and what happens in this industry can have consequences in all other aspects of AI use.



