AI music is fine until it starts pretending to be real people

6 hours ago 5
Music record
(Image credit: Pixabay/chienba)

AI-generated music is becoming more widespread but not necessarily popular. And that's just the publicly acknowledged AI music. Now, artists are dealing with seeing their name and voice attached to music they never performed or approved of, even if they passed away decades ago.

The most recent high-profile incident occurred when English folk singer Emily Portman heard from a fan who liked her new release, except the album, Orca, though released under her name, was entirely fake. The whole thing had been pushed live on Spotify, iTunes, YouTube, and other major platforms without her knowledge or consent.

Portman took to social media to warn her fans about what was happening. The fact that the AI could mimic her artistic style well enough to trick some fans just added to the creep factor. It took weeks for Spotify to address the problem, and you can still see the album on Spotify even if the music is gone.

Portman joins a litany of acts, from pop artist Josh Kaufman to country artists Blaze Foley, who passed away in 1989, and Guy Clark, who died in 2016, in having her work mimicked by AI without her approval.

It seems we’ve moved past the novelty of AI remixes and deepfake duets into digital identity theft with a beat. The thieves are often good at being quiet in their releases, able to score whatever royalties might trickle in.

Further, even getting the music taken down might not be enough. A few days after the initial incident, Portman found another album had popped up on her streaming page. Except this time, it was just nonsense instrumentals, with no effort to sound like the musician.

AI's future sounds

Having scammers use AI to steal from actual artists is obviously a travesty. There are some blurry middle grounds, of course, like never-real musicians pretending to be humans. That's where AI-generated “band” Velvet Sundown stands.

Sign up for breaking news, reviews, opinion, top tech deals, and more.

The creators later admitted the origin of the AI band, but only after millions of plays from a Spotify profile showing slightly uncanny images of bandmates that didn’t exist. As the music was original and not directly ripped from other songs, it wasn’t a technical violation of any copyright laws. The band didn’t exist, but the royalties sure did.

I think AI has a place in music. I really like how it can help the average person, regardless of technical or musical skills, produce a song. And AI tools are making it easier than ever to generate music in the style of someone else. But, with streaming platforms facing 99,000 uploads a day, most of which are pushed through third-party distributors that rely on user-submitted metadata, it’s not hard to slip something fake into a real artist’s profile. Unless someone notices and complains, it just sits there, posing as the real thing.

Many fans are tricked, with some believing Orca was really Emily Portman’s new album. Others streamed Velvet Sundown, thinking they’d stumbled onto the next Fleetwood Mac. And while there's nothing wrong with liking an AI song per se, there's everything wrong with not knowing it is an AI song. Consent and context are missing, and that fundamentally changes the listening experience.

Now, some people argue this is just the new normal. And sure, AI can help struggling artists find new inspiration, fill in missing instrumentation, suggest chord progressions, and provide other aid. But that’s not what’s happening here. These are not tools being used by artists. These are thieves.

Worse still, this undermines the entire concept of artistic ownership. If you can make a fake Emily Portman album, any artist is at risk. The only thing keeping these scammers from doing the same to the likes of Taylor Swift right now is the threat of getting caught by high-profile legal teams. So instead, they aim lower. Lesser-known artists don’t have the same protections, which makes them easier targets. And more profitable, in the long run, because there’s less scrutiny.

And there's the issue of how we as music fans are complicit. If we start valuing convenience and novelty over authenticity, we’ll get more AI sludge and fewer real albums. The danger isn’t just that AI can mimic artists. We also have to worry that people will stop noticing, or caring, when it does.

You might also like

Eric Hal Schwartz is a freelance writer for TechRadar with more than 15 years of experience covering the intersection of the world and technology. For the last five years, he served as head writer for Voicebot.ai and was on the leading edge of reporting on generative AI and large language models. He's since become an expert on the products of generative AI models, such as OpenAI’s ChatGPT, Anthropic’s Claude, Google Gemini, and every other synthetic media tool. His experience runs the gamut of media, including print, digital, broadcast, and live events. Now, he's continuing to tell the stories people want and need to hear about the rapidly evolving AI space and its impact on their lives. Eric is based in New York City.

Read Entire Article