Federal prosecutors have charged a North Carolina ‘musician’ with orchestrating an elaborate $10 million fraud scheme using AI-generated music.
Michael Smith, 52, was arrested Wednesday on charges of wire fraud, wire fraud conspiracy, and conspiracy to money launder.
Prosecutors allege that Smith used AI technology to create lots of of hundreds of pretend songs by nonexistent bands, then employed bots to stream these tracks thousands and thousands of times on popular platforms like Spotify, Apple Music, and Amazon Music.
“Through his brazen fraud scheme, Smith stole thousands and thousands in royalties that ought to have been paid to musicians, songwriters, and other rights holders whose songs were legitimately streamed,” said US Attorney Damian Williams.
According to the unsealed indictment (a proper legal document that has been made public after initially being kept confidential), Smith’s operation lasted seven years. It involved creating hundreds of pretend streaming accounts using purchased email addresses.
Smith even allegedly developed software to play his AI-generated music on repeat from quite a few computers, mimicking individual listeners from different locations.
To avoid detection, he reportedly distributed fake streaming activity across an array of pretend songs, rigorously generating unique names for AI-created artists and tracks.
Some of those quirky and absurdist monikers included bands like “Callous Post” and “Calorie Screams,” with song titles reminiscent of “Zygotic Washstands” and “Zymotechnical.” I ponder what prompt he used to generate those?
The scheme proved exceptionally lucrative. In an email sent earlier this yr, Smith boasted of reaching 4 billion streams and $12 million in royalties since 2019.
Prosecutors claim that by June 2019, Smith was earning about $110,000 monthly, sharing a portion with unnamed co-conspirators.
From an AI perspective, it’s unclear exactly how these songs were generated with AI back in 2019, as there weren’t too many high-quality tools for that then as there at the moment are. Today, tools like Udio, Suno, etc, would probably make such a scam even easier to execute.
We should indicate that botting schemes have plagued streaming platforms for many years, with artists, labels, and fraudsters attempting to game the system.
Spotify, Apple Music, and other platforms have long been fighting fake streams, using AI to investigate and stop bot activity.
AI-generated music is rife on Spotify, and also you’d think the platform might start paying closer attention to its origins and intentions now.
What makes Smith’s case noteworthy, nonetheless, is the mixture of large-scale botting with AI-generated content.
It was smart. But the long arm of the law ultimately caught up.
The music and AI industries are at loggerheads
AI and the creative industries have largely mixed like oil and water. While they don’t naturally mix, their combination has whipped up a volatile concoction stuffed with potential and risks.
Just months ago, the world’s three largest record labels filed federal lawsuits against text-to-audio platforms Suno and Udio, alleging “mass infringement of copyrighted sound recordings.”
The Recording Industry Association of America (RIAA), representing Universal Music Group, Sony Music Entertainment, and Warner Records, claims there is robust evidence that Suno and Udio used copyrighted music without permission to coach their AI models.
Similarly, in April 2024, over 200 outstanding artists, including Billie Eilish, Nicki Minaj, and Jon Bon Jovi, vowed to combat AI music.
AI’s integration into music isn’t universally viewed as a threat. Some see it as a democratizing force, enabling producers to experiment with different formats without the human manipulation involved in traditional record-signing processes.
How will musicians struggling to monetize their often generously talented work on streaming platforms feel about this?
Using AI to game the algorithms of platforms often criticized for underpaying artists might appear to be a victimless crime to some. However, dishonest manipulation of streaming numbers has deep consequences.
While many artists feel shortchanged by streaming platforms, fraudulent practices like those alleged within the Smith case likely harm your entire music ecosystem.
They can dilute legitimate streams, skew discovery algorithms, undermine trust, and possibly make it harder for honest artists to succeed.
It’s yet one more frontier on which artists and platforms could have to fight to make sure a good, transparent ecosystem. Artists risk falling behind.