
AI soundalikes are everywhere. A decade-old lawsuit gave artists a way to fight back, but with no united front, big tech is scraping music unchecked.
Ryan Schmidt, Esq., is an Entertainment Attorney, Music Business Educator, and the Co-Founder of the music-ed tech app Foundation. For him, the blueprint for today’s AI copyright battles was laid by a courtroom clash over creative ownership more than a decade ago.
A fractured frontline: “What wasn’t okay pre-AI is still not okay,” Schmidt says. “Just like it was illegal to burn a CD or download an MP3, it’s illegal to do those same types of actions in the name of AI.”
But that legal clarity hasn’t translated into protection for most artists. Schmidt argues it’s being exploited by big tech, thanks to the industry’s decentralized nature. Unlike film and television—with their powerful, unified unions—music is fragmented, leaving independent artists shut out of the “consent, credit, and compensation conversation” while their work is scraped to train the very models now under fire.
Blurred legal lines: A decade-old controversy still echoes through today’s music debates. “Everyone points to sampling, but everything really changed 10 years ago with the ‘Blurred Lines’ case,” Schmidt says. “We finally put a legal lens on music creation, and just because something was an accepted practice didn't mean it was legal.” The case became a cultural turning point. Sensitivity around infringement has only grown since, creating an environment where AI-generated music faces scrutiny not just for what it copies, but for how it imitates.

Court over Congress: When it comes to regulation, Schmidt is betting on the courts, not a legislative overhaul. “I don't think we can expect a complete overhaul of the U.S. Copyright Act,” he says. “It's more likely going to come through the courts, with judges finding the Copyright Office’s guidance very persuasive.” To protect those independent artists, he also champions the NO FAKES Act, which grants federal protection against unauthorized AI voice clones and deepfakes.
Poisoning the well: Alongside the legal fight, a technological arms race is unfolding. Schmidt points to emerging tools like data "poisoning" that arm artists with a defensive edge. “When an AI model ingests it, the file is completely destroyed,” he explains. “It can’t hear the song it’s trying to train on.”
The global loophole: Schmidt offers a sobering reality check: even with robust U.S. protections, the fight is global. “No matter what the U.S. copyright law says, it's unavoidable,” he warns, pointing to companies in territories with looser IP laws that can scrape and train with impunity. It's a loophole domestic policy can't close—and one that turns copyright into a competitive disadvantage.
But as a global leader in tech innovation, the U.S. still has the chance to help set the tone for how AI and human creativity can ethically and legally coexist. “We're not powerless—there’s still time to get this right,” Schmidt says. “But only if we stop pretending the rules no longer apply."