
The debate over AI in music has been framed as a moral crusade of good versus evil. Yet, as the industry and fans rally against what they perceive as a rising tide of inauthentic content, the blame is being placed on the wrong shoulders. Instead of holding AI companies and platforms accountable for their systemic failures, the music industry is increasingly punishing ordinary creators and independent musicians.
We spoke with Davis Natzle, an AI Business Analyst at Symphonic Distribution and an artist himself. He argued that the entire framework of the AI debate is flawed and that the industry is repeating its worst mistakes with artists paying the price.
Natzle highlighted that the core issue with generative AI in music stems from an original sin in the business strategy of AI developers: the decision not to license any of the content that was critical to training and refining their generative AI models. "I wouldn't characterize AI companies as evil," Natzle said. "But I would characterize them as lazy. The smart thing would have been to solve the licensing issue first. These AI developers are posing as genius disruptors and innovators but they are totally ignorant of copyright, which is fundamental to the music economy. The result is you see multi-billion dollar businesses that rely on industrial-scale intellectual property theft. I wouldn’t call that innovation. You have to build licensing into your infrastructure, or else AI won't benefit the music industry and you’re just selling an illegal product."
On the flip side, the industry's fear of AI is nothing new. The "guilty until proven innocent" framework is a daily reality for artists navigating the opaque world of digital streaming platforms. The same mechanisms that wrongly punish artists for alleged streaming fraud are primed to be turned against those accused of using AI. He cited the high-profile case of musician Benn Jordan, who was accused of artificial streaming by Spotify, and was only able to rectify the situation through a personal connection to the CEO of his distributor. For the average artist or fan without connections, there is no recourse.
-
The human cost: Natzle fears that this moralizing narrative will justify crushing independent artists who get caught in the middle. "I'm scared of what happens to the normal creative people who are going to get ground up in the machine," Natzle said. "This whole moralization, trumpeted by the industry, is going to be the justification to just run them over. And nobody's going to build any kind of system to give them a chance to appeal it."
-
Napster déjà vu: In the early days of music downloads and streaming, the industry over-corrected to address fears of copyright infringement. Instead of improving the systems in place to protect artists and share music, they took out their ire on the average person. "This hearkens back to the Limewire, Frostwire, and Napster days, when the major labels ended up suing grandmas for digital piracy," he explained. "Now, if grandma makes a song with Suno and puts it on Spotify, she might end up being sued for copyright infringement while the AI companies are largely left unscathed. Meanwhile, any independent artist can be accused of using generative AI and be de-platformed, de-monetized, or publicly shamed without proof."
With an influx of money from a compulsory licensing model, you instantly flip the narrative of AI as a drain on the industry to a gain for the industry.Davis Natzle - AI Business Analyst | Symphonic DistributionLicensing has long been an issue for the industry. Layer in AI and the industry is scrambling to build an effective system. Natzle offered a blueprint for a solution, arguing that a fair and scalable model for mass licensing has existed for decades. He contended that the challenges posed by AI are not entirely new but rather an amplification of an existing, unaddressed flaw in the music industry's approach to new technologies and creative reuse, like the industry's failure to adapt to sampling.
-
The missed opportunity: The industry's failure to create a statutory license for samples, like it did with cover songs, has led to decades of legal issues and missed revenue opportunities for artists and rights holders alike. "Look at what happens with a lot of new hip-hop and rap artists. They often enter the scene with uncleared sample problems because to use a sample, you need permission," Natzle noted. "As a result, these newer artists are totally beholden to the original copyright owners on the sample because there's no statutory license for the recording, only for the song. We still need to fix sampling."
-
The model that works: Natzle pointed to the compulsory mechanical license for cover songs as a proven, effective model that allows for widespread creative use while ensuring fair compensation to the original songwriter, without requiring their explicit permission. He provided an example, referencing the commonly covered song “Hallelujah” by Leonard Cohen. "If someone releases a new version of ‘Hallelujah’ without permission, royalties are paid to Leonard Cohen's estate regardless, because of what's called a compulsory mechanical license, which is a statutory license under copyright law," he explained. "That's how we fixed copyright for cover songs. New versions of the song can be created without permission, but the original creator still gets paid a fair rate."
By applying the statutory licensing model of cover songs to AI, Natzle argued the narrative can be flipped. AI would cease to be a threat that drains revenue and would instead become a massive new customer for the music industry. With a proper licensing framework, AI becomes a tool that grows the economic pie for everyone. He detailed a two-tiered system: AI companies would pay a license for training their models on existing copyrighted works, and then pay recurring fees for the ongoing generation of music. This approach, he explained, would legitimize the commercial use of AI-generated music by end-users, transforming AI companies into significant, new revenue streams for artists and rights holders.
- Growing the pie: Natzle believes proper licensing would transform AI companies into significant revenue generators for the music industry, expanding the overall market and pushing more of the money to creators and rights holders. "These licensing fees should not be cheap," he said. "The AI companies have to take the risk that the revenue they generate from users may not be enough to cover their licensing fees. The users have to take the risk that the royalties they receive from streaming their generated tracks may not be enough to cover the fees they are paying to the AI companies. With this influx of money to existing rights holders, you instantly flip the narrative of AI as a drain on the industry to a gain for the industry."
- Fairness in, fairness out: Natzle's blueprint includes weighted compensation for artists whose styles are explicitly invoked by AI users, ensuring fair recognition and additional royalties when their creative identity is referenced. "If somebody says by name, 'generate a rock song that sounds like Metallica,' that should result in a higher payout. That just makes sense."
Ultimately, Natzle advocates for a music ecosystem that embraces creativity, even if it's unconventional, while upholding the rights of artists. His vision is one where the industry allows for an open, yet accountable, environment. "To me, it's ok that AI use is a little open and messy," he said. "But we do have to honor the copyright."