
As artists grow more concerned about their work being used without permission, a new wave of AI tools focused on consent—otherwise known as "rights-first" AI—are emerging. At its core, ethical AI in music is about creating tools and systems designed to respect artists' control of their work, and building in features like transparent attribution, fair compensation and clear ownership. News like the recent BeatStars acquisition of Lemonaide, which will integrate generative music tools into a creator marketplace already built around licensing and revenue sharing, is a step in ensuring artists who train the models continue to receive credit and revenue. This deal in particular signals a noteworthy turning point in the music world that demonstrates how this AI route is becoming a viable business model.
To understand the implications of the change, we spoke with Riley Knapp, President of consultancy Riley Knapp Inc. and former Head of Growth for audio production tool Hit'n'Mix. A multi-platinum songwriter and product strategist with novel perspective when it comes to music, technology, and the way they overlap, Knapp frames ethical AI as the better way to do business while making art in today's world. "Consent-based AI creates the landscape of trust among musicians and creators," he says.
- An artist's anxiety: Even as rights-first platforms take shape, some creators remain skeptical. Without building that trust, the foundation for the success of these platforms will be shaky. For creators, AI raises not just questions but the prospect that it could threaten their livelihoods. It's a concern that is prompting some platforms to take action against purely AI-generated music. “I'm watching my royalty statements dwindle, and in all these big lawsuits, where are the independent musicians? I don't have any protections for the songs I've released without a label," Knapp explains. "It's terrifying because we just don't know what's going to happen.”
So why is the ethical AI debate bubbling up right now? Because the technology itself has drastically improved. After years of struggling to compete, ethical AI models from companies like BeatStars, LANDR, and Splice are demonstrating that they can effectively compete on quality. But Knapp isn’t a blind optimist, noting that perfect attribution, which continues to be a thorny problem in the world of generative AI and copyright, is still the holy grail yet to be fully realized.
- A common misconception: Knapp finds AI useful in producing music, and notes that the way he uses it is more nuanced than some may imagine. “The biggest misconception is that AI music is just people creating slop by typing in a text prompt, outputting a thousand songs a day, and uploading them to streaming services to dilute the pro rata royalty pool while profiting off of human creators."
- The dashboard dream: The hardest technical challenge for platforms right now is attribution. "The question is," Knapp asks, "how will we know who is using what and how much of it they're using? Will I have a dashboard that transparently shows when a major artist uses an AI trained on my work and precisely how much I contributed to their track?"
Despite these open questions, the economic upside of AI integration looks promising. Knapp's belief in this is propped up by his own past experience with industry-changing platforms: "As a writer and producer for over a decade, I saw how streaming changed my life by helping create the music industry's first-ever middle class. Ethical AI has the same opportunity to create an even wider middle class and generate income for people who would never have had that chance before.”
- Credit where it's due: In his view, AI gives artists who aren't properly given their flowers a chance to profit. "Session musicians are often not properly credited. You can play drums on a hit song and no one knows it's you," he says. "The idea of creating an AI model of myself that captures my own unique intricacies for anyone to use is incredibly powerful. It's a way to finally get paid and get proper attribution for my work."
- Scaling the masters: "I know of mastering engineers who've created AI versions of themselves, like Colin Leonard who mastered for Bieber. He created his own model so an independent artist who can’t afford his rates can get a quick turnaround and hear his process on their work. It gives people access. I think that's a really cool thing."
AI is already finding its way into the studio, so the central question moving forward isn't whether it will be used, but how it's governed. With generative tools being embedded directly into creator marketplaces like BeatStars and production workflows, better attribution becomes less of an option and more of an imperative.
- Make the shift work for you: As this AI shift inevitably takes place, he calls on artists to explore how they want to participate and find agency where they can. "Your work is going to be trained on whether you like it or not, so you might as well get paid and have it attributed to you," he says. "As icky as it sounds, that's the reality."
If that leverage can be structured correctly, Knapp believes the upside for musicians is big. Wider access to opportunities in the industry began with streaming, and he sees a similar possibility emerging here for both chart-topping artists and specialists who have historically gone uncredited. “What if I wanted Dave Grohl to play drums on my record and he had his own AI model? Suddenly, I could have a legend on my track for a small subscription fee.” In that scenario, established musicians scale their expertise, and emerging artists gain access to tools that once felt unreachable.
