
Despite a flurry of high-profile lawsuits pitting major studios against artificial intelligence firms, a different story is unfolding behind the scenes. While legal teams publicly battle over intellectual property, many of the same studios are quietly and urgently integrating AI into their workflows. The contradiction reveals a deeper conflict, one less about creative protection and more about raw power plays in an industry defined by secrecy. The public posturing masks a frantic, private scramble to adopt the very technology being villainized, often in dark corners and under strict non-disclosure agreements.
We interviewed Audrey Henson, a Marketing Strategist and Podcast Producer at KoobrikLabs, a firm that consults with studios on AI innovation. Drawing on her work producing the Technically Creative podcast, Henson argued that copyright in its current form rarely benefits creatives. Instead, it props up studio power while leaving individual artists with little more than symbolic rights.
She contended that the current wave of lawsuits—massive corporations like Disney and Universal targeting smaller AI firms such as Midjourney—look less like fights for artistic integrity than battles between entrenched power players.
- It can't wait: "Who does copyright really benefit these days? It's never the artist," she said. "Provenance is the only reason a Vincent van Gogh picture is worth anything, and if provenance is important to a creator, that should be their right. But as a system, it’s broken. To rewrite copyright and IP with AI is going to take so long, and in the meantime, we need protections in place. That can't wait."
- A new kind of leader: Henson insisted that the first step toward a functional future was a human one. "Dedicated AI leadership is now crucial. Someone in an operational position who can be a communicator between the tech and the people. Your head of tech should not be dealing with AI on top of firewalls and infrastructure. You need a separate person innovating on behalf of the company who is asking, 'Have we thought about this? Can we do something here? What are our ethics and our codes?'"
Henson explained that when studios resist AI, it isn’t about defending artists; it's about protecting antiquated business models. She argued this resistance is rooted in a desire to maintain control through intentionally siloed systems.
- Power struggle: "Many studios feel overwhelmed by AI and don't know where to begin. Hollywood's legacy systems make disruption especially difficult and information silos have historically kept power concentrated," explained Henson. "AI stands to democratize some of that power, which can be unnerving to those who have traditionally held it."
This systemic dysfunction makes the need for her proposed "AI communicator" all the more urgent. Henson described this role not as a traditional IT manager, but as a translator and strategist—someone who can bridge the cultural and technical divide.
-
The translator role: "You need somebody willing to experiment, to just throw things at it and see what’s possible, because everyone else waits for someone to declare the industry standard," she said. "You don’t necessarily have to know how to code, you just need to understand the philosophical side of AI."
In Silicon Valley, they say, 'Move fast and break things.' And in Hollywood, it's always, '...but just ask Legal first' as the tag-on. Hollywood’s reaction is essentially, ‘They’re not asking for permission, they’re not going through the system.’ And yes, that’s exactly what makes it disruptive.Audrey Henson - Marketing Strategist and Podcast Producer | KoobrikLabsThis philosophical divide is at the heart of the conflict, which Henson framed as an irreconcilable culture clash between Silicon Valley’s ethos of permissionless innovation and Hollywood’s entrenched, permission-based world.
-
Two warring cultures: "In Silicon Valley, they say, 'Move fast and break things.' And in Hollywood, it's always, '...but just ask Legal first' as the tag-on," Henson noted. "Hollywood’s reaction is essentially, ‘They’re not asking for permission, they’re not going through the system.’ And yes, that’s exactly what makes it disruptive."
This gridlock has real-world consequences. Henson pointed to the 2023 Writers Guild strike as a prime example of the industry’s inability to engage in meaningful dialogue, an experience that cost her her own job.
- The cost of silence: "They waited until month six or seven before they started bringing in professional AI negotiation teams to discuss the AI issues," she recalled. "And then you look at what the writers got out of it. It's just literally the permission to kick up sand, a legal nothing. The industry bled so much over AI in 2023, and since then, nobody has any grasp on what anybody's doing."
For Henson, the lawsuits, the secrecy, and the stalled negotiations all point to the same truth: Hollywood isn’t fighting to protect creativity, it’s fighting to protect control. And in clinging to that control, the industry risks mistaking resistance for protection, when in reality, resistance to AI has become resistance to growth. "I think if you're anti-AI as a company, you're anti-growth at this point. You're not looking out for your own best interest if you're pushing against AI too hard."