AI Music Deals: Big Content's Power Play Leaves Artists Behind
AI Music Deals: Artists Lose in Big Content Power Play

In what appears to be a dramatic reversal, Universal Music Group (UMG) has announced a partnership with AI music startup Udio - the same company it was suing just months earlier for alleged copyright infringement. This development highlights a troubling pattern emerging across the creative industries as media conglomerates strike deals with technology firms while working artists bear the consequences.

The Lawsuit Turnaround

Last year, Universal Music Group joined forces with major labels including Warner Records and Sony Music Entertainment to sue two AI music startups, accusing them of using copyrighted recordings to train text-to-music models without permission. The legal action positioned the music industry as defenders of artistic rights against technological encroachment.

However, the narrative shifted dramatically when UMG revealed its new partnership with defendant Udio to create an AI music platform. Their joint press release promised to "do what's right by [UMG's] artists", but the Music Artists Coalition responded sceptically, noting: "We've seen this before – everyone talks about 'partnership', but artists end up on the sidelines with scraps."

Creative Workers Pay the Price

As courts across the United States grapple with dozens of similar lawsuits, the fundamental question remains whether using creative work to train AI constitutes copyright infringement. In Andersen v Stability AI, one of the first class-action lawsuits concerning AI image generators, artists argue that using their work without credit, compensation or consent "violates the rights of millions of artists".

The impact on creative professionals is already measurable. In January 2024, a Society of Authors survey found that more than a third of illustrators had lost income due to AI competition. One study projects that audiovisual creators could face a 21% revenue loss by 2028 as generative AI increasingly displaces human creative labour.

Big Content's Trojan Horses

In response to these threats, new coalitions have emerged uniting entertainment executives and artists against the tech industry. The Human Artistry Campaign, founded on the principle that "AI can never replace human expression and artistry", brings together creatives and executives to endorse protective legislation.

However, copyright lawyer Dave Hansen, executive director of the non-profit Authors Alliance, warns that these legal battles may ultimately benefit large corporations rather than individual artists. He suggests that copyright lawsuits will likely lead to exclusive licensing deals between media and tech giants while "everybody else gets sort of left out in the cold".

Historical precedent supports this concern. When streaming services emerged, labels and studios secured lucrative deals while musicians, writers and actors saw diminished returns. The pattern appears to be repeating with AI. When Runway and Lionsgate struck a licensing agreement, United Talent Agency CEO Jeremy Zimmer questioned whether artists would share in the benefits: "If I'm an artist and I've made a Lionsgate movie, now suddenly that Lionsgate movie is going to be used to help build out an LLM for an AI company, am I going to be compensated for that?"

In several multimillion-dollar publishing deals with AI companies, authors received neither compensation nor the option to exclude their work from training datasets.

Flawed Solutions and Organised Labour

Even if US courts rule that tech companies must pay for training data, working artists are unlikely to see meaningful benefits. Under current power imbalances, media companies could pressure artists to sign away training rights as employment conditions - a scenario already unfolding as voice actors face contracts demanding such concessions.

Proposed legislative solutions also show concerning limitations. The NO FAKES Act, supported by major entertainment coalitions, would create federal rights to regulate AI deepfakes. However, civil liberties groups including the Center for Democracy and Technology and the American Civil Liberties Union have criticised the bill for vague language, weak free speech protections and potential for abuse. The legislation would allow individuals, including children, to license digital replica rights for up to ten years, creating opportunities for exploitation.

These developments point toward a more effective strategy that entertainment executives genuinely fear: organised labour. Unionised creative workers in the Writers Guild and Screen Actors Guild–American Federation of Television and Radio Artists have secured meaningful AI protections through strikes and collective bargaining - demonstrating that traditional labour organising may prove more effective than copyright law in protecting artists' interests.

As Alexander Avila concludes, if big content truly cared about protecting artists from AI, it would stop trying to sell their voices as training data and start listening to them.