On April 9, Kobalt — one of the world's biggest independent publishers — signed a licensing deal with Udio, the AI music generation platform. Udio will use Kobalt's catalog to train its models.
Udio says artists can "opt in." The press release provides no details on how opt-in works, how much artists get paid, or what "participation" actually means.
This is Udio's fourth deal in six months. UMG settled its lawsuit and signed in October 2025. Warner followed in November. Merlin — representing thousands of indie labels — signed in January 2026. Now Kobalt.
If your music is in any of these catalogs, it may already be training an AI model. And nobody asked you.
The Lawsuits That Became Licensing Deals
Here's what makes this uncomfortable.
Udio is the same company that was sued by UMG, Warner, and Sony in 2024 for copyright infringement "on an almost unimaginable scale." All three major labels filed together. The lawsuit accused Udio of scraping copyrighted recordings to train its AI without permission or payment.
Then UMG quietly settled. Signed a licensing deal. Dropped out of the lawsuit.
Warner did the same a month later.
Sony Music is still suing. And independent artists — the ones who actually had their work scraped without permission — are still fighting Udio in separate lawsuits that Udio is aggressively trying to dismiss.
The pattern is clear: the major labels and big publishers aren't fighting AI on behalf of artists. They're using lawsuits as leverage to negotiate better terms for themselves. Once the deal is done, the lawsuit disappears.
What the Deals Actually Say (and Don't Say)
None of the Udio deals with Kobalt, UMG, Warner, or Merlin have committed to:
- Individual creator consent before training on specific works
- Transparent per-artist compensation breakdowns
- Opt-out mechanisms with teeth — meaning artists can actually remove their work from training datasets
- Auditable reporting on which songs were used and how
The labels and publishers believe they control the rights and can unilaterally opt artists in. Whether that's legally sound is still being tested — but in practice, it's already happening.
What's Already Happening to Independent Artists
While the majors cut deals, independent artists are dealing with the fallout.
Murphy Campbell, an independent folk artist, had her voice cloned by AI. Unauthorized versions of her songs appeared on Spotify. Then she got hit with fraudulent copyright claims on her own recordings through Vydia's Content ID system. The claims were eventually removed — but she had to fight for her own music.
The numbers tell a broader story:
- Deezer reported receiving 60,000+ fully AI-generated tracks per day, with 85% of AI music streams being fraudulent
- Spotify removed 75 million "spammy" AI tracks in the past year
- Independent artists filing lawsuits against Udio face a well-funded legal team working to get their cases dismissed
The artists who had their work scraped to build these models are now competing against the output of those same models — on the same platforms, for the same listener attention. And the companies that profited from their catalogs are the ones signing the deals.
The Real Issue Isn't AI. It's Control.
This isn't anti-AI doomerism. AI tools can be useful — for production, for discovery, for creative experimentation. The issue is who controls the decision.
If your music sits in someone else's catalog, on someone else's platform, behind someone else's licensing terms — you have no say in how it gets used. Your publisher can license it to train AI. Your label can settle a lawsuit on your behalf and call it a win. You find out from a press release.
When your publisher, label, or aggregator can license your music to an AI company without asking you, you don't own your career — they do.
The only relationship that actually belongs to you is the one with your fans.
Why Direct-to-Fan Changes the Math
When fans buy your music from your release page, that revenue goes to you. No intermediary can redirect it to train an AI model. No algorithm can dilute it with 60,000 AI tracks per day. No label can settle a lawsuit on your behalf and call it a win.
Direct fan revenue isn't dependent on per-stream rates that get diluted every time another batch of AI-generated tracks floods the platforms. It's a transaction between you and the person who values your work. That's it.
What Independent Artists Should Do Right Now
1. Check Your Agreements
If you're with a publisher or aggregator, read the fine print on AI licensing. Do they need your consent? Most don't explicitly require it. Look for clauses about "new technologies," "future uses," or blanket licensing rights. If the language is vague, assume it doesn't protect you.
2. Build Your Email List
Your fan email list is the one asset no platform, label, or AI company can take from you. Every fan you collect through your own channels is a fan you own the relationship with. Algorithms change. Platforms add and remove features. Your email list stays.
If you haven't started, read our guide on building a fan email list from scratch.
3. Start Selling Directly
Even if you're also on streaming platforms, having a direct sales channel — release pages, merch, subscriptions — means you have revenue that isn't dependent on per-stream rates being diluted by AI-generated content. It's revenue that belongs to you because a fan chose to pay you directly.
4. Don't Panic — But Don't Be Passive
The artists who built direct fan relationships before streaming fraud became a crisis are the ones who don't lose sleep over it. They have names, emails, and a way to reach their audience that doesn't depend on any single platform's rules.
The ones who waited are the ones scrambling now.
Start building today.
The Bigger Picture
The music industry has always had middlemen who make decisions on behalf of artists. Managers. Labels. Publishers. Platforms. Every era introduces a new entity that sits between the artist and the audience, taking a cut and making choices the artist didn't sign up for.
AI just made the stakes higher. Now it's not just your royalties being split — it's your creative identity being fed into a model that can generate something that sounds like you, at scale, for free.
The best protection isn't a better contract. It's a fan base that pays you directly.
As Spotify's own Loud & Clear 2026 report showed, the streaming economy is already tilted against independent artists. AI-generated content flooding the platforms only accelerates that imbalance.