Skip to content

RIAA Lawsuit Against Generative Music Startups Will Be the Bloodbath AI Needs

Like many AI companies, Udio and Suno relied on large-scale theft to create their generative AI models. They have admitted this, even before the music industry’s new lawsuits against them have come before a judge. If presented to a jury, the trial could be both a damaging expose and a very useful precedent for equally unethical AI companies facing certain legal dangers.

The demands were presented on Monday with great fanfare by the Recording Industry Association of America, putting us all in the uncomfortable position of supporting the RIAA, which for decades has been the bogeyman of digital media. I myself have received nasty messages from them! The case is that clear.

The essence of the two lawsuits, which are extremely similar in content, is that Suno and Udio (strictly speaking, Uncharted Labs doing business as Udio) indiscriminately plundered more or less the entire history of recorded music to form data sets, which they then used to train a music-generating AI.

And here let’s quickly note that these AIs do not “generate” but rather match the user’s cue to patterns in their training data and then attempt to complete that pattern. In a way, the only thing these models do is make covers or mashups of the songs they ingested.

That Suno and Udio ingested such data is, for all purposes (including legal ones), indisputably true. Company leaders and investors have been recklessly relaxed about copyright challenges in the space.

They have admitted that the only way to create a good model of music generation is to ingest a large amount of high-quality music, much of which will be protected by copyright. It is simply a necessary step to create machine learning models of this type.

They later admitted that they did it without the permission of the copyright owners. Investor Brian Hiatt told Rolling Stone Just a few months ago:

Honestly, if we had deals with labels when this company started, I probably wouldn’t have invested in it. I think they needed to make this product without limitations.

Tell me you stole a century of music without telling me you stole a century of music, understood. To be clear, by “restrictions” we mean copyright law.

Finally, the companies told RIAA attorneys that they believe stealing all of this media falls under the fair use doctrine, which fundamentally only comes into play in the unauthorized use of a work. Now, it is true that fair use is a complex and confusing concept in terms of idea and execution. But a company with $100 million in its pockets steals every song ever made so it can largely replicate them and sell the results: I’m no lawyer, but that seems a bit of a departure from the intended safe harbor of, say, a 7th grader. degree using a Pearl Jam song as the background of their video about global warming.

To be frank, it seems that the chicken of these companies is cooked. They clearly hoped they could take a page from OpenAI’s playbook, secretly using copyrighted works and then using evasive language and misdirections to stop their less wealthy critics, such as authors and journalists. If when the AI ​​companies’ shenanigans are revealed, they are the only distribution option, it no longer matters.

In other words: Deny, divert, delay. Ideally, you can prolong it until the tables turn and you reach agreements with your critics (for LLMs, They are media and similar, and in this case it would be the record labels, which the music generators clearly hoped to eventually reach from a position of power. “Sure, we stole your stuff, but it’s big business now; Wouldn’t you rather play with us than against us? It is a common strategy in Silicon Valley and a winning one, since it mainly costs money.

But it is more difficult to achieve when you have irrefutable proof in your hand. And, unfortunately for Udio and Suno, the RIAA included a few thousand pieces of smoking gun in the lawsuit: songs they own that are clearly being regurgitated by musical role models. Whether Jackson 5 or Maroon 5, the “generated” songs are simply slightly distorted versions of the originals, something that would be impossible if the original were not included in the training data.

The nature of LLMs (specifically, their tendency to hallucinate and lose plot the more they write) precludes the regurgitation of, say, entire books. This has probably raised a authors’ lawsuit against OpenAI, since the latter can plausibly claim that the fragments his model cites were taken from reviews, first pages available online, etc. (The last move in the goal is that did used copyrighted works from the beginning, but didn’t anymore, which is funny because it’s like saying you only squeezed orange juice once but you don’t anymore).

what you can not It’s plausible to claim that your music generator only heard a few bars of “Great Balls of Fire” and somehow managed to spit out the rest word for word and chord by chord. Any judge or jury would laugh in your face and hopefully a courtroom artist will have the opportunity to illustrate it.

This is not only intuitively obvious but also has legal consequences, since it is clear that the models are recreating entire works (sometimes badly, of course, but complete songs). This allows the RIAA to claim that Udio and Suno are causing real and significant harm to the businesses of copyright holders and the artists who are being regurgitated, allowing them to ask the judge to shut down all operations of the companies. of AI at the beginning of the trial with a mandate.

Do the opening paragraphs of your book come from an LLM? That is an intellectual question that must be discussed in depth. Dollar store “Call Me Maybe” generated on demand? Turn it off. I’m not saying it’s correct, but it’s likely.

The predictable response from companies has been that the system is not intended replicating copyrighted works: a desperate and naked attempt to dump liability on users under the safe harbor of Section 230. That is, in the same way that Instagram is not liable if you use a copyrighted song like backup of your reel. In this case, the argument seems unlikely to gain traction, in part because of the aforementioned admissions that the company itself ignored copyright to begin with.

What will be the consequence of these trials? As with everything related to AI, it is quite impossible to say in advance, as there is little precedent or established and applicable doctrine.

My prediction, also lacking actual experience, is that companies will be forced to expose their data and training methods, things that have clear evidentiary interest. Seeing these and their obvious misuse of copyrighted material, along with (likely) communications indicating that they were violating the law, will likely precipitate an attempt to settle or avoid trial, and/or a speedy trial against Udio and Suno. They will also be forced to stop any operations that rely on theft-based models. At least one of the two will try to continue the business using legal (or at least legally adjacent) music sources, but the resulting model will be a huge drop in quality and users will flee.

Investors? Ideally, they would lose their shirts, having placed their bets on something that was obviously and demonstrably illegal and unethical, and not just in the eyes of the nebbish authors’ associations but in the legal minds of the infamous and ruthlessly litigious RIAA. Whether the damages equal the cash on hand or the financing promised is anyone’s guess.

The consequences can be far-reaching: If investors in a new generative media startup suddenly see a hundred million dollars vaporized due to the fundamental nature of generative media, a different level of diligence suddenly seems appropriate. Companies will learn from the trial (if there is one) or settlement documents, etc., what could have been said, or perhaps more importantly, what should not have been said, to avoid liability and keep rights holders of author guessing.

Although this particular lawsuit seems almost a foregone conclusion, not all AI companies leave their fingerprints on crime scenes so liberally. It won’t be a manual for processing or forcing deals with other generative AI companies, but rather an object lesson in arrogance. It’s nice to have one of those every once in a while, even if the teacher is the RIAA.