Sarah Chen is a digital artist. For 15 years, she's built her craft—thousands of hours of work. Her style is distinctive. Her technical skill is undeniable. Last month, she discovered her art was used to train Stable Diffusion without her permission or compensation.

She's not alone. Millions of artists have filed lawsuits against OpenAI, Google, Meta, and Microsoft. Getty Images sued. The Authors Guild sued. Individual artists are suing in coordinated class actions. Their claim: these companies trained AI models on their copyrighted work without permission.

This is the defining legal battle of AI.

The Legal Battle

The lawsuits argue that using copyrighted work for AI training violates copyright law. You can't just take someone's work and use it for your purposes without permission.

But the defendants argue differently: they claim fair use applies. Fair use is a legal doctrine that allows certain uses of copyrighted material without permission—specifically for commentary, criticism, and parody. The AI companies argue that using training data to build models is "transformative" and therefore falls under fair use.

This is the crux of the dispute. Is training an AI model a fair use of copyrighted material?

The Stakes

If courts rule against AI companies, it could reshape the industry. Training costs would spike dramatically. Companies would need to license datasets. Or they would need to pay artists directly for using their work.

If courts rule for AI companies, artists lose recourse. Their work becomes a commodity for AI training. Every time an artist's work is used in a training dataset, they get nothing.

The Precedent Matters

These cases will define whether AI development happens with artist consent or without it. The outcome will shape whether AI companies thrive by compensating creators or exploiting them.

This will also set precedent for other domains. If the rule is "fair use for training," then all copyrighted material becomes fair game for AI training. News articles, books, code, everything.

The Historical Parallel

This isn't the first time technology disrupted creative work. Photography threatened painters. Recording technology threatened live musicians. The internet threatened publishers. But in each case, new rules emerged. Photographers aren't allowed to just copy paintings. Musicians get royalties from recordings.

The question is: what new rules will emerge for AI?