← Back to All Cases
Concluded IP

Kadrey v. Meta Platforms

U.S. District Court, Northern District of California · United States · 2025-06-25 · 3:23-cv-03417-VC

Authors Richard Kadrey and Christopher Golden sued Meta for training LLaMA on pirated books from LibGen. Judge Vince Chhabria ruled that AI training constitutes fair use regardless of whether the training data was lawfully obtained — directly contradicting the Bartz v. Anthropic piracy distinction.

Holding

Judge Chhabria held that Meta's use of copyrighted books to train its LLaMA large language models is transformative fair use. Critically, the court found that the source legality of training data is irrelevant to the fair use analysis — the transformation inherent in AI training applies equally whether the books were purchased or pirated.

Arguments For / Positive Implications

  • Provides the strongest judicial endorsement of AI training as transformative fair use
  • Simplifies the legal framework by focusing on the nature of the use rather than data provenance
  • Follows the Supreme Court's Authors Guild v. Google precedent closely
  • Gives AI companies clarity that transformative use is the dispositive factor

Arguments Against / Concerns

  • Directly contradicts Bartz v. Anthropic's piracy distinction, creating a circuit split
  • Effectively eliminates any incentive for AI companies to license training data rather than scrape it
  • Leaves authors with no remedy even when their works are obtained through piracy
  • The 'source legality is irrelevant' principle could have troubling implications beyond AI

Our Takes

Lawra Lawra (The Moderate)
This ruling is legally rigorous but creates a real problem. Judge Chhabria's fair use analysis is textbook — AI training genuinely is transformative, and the Supreme Court's framework doesn't include a 'clean hands' requirement. But by declaring source legality irrelevant, the ruling removes the only practical incentive for AI companies to pay for training data. We now have two federal judges looking at the same question and reaching opposite conclusions. The Supreme Court will need to sort this out.
Lawrena Lawrena (The Skeptic)
This is the most dangerous AI ruling to date. A federal judge just told the world that tech companies can pirate millions of books, feed them to their AI, and call it 'transformative fair use.' Source legality doesn't matter? Tell that to the authors whose books were stolen from LibGen. If training on pirated data is fair use, what exactly is piracy anymore? This ruling guts copyright protection in the AI era.
Lawrelai Lawrelai (The Enthusiast)
Judge Chhabria followed the law where it led. Fair use is about the nature and purpose of the new use, not the chain of custody of the input. Google scanned millions of library books without permission and the Supreme Court blessed it. Meta trained on books without permission and the same logic applies. Is the outcome uncomfortable? Maybe. But the legal framework is clear, and pretending that source legality changes the transformative analysis is wishful thinking, not legal reasoning.
Carlos Miranda Levy Carlos Miranda Levy (The Curator)
This ruling forces us to confront a deeper truth: all human knowledge is built on accessing and learning from previous works. Every student, every scholar, every creator has consumed copyrighted material to learn. The transformation from reading to creating something new is what matters. But — and this is crucial — the absence of a clean-hands requirement creates a dangerous incentive: why pay for content if piracy has no legal consequence? The market needs both freedom to learn and incentives to create. A ruling that removes incentives for authors while freeing AI companies from any obligation is not balanced innovation — it's extraction. The Supreme Court needs to find the middle ground.

Why This Case Matters

Kadrey v. Meta is the anti-Bartz. Where Judge Alsup in Bartz v. Anthropic drew a bright line between training on purchased and pirated books, Judge Chhabria in Kadrey declared that distinction legally irrelevant. The result is a direct conflict between two federal courts in the same district on the most consequential question in AI copyright law — making Supreme Court review increasingly likely.

What Happened

Authors Richard Kadrey (known for the Sandman Slim series) and Christopher Golden filed a class action against Meta Platforms in July 2023, alleging that Meta trained its LLaMA family of large language models on pirated copies of their books obtained from Library Genesis (LibGen), an underground repository of pirated academic texts and books.

The case closely paralleled Bartz v. Anthropic — both involved AI companies training on books sourced from shadow libraries, both were filed in the Northern District of California, and both raised the same core question: is AI training on copyrighted books fair use?

The Fair Use Analysis

Judge Chhabria’s analysis followed the traditional four-factor test but reached dramatically different conclusions from Judge Alsup:

  1. Purpose and character: LLaMA transforms books into statistical model weights that generate entirely new text. This is “quintessentially transformative” — the model does not reproduce, summarize, or substitute for the original books. This factor strongly favored Meta.

  2. Nature of the works: The books are creative, published works entitled to strong copyright protection. This factor favored the plaintiffs, but the court noted this factor rarely determines fair use outcomes.

  3. Amount used: Meta used complete works, but the court found this was necessary for effective training. Using partial works would degrade model performance without meaningfully reducing the copyright impact.

  4. Market effect: LLaMA does not substitute for the books it was trained on. Users do not use LLaMA instead of reading Kadrey’s novels. This factor strongly favored Meta.

The Piracy Question

The most controversial aspect of the ruling is the court’s treatment of data provenance. Judge Chhabria acknowledged that Meta’s training data included pirated books from LibGen but held that this fact is irrelevant to the fair use analysis:

“The fair use doctrine asks whether the defendant’s use of the copyrighted work is transformative and whether it harms the market for the original. It does not ask how the defendant obtained the work. A use is either transformative or it is not, regardless of the chain of custody.”

This directly contradicts Judge Alsup’s holding in Bartz that training on pirated copies is “inherently, irredeemably infringing” regardless of the downstream use.

The Circuit Split

Kadrey and Bartz create a textbook circuit split — two federal courts in the same jurisdiction reaching opposite conclusions on the same legal question. While technically both are district court rulings (not binding precedent), the conflict virtually guarantees that the issue will be appealed and eventually reach the Supreme Court. Legal scholars have described this as “the most important copyright question of the 21st century.”

The Broader Impact

For AI companies, Kadrey offers the most favorable legal framework imaginable: train on whatever you want, from wherever you get it, and the result is fair use as long as the model is sufficiently transformative. For authors and publishers, the ruling is devastating — it eliminates the one legal lever (data provenance) that Bartz had given them. The practical question now is which approach the appellate courts and ultimately the Supreme Court will adopt.

Sources

  • Kadrey v. Meta Platforms, Inc., No. 3:23-cv-03417-VC (N.D. Cal. June 25, 2025) (2025-06-25)
  • Authors Guild v. Google, Inc., 804 F.3d 202 (2d Cir. 2015) (2015-10-16)
  • Meta Wins Fair Use Ruling in AI Book-Training Lawsuit — Bloomberg Law (2025-06-25)

Explore Legal Frameworks

Cases don't happen in a vacuum. Explore the regulatory frameworks shaping AI law around the world — from the EU AI Act to emerging legislation in Latin America.

Ready for structured learning? Explore the Learning Program →

Comments

Loading comments...

0/2000 Comments are moderated before appearing.