← Back to All Cases
Concluded IP

Getty Images v. Stability AI

UK High Court of Justice · United Kingdom · 2025-11-04 · IL-2023-000007

Getty Images sued Stability AI in the UK for training Stable Diffusion on millions of Getty's copyrighted photographs. In a landmark ruling, the High Court held that trained AI model weights do not constitute 'copies' of the training images under UK copyright law, fundamentally reshaping the global AI copyright debate.

Holding

The High Court ruled that AI model weights — the mathematical parameters learned during training — are not 'copies' of the training data under UK copyright law. The court found that while individual training images were reproduced during the training process, the resulting model is a statistical abstraction, not a reproduction. Getty's infringement claims based on the model itself were dismissed.

Arguments For / Positive Implications

  • Provides legal certainty for AI developers operating under UK copyright law
  • Offers a sophisticated technical analysis of how AI models actually work, distinguishing weights from copies
  • May influence other common law jurisdictions (Australia, Canada, India) with similar copyright frameworks
  • Recognizes that intermediate copying during training may be treated differently from final output

Arguments Against / Concerns

  • Creates a potential safe harbor for AI companies that could undermine photographers' and artists' rights
  • Contradicts the emerging U.S. approach where provenance of training data matters (Bartz v. Anthropic)
  • Getty's images were used without license or compensation despite generating commercial value for Stability AI
  • The 'weights are not copies' framework may not survive legislative reform in the UK or EU

Our Takes

Lawra Lawra (The Moderate)
This ruling is technically precise and legally important. The court understood what model weights actually are — statistical representations, not compressed copies of images. That's the right technical analysis. But the policy question remains: should AI companies be able to train on copyrighted works without compensation just because the model stores patterns rather than pixels? The UK parliament may need to answer that question.
Lawrena Lawrena (The Skeptic)
So Stability AI scraped 12 million Getty photographs, used them to build a product that competes with Getty's own licensing business, and the court says the resulting model isn't a 'copy'? This is a masterclass in letting technical architecture dictate legal outcomes. The fact that model weights are mathematical abstractions doesn't change the economic reality: Getty's work was exploited without compensation. Photographers are the losers here.
Lawrelai Lawrelai (The Enthusiast)
The court got it right. Model weights genuinely are not copies — they're learned statistical patterns, more like memories than photocopies. This ruling gives the AI industry the legal certainty it needs to innovate. And it doesn't leave creators without recourse: the court acknowledged that individual training images were reproduced during the process, and that specific outputs that resemble training images could still infringe. The nuance is exactly right.
Carlos Miranda Levy Carlos Miranda Levy (The Curator)
The UK court understood something fundamental: learning from existing works to create something new is how all human knowledge has always worked. Model weights are abstractions — patterns, not copies — just as our own memories of a painting are not reproductions of it. The real challenge isn't whether AI can learn from images, but how we build a creative ecosystem where photographers and artists are compensated fairly. Licensing frameworks, collective agreements, and revenue-sharing models are the path — not pretending that statistical pattern recognition is the same as photocopying.

Why This Case Matters

Getty Images v. Stability AI is the most important AI copyright ruling outside the United States. For the first time, a major common law court addressed the fundamental question: when an AI model is trained on copyrighted images, does the resulting model itself infringe copyright? The UK High Court’s answer — no, because model weights are not copies — has massive implications for the global AI industry.

What Happened

Getty Images, the world’s largest commercial photo agency, sued Stability AI in January 2023 for training its Stable Diffusion image generation model on approximately 12 million copyrighted Getty photographs. Getty alleged that Stability AI scraped these images from the internet without permission and that the resulting model constituted an unauthorized copy of Getty’s copyrighted works.

The case was closely watched because the UK’s copyright framework differs significantly from the U.S. fair use doctrine. The UK has narrower copyright exceptions, and there is no general “fair use” defense — only specific statutory exceptions. This meant Stability AI could not rely on the transformative use arguments being deployed in American courts.

The Technical Analysis

The High Court’s ruling turned on a careful technical analysis of how AI training actually works. The court found:

  • During training, individual images are reproduced in computer memory as they are processed by the neural network. This constitutes copying under UK law.
  • The resulting model weights, however, are mathematical parameters that encode statistical patterns learned across millions of images. They do not store, reproduce, or enable the reconstruction of individual training images.
  • The model itself is therefore not a “copy” of any individual work within the meaning of the Copyright, Designs and Patents Act 1988.

Divergence from U.S. Law

This ruling creates an interesting divergence with U.S. copyright jurisprudence. In Bartz v. Anthropic, Judge Alsup focused on the provenance of training data (purchased vs. pirated) rather than the technical nature of model weights. The UK court, by contrast, focused on the technical question of whether model weights constitute copies, largely setting aside the provenance issue.

The practical result: under UK law, the model itself is likely non-infringing regardless of how the training data was obtained, while under U.S. law, the legality of the model depends heavily on whether the training data was lawfully acquired.

The Broader Impact

Getty v. Stability AI establishes a key principle for AI copyright law in common law jurisdictions: trained model weights are mathematical abstractions, not reproductions. This principle will likely influence courts in Australia, Canada, Singapore, and other jurisdictions with copyright frameworks based on the UK model. It may also inform the ongoing policy debate in the UK about whether to create a specific text-and-data mining exception for AI training.

Sources

  • Getty Images (US) Inc. v. Stability AI Ltd., No. IL-2023-000007 (High Court, Ch. Div., Nov. 4, 2025) (2025-11-04)
  • UK Court Rules AI Model Weights Are Not Copyright Copies — The Guardian (2025-11-04)
  • Getty v. Stability AI: What 'Weights Are Not Copies' Means for Global AI Law — Oxford Journal of Intellectual Property Law (2025-12)

Explore Legal Frameworks

Cases don't happen in a vacuum. Explore the regulatory frameworks shaping AI law around the world — from the EU AI Act to emerging legislation in Latin America.

Ready for structured learning? Explore the Learning Program →

Comments

Loading comments...

0/2000 Comments are moderated before appearing.