Court Lets Key Claims Proceed in AI Art Lawsuit
Artists Score Major Win in AI Copyright Battle
Last updated:

Edited By
Mackenzie Ferguson
AI Tools Researcher & Implementation Consultant
A federal judge allows copyright infringement and trademark claims to proceed in a lawsuit where artists accuse AI companies of using billions of copyrighted images without permission.
In a pivotal legal development, a federal judge has advanced key copyright and trademark claims against generative AI art companies in a groundbreaking lawsuit. The order could have wide-reaching implications for the industry, especially those firms that used the Stable Diffusion model. Artists allege that these AI systems were trained using billions of images downloaded from the internet without proper authorization or compensation, an issue that strikes at the heart of intellectual property rights in the digital age.
U.S. District Judge William Orrick's recent ruling has given a significant victory to artists suing AI companies like Stability AI and Runway. Orrick found that the companies' AI tool, Stable Diffusion, may have been constructed using copyrighted works, potentially facilitating infringement by design. This means other AI companies using the model in their products might also become entangled in the lawsuit, signaling a potentially massive legal and financial predicament for the AI sector.
Learn to use AI like a Pro
Get the latest AI workflows to boost your productivity and business performance, delivered weekly by expert consultants. Enjoy step-by-step guides, weekly Q&A sessions, and full access to our AI workflow archive.














Though the court dismissed some claims, such as breach of contract and unjust enrichment, as well as violations of the Digital Millennium Copyright Act, the lawsuit will proceed to the discovery phase. This phase will likely uncover critical information about how the AI firms harvested copyrighted materials to train their AI models. The discovery process has the potential to expose further misuse of copyrighted content, which could bolster the artists' case significantly.
The lawsuit, initially filed by artist Karla Ortiz – known for her work on blockbuster films such as Black Panther and Avengers: Infinity War – focuses on the use of the LAION dataset. This dataset, allegedly comprising five billion images scraped from the internet, was used by companies like Stability AI and Runway to train their AI models. Notably, the lawsuit also includes Midjourney and DeviantArt for incorporating and utilizing the Stable Diffusion model.
The artists' copyright infringement claims highlight a significant concern for the future of AI in creative fields. As AI-generated works are not eligible for copyright protection, the extensive use of copyrighted material for training puts existing intellectual property laws to the test. Courts' decisions on these issues will likely influence the extent to which AI can be integrated into industries like filmmaking, potentially limiting its adoption if strict copyright infringement standards are upheld.
Judge Orrick's decision also delves into how Stable Diffusion operates, suggesting that it may invoke copies or protected elements of copyrighted works. This operational aspect could be troublesome for AI companies, as it implies a foundational level of infringement. Orrick's comments indicated that Stability AI and Runway could have promoted copyright infringement intentionally, a damning assertion that could result in substantial damages if proven.
Learn to use AI like a Pro
Get the latest AI workflows to boost your productivity and business performance, delivered weekly by expert consultants. Enjoy step-by-step guides, weekly Q&A sessions, and full access to our AI workflow archive.














Stability AI and Runway argued against the artists' claims, asserting that their AI models do not store or reproduce specific images used for training. However, the court found that the sheer size and nature of the LAION dataset, combined with the operations of Stable Diffusion, do not necessitate identifying specific works used in training at this stage of the lawsuit. This means the plaintiffs' claims may proceed without the need for such specificities, making it easier for the artists to make their case.
The implications of this lawsuit are far-reaching, with the potential to shape the landscape of AI development and its integration into business practices across sectors. Companies like DeviantArt have voiced concerns, noting that a ruling against AI firms could lead to a cascade of similar lawsuits, creating widespread legal and operational challenges. DeviantArt's lawyer highlighted the extensive impact such legal actions could have, given the number of companies in similar positions.
In addition to copyright infringement, trademark claims against AI companies are also proceeding. Midjourney's practices of producing images that mimic specific artists' styles and publishing these alongside the artists' names have raised questions about consumer deception and false endorsement. The court noted that these issues could be addressed in later stages of the case, adding another layer of complexity to the ongoing litigation.
CEO comments and internal communications from AI companies like Stability AI and Midjourney have surfaced, revealing the extent to which these firms acknowledge their models' capabilities. Statements from Stability AI's CEO indicating the compression of internet-sourced images to recreate them have fueled the artists' claims. During discovery, artists' lawyers are expected to delve into these admissions, seeking further proof of infringement and unauthorized use of copyrighted materials.
Overall, this lawsuit represents a landmark case exploring the intersection of AI technology and copyright law. With the legal process moving forward, the outcome could set a precedent affecting not just AI art generators but also other AI applications relying on large datasets. As this case progresses, businesses leveraging AI will need to closely monitor legal developments and potentially reconsider their data sourcing practices to mitigate similar legal risks.