In September 2025, Anthropic AI, the company behind the Claude, agreed to pay $1.5 billion to authors and publishers for using illegally-downloaded books to train its AI language models. What started as a lawsuit brought by three authors turned into a class action with nearly half a million authors and one of the largest copyright settlements in U.S. history.
In August 2024, three authors—Andrea Bartz, Charles Graeber, and Kirk Wallace Johnson—brought a class action suit against Anthropic, claiming that the AI company downloaded millions of copyrighted books from “shadow libraries,” such as Library Genesis (“LibGen”) and Pirate Library Mirror (“PiLiMi”) to train its AI models. Documents later showed that Anthropic did download millions of books from pirate sites to train its AI models—but also purchased and scanned millions of books for the same purpose. The authors challenged Anthropic’s use of any books, without their permission, to train its AI models.
In June 2025, Judge Alsup, of the U.S. District Court for the Northern District of California, significantly narrowed the scope of the lawsuit. The Court held that Anthropic’s use of legally-acquired books to train its AI models was fair use, explaining that the use was “quintessentially transformative.” But Judge Alsup denied Anthropic’s request for summary judgment related to piracy, finding that it was not fair use.
In August 2025, the Court certified the case as a class action related to piracy. The class includes all rightsholders of books Anthropic acquired illegally from LibGen and PiLiMi to train its AI models. A trial relating to piracy was set for December 2025, but in September 2025, Judge Alsup preliminarily approved a settlement agreement pursuant to which $1.5 billion would be split among rightsholders (the “Settlement”).
This Bartz class includes rightsholders—both authors and publishers—whose books were downloaded by Anthropic from pirated databases and used to train its AI models. Because Judge Alsup held that the use of books acquired legally, even without the authors’ permission, to train AI language models constituted fair use—the Settlement only applies to rightsholders whose books were downloaded illegally and used to train AI models. In order to determine if a book is included in the Settlement, authors and publishers can search on the Anthropic Settlement website.
Authors and publishers have a few options under the Settlement: (1) accept the Settlement terms and submit a claim form to request payment; (2) object to the Settlement and submit a claim or (3) exclude themselves—i.e. opt out—of the Settlement.
While authors and publishers will receive money from the Settlement, the Settlement is somewhat narrow. Anthropic will pay approximately $3,000 per work under the Settlement. The Settlement only holds Anthropic accountable for its past infringements--and does not cover claims based on the output of AI models, affect Anthropic’s ability to train on legally-acquired materials, or create a forward-looking licensing framework.
Despite the Settlement’s relatively narrow focus, Bartz sends a strong message to AI companies that they could be liable for copyright infringement by using books from pirated sites to train their AI models. The Settlement holds Anthropic accountable—for illegally downloading copyrighted works to train its AI models—by paying creators, even if only $3,000 per work. Accepting the Settlement and submitting a claim—or objecting to the Settlement and submitting a claim—is the only way to receive compensation under the Settlement.
The deadline to object to the Settlement, or opt out, is January 29, 2026 and the deadline to submit a claim under the settlement is March 30, 2026. The Court has only preliminarily approved the Settlement--and must hold a hearing to ensure that the Settlement is “fair, reasonable, and adequate,” which is scheduled for April 2026.

/Passle/5cb04e9a989b6e13ecfcf95d/SearchServiceImages/2026-01-07-16-28-06-384-695e89966238591a5798d6ff.jpg)

/Passle/5cb04e9a989b6e13ecfcf95d/SearchServiceImages/2025-12-15-19-28-56-508-694061786b04b88e2d8bcb21.jpg)
/Passle/5cb04e9a989b6e13ecfcf95d/SearchServiceImages/2025-12-06-02-57-22-675-69339b925912d3272a666ed5.jpg)