All News

Industry & Advocacy News

Opting out of the Anthropic Settlement: What Authors Should Know

Photo of Anthropic logo on a laptop screen with headline text reading Law Firms Promoting Opt-Out of Anthropic Settlement

The Authors Guild has received numerous emails from authors asking about letters they received from different law firms advising them to opt out of the Bartz v. Anthropic settlement and to join separate lawsuits with the promise of much higher awards. These communications state that the firm will bring a lawsuit on the author’s behalf and will obtain a much larger recovery for authors than they would receive from the $1.5 billion settlement (estimated to be around $3,000 per book, with half generally going to the publisher depending on the author’s contract).

We agree that the amount per book feels paltry in light of the gross theft and that the desire to obtain more money for authors is laudable, but the amounts promised by many of these firms are a bit pie in the sky and some of their communications are less than fully transparent.

Whether to opt out of the settlement is ultimately a personal decision that each author needs to make based on their individual circumstances. We wish to ensure that authors have clear and correct information regarding what damages can realistically be expected, what is covered by the settlement, what it would mean to litigate an entirely new lawsuit, the costs and risks involved, and the likely timeline—so that authors can make fully informed decisions based on what is best for them.

To start, authors should review our Anthropic page, which provides a detailed overview of the settlement, including the opt-out process and relevant deadlines. The opt out deadline is January 29, 2026, extended from January 15, 2026.

In particular, authors should bear in mind the following critical points:

1. Courts Rarely Grant Maximum Statutory Damage Awards

In their efforts to sign up individual plaintiffs, the law firms have cited the maximum statutory damages that courts may award to prevailing parties in copyright suits. Copyright law allows for awards of between $750 to $150,000 per work where the defendant willfully infringed copyright. We understand that some firms have suggested that authors will likely receive $150,000 per work and minimally $30,000. While technically possible, authors should be aware that these maximum damages are rarely awarded.

Rather, most statutory damages awards are at the lower end of the range. One recent study found that the most common award was the statutory minimum of $750 per work and the second-most common was $3,000. Moreover, the per-work awards tend to be lower when there are more than seven works at issue—as would be the case in any suit against Anthropic. The study found that in cases involving more than twelve works, the median award was $3,000 per work.

Also, if you have an existing publishing agreement, be sure to check your contracts and make sure you do not have an obligation to consult with your publishers before bringing a suit.

2. The Settlement and the New Lawsuits Relate to Anthropic’s Piracy, Not to AI Use

Both the settlement and the new proposed lawsuits only relate the Anthropic’s downloading of books from pirate websites—not its use of those books to train AI. The class action case against Anthropic alleged that the company infringed authors’ copyrights both by illegally downloading books from pirate websites and by using those books without permission to train their language model. The court rejected the claim related to AI training on the grounds that the training was fair use, and found Anthropic liable only for illegal downloading from pirate sites. (To be clear, the Guild strongly disagrees with the training decision, which is not binding on courts in other cases.)

This means that the only claims remaining in the case when the settlement was reached were those involving Anthropic’s piracy. The settlement is intended to compensate copyright owners for that infringement only; it is not intended to provide compensation for the broader harms that AI training causes authors (such as losses from AI-generated books). The Guild and other plaintiffs are seeking relief for those injuries in other cases.

Our understanding is that the law firms seeking to file new cases against Anthropic likewise are basing claims on the piracy only and not on the AI use. As such, authors should not expect that any damages award will compensate them for any AI-related losses.

3. Litigation Is Risky, Lengthy, and Burdensome

It is not a sure thing that any author who opts out of the settlement and files a separate suit will ultimately be awarded any damages. All litigation is risky, and based on the litigation to date, we can expect Anthropic to spare no expense in defending itself, which would make the case time-consuming and costly. It is highly unlikely that Anthropic would agree to settle these cases for larger per-work amounts than it agreed to pay as part of the settlement. If the Guild’s own experience in AI litigation is any indication, plaintiffs will have to go through a highly extensive, drawn-out discovery process, including depositions and production of numerous emails and other documents. That means that any decision on the merits wouldn’t come until 2027 at the very earliest, and could be much later; and assuming Anthropic appeals an award grant, it could be much longer before authors receive any money.

While we understand that the law firms would handle the cases on a contingency basis, meaning that the plaintiffs would not have to front litigation expenses, the time and burdens associated with discovery should not be underestimated. Moreover, expenses, which can total a million dollars or more, come off the top. That is in addition to the contingency percentage that the firms are charging (most seem to be asking for 30 or 35 percent of the total award). If you decide to go with one of the firms, be sure to read the engagement letter carefully. It should spell out your rights and the fees and expenses.