AI companies illegally copied authors’ books from pirate ebook sites to develop their large language models (LLMs) without obtaining consent from authors or publishers, much less providing compensation. This means authors and publishers have no means of controlling uses of their books for training or in outputs that incorporate their works.

It is crucial that we move AI companies away from the current reliance on fair use to licensing regimes where rightsholders have the ability to say no to use of their works or to engage in licensing on terms and conditions they believe are fair, including restrictions on output use.

LLMs need—and will continue to need—high-quality human authored copyrighted books and journalism to develop quality LLMs, but they do not need to use books and journalism for free and without restriction. It is imperative that we bring an end to the unlicensed use of copyrighted books and journalism and the subsequent unrestricted use of LLMs to create copycat books.

Licensing is how copyright works, and that system should not be abandoned for the sake of AI companies. Licensing puts control back in the hands of rightsholders.

  • Licensing is a way to enforce copyright. Licenses come with limitations and restrictions, as well as compensation. Just as your publishing agreements are limited to the rights expressly granted and you can expressly withhold or forbid certain uses, so are AI licenses.
  • Replacing reliance on fair use with licensing systems gives authors the choice to not license at all. By establishing licensing as the norm, it will be even riskier for AI companies to use the works that authors have chosen not to make available for licensed uses.
  • Different authors have different interests when it comes to AI licensing, and these need to be respected and supported.
  • The markets developing for AI licensing are multifaceted. A work can be licensed without feeding the big LLMs or allowing outputs that use your work. When you license a work for “training-only” that means that the LLM owner must block any uses of your work in outputs—and the license will say that those rights are reserved.
  • Authors can separately license rights just for research and reference uses, such as specific Retrieval-Augmented Generation (RAG) uses (where the AI system queries a database in real-time for information to generate more accurate responses without incorporating the material into the model). In addition, licensing books for new derivate work uses of books, such as allowing users to interact with a book or personalize it, is a developing market place, where authors get to decide what they will allow for how much.

These FAQs provide more information on AI licensing. We will continue to update them, given how fast AI is evolving.

1)

Absolutely not. Whether or not an author wants to license their works is their own choice. The Authors Guild has stridently advocated for authors to have the final say over whether or not they want their works included in any licensing deals, even where the rights might belong to the publisher. We want authors to have the option to license works on terms they decide if they decide to do so.

2)

In our AI best practices, we ask authors to be mindful of the fact that the current major LLMs were trained illegally on unlicensed content and that many allow users to generate outputs without sufficient restrictions. By converting unauthorized, unlicensed training to licensed uses—as should have occurred to begin with—authors and publishers can impose necessary restrictions on use and obtain fair payment. Authors and publishers can also decide not to engage—to not permit their works to be used at all.

Licensing mitigates if not blunts harms caused by generative AI in this manner. Technological measures exist that AI companies can use to block or track outputs to certain types of prompts, and those need to be adopted. For instance, licenses can be conditioned on the AI company preventing its users from prompting its LLM to generate outputs that compete with the authors’ works, such as book summaries, use of the author’s style or voice, outlines for sequels, and other uses that infringe the author’s copyrights.

The licensing solutions we are exploring and promoting, including through our partnership with Created by Humans (CbH), give authors and publishers control over whether or not they want to permit certain output uses, and if they do, to earn adequate revenue from them.

3)

No. Without licensing, there is rampant unrestricted use of LLMs to mimic authors and their work. That is now occurring. Licensing will curtail unrestricted unlocked output uses because it will give rightsholders the contractual ability to restrict and police output uses. Authors and publishers know best how to monetize (or not) their works, and they will not permit uses that will undermine the market for their own works.

Copyright history has shown us that when legitimate licensed technologies come into the marketplace, they replace the unrestricted, infringing uses that preceded them, filling a market gap. And the sooner the licensed uses are offered, the more likely they are to retain value. For instance, when the record companies delayed in making digital music available to consumers in the 1990s, many music listeners moved to illegal downloading (e.g., Napster and its progeny). By the time the record companies got into the game and started licensing digital music, illegal downloads and streams had become acceptable in certain circles—legitimizing illegal behavior. It resulted in the industry downsizing by more than half. Through a multi-pronged approach of litigation and subsequent licensing, the record industry was eventually able to move users to licensed services (though a little late to make users pay what they used to for music).

4)

The AG believes that authors should have the choice to determine whether they want their books used for AI on a title-by-title basis and should have the ability to decide what uses they wish to license.

An author might determine that for some works, licensing is worth the additional income, and for others, perhaps not. Some authors might also be interested in allowing their works to be used for training with no permitted output uses. Or, they might wish to only allow specific types of research and reference uses, such as specific Retrieval-Augmented Generation or RAG uses (where the AI system queries a database in real-time for information to generate more accurate responses without incorporating the material into the model). Down the road, marketplaces might also develop for certain types of derivate work uses, such as fan fiction apps; we imagine those will be one-off deals with relatively high rates of compensation. Whether to enter into any of those types of licenses or not are assessments every author should have the right to make. When authors decide to license their works, we believe that they should have control over the terms.

5)

AI training use was never contemplated in any publishing agreements until very recently, and rights to AI uses were not expressly granted to publishers. Because AI is capable of taking so much more from an author than just the text of the book, we believe that licensing authors’ works for AI training without their consent is unethical and unfair. Each author should have the right to decide whether their publisher may license out their work for AI use. We believe this is true even where the author entered into a journalism, academic, or textbook contract that provided for the publisher to acquire the entire copyright—because as authors have recently expressed, they may well have decided not to enter into the agreement had they known. That said, some publishers do claim that they possess the rights to license for AI training.

Publishers do however have rights to some types of output uses under most contracts, such as book summaries, abridgments, or personalization (e.g., allowing a user to pay to change the name of a character in a children’s book to their child), and have an interest in protecting against those uses or being compensated for them. Moreover, the publisher often possesses the final copy used to print the book, which may be the preferred copy for the AI licensee.

As such, we recommend consulting with your agent (if you have one) and having an open conversation with your publisher prior to engaging in licensing. For training-only licenses, which are not addressed in past publishing agreements, if the publisher is involved and provides the final text file, we recommend a split of 80/20 or 75/25 (with the higher amount going to the author)—similar to translation and foreign rights. Other splits are addressed by the contract.

The best licensing platforms also will accommodate authors and publishers signing on as rightsholders—recognizing that in some or even many cases, it might be necessary for both the author and publisher to sign off on AI use of a traditionally published, in-print book.

Authors Guild members may consult with the Guild’s lawyers by submitting a legal help request with any questions about their rights. You may also refer to our Model Contract for advice on rights and our model AI clauses.

6)

Currently, AI companies are licensing rights to literary works to train their LLMs, and more recently have also started to license trustworthy literary materials for specific Retrieval-Augmented Generation (RAG) uses for reference purposes to ensure factually accurate outputs. The latter might include the ability to summarize texts or quote verbatim from it. (Such RAG uses implicate rights that fall more squarely within the publisher’s subsidiary rights.) In addition, there is a burgeoning market for using books to create applications for information, explore books interactively, or create fan fiction, as well as to convert text into other formats. Many of these latter derivate work uses may be best negotiated individually, but some may be amenable to collective licensing.

7)

No. Not participating in a license does not give AI companies the right to use the works. That is still infringement.

8)

To date, more than a dozen lawsuits have been filed by authors against AI companies for training their LLMs without permission. The Authors Guild is a named plaintiff in a class action lawsuit against OpenAI and Microsoft (and our lawyers have also filed class action lawsuits against Anthropic and Meta). The basis for the lawsuits is that the books used were not licensed—the AI companies did not bother to get permission. The Authors Guild strenuously disagrees with their claims that the training is fair use. Unlicensed, unrestricted AI training could destroy the ecosystem for books in the long run, and copyright exceptions do not extend to undermining the very purpose of copyright law.

Compensation for uses covered by the class action cases will be provided through the class action process—assuming the cases are successful.

9)

The fees will be set on a per-license basis and there are no standard fees to date. The platforms that we have seen incorporate different ways to set prices, which authors will then be able to choose or reject. Rather than basing the pricing on what AI companies have offered to pay, the licensing platforms that we have spoken to are using econometrics to recommend reasonable bandwidths for pricing, including pricing based on usage in AI prompts. From any income generated, the licensing platforms will take a small fee to cover operations. Your agent may also be entitled to a commission and your publisher a portion of the revenue.

10)

We are advocating for authors’ prerogatives in all cases to allow or withdraw from licensing deals, regardless of the terms of their prior publishing agreements—which in no way contemplated AI use. For all new contracts, we recommend that authors include express language that they have the right to approve AI uses and to receive a fair share of any licensing deals. We continue, as we have for the last 122 years, to work to ensure that authors have control and secure fair compensation when their works are used.

11)

If you are an Authors Guild member, our legal team would be happy to review your contract and assist you in understanding and negotiating an AI licensing deal. You may submit a legal request for assistance.

Revised November 4, 2025