Industry & Advocacy News
April 13, 2026
Last month, Hachette canceled the release of a forthcoming novel after allegations surfaced online that it was written using generative AI. The novel, Shy Girl by Mia Ballard, was published in the U.K. last fall and was scheduled for release in the U.S. in April under Hachette’s Orbit imprint. According to a New York Times story, a reporter “approached Hachette citing evidence that the novel appeared to be A.I.-generated.” Later that same day, Hachette announced that, following “a thorough and lengthy review of the text,” it had decided to cancel the book’s U.S. publication and discontinue sales in the U.K.
Many questions remain unanswered: What specific evidence did the Times bring to Hachette? What did Hachette’s review process consist of? Did Hachette give Ballard an opportunity to respond? (Ballard told the Times that she did not use AI to write the novel but that someone she hired to edit the book used it during that process.)
But beyond the specifics of Ballard’s situation, the Shy Girl story illustrates the uncertainties and risks for authors considering whether to use AI tools, and if so, for what purposes. This is a growing concern as the use of generative AI becomes more and more widespread. The Authors Guild has heard from literary agents that they are seeing an increase in submissions that are clearly AI-generated and that noticeably lack originality and voice. And, as recent media reports have noted, we are already seeing AI-generated books dominate bestseller lists in certain categories on Amazon and other retailers. Suddenly faced with unfair competition from these materials, many authors understandably want to know when, if ever, it may be permissible to incorporate AI tools into their own writing process.
We urge authors to consult the Authors Guild’s best practices for using AI, which we are in the process of updating at the time of this publication. To begin with, authors should be mindful that, at least as of now, all the major foundational large language models (LLMs) have been trained on pirated, unlicensed books that were taken without permission from or compensation to authors or publishers. This means that when you are using an AI model, you are supporting a product based on the theft of authors’ works.
If you do use AI, our best practices advise against using it to write for you. If you are claiming to be the author of a work, the words should be your own. This is different from using AI as a tool for non-text-generating purposes, such as research or outlining, which is generally permissible (bearing in mind the ethical considerations noted above). If you use AI for those tasks, take precautions to not copy AI-generated text into your work, just as you would to prevent plagiarism. For example, any ideas or facts that you want to use from AI outputs should be rewritten in your own voice.
These steps are particularly important for authors working with a publisher. Standard publishing contracts always include “representations and warranties” terms in which the author makes a legally binding representation that the text is original to them. The term “original” has the same meaning it has under copyright law—i.e., created by a human author. Federal courts and the U.S. Copyright Office have consistently determined that material generated by a machine lacks the required elements of human authorship and therefore is not copyrightable.
As a result, if your book contains more than a minimal amount of AI-generated text, and you haven’t disclosed that fact to your publisher, the publisher may have the right to terminate the entire agreement on breach-of-contract grounds. Similarly, an entirely AI-generated plotline or wholesale adoption of AI-generated characters may violate the warranty. As such, you must disclose to your publisher if you incorporated any AI-generated text, characters, or plot in your manuscript. There are cases where using AI (such as a book about AI or a character using AI) makes sense, and the publisher will approve it. To protect yourself, make sure that any AI use is reflected in your contract.
As always, the Authors Guild’s legal services department is available to members to review and advise on publishing contracts. You can submit a legal services request at staff@authorsguild.org.
Ultimately, as the Shy Girl case illustrates, we believe that readers continue to want a human connection with the author and feel cheated if they discover that a book or article they thought was written by a real person turns out to be AI-generated. Above all, authors should be honest with their readers and clearly disclose if AI was used to generate more than a minimal amount of text. And to assure readers that your work is human authored and to show support for human creativity more broadly, authors should consider registering for the Guild’s Human Authored certification mark.