AI Detection, Publishing Contracts, and You

Or, What Happens When Your Publisher Thinks AI Wrote Your Book

First, I’m not a lawyer. Nor am I an agent, or a publisher. I’m an author who has recently been seeing more and more content creators feel uneasy about the AI clauses in their contracts, and decided to do a bit of a deep dive into the matter.

Publishing Contracts and AI Clauses

Last year, the Author’s Guild recommended an AI clause for publishing contracts suggesting that an author can’t use more than 5% of AI generated text in their output, and that they must disclose every instance of AI generated text. This might sound hunky dory to all of us who think that the whole point of writing fiction is the joy of writing. Asking ChatGPT to barf out a chapter and shoving it into a manuscript defeats the purpose of writing as creative output.

What Counts as AI Generated?

In the modern world, AI is everywhere. It is in the interface of this blog post I’m writing–a magic wand waiting in the wings in case I need it (I don’t). It’s the same magic wand near a Facebook post. It’s in Word. It’s in Scrivener. It highlights typos and suggests better phrasing, and if you use something like ProWriting Aid or Grammarly, it will straight up improve and smooth out your sentences. It will find repeats, suggest dynamic verbs and convert your passive sentences to dynamic ones. If you accept those suggestions, are you using AI generated text? I learned a great deal about writing dynamic and strong sentences from running my drafts through ProWriting Aid. I don’t know where the law stands on this. I don’t think anybody really knows until there is a court case, and those court cases are coming.

What if you ask ChatGPT or Claude.ai for prompts? Or to name your characters? What if you feed it your chapters and ask it for a critique, or an analysis of pacing, and it gives you profound guidance and you use it?

How Do You Prove Your Work is NOT AI Generated?

This one is a scary one, and didn’t even occur to me as a possibility until I saw an entire LinkedIn feed full of content providers lamenting this situation. Publishers have AI checkers at their disposal, and what happens if an AI checker flags your work as AI? We all know AI makes mistakes. How do you prove your words are your own, and how do you rise above being insulted? Worse, if your publisher has the 5% clause in the contract, and they accuse you of using generated content, they can refuse to publish your book, or ask you to remove the questionable content, or demand you tell them where the content came from.

This is happening already in academia, so why not in fiction publishing.

Read Those Clauses and Protect Yourself

This is why you need an agent. If you don’t have an agent, hire a lawyer to review the publishing contract for you. You can always negotiate these clauses to protect yourself against being falsely accused of being a robot or to be more specific about what exactly counts as AI generated. A brief search online brought up this helpful post on contracts.

Add Your Own Clauses if the Publisher Doesn’t Have Them

A more common clause, and one more and more publishers use, says that none of the work can be used for AI training. This is a good one, and you should definitely ask to put that in if they don’t. Just make sure to say that neither you, nor the publisher can knowingly allow the work to be used for AI training. That word ‘knowingly’, apparently is important because in this world, we simply don’t know where our work will end up. We can only hope for the best.

Do you have experience with this topic? Comments? Thoughts? I want to hear!

Emilya Naymark

avatar
Emilya Naymark is the author of the novels Hide in Place and Behind the Lie.
Her short stories appear in the Bouchercon 2023 Anthology, A Stranger Comes to Town: edited by Michael Koryta, Secrets in the Water, After Midnight: Tales from the Graveyard Shift, River River Journal, Snowbound: Best New England Crime Stories 2017, and 1+30: THE BEST OF MYSTORY.

When not writing, Emilya works as a visual artist and reads massive quantities of psychological thrillers, suspense, and crime fiction. She lives in the Hudson Valley with her family.

3 comments

  1. Thanks, Emilya, for this timely and important information. “Knowingly” is a word we need to remember. For the anthology I’m involved in, one of our authors was worried that using a program like Grammarly might qualify as “more than 5%” AI. It isn’t in our book. AI is a tool that (like so many others) can be used wisely and misused. I’m glad organizations like Authors Guild and others and keeping their eye on the trend.

Leave a Reply

Your email address will not be published. Required fields are marked *