AI and Book Publishing – Optimistic Embracer or Silent Sceptic?

By   Hannah Bickerton 13 min read

The impossible-to-avoid topic of AI is a dominant discussion in every boardroom, conference, dinner party and newsfeed. In publishing, like in many industries, the path ahead seems rather bumpy. As we all race to adopt – or even understand – AI fully, we can still only hypothesise as to what its impact will be. Opinions seem to land somewhere on the spectrum between revolutionising how we source, create and market, and ‘We’ll all be made redundant.’

Like many companies firmly rooted in the business of people, we have always proudly emphasised the value of trusted professional human intervention. So as the AI movement explodes across our industry we find ourselves wearing two hats: the optimistic embracer of the ‘new and better’ and the silent sceptic questioning the collateral damage. We should, however, try to visualise what the publishing industry might look like if AI is applied wholesale to the creation, production and publication of the written word. We should think about how AI will affect human imagination and readers’ relationship with the publishing landscape. Because if we fail to do so, we may find ourselves actively working to dilute creativity, and possibly even hastening a breakdown of enthusiasm in readers and writers alike.

With this in mind, we sought out multiple views to help us determine the role of creative talent as it adapts to this fast-changing world. Full disclosure: we aren’t experts in AI. But helpfully we know a few people who are, and who were happy to answer our questions on how professionals are already interacting with AI and what these interactions might look like in the future.

To understand the threat, the devil is in the detail

In a survey of American publishing companies that have implemented AI in their processes, it was found that ‘47% were using it in their marketing departments’ and ‘just 12% were using it for production’. Meanwhile, analysis undertaken by Goldman Sachs economists suggests that incorporating AI into business processes could lead to a ‘7% (or almost $7 trillion) increase in global GDP’ and a ‘1.5%’ boost in productivity.

A 7% increase in global GDP sounds incredibly alluring at a corporate level, but what does putting our trust in AI mean for the individual, and are there areas where we should be actively restricting its use? Bestselling author and podcaster Joanna Penn stresses the need to first be clear on definitions and applications.

‘“AI” is an almost meaningless term at this point, as so much of what we do encompasses aspects of AI. Publishers use Amazon, TikTok and Meta, all of which are publishing and marketing systems powered by AI. If you mean generative AI and the use of LLMs (Large Language Models), then, once again, the usage is becoming ubiquitous. Many writers use tools by Microsoft and Google, like Word or Docs, both of which now have generative AI built in. Microsoft has rolled out GPT4 into its software, and Google is rolling out Gemini. Writers also already use editing tools like ProWritingAid and Grammarly, which, again, now use GPT4 or similar. The most commonly used book design tools, like the Adobe Creative Suite, now have generative AI built in, for example with Adobe Firefly, and many stock photo sites are selling AI-generated images. Most audiobook distribution platforms now allow AI-generated audio, from Amazon KDP’s beta, US, invite-only service, to Findaway by Spotify now partnering with Google Play Books’ auto-narration.

And Joanna continues:

If the publishing industry wants to restrict the use of AI, it is going to have to specify very particular use cases, which are likely to expand over time as the capabilities of the systems become clearer. The best comparison is the development of the internet and how that has impacted publishing and life in general. There are pros and cons with the internet, but would you want to run a business without it now? Every company will have to decide for itself where it wants to draw the line.’

The debate seems to be between whether AI makes us better at what we do or, more menacingly, does what we do. Performance-enhancing tools that fall under the machine-learning side of things enable more efficient processes and do the heavy lifting on time-intensive tasks. This allows for such benefits as sophisticated book marketing vehicles via personalised user content experiences, speedier delivery among publishing teams with translations at the touch of a button, or partially formed options for ad copy and design concepts. This frees up our brain capacity for other important activities.

Generative AI, on the other hand, formulates brand-new outputs such as text, images and design, fuelled by every piece of data in existence across the internet. And this feels far more threatening. Tools such as Story Engine from Sudowrite claim to be the writing partner you always wanted, by ‘beating out chapters’, and can ‘write 1,000s of words, in your style’. As much as some authors may want to hate it, many seem pleasantly surprised. If you visit CoverDesignAI’s website you can see their reproduction for the cover of the The Little Prince, a result based on tidy prompts such as ‘create a dream-like atmosphere and capture the sense of wonder and exploration that is essential to children’s literature’. We can be snobs about it, but it looks rather good.

Time will tell where the line is drawn as industries decide how best to use these new tools. The ethical dilemma currently preoccupying publishing is: how do we use AI to enhance productivity, creativity and efficiency, while limiting its negative impacts? What, specifically, are those negative impacts, and who is most exposed? George Walkley, an independent consultant, publishing expert and academic researcher, summarises:

‘There are two areas in particular that are sensitive for publishers. The first is the distinction between routine and creative work: AI offers great potential for automating tasks and increasing productivity, but there’s real concern about the impact on creative workers such as authors and illustrators that has led many publishers to hold back on using AI for content creation. Secondly, generative AI outputs are subject to bias, error and fabrication, so while AI has great potential to help with ideation and drafts, I would not recommend using the output of generative tools without careful human review.’

While publishers deliberate over how to responsibly accommodate the role of creatives and creators amid the invasion, anyone with a career under this broad umbrella must be feeling a little alarmed, or at the very least wondering how they will need to adapt.

It’s time to change… the narrative

For an industry such as publishing, which values quality and originality of content and constantly strives for relevance in an evolving cultural landscape, creative excellence, editorial intervention and good judgement should always play critical roles. The extent to which AI can ever adequately replace these essential roles, and how we’d strike a harmonious balance between human and machine if it could, is still unclear.

Until AI programs gain sentience of their own, the feasibility of which is still in question, AI-generated content will always feed off data formed from existing human inputs. We wonder what this means for creativity and innovation. Does it become narrower and narrower, leading to an endless spiral of simulacrum? Furthermore, can it be trusted? Human-generated content can fall victim to narrative inconsistencies, lazy stereotypes and linguistic mistakes, but we also have the self-awareness to challenge them. Tools such as ChatGPT can offer writers content suggestions, stylistic improvements and potential plot lines, but, as George highlighted above, it does have its limitations.

Speaking to the impact AI might have on the imaginative skill of content creators, freelancers and marketing professionals within publishing departments, George explained:

‘AI is already highly creative and prolific: in a recent study, ChatGPT beat 99% of human participants in a test of creativity. Generative AI is particularly good at generating a large volume of ideas, or creating the first draft of a document. It would be understandable if this worried creatives. But where those people excel is at judging context, refining ideas, and using AI as a tool. So I would not position this as human versus AI, but as AI augmenting skilled human judgement. The best creators will be learning how to get the most out of these tools and reinvesting the time saved versus manual processes in higher-order thinking, in exactly the same way that earlier generations of creatives embraced Photoshop or desktop publishing.’

From a creative standpoint at least, we can argue that generative AI functions best when there is skilled human interpretation and oversight. Rather than being resistant to change, we should be embracing the advantages it can offer. This is very much an approach that Joanna Penn encourages. When asked what advice she would give to aspiring writers regarding their knowledge and understanding of how AI programs function, she argues that it’s a mindset: 

‘The last twenty years have seen our industry changed by the internet, and the next twenty years will see a shift due to generative AI. You can wait to see how it plays out, or you can join in, learn, and make the most of the change. These AI systems are tools that you can use to help realise your creative goals. They can amplify your ideas, they can act as a collaborative co-pilot, and they can help you learn writing craft and business in specific ways that can accelerate your author career. Of course, in the same way as the internet can be used for wonderful or terrible things, these systems can also be used for good or ill. It is our responsibility as creators to use them responsibly, and to serve our (human) goals to write the best books we can. If authors and the publishing industry stand on the sidelines, then the future will be dictated without us, so we need to be involved.’

The prospect of AI superseding human abilities when it comes to the originality, quality and integrity of content, therefore, is perhaps a counterproductive outlook, and instead we should be focusing on forming a symbiosis between the two. Where AI can turbocharge inspiration and drive new ways of exploring creative concepts, humans still need to steer it and provide the astute judgement of what is relevant, accurate and compelling for readers.

A[I]uthorship issues

If anyone can enter into a program like ChatGPT a prompt such as ‘Write a biography of Oliver Cromwell from the perspective of his cat’ and receive a reply in seconds with a finished product, who gets to lay claim to the work and how should society acknowledge literary successes? In September 2023 the Guardian reported that ‘Jane Friedman, who writes about publishing, forced [Amazon] to remove five bogus titles in her name that appeared to be AI-generated.’ Other authors, Margaret Atwood and Phillip Pullman to name a few, have raised concerns about the use of their IP in LLMs without compensation. At the beginning of this year, Japan’s prestigious Akutawaga literary prize was awarded to a novel by Rie Kudan, who was proud to admit that ‘five per cent of the book quoted verbatim the sentences generated by AI’.

There is also the issue of visibility. When we read a book or admire a piece of art, we are connecting with the creator, sharing experience and understanding through their work. It matters to us who we engage with and endorse. The Guardian has also stated that ‘someone styled as “Steven Walryn” published more than 30 books, mostly nonsensical and repetitive guides on how to use camera brands, as well as a couple of fantasy romances, with 15 published on the same day’. As AI-authored content parading under human aliases saturates sales platforms with utter nonsense, how will the reader experience, and readers’ connection with books, be affected? Will the lack of effort put into the production of publications lead to reader apathy?

Where should the publishing industry draw the line? Stuart Bache, a cover designer who has worked for HarperCollins, Puffin and Hodder & Stoughton and uses aspects of AI in his work, says:

‘The key principle is transparency. Many retailers are already asking publishers to label content to distinguish between human creation, AI assistance and AI generation. This highlights that, as well as communicating with readers, publishers also need to have a conversation with their authors about how they are using AI in their process.’

But this raises another question: how much work should be put into a piece of content by a human for it to be considered content created using AI rather than content created by AI? Stuart suggests the magic number is ‘95%’ input from humans for content to be considered created using AI.

‘Any AI-generated art needs to be a small aspect of the design to be considered AI-assisted. Photoshop now has a fantastic tool called Generative Fill, which is incredibly helpful for small things and has made my artworking process considerably faster. Anything more than that would make the cover the product of the AI.’

So what do we think?

We think we need to see more examples of publishing creatives such as Stuart successfully incorporating AI into their work in order to feel truly comfortable transitioning into this machine-powered wonderland. And there is no doubt that it’ll require a shift in our attitudes. Speaking purely for ourselves, we know we will need to be more open-minded. AI-generated content will increasingly push into our market and demand to be accommodated. We will continue to value professional human input – it seems clear that the best outcomes are achieved through this partnership – but we should expect ‘human input’ to start looking different, and for it to be a challenging process, involving upskilling and a redefinition of roles. And we hope industry regulators can catch up, to fairly protect and compensate everyone involved.

We thought we should seek one last opinion, so we asked AI. This is what ChatGPT had to say.

What impact will AI have on human professionals working within the book publishing industry?

AI is likely to have several significant impacts on professionals working within the book publishing industry. Here are some potential effects:

  1. Automated Content Creation:
    • AI algorithms can assist in content creation, generating drafts, or even entire articles. While AI is not likely to replace human authors, it can be a valuable tool for generating ideas, providing suggestions, or automating repetitive tasks in the writing process.
  2. Content Editing and Proofreading:
    • AI-powered tools can enhance the editing and proofreading process by identifying grammatical errors, improving sentence structure, and suggesting changes for better clarity. This can streamline the editing phase and improve the overall quality of content.
  3. Data Analysis for Market Trends:
    • AI can analyse vast amounts of data to identify market trends, reader preferences, and emerging topics. This information can help publishers make informed decisions about which genres or themes to focus on, optimising their content strategy.
  4. Personalised Recommendations:
    • AI algorithms can analyse reader behaviour and preferences to provide personalised book recommendations. This can enhance user engagement, increase sales, and improve the overall user experience on online platforms.
  5. Supply Chain Optimisation:
    • AI can be used to optimise the supply chain and distribution processes. Predictive analytics can help publishers forecast demand more accurately, reducing overstock or stockouts and improving inventory management efficiency.
  6. Enhanced Marketing Strategies:
    • AI can play a crucial role in marketing by analysing customer data to target specific audiences more effectively. Automated advertising, content optimisation, and personalised marketing campaigns can boost book sales and increase visibility.
  7. Audio and Visual Content Creation:
    • AI technologies like natural language processing (NLP) and computer vision can contribute to the creation of audio and visual content. This includes the development of audiobooks, book trailers, and other multimedia formats.
  8. Efficient Copyright and Plagiarism Detection:
    • AI tools can help in efficiently detecting copyright infringement and plagiarism. This is crucial for publishers to protect their intellectual property and ensure the authenticity of the content they publish.

While AI brings numerous benefits, it’s important to note that it won’t replace human creativity, judgement, and the need for a personal touch in certain aspects of the publishing industry. Professionals in the field may need to adapt their skills, embrace new technologies, and find ways to collaborate effectively with AI to stay competitive in the evolving landscape.


With special thanks to Charlie Ward-Daymond for his contributions to research and writing and Romilly King for our custom illustration. 

Hannah Bickerton
Hannah Bickerton
Hannah has worked in marketing for nine years, specialising in strategy development for start-ups and EdTech companies. Having recently jumped across industries to join the Whitefox team, Hannah isn’t a complete stranger to the publishing world with previous employment at Macmillan and TES Global. She is now dedicated to ensuring that anyone who has something interesting to say knows all about whitefox.