London Book Fair

Creative industry vs. big tech: ‘We generated it, and that is our money!’


You see an error in the text - select the fragment and press Ctrl + Enter

Although AI was only one among many topics presented at the London Book Fair last year, this year it became one of the main topics discussed. There’s a clear indication of increasing interest in AI for publishing. Panel discussions centered around AI garnered significant attention from visitors, who were not all able to enter the halls where the AI events were located. In this review, we share a few observations and insights from the event.



The contrasting regional approaches to AI were notable. Asian publishers tended to emphasize promoting AI usage and encouraging efficiency and content creation, while European and American platforms predominantly tackled regulation issues.




Gareth Rapley, Director of the LBF, highlighted key insights:



“Insights from this year’s event reveal that copyright remains a core focus of industry interest and will continue to be so. Copyright underpins everything the industry is about, and there is always a significant amount of dialogue surrounding it. Now, added to the discussion is the challenge posed by AI in the copyright dialogue. Previously, there was a conversation on fair usage and remuneration for content creation and authorship.”



A central discussion unfolded on the Main Stage, titled “Copyright and AI: A Global Discussion of Machines, Humans, and the Law.” Nicola Salomon from the Society of Authors underscored the need to address the ramifications of AI-generated content, citing instances where AI-authored books infringed upon intellectual property rights. She refers to the Universal Declaration of Human Rights on the protection of your creations (art. 17, 27), and noted that these rights have been taken from us without us even knowing.



The absence of specific regulations governing AI-created books was also noted. This has led to instances of misinformation and legal disputes, such as the controversial AI-generated book about King Charles, which has been lambasted by Buckingham Palace with claims the royal household is lawyering up. As Palace representatives state, the book, created by AI and sold on Amazon, includes false claims about the monarch’s cancer diagnosis. As a result of AI lies and Buckingham Palace’s complaints, the book has been removed from Amazon sales.





The importance of consumer awareness was highlighted, with concerns raised about the lack of labeling or filtering mechanisms on platforms like Amazon to distinguish AI-generated content. While nearly 2,800 books were created by AI, as mentioned by the panel, there are still no specific laws prohibiting  AI from creating and publishing books, so it has become an unregulated storm. 



Glenn Rollans from Brush Education Inc. (Canada) expressed concerns about the strategy of tech companies, stating,



“I really think they want to eat our lunch.”



He highlighted the emergence of language models like ChatGPT and raised questions about remuneration and transparency in data management, interpretation, and marketing. Rollans also emphasized the need for fair dealing jurisdiction.



Despite Canada’s position at the forefront of championing responsible AI governance, Rollans reported a loss of 200 million USD in royalties due to AI content misuse. Content creators are eagerly anticipating the AI and Data Act (2025), which aims to ensure the safety and non-discriminatory use of AI systems in Canada while holding businesses accountable for fair technology usage.



The publishing industry is trying to figure out how to use AI to become more efficient, increasing the number of people who have access to books and content itself, but still addressing the risk linked with products built without remuneration and transparency.



RELATED: Translation trends at the London Book Fair: Japanese are overtaking the UK market, translations from Ukrainian are on the rise



Dan Conway, representing The Publishers Association, collaborates closely with the UK Government, engaging in intense three-way discussions with writers, tech companies, and the government while awaiting consensus on AI regulations:


“Publishing has been heard on this issue. Even if we’re stuck in this temporary policy paralysis, we do need high-level policies to establish basic principles and foster fair remuneration.”





In an exclusive comment for Chytomo, he states: 



“I feel like we’re making a lot of progress. The UK government understands that there is an issue that needs to be addressed, and there’s a high demand for transparency on how authorship is claimed and used. The difficulty we face is that it’s a governmental general election year. A dynamic licensing framework alone can’t solve the entire problem, as tech companies are currently disregarding copyright prerequisites. Despite creative industries being [responsible for] one of the UK’s main exports, regulation will add value without affecting the process.



There are rumors circulating that there might be AI exceptions, allowing publishers to withhold their rights without transparency. As authors and copyright owners, we should especially focus on transparency of measurements, attribution of the author, and fair remuneration. These are common statements around. 



The core issue is unjust enrichment, revolving around three main components: tech talent, computing power, and authorship. While the first two components represent billion-dollar industries, there’s a persistent expectation that the third, authorship, should be provided for free. This discrepancy underscores a fundamental imbalance in the current landscape of AI and content creation.



Copy editing: Lesia Waschuk, Terra Friedman King