Nearly 10,000 authors just published a book with no plot, no prose, no chapters. Just their names, 88 pages of them. The title is Don’t Steal This Book, and I kind of love it.
Composer and copyright campaigner Ed Newton-Rex organized the project to protest a proposed UK law that would let AI companies train on copyrighted works unless creators explicitly opt out. The signatories include Kazuo Ishiguro, Richard Osman, Alan Moore, Marian Keyes, Malorie Blackman, Philippa Gregory, and thousands of working authors you’ve never heard of but who pay their mortgages the same way.
If you’re reading this site, you probably use AI in your creative process. You might see “authors protest AI” and feel like this story has nothing to do with you.
It does. But not for the reasons most outlets are reporting.
What the UK Is Proposing
The UK government wants to change copyright law so that AI companies can train on any published work by default. Your novels, your short stories, your nonfiction, your blog posts. All fair game unless you actively opt out. How that opt-out would work hasn’t been spelled out yet, which is a huge part of the problem.
UK ministers have until March 18 to deliver an economic impact assessment on the proposal. That’s not a public comment deadline (there’s nothing you need to file), but it’s the next milestone worth keeping an eye on.
This Is a Consent Story, Not an Anti-AI Story
Most of the coverage frames this as authors versus technology. That framing is wrong.
The authors protesting at London Book Fair aren’t asking for AI to stop existing. They’re asking for the right to say yes or no before their work gets fed into a training dataset. That’s a copyright question.
And it should matter even more to you if you’re an author who uses ChatGPT to brainstorm or Claude to revise. The whole reason those tools are valuable to you is that you bring original creative work to the table. Protecting your right to decide how that work gets used is what keeps the relationship between authors and AI tools healthy. The AI tools we love are only as good as the creative ecosystem that feeds them. A copyright framework built on consent gives those tools long-term legitimacy. A framework built on “we’ll use your work unless you figure out how to stop us” does the opposite.
What This Means If You Sell in the UK
If you publish through Amazon KDP, Kobo, Draft2Digital, or any distributor that reaches the UK market, this affects you. Copyright policy in one major market has a habit of influencing others, and the UK isn’t the only government watching how this plays out.
This is also a separate issue from AI companies training on pirated copies of books. That’s the subject of the Anthropic settlement and other ongoing lawsuits. What the UK is proposing goes further, making it legal to train on legitimately published works without permission.
What to Do Right Now
Nothing, yet. The March 18 deadline is the UK government’s internal checkpoint, not a call for public input. But this is the kind of policy shift that moves fast once it gets momentum. If you sell in the UK or care about how copyright law evolves globally, keep this on your radar.
I want AI to know my books. I also want to be the one who gets to say yes to that. Those two things aren’t in conflict, and this protest is asking for exactly the same thing.
Sources
- 10,000 Authors Protest AI With ‘Empty’ Book — Deadline’s coverage of the London Book Fair protest
- Don’t Steal This Book: Authors Protest AI at London Book Fair — Euronews coverage with details on the UK government’s March 18 deadline
- Writers Protest ‘Theft’ by AI Companies — Computing.co.uk report on the protest and proposed UK copyright changes