The New Loyalty Test Is 'Prove You're Human' and I Hate Everything About It

By Morgan Paige Published April 4, 2026
The New Loyalty Test Is 'Prove You're Human' and I Hate Everything About It

We’re about five minutes away from having to pee in a cup to prove we wrote our own novels.

The Verge ran a piece on the growing pile of “AI-free” certification labels fighting for dominance, and I counted at least eight competing solutions before my eyes glazed over. Eight. We can’t even agree on one label for organic milk and that’s been around for decades. But sure, let’s fragment the creative authenticity space into a bunch of little fiefdoms, each with their own verification process that ranges from “we checked, pinky promise” to “show us your rough drafts and a human auditor will watch you sweat.”

Some of these services literally just hand you a badge to slap on your website. No verification. No process. Just vibes. Others want you to submit sketches, drafts, revision history. One is using blockchain (of course it is) to issue “Made by Human” tokens. The Authors Guild has their own human authored certification specifically for books. The UK’s Society of Authors launched a partner scheme at the London Book Fair in March with 82 percent of their members expressing interest.

So What Counts as “Human-Made” Anymore?

What counts as “human-made” when AI is baked into basically every creative tool you touch?

One researcher quoted in the piece asked whether chatting with an LLM about your idea before writing it manually counts as using AI. That’s a real question. And it doesn’t have a clean answer. I brainstorm with Claude sometimes. I also brainstorm with my cat, and she contributes about as much to the final draft. (Sorry, Mochi.) Does talking through a plot problem with an AI make my book less mine? Does using Grammarly? Does using autocomplete in my email?

Not by AI, one of the bigger certification efforts, draws the line at 90 percent human-created. Which sounds reasonable until you try to figure out what that means in practice. Did the AI contribute 10 percent of the words? Ten percent of the ideas? Ten percent of the formatting? Nobody knows. It’s a vibe check dressed up as a standard.

Trust Me, I’m Human

These certification systems have a fatal flaw. They only work if people tell the truth.

The Authors Guild charges $10 per title for non-members. You sign a license agreement promising you didn’t use AI. No technical verification. No one watches you type. The Guild’s defense is that legal liability creates sufficient deterrence, but Jane Friedman estimates 25-50 percent of nonfiction already contains undisclosed AI-generated text, based on conversations at AWP. A signed promise isn’t stopping anyone.

The detection tools that could theoretically back this up are a disaster. False positive rates range from 2 to 15 percent depending on the tool. For ESL writers? Up to 61 percent. A Washington Post investigation found 12 percent of Pulitzer-nominated articles got flagged as AI-generated by at least one commercial detector. If your “AI detector” thinks Pulitzer nominees are robots, maybe the technology isn’t ready.

Do Readers Actually Care Though?

YouGov says 61 percent of Americans would feel “somewhat or much less fulfilled” discovering a book was AI-written. 56 percent want to know about any AI contribution.

Sounds damning. Except there is zero market data showing a “Human Authored” badge actually sells more books. Friedman flagged this too. Publishers aren’t rushing to certify titles. If readers cared enough to change their buying habits, you’d see publishers all over this. They’re not.

Readers say they care about AI the same way they say they care about sustainable packaging. It matters in the abstract. It evaporates at the checkout.

The Scarlet Letter

The “human-made” label sounds innocent. Voluntary. Positive. A gold star for doing it the old-fashioned way.

Flip it around though. If “Human Authored” becomes expected, every book without the badge becomes suspect. Uncertified doesn’t mean AI-generated, but that’s what readers will assume. Suddenly every uncertified author looks like they have something to hide.

The push isn’t really toward voluntary human certification. It’s toward mandatory AI disclosure. The EU AI Act requires machine-readable marking of AI-generated content starting August 2026. Amazon already requires disclosure during publishing, though they refuse to show that information to consumers.

Mandatory AI disclosure is a scarlet letter on a tool. Ghostwriters have existed forever and editors reshape entire manuscripts. Nobody demanded those processes be stamped on the cover. Slap “AI-assisted” on a book and suddenly it’s lesser, regardless of quality. The focus shifts from what was made to how it was made.

That’s backwards.

A bad book is bad no matter who or what helped write it. Readers care about stories. They care about characters that feel real and prose that makes them stay up past midnight. They do not care about your workflow.

Badges Won’t Save You

The answer for authors hasn’t changed. Write excellent fiction. Build a readership that trusts you.

No badge is going to do that for you. No blockchain token is going to make readers care about your books. The certification people mean well. I genuinely think they do. But they’re solving a social problem with a technical solution, and that almost never works.

More hoops. More fees. And the people who actually produce slop will either game the system or ignore it entirely. The only authors burdened by these labels are the ones already doing the work.

I want readers who can tell the difference between a book with a pulse and one without. I want writers using every tool available to them, including AI, without losing the part that makes their work worth reading.

That’s not something you can certify.

Sources