90% of AI-Generated Content Is Garbage. Here's How to Be the 10%.

The internet is drowning in AI slop, but quality AI content isn't impossible—it just requires what most creators lack.

10 min read

10 min read

Published 8 February 2026

Blog Image

Scroll through LinkedIn for thirty seconds and you'll see it: the great AI content apocalypse. Thousands of posts that read like they were written by the same brain-dead algorithm, starting with "In today's rapidly evolving landscape" and ending with nothing of substance whatsoever.

"In today's rapidly evolving landscape" has become the new Lorem Ipsum. It's the universal signal that what follows will be beige, boring, and bereft of any actual insight. AI has democratised content creation, yes—but it's also democratised mediocrity.

Here's the uncomfortable truth: 90% of AI-generated content is absolute garbage. Generic, flavourless drivel that says nothing whilst taking far too many words to say it. But here's what the doomsayers miss: the remaining 10% is bloody brilliant. And the difference between slop and substance isn't the AI—it's the human behind it.

The Real Problem Isn't AI

The problem isn't that AI writes poorly. The problem is that people with nothing to say are using AI to say it louder. They're feeding ChatGPT prompts like "Write about digital transformation in ecommerce" and expecting magic. What they get is 500 words of corporate word salad that could apply to any business in any industry at any point in the last decade.

I've been working in ecommerce for 26 years. When I use AI to write about headless commerce architectures, the output is different. Why? Because I'm not asking it to explain headless commerce to me—I'm asking it to articulate my knowledge about headless commerce to you. There's a crucial distinction there.

The AI becomes an amplifier, not a replacement. I feed it my experience, my opinions, my battle scars from implementing Shopify Plus for hundreds of brands. The result isn't generic because the input isn't generic.

Quality Control Is Human Control

Every piece of quality AI content has one thing in common: a human with domain expertise directing the process. They know when the AI is talking rubbish because they know the subject matter. They can spot the hallucinations, catch the logical gaps, and add the nuanced takes that only come from real-world experience.

The democratisation of content creation was supposed to level the playing field. Instead, it's created a new hierarchy: those who can direct AI effectively sit at the top, whilst those who treat it like a magic content machine produce endless variations of the same meaningless fluff.

Editorial voice matters. When I read a piece about ecommerce strategy, I can tell within two paragraphs whether it was written by someone who's actually run an ecommerce business or someone who's just Googled "ecommerce trends 2026". AI doesn't change this fundamental truth—it just makes the gap more obvious.

The Expertise Filter

The best AI content creators aren't using AI to think for them—they're using it to think with them. They bring three things to the table that the slop generators don't:

Proprietary knowledge: Stuff you can't Google. Internal data, first-hand experiences, industry insights that come from being there when the rubber meets the road.

Editorial judgment: Knowing what to include, what to cut, and what angle will actually matter to the audience. AI can generate; it can't curate.

Quality control: The ability to read something and know it's wrong, boring, or obvious. This only comes from understanding the subject deeply enough to have opinions about it.

When you combine these three filters with AI's raw output, you get something worth reading. Remove them, and you get LinkedIn's morning feed.

The Uncomfortable Reality

Here's what nobody wants to admit: most content creators don't have enough expertise to direct AI effectively. They're using it to mask their lack of knowledge, not amplify their abundance of it.

If you don't understand conversion rate optimisation deeply, AI won't help you write a good article about CRO. It'll help you write a longer, more polished version of your ignorance. The problem isn't the tool—it's what you're bringing to the tool.

The future belongs to people who can make AI produce expert-quality output. But here's the catch: to do that consistently, you have to be an expert yourself. AI doesn't eliminate the need for domain knowledge—it amplifies the advantage of having it.

So the next time you see another piece of AI-generated content that makes your eyes glaze over, remember: it's not the AI that's the problem. It's the human who thought they could outsource thinking to a machine and still produce something worth reading.

The 10% creating quality AI content aren't magic. They're just bringing something to the conversation that the other 90% aren't: actual knowledge, real opinions, and the judgment to know the difference between insight and noise.

Explore Topics

Icon

0%

Explore Topics

Icon

0%