The Mountain of Garbage Behind Every Spark of AI Creativity

Or: Why Originality Isn’t the Same as Intent

The following is a follow-up to my previous blog post. I really didn’t plan to write about AI again so soon, but new developments seem to happen almost every day, and they impact how human creativity will or won’t continue to be a force in our world. And as a writer, that’s an important distinction to me.

I’ve said before — probably more than once — that AI can’t really create anything original. That it can remix, synthesize, echo, and adapt, but not originate. It can give you the next version of something, but not the first.

But I want to refine that claim, because after watching the 60 Minutes feature on Google DeepMind’s Project Astra, and thinking more about how AI systems generate outputs, I realized something:

AI can create something new.

But it does it the way nature does — not the way people do.

And that difference matters.

AI Originality Works Like Evolution, Not Inspiration

Here’s the model I’m starting to come around to: AI creativity mirrors Darwinian evolution — not artistic genius.

In evolution, organisms mutate randomly. Most mutations are useless. Some are actively harmful. But every once in a while, one shows up that’s useful. And natural selection keeps it around.

AI works the same way:

  • Generate a bunch of random variations.

  • Test them.

  • Keep the ones that “work.”

It’s mutation and selection, not vision and refinement. And like evolution, it's messy, inefficient, and mostly failure. But sometimes, out of that noise, something surprising survives.

That’s how AI gets to originality. Not through taste. Not through intent.

Through volume and filtration.

The Garbage Is the Cost

And that’s the thing most people don’t talk about: how much garbage AI generates on its way to a single good idea.

If you’re using AI to create something new — not just autocomplete a sentence, but to truly break form — you’re going to get reams of nonsense. Broken ideas. Useless variations. Dead ends. And maybe, somewhere in there, a gem.

This is one of the main differences between human creativity and machine creativity.

Humans filter as we create.

AI just creates.

We feel when something is right. Eddie Van Halen picks up a guitar, does something strange, and knows — instantly — that it’s worth exploring. He might not be able to explain why. But the spark is there.

An AI doesn’t have that. It can create a sound. It might even generate something like the next Eddie Van Halen riff. But it has no sense of rightness, no internal signal that says, “This matters.”

Which means we still need a listener. A watcher. A reader.

Someone human to say: this one is good.

The Selection Filter Is Still Human

Even if you train a model on millions of human preferences — on what people liked, clicked, bought, or shared — you’re still building on subjective standards. The AI isn’t evaluating in a vacuum. It’s echoing back what we already decided was good.

And when it comes to truly new things — things that fall outside the training data, that don't yet exist in the world — AI has no basis for judgment. It has no values of its own. No desire. No taste. No purpose.

So even if it stumbles onto something great, it won't know it.

We will. Or we won’t. But either way, it’s still up to us.

Creativity by Mutation Is Also Costly

And there’s another angle to this that most casual observers miss: the energy cost.

Running large language models — especially ones that generate a billion possible outputs in search of one spark of originality — is resource intensive. Data centers, GPUs, electricity — all to produce a mountain of garbage and a single useful result.

Maybe that's worth it in some domains. But it’s not the elegant spark of insight we like to think of when we talk about creativity.

It’s brute force. It’s trial and error on steroids.

It’s evolution — sped up, but still messy and indifferent.

But Could AI Develop a Filter?

Here’s where I want to acknowledge something: maybe one day, AI will develop an internal filter.

Maybe it will learn to assign value in a way that isn't just regurgitating our preferences back to us. Maybe it will develop something akin to intent — a sense of what matters and why.

I don’t see how that would happen right now. I’m not even sure what that would look like.

But I’ve also learned not to speak too confidently about what AI can’t do — because its capabilities are improving exponentially, and the path forward isn’t always visible from here.

So I’m not saying it’s impossible. I’m just not ready to say that it is possible, either.

For now, what we call “good” still requires human judgment. And that judgment is shaped by culture, history, experience — by emotions and meaning that aren’t reducible to data points.

At least, I don’t think they are. I don’t feel like they are.

Even if AI learns to imitate that well, it still needs us to tell it when it got it right.

Because the rules for what’s good don’t come out of a vacuum.

So Can AI Be Creative?

Sure. But let’s be clear about what kind of creativity we’re talking about.

AI can create novelty, just like nature can create new species.

But it doesn’t know what it’s doing.

It doesn’t care if it fails.

It doesn’t feel anything when it stumbles onto something good.

That’s not inspiration. That’s just noise with a filter.

Until AI has its own standards for what is good — and maybe it never will — then it’s still going to rely on us to tell it what’s worth keeping.

And that means, for now, we’re still the spark.

Javier

© 2025 Chapelle Dorée Publishing, LLC. All rights reserved. This content may not be reproduced, distributed, or modified without permission.

Further Viewing

If you haven’t seen it yet, the 60 Minutes segment that prompted this follow-up is well worth watching. It features Google DeepMind’s CEO demonstrating Project Astra — an early glimpse into what AI might look like when it starts perceiving and interacting with the world in real time.

You can watch it here:
Google’s AI Future: Project Astra on 60 Minutes

It’s impressive, a little eerie, and raises exactly the kind of questions this post is trying to wrestle with. Let me know what you think.

Next
Next

If A.I. Can Tell Your Story, It Was Never Yours to Begin With