I keep reading everywhere that AI makes you 10x more productive. And every time I read it, something nags at me. Not because it’s wrong, not really. But because it correctly answers the wrong question.

And the question that matters is not “how much more can I produce?”, but “what is the value of my output?”

There’s a difference between output and value. A profound one. Smart businesses don’t care how much stuff you make. They care how much of that stuff someone will pay for. If you can suddenly produce ten times the code, ten times the copy, ten times the design, that may feel great. But if everyone else can effectively do the same, because you all use the same tools in a similar way, guess what happens to the value of what you produce…

Dah! It goes down. That’s the law of supply and demand, the most basic and most relentless force in economics. And what you’re left with is more output at lower margins, which, if you do the math, often lands you right back where you started. And this is the part that doesn’t make it into all the LinkedIn posts and the YouTube videos I’ve been getting served these past few months.

There’s a name for this kind of trap. It’s the Red Queen effect, after the infamous Lewis Carroll character who tells Alice she has to run as fast as she can just to stay in the same place.

That’s exactly what AI adoption could look like for most, if you properly consider the second-order effects. You adopt the tool not to leap ahead but to avoid falling behind.

I notice this in my work. I get excited about a new tool, a new workflow, or some new trick that seemingly cuts my previous manual process in half. And for a moment or two, it feels like a superpower. But then I realize everyone else can use the same trick. That’s no true advantage! The baseline just shifted upward, and I’m standing on the same rung of the ladder.

It is often the case when a new capability becomes universally accessible, it stops being an advantage and starts being table stakes. The thing that used to set someone apart, being fast, being technically skilled, being good at the details, now everyone can potentially do that. Or at least, enough people can that the market stops rewarding it the way it used to.

There are exceptions, of course. If you’re genuinely early in your field, you can capture a window of arbitrage before all the rest catch up. I think that’s most of the success stories I hear these days.

And if you’re in a supply-constrained niche, pricing power may hold. Or if you’re building something so new that there’s no existing competition to undercut you, the whole dynamic is different. But this was always the case, not something AI uniquely enables.

For most of us, the fundamental challenge hasn’t changed at all. You still have to find a comparative advantage. You still have to escape the zone of comfort and commonality where you’re interchangeable with everyone else. AI doesn’t solve that. It just changes what the inputs to that equation are.

And here’s the crux of the matter: If the things that used to be scarce are now abundant, like implementation speed, technical chops, and shipping ability, then what’s valuable migrates somewhere else. It moves to whatever’s still hard to replicate. Deep domain expertise. Tacit knowledge. Judgment. Probably taste. Definately trust.

That last one haunts me a little. Because I’m someone who loves tinkering. I love the craft of wasting time to make something work precisely as I imagine it, to get my hands dirty in the implementation details. And this argument suggests that the making part is becoming the easy part. The hard part, the valuable part, is the thinking that happens before you even begin building. Knowing how to find what’s worth making in the first place.

I don’t have this figured out. I don’t think anyone does yet. But I’m feeling more and more confident that the right question to answer is “Where has scarcity moved now, and how can I get there?”