The Deepdive

Accelerating Failure: Why AI Coding Tools Miss The Real Problem

Allen & Ida Season 3 Episode 43

Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.

0:00 | 13:34

Send a text

Ever felt like you’re flying through tasks but not getting anywhere that matters? We dig into the seductive speed of AI coding tools and expose the real bottleneck: shared understanding. The code may compile in seconds, but when requirements are fuzzy, that speed just turns misalignment into expensive, high-fidelity mistakes. We explore how “typing is not the bottleneck” went from a cult sticker to a hard truth shaping engineering strategy.

We walk through research showing why developers feel supercharged while actual time saved is small—and what that gap reveals about flow, satisfaction, and the hidden cost of rework. Then we unpack resonance drift, the quiet distance that grows between what product managers imagine, what engineers build, and what users need. With AI as the ultimate yes-man, ambiguity slides straight into production-quality code, creating technical debt on day one.

Here’s the real shift: domain expertise is now the moat. A compliance-savvy operator armed with AI can outpace a 10x coder because they can validate value, not just syntax. That’s where the “business architect” steps in, owning the blueprint while the AI lays the bricks. We share two concrete practices that change outcomes fast: Amazon’s working backwards press release, which forces clear promises before a line of code, and value stream mapping, which treats code as inventory and optimizes lead time from idea to live feature. Finally, we tackle the apprenticeship gap: if AI swallows the grunt work, how do juniors learn? We offer ways to build deliberate pathways for deep understanding so tomorrow’s architects actually emerge.

If you care about building the right thing, not just building fast, this conversation is your roadmap. Subscribe, share with a teammate, and leave a review telling us the single practice you’ll adopt this week to improve alignment.

Leave your thoughts in the comments and subscribe for more tech updates and reviews.

Allan:

Okay, so picture this. It's two thousand nine.

Ida:

Right.

Allan:

The iPhone is what basically a toddler. Obama's in his first year. And this developer, uh Sebastian Hermita, he designs this sticker that just becomes a a kind of cult object in Silicon Valley.

Ida:

Oh, I know the one you're talking about.

Allan:

Yeah. It's this this vivid, kind of chaotic image of a row of monkeys just hammering away on keyboards. And the caption is just five words. Typing is not the bottleneck.

Ida:

Aaron Powell And it is, I would argue, the most prophetic sticker in the history of software engineering. Trevor Burrus, Jr.

Allan:

Because you've fast forward to right now, and the entire industry is. I mean, it's hyperventilating over AI coding tools. You've got GitHub, Copilot, Cursor, Devon. They all promise to do the typing for you. The whole pitch is basically, hey, we removed the bottleneck, now you can build the next Facebook in an afternoon.

Ida:

Aaron Powell That is exactly the pitch. And look, if typing were the actual bottleneck, we'd be entering a golden age. But the research stack we're looking at today, and this has data from BNYMLN door R metrics, some pretty scathing essays. It suggests that whole pitch is just uh fundamentally flawed.

Allan:

Aaron Powell Because we're solving a problem we didn't. That we didn't actually have.

Ida:

Exactly. The core thesis across this entire stack, and it's an uncomfortable one, is that by using AI to accelerate the typing, we might actually be accelerating failure.

Allan:

Accelerating failure?

Ida:

We're running faster, but we're completely lost.

Allan:

Aaron Powell Okay. Accelerating failure, that sounds expensive.

Ida:

And ideally, we'll unpack just how expensive it really is. We're gonna look at why a developer who say knows banking regulations is suddenly worth 10 times more than a master coder, and why the secret to speed might be writing a fake press release before you even think about writing a line of code.

Allan:

Okay, let's unpack this. Because I think most people, and I include myself in this, we assume that if I have a robot that writes code for me, I must be more productive. Sure. That just feels logically true. If I don't have to pipe public static void main for the millionth time, I'm winning, right?

Ida:

You would absolutely think so. But let's look at this beyond the commit study from BNY Mellon. They surveyed nearly 3,000 developers about using AI coding assistance.

Allan:

Okay.

Ida:

Now the subjective feedback was it was glowing. Eighty-six percent of developers said they were satisfied or very satisfied. They felt like superheroes.

Allan:

Do they love it? But I'm guessing I'm guessing the objective data told a slightly different story.

Ida:

Aaron Powell Completely different. When they actually measured the time saved on tasks, it was negligible. We're talking less than an hour a week, often.

Allan:

Aaron Powell Wait. How can you be very satisfied with a tool that saves you almost no time? That's like that's like loving a dishwasher that doesn't actually wash the dishes.

Ida:

Aaron Powell Because it removes the pain, not the time. The developers said they were staying in that flow state longer. They didn't have to tab out to Google some reject string or, you know, look up library documentation. The AI handled all that grunt work. And that feels fast.

Allan:

Trevor Burrus, Jr.: It's like hitting all green lights on a road trip. You feel like you're making incredible time, even if you're actually driving in a circle.

Ida:

Aaron Powell That is the perfect analogy. And this brings us right to that idea of accelerating failure, which comes from that coding is not the bottleneck article. The AI is a Ferrari engine. But if your ma your requirements is wrong, that engine just drives you off a cliff faster than any human ever could.

Allan:

Trevor Burrus, Jr.: Because the AI doesn't push back.

Ida:

Never. It's the ultimate yes man. If you give it ambiguous instructions, and I mean human language is always ambiguous. It's not going to ask clarifying questions. It'll just hallucinate a solution that looks plausible. It translates your confusion directly into executable code.

Allan:

Aaron Powell So you end up with technical debt on day one.

Ida:

Instantly.

Allan:

You have code that works, but it's doing the wrong thing.

Ida:

Aaron Powell You end up with a very robust, very well-built bridge to absolutely nowhere. And this gets us to the translation gap. This is a really crucial concept from the reading. Phase one is understanding. Trevor Burrus, Jr.

Allan:

Right. The human part.

Ida:

That's the coffee, the whiteboards, the arguments with the product manager. Phase two is translation. That's just turning that agreement into syntax. Trevor Burrus, Jr.

Allan:

And we've automated phase two.

Ida:

We have made phase two almost instant, but we haven't touched phase one. In fact, you could argue we've made phase one harder because we're just skipping it. We're jumping straight to code because it feels cheap now.

Allan:

Aaron Powell There's a term in the notes I really want to drill into resonance drift. It sounds like something out of a sci-fi movie, but it's about human alignment.

Ida:

Aaron Powell It is, yeah. It comes from VJ Anant's work. And you can think of it like the telephone game, but for, you know, million-dollar products. Resonance is when the product manager, the developer, and the user all have the exact same mental picture of what the software is supposed to do.

Allan:

Aaron Powell Which almost never happens.

Ida:

Extremely rare. Usually the PM wants Amazon, the dev builds Shopify, and the user just wanted a buy now button. That gap, that is resonance trip. That's a and normally the slow, painful process of coding forces you to spot those disconnects. You hit a logic problem, you ask a question, you realign.

Allan:

Aaron Powell But if the AI writes the code in five seconds, you skip the ask a question part.

Ida:

You skip it entirely. You build the wrong thing in high fidelity. You don't realize you've drifted apart until you ship it. And you know, fixing a misunderstanding after the code is written is exponentially more expensive than fixing it on a whiteboard.

Allan:

Aaron Powell It's the difference between erasing a sketch and tearing down a house.

Ida:

Precisely.

Allan:

So if syntax is cheap and understanding is now the premium that fundamentally shifts who's valuable on the job market, there's this scenario in the people work source that I found just fascinating, this cage match between two types of workers.

Ida:

Yes. This is the revenge of the domain expert. I love this comparison because it really, really upsets the traditional tech hierarchy.

Allan:

Aaron Powell Okay, let's set the stage. In the red corner, we have the 10x engineer. You know the type, master of JavaScript, dreams and algorithms, knows every framework.

Ida:

Right. The person who usually commands the highest salary.

Allan:

And in the blue corner, we have basically a bank teller. Well, let's say a back office ops person who knows a little bit of Python.

Ida:

Right. Maybe they copy-paste a lot.

Allan:

But they spent five years of their life just staring at banking regulations.

Ida:

Five years ago, you bet on the engineer every single time. No question. Today, the data suggests you bet everything on the bank teller.

Allan:

Aaron Powell The numbers were just shocking to me. Can you walk us through them?

Ida:

Aaron Powell So the source suggests that a domain expert, when you equip them with AI, has about a 95% success rate in building a usable tool. The pure coder with the same AI, somewhere between 25 and 40%.

Allan:

Aaron Powell Why is the gap that wide? I mean, should the coder be better at prompting the AI? They know the technical lingo.

Ida:

Aaron Powell They can prompt for code better, but they can't verify the value. The bank teller looks at the AI's output and immediately says, wait a minute, that violates the Anti-Money Laundering Act, or that workflow doesn't match how a loan officer actually thinks.

Allan:

Ah, they catch the failure immediately.

Ida:

Instantly. Whereas the coder looks at it and says, clean syntax, runs fast, ship it.

Allan:

And then the bank gets a massive fine.

Ida:

Exactly. The source notes that these domain experts move from requirements to a solution about 20 times faster. Not because they type faster, but because they don't have to rewrite the whole thing three times after misunderstanding the goal.

Allan:

Twenty. Twenty times faster. That is just insane.

Ida:

It creates a totally new role. The text calls it the business architect. You're not a bricklayer anymore. The AI lays the bricks. You're the architect making sure the walls are in the right place.

Allan:

There was that amazing specific example of the hypes savvy developer.

Ida:

Right. This person saved their company and estimated$2 million in potential fines. Not because their code was elegant. It might have been ugly code, who knows? Why? But because they understood healthcare privacy laws. Wow. In the AI era, domain knowledge is the moat. Syntax is a commodity.

Allan:

This feels like a massive pivot for the entire industry. I mean, we've spent two decades telling everyone learn to code.

Ida:

Every boot camp, every college course.

Allan:

And now the advice is basically learn the business. If you understand supply chain logistics deeply, or insurance law, or biological research, AI makes you a superhero. If you only know how to write a sorting algorithm in C A makes you redundant.

Ida:

Ouch. Yeah. It's harsh, but you can see the logic, can't you? I can.

Allan:

Okay, so if understanding is the real bottleneck, how do we attack it? The sources talk about slowing down to speed up, which, you know, sounds like a nice bumper sticker, but how do you actually do that?

Ida:

Well, two main strategies really stood out in the text. The first one is the Amazon method, working backwards.

Allan:

This is that famous press release technique, right?

Ida:

Exactly. And it sounds kind of bureaucratic, but in the age of AI, it's a really rigorous filter. The rule is simple. You write a one-page press release describing the finished product before you write a single line of code.

Allan:

And it has to be written for the customer, right? Not we used React 18.

Ida:

But No, no. It's this tool saves you time. New feature X saves you 10 hours a week by automating Y. And if you can't write that page clearly, or worse, you write it and it sounds really boring. You don't write the code, you just stop.

Allan:

That's tough. It's forcing you to face the reality that your amazing idea might actually suck before you've invested months building it.

Ida:

Aaron Powell, but in a world where generating code is basically free, the cost of a project is no longer development, it's adoption. This forces you to validate that adoption risk first. It creates a shared artifact.

Allan:

Aaron Powell So it's a firewall against that resonance drift.

Ida:

Exactly.

Allan:

If we all agree on the press release, we're all building the same thing.

Ida:

Correct. If the press release is vague, the code will be wrong. Simple as that. The second strategy comes from Dora Research and uh David J. Anderson, value stream mapping.

Allan:

Aaron Powell This sounds very industrial, like optimizing a Toyota factory floor.

Ida:

Aaron Powell It's pretty much that concept applied to software. So instead of measuring developer velocity, which is really just how fast you type, you measure lead time.

Allan:

Lead time.

Ida:

How long does it take for a customer's idea to become a real deployed feature they can use?

Allan:

Aaron Powell And usually most of that time isn't spent actually coding.

Ida:

Almost never. It's spent waiting, waiting for approval, waiting for QA, waiting for someone to clarify a requirement. The source describes code as inventory. If an AI writes a bunch of code that just sits in a QA queue for four days, you haven't created value. You've just created a pile of inventory that's depreciating by the minute.

Allan:

I love that framing. Code is inventory. We usually think of code as the asset.

Ida:

Code is a liability until it is in production, solving a real problem. Every line you write is something you have to debug, maintain, secure. The goal shouldn't be more code. It should be smoother flow.

Allan:

So the bottleneck is the handoff.

Ida:

It's the handoff and the decision making. If you want to go faster, don't buy a faster keyboard. Buy a faster decision-making process.

Allan:

It's so ironic, isn't it? The entire tech industry is just obsessed with speed move fast and break things. But what the sources are all saying is the only way to effectively use these new high-speed tools is to be incredibly deliberate and slow at the beginning.

Ida:

Slow is smooth and smooth is fast. That Navy SEALs mantra pops up in the text for a reason. If you nail the understanding phase, the AI will help you execute instantly. But if you rush the understanding, you're just generating legacy comet at the speed of light.

Allan:

Before we wrap up, I do want to touch on the final provocative thought from the sources because it's it's a bit darker. It touches on something that's been nagging me this whole conversation, the junior developer.

Ida:

Ah, yes, the apprenticeship gap. It's a genuine uh structural crisis that ordep.dev highlights.

Allan:

Right. Here's the logic. If AI handles all the grunt work, the bug fixes, the simple functions, the boilerplate, that's great for productivity today. But that grunt work is traditionally how juniors learn the ropes.

Ida:

Exactly. You develop intuition by breaking things. You learn why a database query is slow by writing a bad one and watching the server catch fire.

Allan:

You learn not to touch the hot stove by touching it.

Ida:

And now the AI puts a guardrail around the stove, or it just cooks the entire meal for you. If the AI prevents you from ever writing a bad query, do you ever actually understand how the database works at a deep level?

Allan:

It's like using a GPS your whole life. You never actually learn the map of the city. If the GPS breaks, you're helpless.

Ida:

We risk creating this massive seniority gap. In five or ten years, where do the senior architects come from? If we wipe out the junior role because it's suddenly inefficient, we're basically cutting off the bottom rungs of the ladder.

Allan:

That is a very sobering thought. We might need to completely reinvent how we teach this stuff. Maybe computer science degrees need to become systems thinking degrees.

Ida:

I think that's inevitable. Less syntax, more semantics, less how to type, more what to build.

Allan:

So if I'm a listener, maybe I'm a dev, maybe I'm a manager, what do I do on Monday morning with all this?

Ida:

Put down the keyboard, pick up the whiteboard marker.

Allan:

Don't just dive into the code.

Ida:

The value isn't in the code anymore. It's in the clarity of your thought. If you can articulate the problem perfectly, the AI can probably solve it. If you can't, the AI will just help you fail much, much faster.

Allan:

Understanding is the bottleneck.

Ida:

Always has been. The monkeys on the keyboards were right back in 2009, and they are even more right today.

Allan:

I'm gonna go print that sticker out. Thanks for listening to this deep dive. Go write a press release. We'll see you next time.