DEV Community

Leonid Bugaev
Leonid Bugaev

Posted on

What is the new engineering bottleneck?

Something I keep thinking about:

Maybe AI is not exposing a new problem in engineering. Maybe it is exposing an old one that we were already bad at.

We talk about AI like the bottleneck is still writing code. But honestly, writing code has not been the hard part for a long time. The hard part is all the surrounding context.

Why are we making this change? Who asked for it? Which customer depends on the current behavior? Was this weird edge case intentional? Is this a product decision or just an implementation accident? Did security already review something similar before? Is the documentation describing the current behavior or the behavior we wish we had?

And the uncomfortable part is that most of this context is not in one place.

It is all over GitHub, Slack, docs, tests, head of the engineer who left six months ago.

This was already a problem. AI just makes it harder to ignore. Because now we can create more code from less context, more tests, more docs, more confident explanations.

But if the context is incomplete, then all of that output is built on sand. This is the part I find interesting.

Not “will AI replace engineers?” I don’t think that is the most useful question.

The more interesting question is: What happens when engineering teams can generate implementation faster than they can preserve intent?

Because that is where things get messy.

You can have a clean PR. You can have passing tests. You can have updated docs. You can even have a very convincing AI-generated explanation.

And still nobody can answer the basic question: “Is this actually the right change?”

That question is much more expensive than people admit. I have felt this many times in open source and infrastructure work.

You look at a small change and think, “This should be simple.” Then you start pulling the thread. There is a backwards compatibility issue. There is some behavior that looks wrong but someone depends on it. There is a test that protects the implementation but not the real promise. There is a doc page that says one thing and production behavior says another. There is a customer workaround that became part of the product without anyone naming it. Suddenly the small change is not small.

And this is why I think “AI will make everyone ship faster” is only half true. AI can make the creation part faster. But creation is not the same as shipping. Shipping means the organization understands the change well enough to stand behind it.

That is a different problem. I don’t have the perfect answer yet.

But I think “AI coding” is the wrong frame. The real problem is not coding. The real problem is engineering memory.

And most teams’ engineering memory is held together with Slack search, old PR comments, and someone saying: “I think I remember why we did that.”

That does not scale.

Top comments (1)

Some comments may only be visible to logged-in visitors. Sign in to view all comments.