DEV Community

Cover image for The Spark Is Leaving Before the Code Breaks
Jono Herrington
Jono Herrington

Posted on • Originally published at jonoherrington.com

The Spark Is Leaving Before the Code Breaks

The spark is leaving before the code breaks.

I had a conversation with an engineer last week who works at a culture analytics platform. The kind of company that powers engagement surveys, pulse checks, and culture diagnostics for organizations worldwide. I've been asking people across the industry the same question: From your seat, what impact is AI having on you and your team right now — the good, the bad, and the ugly?

His response stuck with me.

"There's uncertainty on how to best make the most of it. I've seen engineers lose the spark in the eyes in their craft having to just plan and review the output. In other areas, I've seen great results with people being able to make better decisions through knowing different options that's available to them from the planning phase. The sheer pace of AI is wearing down engineers though."

I told him that "losing the spark" line felt real. I was hearing it everywhere.

He told me it hasn't happened to his own team. Not yet. But he's observing it in others. First the engagement drops. Then that trickles into how they approach problems. Then into how they review. A mixture of losing their fidelity in approaching problems and reviewing.

The code still compiles. Tests still pass. Shipping continues.

The breakage hasn't happened yet.

I've seen engineers lose the spark in the eyes in their craft having to just plan and review the output.

What Makes This Different

This engineer isn't just watching his own team. He's got a vantage point that spans organizations, industries, geographies. He sees patterns in aggregate that most of us only glimpse in our own narrow slice.

When he says he's watching engineers lose the spark across the industry, he's not speculating. He's seeing the data. He's hearing it in the open-ended responses from thousands of survey participants. He's watching engagement scores drift while productivity metrics stay flat or climb. He knows what disengagement looks like before it becomes turnover. He can spot the pattern because his entire platform is built on measuring exactly this.

And what he's seeing is a decay pattern that starts with engagement and leaks into everything else.

The Progression He Described

First the engagement drops. The spark goes away. Not from overwork or bad management or unreasonable deadlines. From the sheer pace of a tool that generates faster than humans can properly evaluate. From planning and reviewing output instead of building. From supervising something that feels increasingly alien to the craft they fell in love with.

Then that trickles to how they work and see work. The fidelity in approaching problems starts to thin. Engineers who used to sit with hard problems until they understood them now reach for the prompt window at the first sign of friction. The muscle for wrestling with ambiguity atrophies because the tool offers immediate relief.

Then the reviewing gets thinner. Less questioning. More approval of code they didn't write and solutions they didn't think through. The standards drop not because anyone decided to lower them, but because the energy to maintain them drains away when you're not actually building anymore.

The code still compiles. Tests still pass. Shipping continues.

First the engagement drops. Then that trickles into how they approach problems. Then into how they review.

What Your Dashboard Won't Show

Your AI adoption metrics are showing you velocity. They're showing you active users and code review throughput and deployment frequency. They're not showing you who's still shipping but stopped caring. They're not capturing the engineers who are in the room but already gone.

The engineer gave me the kicker at the end of our conversation: "We accept what we tolerate."

I sat with that for a minute.

We've built an entire framework for AI adoption that tolerates disengagement as long as output stays high. We celebrate velocity gains without asking whether the humans generating that velocity still find meaning in the work. We track adoption metrics without tracking what adoption is doing to the relationship between engineers and their craft.

When I told him I was hearing the "losing the spark" narrative everywhere, he didn't seem surprised. He's watching it happen across the industry in real-time. The sheer pace of AI is wearing people down. The tool moves faster than the culture can adapt. Engineers are trying to keep up with a generated output they can't fully evaluate while maintaining the judgment that used to come from building things themselves.

The Harder Question

I don't have a clean answer for whether we need to accept this. Culture analytics platforms exist because companies want to measure engagement and catch drift before it becomes attrition. But measuring drift isn't the same as preventing it. You can have perfect visibility into declining engagement and still not know what to do about it.

The harder question is what we're willing to tolerate. If we only celebrate speed and output, we shouldn't be surprised when engineers optimize for those things at the expense of meaning. If we only track adoption metrics, we shouldn't be shocked when adoption happens in ways that hollow out the craft. If we tolerate thin reviews and superficial engagement because the code compiles, we are accepting a drift we will eventually have to pay for.

The engineer hasn't seen it on his own team yet. But he's watching it happen to others. The ones who are still showing up but already gone. The ones whose code still works but whose spark has already left.

We accept what we tolerate.

The code will break eventually. It always does. The question is whether the people who need to fix it will still have the engagement to care, or whether we'll be left with velocity metrics that kept climbing while the humans who generated them checked out long before the system needed their judgment.

Your dashboard won't capture that. Only the humans can tell you what's actually happening. And only if you create space for them to say it without defending the rollout.


One email a week from The Builder's Leader. The frameworks, the blind spots, and the conversations most leaders avoid. Subscribe for free.

Top comments (0)