DEV Community

Cover image for The Mandate Had No Return Address
Jono Herrington
Jono Herrington

Posted on • Originally published at jonoherrington.com

The Mandate Had No Return Address

The first time I opened Cursor, I used it for ten minutes and shut it down. It felt foreign. The suggestions arrived faster than I could evaluate them. The workflow I'd built over fifteen years wanted no part of it. I closed the laptop, told myself I'd come back when things were less busy, and didn't think much more about it.

I recognized the feeling later. Fear of change, dressed up as productivity skepticism. I'd seen it in junior engineers resisting new frameworks. I'd seen it in tech leads protecting workflows that had stopped scaling. I hadn't expected to see it in myself.

That moment stayed with me when I started thinking about how to introduce AI to my team at Converse. Because I knew something the people who send mandate emails don't know about themselves: the resistance engineers might feel isn't a character flaw or a performance issue. It's the same thing I felt the first time I sat down with a tool that asked me to change how I think.

A mandate doesn't make room for that.

What 500 Experienced Engineers Said

I spent time in a thread with over 500 experienced engineers describing what happened when their companies introduced AI. Senior engineers. Staff engineers. Tech leads. Engineering managers. People who've been writing code since childhood, who've built systems from scratch, who understand the difference between what a tool does and what a tool is actually worth.

The frustration in that thread wasn't about the tools. It was about how the tools showed up.

The pattern was consistent across almost every comment I read. Leadership sends an email. Something like: All teams will integrate AI tools by end of quarter. Approved vendors attached. Adoption metrics to follow. No conversation about which types of work actually benefit from AI and which don't. No pilot where a team tries it on real work and reports back what they found. No feedback loop where engineers can say "this helps here but hurts here." Just a mandate and a number to hit.

Different companies. Same story.

What happens next looks fine for months. Adoption metrics climb. Velocity holds, or even rises. The quarterly review slide looks clean. And then, quietly, something starts to degrade. Decisions that used to come from judgment start coming from autocomplete. The people who were exceptional at the hard parts start to feel like the hard parts don't matter anymore. Engineers who never fully built a system from scratch can't debug it when it breaks, because they never developed the mental model for how it works.

The failure curve is slow and invisible until it isn't. By the time it shows up in production, leadership has already moved on to the next initiative.

The failure curve is slow and invisible until it isn't. By the time it shows up in production, leadership has already moved on to the next initiative.

What the Dashboard Can't See

Here's what the mandate email misses. AI is exceptional at certain things. Boilerplate. Testing scaffolds. Exploring an unfamiliar API. Generating the scaffolding for something you already know how to build but don't want to type for three hours. That's real leverage ... real time returned to the things that require your full brain.

The picture changes on nuanced system design. Security critical paths. Code that needs to survive five years of edge cases from customers you haven't met yet. When you mandate usage without making those distinctions explicit, engineers stop making them too. The boilerplate and the critical path start to blur. And the engineer who was great at knowing the difference ... the tech lead who would have flagged it in review, the staff engineer who would have pushed back in planning ... stops being asked to make the call.

This is because adults do the exact same thing our kids do when you mandate something. They push back. Sometimes loudly, sometimes quietly. Put someone in a corner, tell them this is how they work now, measure whether they're complying ... and you haven't driven adoption. You've driven compliance. Compliance means engineers will use the tool on the tasks that get measured and quietly stop applying full judgment everywhere else.

The organizations in that thread weren't measuring the wrong things. They weren't measuring anything. Compliance dashboards don't capture where AI helps and where it doesn't. They only capture whether people opened the tool.

What I Did at Converse

When I introduced AI to my team, I didn't send an email. I gave the team two weeks of blocked time to explore without delivery pressure. Not meetings. Not deliverables. Just room to try things without a deadline breathing down their necks.

But before I gave my team anything, I had to give myself time with it. It took me about a month to come back to Cursor after that first session. My tech lead had been using it and kept pushing me to give it another shot. He was already deep in it ... building workflows, training agents, figuring out where it actually saved time and where it introduced noise. I eventually came back, and once I did, I stayed. Built my own workflows. Understood, from the inside, what it changed and what it didn't.

That experience shaped everything about how I framed the rollout. The frame wasn't "we need to adopt this." It was "let's go see what this thing can do."

My tech lead started a weekly call where engineers shared what worked, what didn't, and what surprised them. I wasn't even in all of those sessions. I didn't need to be. The point was that curiosity was driving it, not a dashboard. We had real conversations about where AI creates leverage and where it doesn't. I showed them how it had changed my own workflow ... not a demo, not a slide deck, but an actual walkthrough of something I was working on. What I tried. What surprised me. What I still don't trust it with.

I made space for them to bring discoveries back. I let it be visible when I was learning from what they found. We were all in learning mode together, because it was genuinely new to all of us and to the industry at large.

That feedback loop changed what we built next. It told us which workflows were ready for AI acceleration and which ones needed a human in the loop. It surfaced things no adoption metric would have caught, because it was built on a conversation, not a compliance check.

The Posture Is the Strategy

When you're measuring people, they optimize for the metric. When you're curious alongside them, they start optimizing for the thing itself.

When you're measuring people, they optimize for the metric. When you're curious alongside them, they start optimizing for the thing itself.

The difference between a team running on curiosity and one running on compliance is the leader's posture. The same tool produces completely different outcomes depending on who showed up to lead the rollout. One posture gets you adoption numbers. The other gets you engineers who understand the tool well enough to know when not to use it.

My team uses AI constantly now. They also love their work. Those two things aren't in conflict. But they would be if I'd sent the email and called it a strategy.

A mandate is a statement of intent. Intent with no return address tells engineers exactly what kind of feedback the organization wants.

None.

The Question Worth Sitting With

If something in this is snagging for you, the question to sit with is this ... how did your team adopt AI, and who decided how it would be used?

Not which tools they use. Not whether adoption is up quarter over quarter. Who decided ... and how did that decision arrive at the engineer who actually has to live with it every day?

If the answer is a Slack announcement or an all hands slide, you already know what kind of signal came back.

Ask three engineers on your team, individually, what they wish had been different about how AI tools were introduced. Don't defend the rollout. Just listen. Write down what you hear.

The gap between what your dashboard says and what your engineers know is exactly the size of the conversation you haven't had yet.

The gap between what your dashboard says and what your engineers know is exactly the size of the conversation you haven't had yet.


One email a week from The Builder's Leader. The frameworks, the blind spots, and the conversations most leaders avoid. Subscribe for free.

Top comments (0)