DEV Community

Natália Spencer
Natália Spencer

Posted on

What Engineering Managers Look For in Performance Reviews

Performance reviews aren't some mysterious process happening in a conference room. From a manager's perspective, they're actually pretty straightforward. But there's a gap between what managers are evaluating and what developers think we're evaluating. This gap creates friction—developers undersell their work, managers feel like they're missing context, and the whole process feels less useful than it should be.

I've managed teams for years and conducted more performance reviews than I care to remember. Here's what I actually look for, and why the way you document your work matters far more than you might think.

The First Problem: I Can't Remember Everything You Did

Let me be honest: I don't have perfect memory of your last six months. I'm managing multiple people, handling incidents, attending strategy meetings, and shipping my own deliverables. Your day-to-day work—even the important work—isn't always top of mind for me when review season arrives.

This isn't an excuse for poor memory. It's just reality. What this means is that developers who document their work well have a massive advantage. They're not being evaluated on my memory. They're being evaluated on documented evidence.

When you come into your review with clear achievement statements, links to pull requests, and context around your impact, you're making my job easier and ensuring you're evaluated on your actual work, not on what I happened to remember or what I noticed casually.

The developers who struggle most in reviews are usually excellent engineers. But they assume their work speaks for itself. It doesn't. Documentation speaks for itself. Work just disappears into the codebase.

What I'm Actually Looking For

Let me break down what goes into a performance rating from my side of the table.

Evidence of Impact

I'm looking for demonstrated impact. Not effort, not hours worked, not "I was really busy." Impact.

The difference is huge. Effort is internal—how hard you worked. Impact is external—what changed as a result of your work.

Here's what I'm listening for:

Business impact: Did your work improve revenue, retention, user satisfaction, or reduce costs? Example: "Optimized our database queries, reducing infrastructure costs by $15K annually" is impact. "Refactored database queries" is not.

Team impact: Did your work unblock other engineers, improve their productivity, or accelerate their development? Example: "Set up automated testing framework, reducing QA cycle time from 2 weeks to 3 days for the mobile team" is impact. "Wrote testing code" is not.

User impact: Did your work improve the user experience, fix critical bugs, or ship features users actually want? Example: "Reduced API latency by 40%, improving dashboard load times and user satisfaction scores by 8%" is impact. "Performance optimization" is not.

The pattern is: context → action → measurable result. Managers want to see all three.

Four things managers actually evaluate in performance reviews: 1) Evidence of Impact - business, team, and user outcomes with context, action, and quantified<br>
   results. 2) Evidence of Growth - new skills, stretch projects, acting on feedback, career trajectory. 3) Quality of Judgment - technical decision-making,<br>
  thinking about tradeoffs, honesty about challenges. 4) Collaboration - working well with others, helping teammates, sharing knowledge, being receptive to<br>
  feedback

Evidence of Growth

I'm also looking for signals that you're growing as an engineer. What new skills did you pick up? What stretch projects did you take on? What feedback did you act on from previous reviews?

Growth shows ambition. It shows self-awareness. It shows you're thinking about your trajectory. Engineers who grow become better engineers. That matters for your career and for the team.

Without documentation, growth is invisible. If you learned a new programming language, architected a new system, or led a project for the first time, I might not know it unless you tell me.

If you document it as you go—"Led the API redesign project, making architectural decisions for a system that will scale to 10x current usage"—then it becomes a clear narrative of growth.

Quality of Judgment

I'm evaluating your technical judgment. Did you make good decisions? Do you think about tradeoffs? Can you defend your technical choices?

The easiest way to demonstrate judgment is through architecture decisions and technical leadership. But it also shows up in how you've documented your work. Are you being honest about challenges? Do you acknowledge tradeoffs instead of pretending your solution was perfect?

Developers who document thoughtfully—"We chose Option A despite Option B being technically superior, because the engineering cost wasn't justified by the timeline" — demonstrate better judgment than those who just say "I shipped X."

Collaboration and Communication

I'm watching to see if you work well with others. Are you helping teammates? Are you sharing knowledge? Are you receptive to feedback?

Documentation matters here too. "Mentored three junior developers through their first features" is collaboration. "Did my job and sometimes helped people" is not. The specificity proves you were actually thinking about collaboration, not just coasting.

What I'm NOT Looking For

Let me also be clear about what doesn't matter.

Busyness. "I was really busy" doesn't influence my rating. You could have been busy writing code that nobody needed. I care about direction and impact, not activity level.

Code quantity. Lines of code, number of commits, number of PRs. These are vanity metrics. Someone could ship 100 trivial PRs and create less value than someone shipping one complicated PR that solved a critical problem.

Being the smartest person in the room. Intelligence is assumed. What I'm evaluating is what you did with that intelligence. Can you apply it? Can you explain it? Can you help others use it?

Going above and beyond on everything. I don't expect you to be extraordinary at every single task. I expect you to be solid at your job and growing in key areas. "Went above and beyond" across everything raises a red flag—it usually means you're unsustainable or you're not prioritizing effectively.

four things managers don't care about in performance reviews: 1) Busyness -

How Documentation Influences Promotion and Compensation

I'll be direct: promotion and compensation decisions are heavily influenced by how well you've documented your impact.

This might sound unfair. Shouldn't the quality of your work speak for itself? In a perfect world, yes. In reality, there's selection bias. The developers I see the most clearly are the ones who communicate best. They're the ones whose work is easiest to defend in promotion discussions.

Here's how it actually works:

When promotion season arrives, I'm building a case for you with my leadership. I need to explain why you deserve a higher level. The case is stronger when I can say:

"Sarah owned the API redesign project. She made strategic decisions about tradeoffs, documented the technical RFC, and led three other engineers through implementation. The new API reduced latency by 50% and enabled us to ship features three times faster. She also mentored two junior developers in parallel."

versus:

"Sarah did good work on the API team."

Both might be true. But one is promotable. One is not. The difference is documentation.

 Comparison showing the same work with two different outcomes. Left side (Not Promotable):

The same is true for compensation. When negotiating budget for raises and refreshes, I need to justify why specific engineers deserve larger increases. Documentation lets me make those arguments. Without it, even excellent engineers can get underpaid simply because their impact isn't visible.

The Presentation Matters Too

How you present your work in the review itself matters. I'm not just looking at what you've documented. I'm also listening to how you talk about it.

Tell stories, not lists. Don't just say "I shipped feature X, fixed bug Y, and refactored system Z." Walk me through the context. Why did the feature matter? What was broken about the bug? Why did the refactor enable the team to work faster?

Back your claims with evidence. If you say you reduced latency by 40%, have the metrics. If you say you unblocked three engineers, be able to talk about what you unblocked them on. This isn't about being overly formal. It's about being credible.

Be honest about challenges. If something went sideways, say so. The developers who impress me most are those who acknowledge when things didn't go as planned and can articulate what they learned. That's maturity.

Discuss impact on business and team. Not just technical impact. Connect your work to what matters: shipping faster, better user experience, reduced costs, happier team members, whatever the business cares about.

What Happens When You Don't Document Well

I see this all the time. A genuinely strong engineer walks into their review with minimal documentation. They're vague about their accomplishments. They struggle to explain the business impact of their work.

The review becomes uncomfortable. I'm trying to advocate for them, but I don't have ammunition. Their rating ends up being lower than it should be. They feel undervalued. I feel frustrated because I know they're better than their review reflects.

Meanwhile, the developer who documented their work comes in prepared. Even if their actual output was slightly lower, the documentation makes them look more senior. They walk out with a better rating and better compensation.

This isn't a complaint about life being unfair. It's an observation about how to navigate the reality of performance management. Documentation is a leverage point. Using it well is professional, not self-serving.

How to Prepare for Your Review

Based on all of this, here's what I'd recommend:

Start early. Three months before your review, list your major accomplishments. One month out, gather evidence from Git, pull requests, and communications. Two weeks out, write impact statements that tell the story.

You don't need fancy formatting. You need clarity. For each accomplishment: What was the problem? What did you do? What changed as a result? Why did it matter?

Document growth explicitly. If you learned something new, took on a stretch project, or acted on feedback, write it down. Make the narrative of growth visible.

Quantify when possible. Numbers are credible. "Improved test coverage from 45% to 78%" is stronger than "improved test coverage." "Mentored three junior developers" is stronger than "helped with mentoring."

Prepare examples. Be ready to discuss two or three specific accomplishments in depth. Not a list of twenty vague things. A few clear stories with context and impact.

Practice your narrative. In the actual review, you should be able to talk naturally about your work. If you sound like you're reading from a script, it's less credible. If you sound like you're genuinely reflecting on what you've accomplished, it's more credible.

The Manager's Job Is to Advocate For You

Here's one more thing: your manager's job is to advocate for you. I want you to succeed. I want you to get promoted, fairly compensated, and recognized for your contributions.

But I can only advocate effectively when I have clear evidence of your impact. Documentation gives me that evidence. It makes my job easier. It makes promotion and compensation discussions easier. It makes your career trajectory clearer.

The developers who understand this—who document their work systematically and present it clearly—advance faster. Not because they're more talented. But because they're more effective at communicating their value.

Think of it this way: you've already done the work. You've already shipped the features, fixed the bugs, solved the problems. Documentation isn't adding more work. It's just capturing work you've already done.


Next Steps

Start with your last three months. What were your major accomplishments? List three or four things you're proud of. Then, for each one, gather supporting evidence from Git, pull requests, or team conversations. You now have the foundation for a strong performance review.

If you want to make this systematic—having documentation ready throughout the year rather than scrambling before review season, check out the six types of developer impact you should be documenting.

The clearer you are about your work, the better your manager can advocate for you.

Top comments (1)

Collapse
 
mohammed_ahmed_9af1b0b05f profile image
Mohammed Ahmed

Hey, i was lookin' for contributors that has ideas or want to build projects and just want to contribute in.