DEV Community

Natália Spencer
Natália Spencer

Posted on • Originally published at bragdoc.ai

Output-Based Performance Reviews: What Engineers Need to Know

Meta, Amazon, and X switched to output-based performance reviews. Learn what this means for engineers and how to document your impact for better reviews.

Update: Amazon just laid off 16,000 employees this week—the second wave of a 30,000-person cut that started in October. Their reason? Too many 'layers of bureaucracy' and not enough 'ownership.' Meanwhile, the employees who remain are being evaluated on their Forte system, which requires 3-5 specific accomplishments with measurable impact. The message is clear: output matters, effort doesn't.


Your performance review is changing. Meta, Amazon, and X just made it official: they're rating engineers on output, not effort.

Earlier this month, [Meta announced (https://finance.yahoo.com/news/meta-changing-performance-review-reward-171427657.html) a complete overhaul of its performance review system. The new platform, called Checkpoint, does away with the old model and introduces something more direct: employees will be rated on what they delivered, not how hard they worked.

Amazon made a similar move earlier this month. Their internal review system, Forte, now requires employees to submit 3-5 specific accomplishments—not reflections on their work style, but concrete examples of what they delivered.

X has been doing this since 2022, when Elon Musk required employees to explain their weekly accomplishments.

This isn't a coincidence. It's a shift in how tech companies evaluate engineers. And if you're not adapting, you're falling behind.

What "Output Over Effort" Actually Means

Let's be specific about what changed.

The old model rewarded presence and activity. You showed up, you worked hard, you participated in meetings, you were a good teammate. At review time, you wrote about your "contributions" and "involvement" in projects.

The new model rewards results. What did you ship? What impact did it have? Can you quantify it?

Here's how Amazon's internal guidance puts it:

"Accomplishments are specific projects, goals, initiatives, or process improvements that show the impact of your work."

They give an example of what not to write: "I contributed to team projects."

And what to write instead: "Led a cross-functional team to reduce server downtime by 15%, resulting in $2 million in savings."

The difference isn't subtle. One describes activity. The other describes output.

Why This Is Happening Now

Three forces are converging:

1. The "Year of Efficiency" became permanent. Meta called 2023 its "year of efficiency." Three years later, that mindset hasn't relaxed—it's intensified. Companies are running leaner and expecting more from fewer people. That means every person needs to justify their seat.

2. AI is changing how productivity is measured. According to internal communications reported by Fortune, Meta's head of people, Janelle Gale, told staff that "AI-driven impact" will become a core expectation starting in 2026. Employees will be assessed on how effectively they use AI to achieve results. The bar for what counts as productive output is rising.

3. Competition for roles is fierce. With 500,000 tech workers laid off since ChatGPT launched, there's no shortage of qualified engineers. Companies can afford to be selective—and they're selecting for people who can demonstrate impact, not just effort.

What This Means for You

If you're an engineer at a company that hasn't announced these changes yet, don't assume you're exempt. The pattern is clear: what starts at Meta and Amazon spreads across the industry.

Here's the practical reality:

Your manager can't advocate for you with vague descriptions. When your manager goes into a calibration meeting to argue for your promotion or raise, they need specific examples. "She works really hard" doesn't compete with "She shipped the new authentication system, reducing login failures by 60% and saving 200 support tickets per month."

Recency bias will hurt you even more. If the only outputs your manager remembers are from the last few weeks, you're competing against yourself from 11 months ago. Everything you shipped in Q1 needs to be documented, or it effectively didn't happen.

"I was busy" is no longer an excuse. Busy with what? What was the result? If you can't answer that clearly, you're in trouble.

How the New Review Systems Work

Let's look at what you're up against:

Meta's Checkpoint System

  • Four ratings: Outstanding (top 20%), Excellent (70%), Needs Improvement (7%), Not Meeting Expectations (3%)
  • Bonus multipliers: Outstanding = up to 300% bonus. Excellent = 115%. Needs Improvement = up to 50%. Not Meeting = no bonus.
  • Frequency: Two review cycles per year, bonuses paid twice annually

The gap between "Excellent" (115% bonus) and "Outstanding" (up to 300% bonus) is massive. The difference? Documented, quantifiable output.

Amazon's Forte System

  • Required: 3-5 specific accomplishments with measurable impact
  • Linked to compensation: Your "Overall Value" rating determines pay
  • Explicit criteria: Must connect accomplishments to Amazon's Leadership Principles

Amazon's guidance is explicit: "Consider situations where you took risks or innovated, even if it didn't lead to the results you hoped for." They want specifics, even about failures.

The Engineers Who Will Thrive

The engineers who succeed in this environment share one trait: they document their output as it happens.

Not at the end of the quarter. Not the week before reviews. As it happens.

Here's why that matters:

Details fade. You shipped something important in March. By November, you remember that you did it, but not the specifics. Not the metrics. Not the context that made it matter.

Context gets lost. The feature you built was important because it unblocked another team, saved the company from a compliance issue, or prevented a customer from churning. Six months later, that context is gone unless you wrote it down.

Impact compounds. A single achievement might not seem impressive. But when you can show a pattern—five performance improvements, three mentorship wins, two architectural decisions that paid off—the story becomes compelling.

What to Do Starting This Week

You don't need to wait for your company to announce a new review system. Start documenting your output now.

1. Shift your language from effort to output.

Effort-based (old) Output-based (new)
Worked on the payment system Shipped payment retry logic, reducing failed transactions by 12%
Helped with the migration Led database migration affecting 2M records with zero downtime
Participated in code reviews Reviewed 47 PRs, catching 3 critical security issues before production

BragDoc achievements list showing output-based achievement statements with impact metrics

2. Capture metrics when they're fresh.

The week you ship something, you know the numbers. You know how much faster it is, how many errors it prevents, how much time it saves. Write that down immediately. In three months, you won't remember.

3. Document the "why," not just the "what."

"Fixed a bug" tells your manager nothing. "Fixed a race condition in the payment queue that was causing 2% of transactions to fail silently, protecting approximately $50K/month in revenue" tells a story.

4. Review monthly, not quarterly.

Set a calendar reminder. Once a month, look at what you shipped and make sure it's documented with impact. This takes 15 minutes and saves hours of scrambling before reviews.

The Bottom Line

The shift to output-based reviews isn't a trend—it's the new normal. Meta, Amazon, and X aren't experimenting. They're standardizing an approach that rewards engineers who can demonstrate what they delivered, not just that they showed up.

This is actually good news for engineers who do great work but struggle to talk about it. The new systems don't reward self-promotion or politics. They reward documentation. Specifics. Impact.

If you've been doing good work quietly, now's the time to start writing it down. Your next review depends on it.


Start documenting this week. The simplest approach: create a running document and add one win every Friday. Note what you shipped, what impact it had, and any metrics you can attach.

If you want something that captures your output automatically, that's what we built BragDoc to do. It connects to your GitHub, extracts achievements from your commits and PRs, and helps you translate them into the kind of impact statements that perform well in the new review systems.

BragDoc dashboard showing weekly achievements automatically extracted from Git commits

Your effort matters. But only your documented output will be rewarded.

Top comments (0)