DEV Community

Natália Spencer
Natália Spencer

Posted on • Originally published at bragdoc.ai

Meta Built an AI to Write Reviews. You Still Need to Remember.

Meta built an AI to help write performance reviews. But you still need to remember what you accomplished. Learn how to track work achievements automatically.

Late last year, Meta launched an AI-powered performance review assistant for their employees. The tool, which integrates their internal AI assistant Metamate with Google's Gemini, helps staff write better self-reviews and peer feedback.

There's just one problem: you still have to tell it what you accomplished.

The AI can make your writing more polished. It can format your achievements into compelling narratives. It can ensure your review aligns with company values. But it can't remember the critical bug fix you shipped in March. Or the architectural decision you made in July. Or the teammate you mentored in September.

Meta's AI solves the output problem. It doesn't solve the input problem.

What Meta Actually Built

In December 2025, Meta launched their AI Performance Assistant for year-end reviews. According to Business Insider, the tool does exactly what you'd expect from a modern AI writing assistant.

Employees feed their accomplishments into the system. The AI—using both Meta's internal Metamate (trained on company docs) and Google's Gemini—generates polished review text. Employees edit the output. Submit the review.

Based on available reports, the workflow appears to work like this:

  1. Employee manually compiles what they worked on
  2. Employee inputs accomplishments into AI tool
  3. AI drafts review text
  4. Employee refines the AI-generated content
  5. Employee submits review

Notice what's missing? Steps 0 through 1. The part where you actually remember what you did.

How to Remember Accomplishments for Performance Reviews

If you've ever prepared a performance review, you know the hard part isn't writing eloquent prose. The hard part is answering a much simpler question: what did I actually accomplish in the last six months? This is the universal challenge: how to remember accomplishments for performance reviews when months have passed.

You shipped code every day. You closed tickets. You reviewed PRs. You fixed bugs. You participated in design discussions. You mentored teammates. You made architectural decisions. All of this happened, but can you list it six months later?

Most engineers can't. Not because they didn't do the work, but because human memory doesn't work that way.

You remember the big projects. The feature launch that took three months. The incident that kept you up until 2am. But what about everything else? The steady stream of smaller contributions that actually make up most of your work?

That refactor in April? Gone. The performance optimization in May? Fuzzy. The bug fix that saved a customer from churning? You know you did something like that, but when was it exactly?

This is the documentation problem that Meta's AI doesn't solve. You can have the most sophisticated AI writing assistant in the world, but if you can't remember what you accomplished, it never makes it into your review. The challenge isn't writing—it's capturing and tracking work achievements before memory fades.

Meta's Approach Still Requires Manual Input

Here's what Meta's AI Performance Assistant actually does:

Metamate searches internal docs. If you wrote design docs, sent emails, or posted in internal forums, Metamate can find and summarize those. That's useful for context.

It generates draft text. Feed it bullet points about your work, and it'll turn them into polished review language. It knows company terminology. It can map your work to Meta's values.

It combines multiple AI models. Using both Metamate and Gemini gives employees different strengths—internal context from one, broader reasoning from the other.

But based on available information, it doesn't appear to automatically capture code commits, merged PRs, or closed bugs. Employees still need to tell the system what they accomplished.

Even with access to internal documentation, the system still relies on you remembering what's worth documenting. If you didn't write a design doc for that critical bug fix, Metamate won't find it. If you didn't post about that architectural decision in the internal forum, it's not in the system.

Your actual work—the code you shipped, the PRs you reviewed, the commits you made—isn't automatically captured.

Diagram showing what Metamate can access (internal docs, emails, forums) versus what it cannot access (Git commits, PRs, bugs, code reviews)

The Missing Layer: Automatic Achievement Capture

This is where the automation gap becomes obvious. Meta employees still need to do the same thing everyone else does: manually track their accomplishments and compile them from memory before they can use the AI writing tool.

Compare this to a system that starts with your actual work:

  1. Your Git commits are automatically captured as they happen
  2. PRs you created and merged are tracked
  3. Issues you closed are documented
  4. Technical context is preserved (branch names, commit messages, dates)
  5. When review time comes, you have a complete record

This is what automatic achievement tracking provides. You're not relying on memory. You're not scrolling through six months of Git history at 11pm. The documentation already exists because it was captured while you were working.

Comparison diagram showing Meta's manual approach (remember, input, AI writes) versus automatic capture approach (work captured automatically, context preserved, complete record ready

Then—and only then—does an AI writing assistant become useful. It can help you synthesize that complete record into compelling review language. But it needs the complete record first.

Why This Matters Now

Meta isn't just helping employees write reviews. They're changing how they evaluate them. Starting in 2026, "AI-driven impact" becomes a formal part of performance reviews. Employees are assessed on how effectively they use AI tools to deliver results.

This raises the stakes for documentation. It's not enough to say "I shipped features." You need to show how AI tools amplified your work. What did AI accelerate? Where did your strategic decisions matter most? What impact resulted from combining your judgment with AI capabilities?

Answering these questions requires complete documentation of what you actually did. Not a partial list assembled from memory. Not the big projects you remember plus a few smaller ones you happened to write down.

The complete record. Every contribution. Every decision. Every outcome.

What Meta Got Right (And What They Missed)

Meta deserves credit for recognizing that AI can improve the performance review process. Writing clear, compelling self-reviews is hard, and an AI assistant helps. The multi-model approach—combining their internal Metamate with external models like Gemini—is smart architecture.

But they solved the wrong bottleneck. The hard part isn't writing polished prose. It's remembering what you accomplished in the first place.

Put differently: Meta built a better typewriter when what employees needed was a better notebook.

How to Track Work Achievements Automatically

The solution to tracking work achievements isn't better AI writing. It's automatic capture.

Your Git history already contains a complete record of your technical work. Every commit. Every PR. Every code review. Timestamped. With context. With diffs showing exactly what changed.

This is objective documentation that requires zero additional effort. You create it automatically every time you push code.

The problem isn't the data—it's extracting meaningful achievements from it. That's where automated achievement tracking changes the equation.

Instead of manually compiling your work six months later, the system processes your Git history continuously. As you work. Commit by commit. PR by PR. When review time comes, you have a comprehensive record documenting every work achievement.

That's when an AI writing assistant becomes truly valuable. It can help you synthesize that complete record into compelling review language. But it needs the input first.

The Shift to AI-Driven Performance Reviews

Meta's move signals where the industry is heading. Other companies will follow. [AI-driven impact (https://www.bragdoc.ai/blog/documenting-impact-ai-coding-era) will become a standard evaluation criterion.

This makes comprehensive documentation even more critical. You'll need to show:

  • What problems you solved
  • How you used AI to amplify your work
  • Where your strategic decisions mattered
  • What impact resulted
  • How you enabled others to do better work

You can't demonstrate any of this if you forgot half your accomplishments. The engineers who thrive in this environment will be the ones with complete achievement records. Not because they're better self-promoters. Because they have better systems.

Start With Better Input

Meta built an AI to help write performance reviews. But the real leverage comes earlier: automatic capture while you work.

Then use AI to polish the writing. Start with complete data. Everything else is trying to write a good story about work you forgot you did.

Where to Go From Here

If you're using Meta's AI Performance Assistant—or any similar tool—make sure you're feeding it complete information. The AI can only work with what you give it.

That means documenting work achievements continuously. Git commits, PRs you created, issues you closed—all captured as it happens so you never have to remember accomplishments from memory.

That's what we built BragDoc to do. It extracts achievements from your GitHub activity automatically. When review time comes, you have a comprehensive record ready. Use AI writing tools to polish it, but start with complete input.

Meta built AI to help employees write reviews. You can solve the harder problem—remembering what to write—by capturing your Git history. Your actual work is already documented. It's time to use it.

Top comments (1)

Collapse
 
bhavin-allinonetools profile image
Bhavin Sheth

This is so true. I’ve faced this exact problem during yearly reviews — writing wasn’t the hard part, remembering was. Now I keep a simple habit: whenever I ship something important or fix a real issue, I note it in a personal log. Even a one-line note helps later. AI can polish the story, but if you don’t capture the work when it happens, half your impact gets lost.