DEV Community

member_26847952
member_26847952 Subscriber

Posted on

Your Team's Code Reviews Are Disappearing — I Built PRReviewIQ to Fix That

Notion MCP Challenge Submission 🧠

Every code review your team does contains hard-won knowledge: a bug pattern caught, a performance trap avoided, a naming convention enforced. And almost all of it evaporates the moment the PR is merged.

PRReviewIQ is my attempt to fix that. It's an AI code review tool that doesn't just comment on diffs — it remembers. Every insight gets logged into a living Notion knowledge base via Notion MCP, so your team's collective code quality wisdom compounds instead of getting buried in closed PRs.


Video Demo


Show Us the Code

🔗 github.com/himanshu748/dev-challenge-4


The Problem Worth Solving

Code review is one of the highest-leverage activities on any engineering team — but it's almost entirely ephemeral. Comments live on GitHub. Patterns go untracked. New teammates repeat the same mistakes. There's no institutional memory.

I wanted Notion to be that memory. Not as a place to manually paste notes, but as a database that an AI agent actively writes to as part of the review process itself.


How It Works

PRReviewIQ ships as two interfaces over the same core:

Web app — a FastAPI server at http://127.0.0.1:8000 where you can paste diffs and trigger reviews through a browser UI.

CLI — run it directly against a local git repo:

python review.py --repo /path/to/repo

# Or target a single file for pre-commit checks
python review.py --file path/to/file.py --pr-title "Pre-commit review" --repo-name my-repo
Enter fullscreen mode Exit fullscreen mode

Under the hood, every review goes through HuggingFace, and every finding gets written into Notion through MCP.


How I Used Notion MCP

Here's the part I'm most proud of from an architecture standpoint.

Most Notion integrations make direct REST calls. I went a different route: PRReviewIQ runs the official @notionhq/notion-mcp-server locally via npx, and passes NOTION_TOKEN directly to the MCP process. The app itself never touches the Notion REST API — all writes happen through MCP.

This matters because it means the LLM is using Notion as a tool, not just a destination. When it reviews a diff, it decides what's worth logging, how to categorize it, and where it belongs in the knowledge base — the same way a human reviewer would decide what's worth writing down.

What gets built in Notion:

  • A coding standards database that evolves as the agent identifies recurring patterns. You can query it live via GET /api/standards — it reads directly from Notion and returns JSON, so the app and the team always share the same source of truth.
  • A reviews log — every PR review stored as a structured Notion entry, searchable and linkable.
  • Weekly digest pagesPOST /api/weekly-digest aggregates trends across recent reviews and writes a summary page your team can async through on Mondays.

The compounding effect is the real value: the knowledge base gets more useful the more you use it.


Technical Note on the MCP Setup

Running @notionhq/notion-mcp-server via npx means Node.js needs to be installed alongside Python. The MCP server process is spawned by the app at runtime — no separate setup step required beyond having npx available.

# Prerequisites
python 3.11+
node + npx
Enter fullscreen mode Exit fullscreen mode
# .env
HF_API_KEY=hf_...
NOTION_TOKEN=...
NOTION_PARENT_PAGE_ID=...
Enter fullscreen mode Exit fullscreen mode
# Run
pip install -e .
uvicorn app.main:app --reload
Enter fullscreen mode Exit fullscreen mode

The first call to POST /api/setup provisions the full Notion workspace structure — databases, log pages, and the standards section — and you're ready to start reviewing.


What I'd Build Next

The CLI is useful but I want to wire it into a git pre-commit hook so reviews happen automatically before every push. The Notion knowledge base is already there — the next step is making the feedback loop tighter.

Top comments (0)