This is a submission for the Notion MCP Challenge
What I Built
WriteRight is an AI-powered writing analysis tool that helps K–8 teachers give differentiated feedback to every student — without spending hours reading and planning.
Teachers store student writing samples in a Notion database. WriteRight reads them all via Notion MCP, analyzes each one against Common Core State Standards (CCSS) for the student's grade level, and generates:
- Per-student teaching points (2–3 next-step instructional moves)
- Strengths (what the student is doing well, tied to standards)
- Standards addressed (e.g. W.3.1, L.3.2)
- Smart small groups — students clustered by shared instructional need, with a suggested activity
The full analysis saves back to Notion in one click.
Live app: https://write-right-app.web.app
GitHub: https://github.com/watts4/write-right
The Problem I Solved
A third-grade teacher with 28 students has 28 different writing stages. Reviewing every sample, finding the right CCSS standard, planning differentiated groups — that's 3–4 hours of prep per unit. WriteRight does it in under 2 minutes.
Video Demo
Show us the code
GitHub: https://github.com/watts4/write-right
The repo includes:
- React 18 / TypeScript / Vite / Tailwind frontend (Firebase Hosting)
- Node.js / Express / TypeScript backend (Google Cloud Run)
- Notion OAuth 2.0 + PKCE flow
- MCP agentic read loop with parallel tool execution
- CCSS Writing + Language standards K–8
- Seed script for mock Grade 3 class (8 students with deliberate skill gaps)
How I Used Notion MCP
Notion MCP is the core reading engine of WriteRight — not a thin wrapper, but the actual mechanism that makes agentic, schema-flexible reading of student writing possible.
Three-Phase Pipeline
Notion DB ──► Phase 1: MCP Agentic Read
│
▼
Phase 2: Claude Analysis (CCSS-anchored JSON)
│
▼
UI Results ──► /api/save ──► Notion (REST write-back)
Phase 1 — Notion MCP Agentic Read
The backend spawns @notionhq/notion-mcp-server (the official npm package) as a subprocess on a random local port. Claude then runs an agentic tool-use loop — calling MCP tools to query the database, list student pages, and fetch the block content (the actual writing) for each one.
The key performance optimization: all tool_use blocks within each response are executed in parallel via Promise.all. This cut Phase 1 from 4+ minutes to ~90 seconds for a class of 30.
// Spawn Notion MCP server with teacher's OAuth token
const mcpServer = spawn('notion-mcp-server', ['--port', String(port)], {
env: {
...process.env,
OPENAPI_MCP_HEADERS: JSON.stringify({
Authorization: `Bearer ${notionToken}`,
'Notion-Version': '2022-06-28',
}),
},
});
const transport = new StreamableHTTPClientTransport(`http://localhost:${port}/mcp`);
await client.connect(transport);
// Agentic loop — Claude calls tools until all student data is extracted
while (iterations < MAX_ITERATIONS) {
const response = await anthropic.messages.create({
model: 'claude-sonnet-4-6',
tools: mcpTools,
messages,
});
// Execute all tool_use blocks in parallel ← the key performance win
const toolResults = await Promise.all(
response.content
.filter((b) => b.type === 'tool_use')
.map((b) => client.callTool({ name: b.name, arguments: b.input }))
);
if (response.stop_reason === 'end_turn') break;
iterations++;
}
Phase 2 — Claude Analysis (No Tools)
A single Claude API call receives all the extracted writing samples + grade-level CCSS standards and returns structured JSON: per-student teaching points, strengths, standards addressed, and small group recommendations with activity suggestions.
Phase 3 — Notion REST Write-Back
The "Save to Notion" action uses @notionhq/client REST API — not MCP. MCP is ideal for flexible agentic reads where the schema is unknown; REST is faster and more predictable for batch writes with a known structure.
Why @notionhq/notion-mcp-server (npm) — not the hosted server
I initially tried Notion's hosted mcp.notion.com server via Anthropic's mcp_servers parameter. It only accepts Notion's own OAuth flow — incompatible with injecting a teacher's access token server-side.
The npm package solves this cleanly: spawned as a subprocess with the token injected via OPENAPI_MCP_HEADERS, giving each teacher session a fully authorized MCP connection.
Other Engineering Wins
Parallel tool execution cut time by ~70%
Each MCP loop iteration may produce multiple tool_use blocks. Running them sequentially was the original bottleneck. Promise.all on every batch reduced Phase 1 from 4+ minutes to ~90 seconds.
Stateless analyze route + separate save route
Cloud Run buffers responses until the Node.js event loop is idle. Fire-and-forget Notion writes inside the analyze route caused requests to hang. Solution: /api/analyze is pure read + analyze; /api/save is a separate, explicit user action.
24-hour in-memory session store
Teacher OAuth tokens stored in a session map (UUID → token + database ID) with auto-expiry. No database required for MVP.
Tech Stack
| Layer | Tech |
|---|---|
| Frontend | React 18, TypeScript, Vite, Tailwind CSS |
| Backend | Node.js, Express, TypeScript |
| AI | Claude Sonnet 4.6 (Anthropic SDK) |
| Notion MCP |
@notionhq/notion-mcp-server (npm subprocess) |
| Notion Auth | OAuth 2.0 + PKCE |
| Notion Writes |
@notionhq/client (REST) |
| Deploy | Firebase Hosting + Google Cloud Run |
| Standards | CCSS Writing + Language K–8 |
Top comments (0)