DEV Community

Cover image for Meet Zhu Li: The AI Agent That Manages Your Notion Workspace From Any Channel
Miracleio
Miracleio Subscriber

Posted on

Meet Zhu Li: The AI Agent That Manages Your Notion Workspace From Any Channel

Notion MCP Challenge Submission 🧠

This is a submission for the Notion MCP Challenge

Here's how a typical interaction looks:

Me: hey, I just deployed a fix for the payment button issue on iOS Safari

Zhu Li: Got it. I found the issue in your Notion workspace — marking it resolved now
        and linking it to your latest commit. Done. Your changelog has been updated
        with a note about the fix. Anything else?
Enter fullscreen mode Exit fullscreen mode

That exchange is the core promise of Zhu Li: an AI project operations assistant that turns plain-language messages into real work inside Notion. You can report a bug, submit a feature idea, ask what is blocking a release, or mention that you just shipped a fix, and Zhu Li updates the right records for you.

I built it because project context is usually scattered across chat apps, commits, dashboards, sticky notes, and half-remembered ideas. Bugs get reported but not triaged. Feature requests get discussed but not tracked. Fixes ship but never make it into release notes. I wanted one place where all of that could live in a structured way, and one assistant that could keep it up to date without constant manual cleanup.

That place is Notion, and Notion MCP is what makes the whole experience practical. Instead of treating Notion like a passive document store, Zhu Li uses MCP to read from and write to a real workspace full of issues, features, insights, commits, changelog entries, and queued events.

iMessage is just one part of a much bigger system.

That iMessage conversation — and the Notion updates it triggered — is the most visible slice of a much deeper system. Under the hood, Zhu Li is a fully featured project management backend with an AI pipeline for intake, triage, deduplication, pattern detection, and changelog generation — all writing to a structured Notion workspace that becomes your single source of truth for everything happening in your project.

Screenshot of Conversation with Zhu Li on iMessage

The name comes from The Legend of Korra — Zhu Li is the loyal, meticulous assistant who quietly handles everything behind the scenes so you don't have to. That's the whole idea.

Live: zhuli.aevr.space

Home page / landing page of zhuli.aevr.space

Home page / landing page of zhuli.aevr.space


The problem: project data lives everywhere except where you can use it

Bugs get reported in five different places in five different ways. Feature ideas land in Slack threads and Figma comments and never get tracked. Commits go in with no link to the issue they fix. Release notes write themselves... never. And recurring problems stay invisible until someone connects enough dots to notice the pattern.

I wanted all of that in one structured place that an AI could read and write. The structured place is Notion. The AI that keeps it current is Zhu Li.

Here's what Zhu Li solves:

  1. Noise reduction — Users report the same bug in five different ways across five different channels. Zhu Li's dedup agent reads related issues from your Notion Issues database and only creates a new record when it's genuinely novel, incrementing an affectedCount counter on duplicates.
  2. Triage fatigue — Manually setting severity, category, and component for every incoming issue is slow. Zhu Li's triage agent does it automatically.
  3. Lost patterns — Recurring issues are invisible until your Insights database surfaces them. Zhu Li's insight agent periodically scans resolved and open issues and writes pattern summaries back to Notion.
  4. Changelog pain — Nobody wants to write release notes on Friday afternoon. The code mapper agent links commits to issues and auto-drafts changelog entries in Notion.
  5. Scattered context — Because everything resolves to Notion, you get one source of truth that any AI assistant, human, or tool can read.

How it's wired together

┌──────────────────────────────────────────────────────────┐
│                     Zhu Li Platform                      │
├──────────┬──────────┬──────────────────┬─────────────────┤
│  Web UI  │ REST API │   MCP Server     │ Telegram / iMsg │
│ Next.js  │  /api/v1 │   /api/mcp       │  Webhooks       │
└────┬─────┴────┬─────┴────────┬─────────┴────────┬────────┘
     │          │              │                  │
     ▼          ▼              ▼                  ▼
┌──────────────────────────────────────────────────────────┐
│                   AI Agent Pipeline                      │
│   Intake → Triage → Dedup → Insight → CodeMapper → CLog │
│          Groq Llama 3.1-8b / 3.3-70b (+ Gateway)        │
└──────────────────────────┬───────────────────────────────┘
                           │
                           ▼
┌──────────────────────────────────────────────────────────┐
│             Notion MCP  (mcp.notion.com)                 │
│       Read / write the user's Notion databases           │
└──────────────────────────┬───────────────────────────────┘
                           │
                           ▼
┌──────────────────────────────────────────────────────────┐
│             User's Notion Workspace                      │
│  Issues · Features · Insights · Changelog · Commits     │
└──────────────────────────────────────────────────────────┘
Enter fullscreen mode Exit fullscreen mode

The stack is a Next.js App Router monorepo deployed on Vercel. Every channel feeds the same AI agent pipeline, which always writes to Notion via MCP.

Layer Technology
Framework Next.js · App Router · Turbopack
AI runtime Vercel AI SDK + Groq (Llama 3.1/3.3)
Notion (primary) @ai-sdk/mcp — Notion MCP SSE transport
Notion (bootstrap) @notionhq/client — create databases on signup
Database MongoDB Atlas (Mongoose)
Auth JWT (jose) + bcrypt + Notion OAuth
MCP Server @modelcontextprotocol/sdk
UI Tailwind CSS v4 + shadcn/ui
Deployment Vercel

The organizational backbone: six Notion databases

After connecting Notion via OAuth, a one-click bootstrap creates six structured databases that become the persistent state of your project:

  • Issues — Bug tracking with severity, status, component, and assignee
  • Features — Feature requests with priority, effort, and status
  • Insights — AI-generated pattern analysis across recurring issues
  • Commits — Git commit log with hash, message, and linked issues/features
  • Changelog — Auto-generated release notes linked to commits
  • Event Queue — Processing pipeline status (synced from MongoDB)

Everything that comes in — whether a text from iMessage, a GitHub Actions payload, or a direct API call — gets classified and written into these databases. The structure is consistent no matter which channel it came through.

Notion workspace showing the six databases

Notion workspace showing the six databases


The AI agent pipeline: from raw input to structured Notion data

Every submission enters a multi-stage pipeline. Each stage is a separate AI agent with a single responsibility:

Agent Model What it does
Intake llama-3.1-8b-instant Classifies input, extracts title/description/component, creates a Notion page
Triage llama-3.3-70b-versatile Sets severity (critical/high/medium/low), category, and component
Dedup llama-3.3-70b-versatile Searches Notion Issues for semantic matches; marks duplicates
Insight llama-3.3-70b-versatile Detects recurring patterns; upserts Insight pages in Notion
Code Mapper llama-3.3-70b-versatile Links commits to open Issues/Features; drafts Changelog entries
Changelog llama-3.3-70b-versatile Produces formatted release notes from mapped commits

By the time a bug lands in your Notion workspace it already has a severity, a component, a duplicate check, and links to any related commits. No triage meeting required.

A single submitted issue shown in Notion, with its Severity, Status

A single submitted issue in Notion — severity, component, and category all set automatically


The chat agent: your Notion workspace, conversational

The part that makes this more than a data pipeline is the chat agent. It's backed by Notion MCP, which means it can read and write your entire workspace in response to plain-language requests.

Ask it questions:

You: "Which component has the most critical open issues right now?"

Zhu Li: "The payments component has 3 critical open issues — two related to iOS Safari checkout and one involving webhook retry logic."

Or give it instructions:

You: "I just merged a fix for the iOS Safari payment bug. Mark it resolved."

Zhu Li: "Got it — I've marked the issue as resolved in Notion and linked your latest commit."

Under the hood, Zhu Li calls notion_query_database to find the matching issue, then notion_update_page to close it. No custom CRUD code anywhere in that path — the model reasons about what tools to call and orchestrates Notion directly through MCP.

This is what makes the multi-channel experience feel natural. Whether you use the web dashboard, Telegram, or iMessage, the outcome is the same: your Notion workspace becomes something you converse with — and it acts.


What you can submit, and how

The same pipeline and chat agent are accessible from whichever interface fits your workflow:

For the demo flow, I focus on web dashboard chat and Telegram, while still supporting iMessage channels.

Web Dashboard — A chat interface with structured views for the backlog, insights, and changelog.

Dashboard chat interface

[SCREENSHOT: Dashboard chat interface — the message thread showing a question like "What are our open issues?" and an AI response]

REST API + SDK — External services submit issues or query data via an API key:

import { ZhuLiSDK } from "@untools/zhuli";

const zhuli = new ZhuLiSDK({ apiKey: "zhuli_abc123...", baseUrl: "<https://zhuli.aevr.space>" });

// Report a bug from any service
await zhuli.submitIssue({
  title: "Payment button unresponsive on iOS Safari",
  description: "Tapping checkout does nothing. Network tab is empty.",
  severity: "critical"
});

// Ask a natural language question backed by your Notion data
const answer = await zhuli.chat("What critical issues have been open for more than a week?");

// Fetch structured insight records
const insights = await zhuli.getInsights();
Enter fullscreen mode Exit fullscreen mode

Git Integration** — A post-commit hook or GitHub Actions step submits every commit automatically

Git Integration — A post-commit hook or GitHub Actions step submits every commit automatically:

# .github/workflows/zhuli.yml
on: [push]
jobs:
  notify:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v4
        with: { fetch-depth: 2 }
      - run: |
          curl -sf -X POST <https://zhuli.aevr.space/api/v1/ingest/commit> \\
            -H "Authorization: Bearer ${{ secrets.ZHULI_API_KEY }}" \\
            -H "Content-Type: application/json" \\
            -d "{\\"hash\\":\\"$(git rev-parse HEAD)\\",\\"message\\":\\"$(git log -1 --pretty=%s)\\",\\"branch\\":\\"$GITHUB_REF_NAME\\",\\"author\\":\\"$(git log -1 --pretty='%an')\\",\\"committedAt\\":\\"$(git log -1 --pretty=%cI)\\"}"
Enter fullscreen mode Exit fullscreen mode

GitHub Action YAML

GitHub action YAML

Telegram Bot — Users message the bot; Zhu Li replies using the same AI agent backed by Notion MCP tools.

Setting up Telegram is pretty straightforward:

Setting up Telegram - Provide bot token

Then you can chat with your agent:

Telegram conversation

[SCREENSHOT: Telegram conversation — user asks "How many commits do I have?", bot responds with a formatted list pulled from Notion]

iMessage via Blooio — This is the hosted iMessage integration. Blooio sends inbound message.received webhooks to Zhu Li, and Zhu Li routes them in three ways:

  • Plain text messages go straight to the chat agent
  • /issue ... creates an issue event and pushes it into the intake pipeline
  • /feature ... creates a feature request event and queues it for processing

That means you can text Zhu Li from iMessage like you would text a teammate: ask for your current backlog, report a production bug, or submit a feature request from your phone. Because Blooio is hosted, this path does not require running your own Mac server or changing SIP settings.

Conversation in iMessage

iMessage thread through Blooio

iMessage via BlueBubbles — I also support BlueBubbles as the self-hosted alternative for teams that want to use their own Mac as the iMessage bridge instead of a hosted provider.

MCP Server — Claude Desktop and any MCP-capable AI assistant can connect directly to Zhu Li's own MCP server at /api/mcp. Six tools are exposed: submit_issue, get_insights, get_backlog, update_status, search_workspace, generate_changelog. Each tool internally calls Notion MCP to fulfil the request.

Public Portal — Each workspace gets a shareable /portal/<id> portal page where anonymous visitors can submit issues and feature requests, see the public backlog, read the published changelog, and find documentation links — all without a Zhu Li account. Here’s an example - https://zhuli.aevr.space/portal/zlp_9428dee93575424a81

Public portal page showing the issue submission form

Public portal page showing the issue submission form

Portal showing public issue list, and published changelog section

Public portal page showing the issue submission form, public issue list, and published changelog section


Video Demo

[VIDEO: Walkthrough — register, connect Notion, submit from web dashboard and Telegram, watch events land in Notion with severity + component set]


Show us the code

GitHub: github.com/miracleonyenma/zhuli

npm SDK: @untools/zhuli

Live: zhuli.aevr.space


How I Used Notion MCP

Notion MCP is the centrepiece of Zhu Li — not just an add-on.

Connection flow

When a user signs up with "Continue with Notion", an OAuth flow runs:

Auth page → GET /api/v1/notion/oauth/initiate?mode=login
  → Notion authorisation screen
  → GET /api/v1/notion/oauth/callback
    → Creates / finds account by Notion workspace ID
    → Encrypts access token (AES-256-GCM) → stores on User document
    → Creates session cookie (JWT)
    → Redirects to /dashboard/setup (workspace bootstrap)
Enter fullscreen mode Exit fullscreen mode

The access token is encrypted at rest and decrypted on-demand. The MCP client is created per-user and cached with a 5-minute TTL so tool initialisation latency only hits once per session:

// lib/chat/agent.ts (simplified)
async function getNotionMCPTools(userId: string) {
  const cached = mcpClientCache.get(userId);
  if (cached && Date.now() - cached.createdAt < MCP_CACHE_TTL_MS) {
    return cached.tools;
  }

  const user = await UserModel.findById(userId).select("+notionConnection.accessToken");
  const token = decrypt(user.notionConnection.accessToken); // AES-256-GCM

  const client = new experimental_createMCPClient({
    transport: { type: "sse", url: "<https://mcp.notion.com/sse>" },
    headers: { Authorization: `Bearer ${token}` },
  });

  const tools = await client.tools();
  mcpClientCache.set(userId, { tools, createdAt: Date.now() });
  return tools;
}
Enter fullscreen mode Exit fullscreen mode

How agents use MCP tools

Every AI agent that reads or writes Notion data receives the MCP tool set as part of its generateText() call. The agent decides which tool to call — Zhu Li doesn't wire up explicit logic for "call this specific Notion API endpoint". The model reasons about what it needs and calls the right MCP tool:

// lib/agents/triage.ts (simplified)
const { text } = await generateText({
  model: groq("llama-3.3-70b-versatile"),
  tools: { ...notionMCPTools },          // ← the full Notion MCP tool set
  maxSteps: 5,
  system: `You are a triage agent. Given a new issue, read the Issues database
           to understand conventions, then update the issue's Severity, Category,
           and Component properties.`,
  prompt: `Issue: ${title}\\nDescription: ${description}\\nNotion page ID: ${pageId}`,
});
Enter fullscreen mode Exit fullscreen mode

The model will call notion_query_database to read existing issues, understand the team's severity conventions from real data, then call notion_update_page to write back the triage decision. No hand-coded Notion API logic anywhere in the triage step.

[SCREENSHOT: Server logs or a dev console showing the MCP tool calls being made during a triage run — notion_query_database → notion_update_page]

MCP-to-MCP: Zhu Li as a MCP server

Zhu Li also exposes its own MCP server, meaning any MCP-capable AI assistant can use Zhu Li as a tool — and Zhu Li in turn calls Notion MCP to fulfil the request. That's MCP all the way down:

Claude Desktop
  → POST /api/mcp (Zhu Li MCP server)
      tool: get_insights
  → Zhu Li resolves user from API key
  → getNotionMCPTools(userId)
  → AI agent: generateText with Notion MCP tools
      → notion_query_database (Insights database)
  → Returns structured insight list to Claude
Enter fullscreen mode Exit fullscreen mode

The six tools exposed through the Zhu Li MCP server:

server.tool("submit_issue",      submitIssueSchema,     handleSubmitIssue);
server.tool("get_insights",      getInsightsSchema,     handleGetInsights);
server.tool("get_backlog",       getBacklogSchema,      handleGetBacklog);
server.tool("update_status",     updateStatusSchema,    handleUpdateStatus);
server.tool("search_workspace",  searchWorkspaceSchema, handleSearchWorkspace);
server.tool("generate_changelog",changelogSchema,       handleGenerateChangelog);
Enter fullscreen mode Exit fullscreen mode

The chat agent: Notion MCP as a conversational interface

The chat agent (lib/chat/agent.ts) is the most direct use of Notion MCP. For queries, the model calls notion_query_database and reasons over the results. For mutations — like the "mark it resolved" example at the top — it calls notion_update_page. No custom aggregation or CRUD code anywhere in that path.

The Blooio integration brings that same Notion-backed chat to iMessage. Zhu Li verifies the Blooio webhook signature, parses inbound message.received events, and routes them to chat or queues them as pipeline events. The reply goes back through the Blooio REST API:

iMessage user
  → Blooio webhook
  → /api/v1/channels/blooio/webhook
      → verify HMAC signature
      → parse inbound text
      → route to chat() or create EventQueue item
  → Notion MCP tools / agent pipeline
  → reply sent back through Blooio REST API
Enter fullscreen mode Exit fullscreen mode

[SCREENSHOT: Blooio dashboard webhook configuration showing the Zhu Li webhook URL and message event type]

For external channels (Telegram, iMessage), the chat agent routes to the AI Gateway primary model (larger context window, better tool-call reliability) and falls back to Groq on error — important when users send bursts of messages or MCP tool results are large.

What Notion MCP unlocks that wasn't possible before

Without MCP, connecting an AI agent to Notion required manually wrapping every endpoint worth using, writing pagination and filter logic for each database, re-implementing rich text parsing, and maintaining all of it as the API evolved.

With MCP, I describe the intent in plain language. The model figures out which tools to call. The agent code became thinner, more readable, and much easier to extend.

It also meant the chat interface — including the bidirectional action-taking — required almost no hand-coded Notion logic. Queries, mutations, cross-database lookups: the model handles the orchestration. I just write the system prompt.


Built for the Notion MCP Challenge · March 2026

Top comments (10)

Collapse
 
harsh2644 profile image
Harsh

The dedup agent is the piece most tools skip entirely and it's probably the one that matters most in practice. Bug reports are messy. The same issue comes in five different ways from five different people, and without dedup you end up with a bloated backlog where the signal is buried under noise. Incrementing affectedCount instead of creating duplicates is exactly the right call it also gives you severity signal that pure ticket count never captures.

The MCP-to-MCP architecture is worth calling out separately. Zhu Li consuming Notion MCP while also exposing its own MCP server means it can sit in the middle of a much larger tool chain Claude Desktop calling Zhu Li, Zhu Li calling Notion. That's not just a feature, that's a composability decision that most agent projects don't think about at this stage.

One thing I'm curious about: how does the triage agent handle ambiguous severity? Like if someone texts the checkout button is a bit slow sometimes is that medium or low? Does the model lean on the existing Issues database conventions to calibrate, or is it mostly working from the system prompt?

Collapse
 
miracleio profile image
Miracleio

I'm really happy you made that observation @harsh2644 !
I thought it was very important to add deduplication early on because its what we see a lot in real world applications one bug can lead to multiple users complaining about the same thing, so its important we consolidate that the best we can in order to avoid noise.

As for the MCP-to-MCP architecture, I wanted Zhu Li to be able to give Notion MCP context it might not already have when interacting with your workspace. I'm glad you noticed that decision.

It would be really cool to have the triage agent lean on the existing issues database as you mentioned. That would make it smarter, but for now and to keep it lean, it is mostly working from the system prompt.

Thanks for the really detailed feedback! I appreciate it!

Collapse
 
uche_emma profile image
Uche Emmanuel

Awesome

Collapse
 
miracleio profile image
Miracleio

Thank you!

Collapse
 
azubuikeduru profile image
Azubuike Duru

Love this!!

Collapse
 
miracleio profile image
Miracleio

Thank you! 🙏🏾

Collapse
 
divofred profile image
Fredrick Emmanuel

Would love to try this out

Collapse
 
miracleio profile image
Miracleio

Please do! If you run into any issues, let me know

Collapse
 
miracleio profile image
Miracleio

I think its worth mentioning that I set out trying to build an iMessage chat bot and while I was able to get the code to work and even test it out for a limited time, I faced two major challenges:

  • First, I didn't want to do all the risky stuff involved with getting BlueBubbles to work on my machine. It mentioned something about disabling SIP and modifying my boot loader (I'm not really sure but I wasn't comfortable with any of that) and that I might not be able to install updates on my machine after that. I think it all makes sense why people buy a mac mini to do all that.
  • Second, I was able to integrate another service, a hosted service - blooio.com but I exhausted my 20 messages.

The integration still works and if you set up BlueBubbles and Blooio, you will be able to use those channels. The web and telegram chat on the other hand, work fine and is easy to set up.

Collapse
 
miracleio profile image
Miracleio

If you're trying out Zhu Li and you encounter an issue, you can drop it on zhuli.aevr.space/portal

I should get it in my notion workspace