DEV Community

Cover image for I Gave My Notion Workspace a Brain — Here's What Happened
Arya Koste
Arya Koste

Posted on

I Gave My Notion Workspace a Brain — Here's What Happened

Notion MCP Challenge Submission 🧠

This is a submission for the Notion MCP Challenge

I spent a weekend building an AI that autonomously operates inside Notion. You type one sentence. It searches your workspace, reasons about what to do, creates pages, spins up databases, populates tasks — and you watch every single step happen in real time. Here's how I built it and why it might be the most useful thing I've ever made.


The Frustration That Started This

I've been a Notion user. My workspace is a sprawling mess of half-finished project pages, meeting notes that never got followed up on, and goals I set in January that I haven't looked at since.

The problem isn't Notion. Notion is incredible. The problem is that Notion is entirely passive. It sits there, waiting. It does nothing unless you tell it exactly what to do, how to structure it, and where to put it.

I kept thinking: what if Notion could just... think? What if I could say "set up a Q2 OKR tracking system" and it would actually do it — build the database, create the objectives, set up the weekly check-in templates, write the guidelines page — without me having to architect every single piece?

That's Apex.


What I Built

Apex is a full-stack AI command center that gives Llama 3.3 70B (via Groq) complete autonomous access to your Notion workspace. You chat with it naturally. It reasons, plans, and executes — using 10 Notion MCP tools in sequence — while you watch every operation happen live on screen.

It's not a chatbot that tells you how to use Notion. It's an agent that uses Notion for you.

Why Groq? Groq's free tier gives you 14,400 requests/day with no credit card required. Llama 3.3 70B handles tool calling excellently — and it's genuinely fast. Zero cost to run.

7 Intelligent Modes

Each mode has a deeply engineered system prompt that makes the AI behave like a domain specialist:

Mode What It Does
💬 Chat Natural language Q&A — ask anything about your workspace
🏗️ Project Blueprint One sentence → hub page + task database + timeline + risk log
🧠 Brain Dump Raw chaos → organized, searchable Notion knowledge base
📋 Meeting Intelligence Paste notes → decisions, action items, owners, follow-ups
📊 Weekly Review AI scans entire workspace → CEO-level intelligence report
🎯 OKR Tracker Define goals → complete OKR system with check-in templates
📅 Content Calendar Describe strategy → 20+ content ideas across all channels

The Feature I'm Most Proud Of: Live Tool Use Visualization

Most AI tools are black boxes. You ask, you wait, you get an answer. You have no idea what happened in between.

Apex flips this. As the AI works, you see every single Notion operation as it happens — animated cards streaming in, showing you exactly what the AI is doing and why:

🌐  Scanning Workspace...                          ✓ 12 pages, 3 databases
🔍  Searching Notion...     "Q2 goals"             ✓ Found 2 related pages
📄  Creating Page...        "🎯 Q2 Goals Hub"      ✓ Created
🗄️  Building OKR Database.. "Q2 OKR Tracker"       ✓ Created with 8 fields
➕  Adding Record...        "Grow MRR by 40%"      ✓ Objective added
➕  Adding Record...        "Key Result: MRR > $50k"✓ KR added
📄  Creating Page...        "📋 Weekly Check-in"   ✓ Template created
📄  Creating Page...        "🏆 Wins & Learnings"  ✓ Log created
Enter fullscreen mode Exit fullscreen mode

Eight Notion operations, chained intelligently, executed autonomously. The whole OKR system is live in your workspace before you've finished reading the output.

This transparency is what makes it feel like a genuine AI colleague rather than a magic trick.


Video Demo


Technical Deep Dive

Stack

  • Frontend: Next.js 14 (App Router) + TypeScript
  • Styling: Tailwind CSS with glassmorphism, animated ambient orbs, dot-grid background
  • AI: Llama 3.3 70B via Groq SDK — streaming + multi-turn tool use
  • Notion: @notionhq/client — 10 MCP-compatible tools
  • Transport: Server-Sent Events for real-time streaming

The 10 Notion MCP Tools

const NOTION_TOOLS = [
  "search_notion",           // Full-text search across workspace
  "create_notion_page",      // Create pages with rich markdown → block conversion
  "get_page_content",        // Read any page's full content
  "update_notion_page",      // Append structured content to existing pages
  "create_project_database", // Task DB: Status, Priority, Due Date, Assignee, Tags
  "query_database",          // Retrieve records with optional filters
  "get_workspace_overview",  // Scan recent pages and databases
  "add_database_item",       // Insert rows with proper property types
  "create_goals_tracker",    // OKR DB: Objective/KR, Progress %, Owner, Quarter
  "create_content_calendar", // Editorial DB: Channel, Type, Status, Publish Date
]
Enter fullscreen mode Exit fullscreen mode

Tools are defined in OpenAI-compatible format (which Groq uses), so each tool looks like:

{
  type: "function",
  function: {
    name: "create_notion_page",
    description: "Create a new page in Notion with rich content...",
    parameters: {
      type: "object",
      properties: {
        title: { type: "string", description: "Title of the new page" },
        content: { type: "string", description: "Page content in markdown format..." },
        icon: { type: "string", description: "Emoji icon for the page" },
      },
      required: ["title", "content"],
    },
  },
}
Enter fullscreen mode Exit fullscreen mode

The Multi-Turn Streaming Loop

This is the core of the whole thing. The AI doesn't just make one tool call — it chains as many as needed, reasoning about results before deciding what to do next:

const groq = new Groq({ apiKey: process.env.GROQ_API_KEY });

// Keep looping until the model stops using tools
while (true) {
  const streamResponse = await groq.chat.completions.create({
    model: "llama-3.3-70b-versatile",
    tools: NOTION_TOOLS,
    messages: conversationHistory,
    stream: true,
  });

  // Accumulate streamed tool call fragments
  const toolCallsMap: Record<number, { id: string; name: string; arguments: string }> = {};

  for await (const chunk of streamResponse) {
    const delta = chunk.choices[0]?.delta;

    if (delta?.content) {
      // Stream text chunks to the frontend in real-time
      sendSSE({ type: "text", content: delta.content });
    }

    if (delta?.tool_calls) {
      for (const tc of delta.tool_calls) {
        // Accumulate streamed JSON fragments per tool call index
        if (!toolCallsMap[tc.index]) toolCallsMap[tc.index] = { id: "", name: "", arguments: "" };
        if (tc.id) toolCallsMap[tc.index].id = tc.id;
        if (tc.function?.name) toolCallsMap[tc.index].name = tc.function.name;
        if (tc.function?.arguments) toolCallsMap[tc.index].arguments += tc.function.arguments;
      }
    }
  }

  const toolCalls = Object.values(toolCallsMap);
  if (toolCalls.length === 0) break; // No more tool calls — done

  // Notify frontend each tool is starting
  for (const tc of toolCalls) {
    sendSSE({ type: "tool_start", name: tc.name, input: JSON.parse(tc.arguments) });
  }

  // Add assistant message with tool_calls to history
  conversationHistory.push({
    role: "assistant",
    tool_calls: toolCalls.map(tc => ({
      id: tc.id, type: "function",
      function: { name: tc.name, arguments: tc.arguments },
    })),
  });

  // Execute each tool against the Notion API
  for (const tc of toolCalls) {
    const result = await executeTool(tc.name, JSON.parse(tc.arguments));
    sendSSE({ type: "tool_result", name: tc.name, result });

    // Feed result back as a tool message for the next reasoning step
    conversationHistory.push({ role: "tool", tool_call_id: tc.id, content: JSON.stringify(result) });
  }
}
Enter fullscreen mode Exit fullscreen mode

The key insight: the AI sees the result of each Notion operation before deciding what to do next. It's not blindly firing off a pre-planned list — it's actually reading what it created and adapting.

Markdown → Notion Blocks

One of the trickier engineering challenges: Notion's API doesn't accept markdown. Every page has to be structured as typed block objects. I wrote a full converter that handles 8 block types:

function markdownToBlocks(markdown: string): NotionBlock[] {
  return lines.flatMap(line => {
    if (line.startsWith("# "))   return [heading1Block(line.slice(2))]
    if (line.startsWith("## "))  return [heading2Block(line.slice(3))]
    if (line.startsWith("### ")) return [heading3Block(line.slice(4))]
    if (line.startsWith("- "))   return [bulletBlock(line.slice(2))]
    if (/^\d+\. /.test(line))    return [numberedBlock(line.replace(/^\d+\. /, ""))]
    if (line.startsWith("> "))   return [quoteBlock(line.slice(2))]
    if (line === "---")          return [dividerBlock()]
    if (line.trim())             return [paragraphBlock(line)]
    return []
  })
}
Enter fullscreen mode Exit fullscreen mode

Headings, bullets, numbered lists, quotes, dividers, paragraphs — all properly typed and sent to Notion's API.

OKR Database Schema

When you say "set up my Q2 OKRs", Apex builds a database with exactly the right schema:

const okrSchema = {
  Name:     { title: {} },
  Type:     { select: { options: ["Objective", "Key Result"] } },
  Status:   { select: { options: ["Not Started", "On Track", "At Risk", "Behind", "Complete"] } },
  Progress: { number: { format: "percent" } },
  Owner:    { rich_text: {} },
  "Due Date": { date: {} },
  Quarter:  { select: { options: [period] } },
  Notes:    { rich_text: {} },
}
Enter fullscreen mode Exit fullscreen mode

Not a generic template. A purpose-built structure that matches how real OKR systems work.


Features That Make the UX Sing

Beyond the core AI functionality, I obsessed over the interface:

⌘K Command Palette — Hit Cmd+K to open a searchable command palette. Switch modes, fire quick actions, export your conversation, clear the session — all without touching the mouse.

Slash Commands — Type /project, /goals, /meeting directly in the chat input to instantly switch modes. Feels like a native app.

Voice Input — Click the mic button and just talk. Web Speech API converts your speech to text in real time, with animated pulse rings showing it's listening. Perfect for actual brain dumps.

Smart Follow-Up Suggestions — After every AI response, three contextual next-step prompts appear. Click any to instantly continue. It's like the AI is guiding you toward the next useful thing.

Session Stats Bar — A live counter in the top bar shows pages created, databases built, and total Notion operations performed this session. It's a small thing but makes the productivity feel visceral.

Floating Success Toasts — Every time a Notion page or database is created, a beautiful animated toast slides up: "Page created: Q2 OKR Tracker — saved to Notion". Instant feedback, no hunting through your workspace.

Export Conversation — Download your entire session as a formatted markdown file. Great for sharing what you built.

Workspace Dashboard — A separate /dashboard route showing live workspace stats, recent pages and databases, a full capability showcase, and a visual "How Apex Works" flow diagram.


Real-World Demo: OKR Mode in Action

What I typed:

"Set up Q2 OKRs — grow MRR by 40%, improve activation rate to 60%, ship mobile v1, and reduce churn below 3%"

What Apex did (30 seconds):

  1. Scanned workspace for existing goal-related pages
  2. Created a 🎯 Goals Hub page with vision statement and OKR methodology explained
  3. Built the Q2 OKR Tracker database with 8 properly typed fields
  4. Added each of the 4 objectives as database records (Type: Objective)
  5. Added 3 key results per objective (Type: Key Result, with measurable targets)
  6. Created a 📋 Weekly Check-in Template page — duplicate this every Monday
  7. Created a 🏆 Wins & Learnings Log for retrospectives

What I had to do: Type one sentence.

That's the whole thing.


What's Next

  • Scheduled Reports — Auto-run weekly review every Monday morning via cron
  • Notion Webhooks — React to workspace changes in real time
  • Multi-workspace — Switch between different Notion accounts
  • Template Library — Pre-built prompts for 20+ common use cases

Setup in 3 Minutes

git clone https://github.com/Aryakoste/apex-notion
cd apex-notion-ai
npm install
cp .env.local.example .env.local
Enter fullscreen mode Exit fullscreen mode

Edit .env.local:

GROQ_API_KEY=gsk_...
NOTION_API_KEY=secret_...
Enter fullscreen mode Exit fullscreen mode
npm run dev
# Open http://localhost:3000
Enter fullscreen mode Exit fullscreen mode

Getting your Groq API key (free):

  1. Go to console.groq.com → sign up (no credit card)
  2. API Keys → Create API Key → copy it

Getting your Notion API key:

  1. notion.so/my-integrations → New integration → copy the secret
  2. On each Notion page: ... menu → Add connections → select your integration

That's it. Everything else is handled by Apex.


Why This Matters Beyond the Demo

The reason I'm excited about this isn't the cool UI or the streaming animations — it's what it represents for how we use tools.

Notion has always required you to be the architect. You had to know what a good project page looks like. You had to remember to run your weekly review. You had to structure your OKRs correctly.

With MCP, the AI can be the architect. You just have to know what you want. The gap between intention and implementation collapses from hours to seconds.

I think this is the future of productivity software. Not AI that suggests. AI that does.


GitHub

Full source code: [https://github.com/Aryakoste/apex-notion]


#notionmcp #ai #nextjs #typescript #groq #llama #productivity #buildinpublic

Top comments (0)