DEV Community

fliptrigga13
fliptrigga13

Posted on

NEXUS Ultra Notion: I wired my local AI swarm to Notion MCP and it updates live every 35 seconds

The Problem I Solved

I was running 6 AI agents locally and had zero visibility into what they were doing. Every monitoring tool I tried added cloud costs I didn't want and still couldn't tell me why an agent drifted or looped silently.

So I built a self-monitoring AI swarm. The agents score each other. The best-performing agent each cycle gets logged. Lessons get written back to memory. After 72 generations it's meaningfully different from day one.

The problem: all of that intelligence was stuck in SQLite, Redis logs, and a terminal window.

What I Built with Notion MCP

I connected the NEXUS swarm to Notion using the Notion MCP API. Every 35 seconds, synced to the swarm's cycle interval, a Python connector reads the latest cycle data and pushes it into 3 live Notion databases automatically:

Cycle Reports — Every completed swarm cycle gets a row: cycle ID, score, MVP agent, task description, latency, status.

Agent Leaderboard — All 12 agents tracked in real time with individual scores and trend direction. The MVP agent gets flagged automatically.

Buyer Intelligence — SCOUT agent outputs, signals about potential buyers, flow directly into Notion where I can act on them without touching the terminal.

Setup

pip install requests
Enter fullscreen mode Exit fullscreen mode

Add to .env:

NOTION_TOKEN=secret_your_token_here
Enter fullscreen mode Exit fullscreen mode

Get your token at notion.so/profile/integrations. Create an integration, share a page with it, then:

python nexus_notion_sync.py
Enter fullscreen mode Exit fullscreen mode

The connector auto-creates all 3 databases on first run. No manual Notion setup beyond the token.

How Notion MCP Is Used

The connector uses the Notion REST API to create databases programmatically with typed properties, push structured rows, and auto-configure itself from .env. Zero manual Notion setup beyond the token.

The entire data flow: 6 local LLM agents via Ollama run phi4:14b, qwen2.5:14b, mistral:7b. They complete a self-evaluation cycle. The REWARD agent scores them all. Redis blackboard captures the state. The Python connector reads it every 35 seconds and pushes it to Notion via MCP.

Results After First 20 Cycles

  • Cycle Reports: 6 rows and growing
  • Agent Leaderboard: 12 agents tracked per cycle with zero manual input
  • Buyer Intelligence: 8 real buyer signals already surfaced from SCOUT outputs
  • Zero cloud AI cost — all models run locally on RTX 4060

Code

https://github.com/fliptrigga13/nexus-notion-mcp

The product this swarm is helping sell: veil-piercer.com

Top comments (0)