I Hooked My Autonomous AI Outreach Swarm to Notion via MCP — It Reports Every Cycle in Real-Time
Submission for the Notion MCP Challenge.
What I Built
NEXUS → Notion MCP Bridge: a Python client that connects an autonomous AI outreach swarm to a Notion database using the official @notionhq/notion-mcp-server over stdio transport — no REST API, pure Model Context Protocol.
Every 90 seconds, the swarm runs a cycle: scrape a live Reddit thread, write a reply, score it 0.0–1.0. Every passing cycle becomes a Notion database page automatically, giving me a real-time command center to review AI-generated content before it goes live.
GitHub: https://github.com/fliptrigga13/nexus-notion-mcp
The Problem
I run an autonomous outreach swarm that generates Reddit replies for my AI monitoring product. The swarm runs overnight. By morning I had a JSON file with 200 entries and no way to review them at a glance.
Notion was the obvious answer — a queryable, filterable view with a checkbox to track what was posted. The question was how to connect them.
How the Notion MCP Bridge Works
The key design decision: use the official Notion MCP server as the transport layer, not the REST API directly.
AI Swarm (Python) → cycles.json → MCP Bridge → notion-mcp-server (stdio) → Notion DB
The bridge spawns notion-mcp-server as a subprocess and communicates over stdin/stdout using JSON-RPC 2.0:
class NotionMCPClient:
def start(self):
env = {**os.environ, "NOTION_API_KEY": self._api_key}
self._proc = subprocess.Popen(
["notion-mcp-server", "--transport", "stdio"],
stdin=subprocess.PIPE,
stdout=subprocess.PIPE,
env=env,
text=True,
)
self._initialize()
def _initialize(self):
self._send({
"jsonrpc": "2.0",
"id": self._next_id(),
"method": "initialize",
"params": {
"protocolVersion": "2024-11-05",
"capabilities": {},
"clientInfo": {"name": "nexus-notion-mcp-bridge", "version": "1.0"},
},
})
Once initialized, the bridge calls notion_create_page for each new cycle:
def create_page(self, database_id: str, cycle: dict) -> bool:
resp = self._send({
"jsonrpc": "2.0",
"id": self._next_id(),
"method": "tools/call",
"params": {
"name": "notion_create_page",
"arguments": {
"parent": {"database_id": database_id},
"properties": {
"Name": {"title": [{"text": {"content": title}}]},
"Score": {"number": score},
"Posted": {"checkbox": posted},
"Timestamp": {"date": {"start": ts}},
},
"children": [body_block]
},
},
})
return resp and not resp.get("error")
The Notion Database
Each row captures the full cycle: score, whether it was posted, timestamp, thread context, and the full outreach copy in the page body. The database becomes a live review queue — filter by Posted = false, sort by Score descending.
Setup
npm install -g @notionhq/notion-mcp-server
export NOTION_API_KEY=your_token
export NOTION_DATABASE_ID=your_db_id
python nexus_notion_mcp_bridge.py # continuous polling
python nexus_notion_mcp_bridge.py --once # sync once and exit
Why MCP Instead of REST
The REST API would have worked. But using the MCP server as the transport layer means:
- Any MCP-compatible AI client can now interact with the same Notion workspace
- The bridge is swappable — swap
notion-mcp-serverfor any other MCP server and the Python client stays the same - The stdio transport keeps everything local — no HTTP overhead, no additional auth surface
The MCP protocol is the right abstraction for AI-to-tool communication. The swarm is an AI system, Notion is the tool, MCP is the protocol that connects them.
What's Next
- Add
notion_update_pagewhen a cycle gets posted to sync the checkbox back - Build a filter view surfacing only high-scoring unposted entries
- Connect the REFLECT agent's insights to a Notion notes page
Top comments (0)