This is a submission for the Notion MCP Challenge
What I Built
THAWNE is an autonomous intelligence agent that monitors the Tempo blockchain ecosystem 24/7 — crawling docs, tracking on-chain activity, indexing AI agent payments, and building a living knowledge base. The problem? All that data was trapped in a SQLite database on a headless server. Useful for machines. Useless for humans.
Notion MCP turned Notion into the human interface for an autonomous system.
I built a bridge agent — a dual MCP client that connects Notion MCP and THAWNE's own MCP server simultaneously. It creates a bidirectional pipeline:
THAWNE → Notion (Export):
Every day, an automated sweep scans the blockchain ecosystem — new projects, protocol changes, market developments, on-chain activity. The bridge pushes enriched entities (with descriptions, relationships, analyst notes) and developments (with strategic analysis and entity links) into structured Notion databases. A human can open Notion and see exactly what the AI knows, organized and browsable.
Notion → THAWNE (Ingest):
Drop raw intelligence into an Intel Inbox database in Notion. Set the status to "New". The bridge picks it up, reads the page content, classifies it (single entity vs. structured sweep), and writes it into THAWNE's knowledge base. Status flips to "Ingested" automatically. Need to update something? Change status to "Updated" — the bridge re-ingests with a full overwrite.
The result: Notion becomes the eyes and hands of an AI system. The agent sees the world, Notion shows you what it sees, and you can reach in and correct or feed it — all without touching a database, writing SQL, or deploying a dashboard.
What makes this different
Most MCP integrations connect AI to Notion as a fancy note-taking tool. THAWNE flips this. Notion isn't where the AI writes notes — it's the control surface for an autonomous system that:
- Monitors a live blockchain via RPC (blocks, transactions, wallet classification)
- Tracks 82+ AI payment services on the Machine Payments Protocol directory
- Maintains a knowledge graph of 45 entities with 49 relationships, analyst notes, and significance ratings
- Runs its own MCP server with 7 tools, including payment-gated queries via MPP (agents pay to access intelligence)
- Operates on a daily cron cycle — sweep at 5:05 AM, Notion sync at 5:30 AM, no human needed
Notion MCP made it possible to build a human-in-the-loop layer on top of all this without writing a single line of frontend code.
Architecture
+-----------------------------------------------------------+
| DAILY CYCLE |
| |
| 05:05 Automated Sweep (web + chain + MPP directory) |
| | |
| v |
| THAWNE DB <-- new entities, developments, relations |
| | |
| 05:30 Bridge Agent (cron) |
| | |
| +--> Notion MCP --> Sync entities + developments |
| | to Notion databases |
| | |
| +--> Notion MCP --> Check Intel Inbox for new items |
| | Ingest -> THAWNE DB |
| v |
| Status: "Ingested" |
| |
+-----------------------------------------------------------+
| HUMAN LAYER |
| |
| Notion: Intel Inbox -> Drop raw intel, set "New" |
| Notion: Entities DB -> Browse enriched profiles |
| Notion: Developments -> Timeline of ecosystem events |
| Notion: Sweep Reports -> Daily intelligence summaries |
| |
| Edit an entity? -> Set status "Updated" -> auto re-sync |
| |
+-----------------------------------------------------------+
| THAWNE MCP SERVER |
| |
| 7 tools (ecosystem, MPP directory, entity intel, |
| chain activity, wallet lookup, top wallets, |
| recent developments) |
| 1 resource (ecosystem summary) |
| 1 prompt (analyze entity template) |
| MPP payment gate (agents pay per query) |
| |
+-----------------------------------------------------------+
Video Demo
Show us the code
dr-gideon
/
thawne-notion-bridge
Dual MCP client bridging an autonomous blockchain intelligence agent with Notion. Built for the Notion MCP Challenge.
THAWNE Notion Bridge
A dual MCP client that creates a bidirectional sync pipeline between an autonomous intelligence agent (THAWNE) and Notion.
Built for the Notion MCP Challenge.
What it does
THAWNE is an autonomous agent that monitors the Tempo blockchain ecosystem 24/7. It tracks entities, developments, on-chain activity, and AI payment services — all stored in a SQLite knowledge base on a headless server.
The bridge turns Notion into the human interface for this system:
- THAWNE → Notion: Pushes enriched entities (profiles, relationships, analyst notes), developments (with strategic analysis), and daily sweep reports into structured Notion databases.
- Notion → THAWNE: Reads raw intelligence from a Notion "Intel Inbox", classifies it, and writes it into the knowledge base. Supports re-ingestion via status flags.
Every database operation goes through Notion MCP. Zero REST API calls.
Architecture
05:05 Automated Sweep (web + chain + MPP directory)
|
v
THAWNE DB <--…The submission repo contains the bridge agent — the core of this project. The full THAWNE ecosystem (MCP server, chain observer, wallet analyzer) runs on a private server but the bridge is fully open source.
Key files:
bridge.py— The dual MCP client (~700 lines of Python). Connects to both Notion MCP and THAWNE MCP simultaneously via stdio. 6 commands:status,ingest,sync-entities,sync-developments,sync-sweep-reports,sync-all. Handles idempotency, rate limiting, and error recovery.config/notion_ids.example.json— Template showing the database ID and data source ID structure needed for the bridge.requirements.txt— Dependencies (justmcpandsqlite3).
Supporting infrastructure (private, referenced for context):
THAWNE MCP Server — 7 tools exposing blockchain intelligence, including MPP payment-gated queries. Dual transport (stdio + HTTP/SSE).
Chain Observer — Real-time blockchain indexer. Polls Tempo RPC, indexes blocks/transactions/wallets, auto-classifies wallet types.
Daily Sweep — Automated cron job that crawls web sources, Tempo chain data, and the MPP service directory every morning. The bridge syncs these results to Notion 25 minutes later.
How I Used Notion MCP
The bridge agent is a dual MCP client — it holds two simultaneous MCP sessions (Notion + THAWNE) and orchestrates data flow between them. Every database operation goes through Notion MCP. Zero REST API calls.
MCP Tools Used
Querying data: API-query-data-source with data_source_id to query all four Notion databases. This was the trickiest part — data source IDs are different from database IDs, and discovering this mapping required using API version 2025-09-03 during database creation.
Creating pages: API-post-page with structured properties (title, select, multi_select, number, date, URL) to create entity and development pages.
Updating pages: API-patch-page for property updates (status changes, sync timestamps) and idempotent re-syncs.
Page content: API-patch-block-children to add rich block content — headings, paragraphs, bulleted lists, dividers — creating structured entity profiles with Description, Relations, and Analyst Notes sections.
Reading content: API-get-block-children to extract page body content during ingestion. Intel dropped into Notion as free-form text gets parsed and structured into THAWNE's schema.
The Hard Parts
Notion MCP v2.0 silently accepts invalid properties. During development, I sent wrong property types (rich_text instead of number for IDs, nonexistent select options). The API returned success. Pages were created with blank fields. No errors. This cost hours of debugging until I learned to verify every write.
Database IDs vs. Data Source IDs. API-post-page needs a database_id. API-query-data-source needs a data_source_id. They're different values for the same database. The data source ID only appears when you create a database with API version 2025-09-03, but that version silently drops non-title properties during creation. Solution: create with 2022-06-28, then retrieve data source IDs with 2025-09-03.
Idempotency across two systems. Running sync twice shouldn't create duplicates. The bridge tracks THAWNE ID (a number property) on every Notion page and uses ISO timestamps in Last Synced to determine whether an update is needed. Same-day changes always propagate; only strictly-newer sync dates are skipped.
Rate limiting. Notion's API runs at ~3 requests/second. With 45 entities and 100 developments, a full sync means ~290 API calls. A 0.5-second sleep between calls keeps it safe, and the bridge serializes all operations rather than batching.
What Notion MCP Unlocks
The real unlock isn't "AI can read/write Notion." It's that Notion becomes a zero-code frontend for any autonomous system.
THAWNE runs on a headless VPS. It has no web dashboard. No admin panel. No UI at all. But through Notion MCP, a non-technical person can:
- See everything the AI knows (browse Entities, read Developments)
- Feed it new information (drop intel in Inbox)
- Correct mistakes (edit + set "Updated")
- Review daily intelligence (read Sweep Reports)
All without SSH, SQL, or asking the AI directly. Notion is the universal human interface.
Top comments (2)
Really interesting inversion of Notion as an operational interface rather than a note-taking tool. The bidirectional bridge pattern is clever.
One thing I keep running into with these dual-MCP-client architectures: once you add a second or third autonomous agent to the system, the point-to-point wiring gets fragile fast. Who arbitrates when two agents want to write to the same Notion database? How do you scope what each agent can access? Curious if you've thought about that as THAWNE grows.
Some comments may only be visible to logged-in visitors. Sign in to view all comments.