This is a submission for the Notion MCP Challenge
What I Built
Rio is a voice-controlled multi-agent AI system. You speak a task — Rio's agents plan it, execute it on your desktop, research it on the web, and write the result. Every single step of that process lives in your Notion workspace, written in real time, structured, searchable, and yours.
Notion isn't a plugin here. It's the mind Rio thinks in.
The Problem
AI agents are powerful and invisible. They spin up, do work, disappear. You get a final answer with no trace of how it got there — no audit trail, no structured memory, nothing you can build on top of.
Rio solves that by making Notion the surface where agents think out loud. Every task claimed, every browser step taken, every report written — it all lands in structured Notion databases as it happens, not after the fact.
Demo
What you'll see:
- A pending task sitting in the Rio Task Queue database
- A voice command to Rio — "Research the top 3 AI agent frameworks and write a comparison"
- The Rio Logs database populating live — one row per agent step, in real time
- The Creative Agent writing a full formatted Notion page as the final deliverable
- The task row flipping to
done
How Notion MCP Is Core
There are two distinct ways Notion is used — and both matter.
1. Notion as memory
Every agent writes structured logs as it works. Not at the end — during. The Orchestrator logs when it claims a task. The UI Navigator logs each browser action. The Creative Agent writes the full report as a Notion page.
# Every logger.info() becomes this
await memory.log_step("ui_navigator", "browser_step", action, "done")
# → live Notion row: agent name, event, detail, status, timestamp
2. Notion as a reasoning tool (via MCP)
Agents actively query and write Notion during execution — not just at completion. The Orchestrator polls the Task Queue for new jobs. The Creative Agent searches existing pages before writing.
tools, toolset = await MCPToolset.from_server(
connection_params=SseServerParams(
url="https://mcp.notion.com/sse",
headers={"Authorization": f"Bearer {user_token}"}
),
tool_filter=["notion-search", "notion-create-pages", "query-database"]
)
No .env Tokens. Anyone Can Use This.
Rio uses a full OAuth 2.0 + PKCE flow. A user clicks "Connect Notion," approves access, and their token lives in session memory. No database IDs are hardcoded anywhere. The moment they connect, Rio auto-creates their Rio Logs and Rio Task Queue databases — zero manual Notion setup.
User clicks Connect Notion
→ /oauth/start — PKCE verifier generated, redirect to Notion login
→ /oauth/callback — code exchanged, token stored in session memory
→ auto_setup() — Rio Logs DB + Task Queue DB created in their workspace
→ WebSocket init — agents get NotionMemory + MCPToolset injected
The only things in .env are your app's own client_id and client_secret from notion.so/my-integrations.
Architecture
RIO — SYSTEM ARCHITECTURE
==========================
User
|
Voice Input
|
v
+-------------------------+
| Orchestrator Agent |
| polls Notion Task |
| Queue for pending jobs |
+-------------------------+
| | |
v v v
+------------+ +----------+ +-----------------+
| Live Agent | | UI Navi- | | Creative Agent |
| voice STT | | gator | | research, write |
| + TTS | | browser | | full Notion |
| | | desktop | | pages + reports |
+------------+ +----------+ +-----------------+
| | |
v v v
+---------------------------------------+
| NotionMemory |
| writes every step live to Notion |
| via REST API (httpx async) |
| |
| Rio Logs DB Task Queue DB |
| Agent / Event Task / Status |
| Detail / Status Agent / Result |
+---------------------------------------+
|
v
+---------------------------------------+
| MCPToolset |
| lets agents query + search Notion |
| during execution (SSE transport) |
| |
| notion-search query-database |
| notion-create append-block |
+---------------------------------------+
|
v
+-------------------------+
| User Notion Workspace |
| connected via OAuth |
| PKCE — no env tokens |
+-------------------------+
OAUTH FLOW
----------
User clicks "Connect Notion"
|
v
/oauth/start
generate code_verifier + code_challenge (PKCE S256)
|
v
api.notion.com/v1/oauth/authorize
user approves Rio access in their own browser
|
v
/oauth/callback
exchange code + verifier for access_token
store token in session (in-memory, per user)
|
v
NotionMemory.setup()
auto-creates Rio Logs DB + Task Queue DB
in the user workspace if not already present
|
v
MCPToolset.from_server()
connects to mcp.notion.com/sse
with user access_token — agents ready
Stack: Python · Google ADK · FastAPI · WebSockets · Notion REST API · Notion MCP (SSE) · httpx async · Google Cloud Run
What I Learned
Building agents is easy. Building agents with observable, persistent memory is what actually makes them useful.
The shift from logger.info() to await memory.log_step() sounds trivial — but it completely changes the character of the system. Rio's work is no longer ephemeral. It accumulates. You can go back, search it, build on it, share it. That's what Notion gives it.
The OAuth decision was also non-negotiable. An agent that only works with hardcoded tokens is a demo. An agent that any person can authorize with their own workspace is a product.
Rio Notion Live Web - Rio-Notion Live Web
Top comments (0)