Using Notion as a Database Backend for AI Agents
I use Notion as the database for my entire AI agent system. Not Postgres. Not SQLite. Not Supabase. Notion. And before you close this tab — let me explain why it actually works, and where it absolutely doesn't.
Why Notion over a real database
The honest answer: my wife can see it.
She runs an esthetician business and does affiliate marketing. She doesn't know what a SQL query is, and she shouldn't have to. But she checks Notion every day. When my AI agent tracks her packages, scouts trending products, or syncs her calendar — she can see all of it in Notion without me building a frontend.
That's the real value proposition. Notion is a database with a UI that already exists. For a solo builder working nights after the kid goes to bed, "I don't have to build a dashboard" is worth more than "I can do JOINs."
The other reasons:
- Rich text and relations — I can store a deal memo with formatted text, linked contacts, and file attachments in one record. Try doing that in SQLite.
- Free tier is generous — Haven't paid them a cent. API rate limits are the constraint, not pricing.
- API is decent — Create, read, update, filter, sort. Covers 90% of what I need.
The 7 dashboards
My Notion workspace has 7 dashboards that update automatically:
- Health Dashboard — VPS uptime, last heartbeat, error counts, API usage
- Bets — Every prediction with P&L tracking
- Deals — Investment deals with status, next steps, key dates
- Family — Soccer games, school events, merged calendar
- Orders — Package tracking from email receipt to delivery
- Garden — 26 seed varieties with germination dates and zone assignments
- Home — The meta-dashboard that links to everything
The garden one is my favorite. It tracks when I started each seed indoors, expected germination days, actual sprout dates, and which zone they'll go in. My wife thinks I'm insane for having an AI track my zinnia seedlings. She's probably right.
SDK v3.0 broke everything
In February, the Notion Python SDK shipped v3.0 with a breaking change to database creation. The parent parameter changed from:
# Old (v2.x) - worked fine
parent = {"type": "page_id", "page_id": page_id}
# New (v3.0) - required format
parent = {"page_id": page_id}
No "type" key anymore. The SDK now infers the parent type internally. This broke my template creator, dashboard builder, and sales tracker — basically everything that creates databases programmatically. The error message was useless.
I found the fix by reading the SDK changelog. The migration guide was buried in a GitHub discussion, not the docs. An hour of debugging for a one-line fix.
No webhooks — and how I deal with it
This is Notion's biggest limitation: no webhooks. You can't say "when this database changes, call my endpoint." You have to poll.
My approach uses different intervals based on time-sensitivity:
# Fast (every 5 min): Orders + Health dashboards
# Medium (every 15 min): Deals + Family dashboards
# Daily (8am): Full rebuild of all dashboards
# 12-hour: Complete refresh of everything
It's wasteful compared to webhooks. Every poll costs an API call even if nothing changed. But for a personal system with 7 dashboards, the cost is negligible — roughly 200 API calls per day total.
Rate limits and the workaround
Notion's API rate limit is 3 requests per second. Sounds fine until you're updating 26 garden plants, each needing a page update with 5 properties. That's 26 API calls, and at 3/second you need ~9 seconds minimum.
My fix is boring but effective:
import time
NOTION_RATE_LIMIT = 0.35 # seconds between calls (slightly > 1/3)
def notion_request(method, url, **kwargs):
time.sleep(NOTION_RATE_LIMIT)
response = method(url, headers=headers, **kwargs)
if response.status_code == 429:
retry_after = int(response.headers.get('Retry-After', 1))
time.sleep(retry_after)
response = method(url, headers=headers, **kwargs)
return response
A fixed delay between every request, plus retry-on-429. Not fancy. Bulk updates are slow. But it never hits rate limits, and for background automation, "slow but reliable" beats "fast but occasionally fails."
What I'd do differently
If I were starting over, I'd still use Notion — but I'd add a local SQLite cache for time-series data. The bet history doesn't need to live in Notion. It needs somewhere I can run analytics on it. Right now, to calculate win rate, I have to query Notion for every record, page through results, and compute it in Python. A local database would make that instant.
I'd also structure databases more carefully upfront. I have some where I'm storing JSON blobs in a rich_text property because I ran out of property types. That's a code smell that means "you should have used a real database for this part."
But for the core use case — dashboards that humans look at, data that benefits from rich formatting, and a system where the builder and the users are the same household — Notion genuinely works. It's not the right tool for everything. But it's the right tool for this.
Top comments (0)