DEV Community

kanta13jp1
kanta13jp1

Posted on

Notion API Rate Limits Are Breaking Your Automation — Here's the Real Fix

Notion API Rate Limits Are Breaking Your Automation — Here's the Real Fix

When 429 Becomes Your Most Common Error

Have you tried automating your life data with the Notion API?

Auto-sync health data from Apple HealthKit. Log Stripe webhook income directly to a database. Sync daily tasks with GitHub Issues.

If you've tried, you know what happens: HTTP 429 Too Many Requests shows up fast.


Notion API Rate Limits: The Precise Numbers

Notion's official docs state:

Average of 3 requests per second per integration token

The word "average" is doing a lot of work there. Short bursts are tolerated, but sustained throughput above 3 req/s triggers 429s.

Here's what that looks like in practice:

Use Case Requests Needed Problem
Write 30 days of health data 30+ reqs (1 per day) 10+ seconds
Habit tracker (10 items × 30 days) 300+ reqs 100+ seconds
Monthly finance import 50–200 reqs Burst → 429
Cross-database queries pages × tables Hits cap instantly

The Block Limit Compounds It

On top of the rate limit, each page has a 1000-block ceiling. Attempt to write dense data — long journals, bulk imports — and you hit a second wall mid-operation.


The Standard "Solutions" and Why They Fall Short

Solution 1: Queuing + Exponential Backoff

import time
import random

def notion_request_with_backoff(fn, max_retries=5):
    for attempt in range(max_retries):
        try:
            return fn()
        except Exception as e:
            if "429" in str(e):
                wait = (2 ** attempt) + random.uniform(0, 1)
                time.sleep(wait)
            else:
                raise
    raise Exception("Max retries exceeded")
Enter fullscreen mode Exit fullscreen mode

The catch: Accumulated wait time. A 30-minute sync job becomes 2–3 hours.

Solution 2: Batch Writing

Notion API lets you write multiple blocks in a single request. Compress 100 blocks into 1 request and you get 100× efficiency in theory.

The catch: Payload size limits (2000 blocks/request) + child pages require separate requests.

Solution 3: Cache + Diff Sync

Only write changes since the last sync rather than full rewrites.

The catch: Diff logic gets complex fast. Notion DB versioning is weak — collision handling is a real engineering problem.


Why These Workarounds Don't Solve the Underlying Problem

The root issue is that Notion was designed for team document management, not for programmatic data ingestion.

Team Notion Usage Personal Life Log
Humans write documents Programs write data
Dozens of writes/day Hundreds–thousands of writes/day
Someone reads the output An analytics tool reads it
Schema rarely changes Schema evolves constantly

Personal life logging needs a database. Notion is not one.


An Architecture That Bypasses Rate Limits by Design

Here's what Jibun Kaisha uses instead:

Apple HealthKit / Webhooks / External APIs
              ↓
  Supabase Edge Function (Deno)
  - Buffering
  - Batch processing
  - Direct PostgreSQL writes
              ↓
  PostgreSQL (effectively unlimited writes)
              ↓
  Flutter Web (Realtime subscriptions)
Enter fullscreen mode Exit fullscreen mode

Supabase throughput for reference:

  • Edge Function: 500 req/sec (free tier)
  • PostgreSQL: 60 pooled connections (Pgbouncer)
  • Realtime: connections × channels, practically unlimited

Compare to Notion's 3 req/sec: that's 167× the throughput.

Side-by-Side: Notion API vs. Supabase PostgreSQL

Dimension Notion API Supabase PostgreSQL
Write speed 3 req/sec Thousands of inserts/sec
Batch size 2000 blocks/req Unlimited
Transactions None Full ACID
Custom indexes Fixed Any column
Aggregate queries Impossible Full SQL
Realtime None Realtime API

If You Want to Stay on Notion: The Realistic Ceiling

If you're committed to Notion API for personal automation, here's a practical ceiling:

  • Daily batch jobs only (forget real-time sync)
  • ≤ 100 writes/day (3 req/sec × 30s = 90 req, with margin)
  • No cross-database JOINs (Notion DBs don't relate across databases natively)
  • ≤ 500 blocks/page (50% of the 1000-block limit as safety margin)

That works for: simple daily journals, weekly reviews, lightweight logs.

It doesn't work for full life-data automation.


Summary

Notion's 3 req/sec rate limit is entirely reasonable for team document workflows.

For personal automation — syncing health data, finance, habits, learning, and work into one dashboard — it breaks down immediately. Queuing and diff sync are band-aids, not solutions.

What you actually need: an architecture where your data store is a real database with direct write access — not a document tool with a public API bolt-on.

Try Jibun Kaisha free — no API limits, zero setup

Top comments (0)