DEV Community

Cover image for BrewHub PHL: Low Latency, High Caffeine (Notion MCP Operations Ledger)
BrewHubPHL
BrewHubPHL

Posted on

BrewHub PHL: Low Latency, High Caffeine (Notion MCP Operations Ledger)

This is a submission for the Notion MCP Challenge

## What I Built

<BrewHubPHL.com isn't just a software project; it's the digital infrastructure for a physical neighborhood coffee and parcel logistics shop. Managing a physical retail space requires strict security, fast transactions, and immutable audit logs. But reading database tables isn't a great experience for human store managers.

To bridge the gap between our high-speed PostgreSQL database and our human management team, I built the Hub-to-Notion Operations Ledger.

Whenever a barista completes an order on the POS, or a manager performs a high-security action (like an IRS-compliant payroll override), our database automatically and securely syncs that canonical data directly into a Notion Workspace. This turns Notion into a "Live Manager's Logbook" that updates in real-time, without any manual data entry.

## Show us the code

<The magic happens through a highly secure, zero-trust architecture between Supabase, Netlify Functions, and Notion.

Instead of trusting the payload from the database trigger, our Netlify function acts as a secure bridge that re-fetches the canonical data. We also built-in Graceful Degradation: if the Notion API ever goes down, the database catches the exception and returns NEW, ensuring we never stop selling coffee just because a sync failed.

Here is a snippet of our Double-Guarded Postgres Trigger for Auto-Syncing completed orders:
-- 20260304_schema86b_orders_notion_sync.sql
CREATE OR REPLACE FUNCTION public.fn_trg_orders_notion_sync()
RETURNS trigger
LANGUAGE plpgsql
SECURITY DEFINER SET search_path = public, extensions
AS $$
BEGIN
-- Defense-in-depth guard: Only fire on transition to 'completed'
IF (NEW.status = 'completed') AND (OLD.status IS DISTINCT FROM 'completed') THEN
BEGIN
-- Fire webhook to Netlify with signed headers and ONLY the record_id
PERFORM net.http_post(
url := coalesce(current_setting('app.settings.notion_sync_webhook_url', true), 'https://brewhubphl.com/.netlify/functions/notion-sync'),
headers := jsonb_build_object(
'Content-Type', 'application/json',
'X-BrewHub-Action', 'true',
'x-brewhub-secret', current_setting('app.settings.internal_sync_secret', true)
),
body := jsonb_build_object('table', 'orders', 'record_id', NEW.id, 'type', 'UPDATE')
);
EXCEPTION WHEN undefined_function THEN
RAISE WARNING 'pg_net missing, skipping Notion sync';
WHEN OTHERS THEN
RAISE WARNING 'Notion sync trigger failed, but allowing order to complete: %', SQLERRM;
END;
END IF;
RETURN NEW;
END;
$$;

CREATE TRIGGER trg_orders_notion_sync
AFTER UPDATE ON public.orders
FOR EACH ROW
-- Double guard: Postgres level check before invoking function
WHEN (NEW.status = 'completed' AND OLD.status IS DISTINCT FROM 'completed')
EXECUTE FUNCTION public.fn_trg_orders_notion_sync();

## How I Used Notion MCP

<While the Postgres triggers and Netlify functions populate the data, Notion MCP (Model Context Protocol) acts as our "AI Shift Manager."

By adding the Notion MCP server to our .vscode/mcp.json configuration, our AI concierge ("Elise", powered by Claude) has direct read/write access to the Shop Command Center in Notion.

This unlocks an incredible Human-in-the-Loop workflow:

The Daily Pulse: I can ask the AI to "Check the Ledger." The AI uses MCP to read the Sales Ledger database in Notion, summarizes the total revenue and completed orders for the day, and cross-references it with our physical parcel logistics.

Security Audits: The AI scans the Audit Trail page in Notion for any new Manager Overrides (synced via our zero-trust webhooks). If it sees an unusual comp or payroll adjustment, it flags it for my review.

Drafting Briefings: Instead of just outputting text, the AI uses MCP to draft a neat "Morning Briefing" page directly inside my Notion workspace for the opening manager to read the next day.

Top comments (0)