This is a submission for the Notion MCP Challenge
What I Built
Nancy is an automated Dribbble job intelligence system. It monitors Dribbble for new design job postings, summarizes them using AI, fires real-time alerts to a Telegram channel, and now, powered by Notion MCP, stores every opportunity in a structured Notion workspace with full application pipeline tracking.
The idea was to give my brother an edge: see a new job posting before almost anyone else in the world, apply immediately, and track the whole pipeline in one place.
Core capabilities:
- Scrapes Dribbble job listings on demand via a REST API trigger
- Summarizes job descriptions using HuggingFace (facebook/bart-large-cnn)
- Sends formatted Telegram alerts with one-tap apply buttons
- Stores all jobs in a Notion database with status tracking
- Reads its own configuration live from Notion — no redeployment needed to change behavior
Tech stack: Python 3.11, FastAPI, BeautifulSoup4, HuggingFace Inference API, python-telegram-bot, notion-client, deployed on Render.
Video Demo
Show us the code
GitHub: github.com/juv85/Nancy-v2-alt
The Notion integration lives in notion/notion_integration.py. The key entry points:
# scraper/scraper.py — reads live config from Notion on every run
config = notion_tracker.get_config()
if config.get("active", "true").lower() == "false":
return {"status": "paused", "detail": "Nancy is paused via Notion config."}
max_pages = int(config.get("max_pages", 2))
keywords = [k.strip().lower() for k in config.get("keywords", "").split(",") if k.strip()]
# After scraping, each new job is saved to Notion and marked notified
if notion_tracker.enabled:
page_id = notion_tracker.add_job(job)
notion_tracker.mark_telegram_sent(page_id)
To run your own instance, clone the repo and set the environment variables required in the .env.example
How I Used Notion MCP
Most Notion integrations use Notion as a destination — a place to dump output. Nancy uses it as both input and output: Notion is the control plane Nancy reads from, and the data layer Nancy writes to.
Notion as control plane (input)
Nancy reads a Config database in Notion before every single run:
| Setting | Example | What it does |
|---|---|---|
active |
true / false
|
Kill switch — pause Nancy without touching code |
keywords |
designer, UX, product |
Only alert on jobs matching these terms |
job_types |
Full-time, Contract |
Filter by employment type |
max_pages |
3 |
How many Dribbble pages to scrape per run |
Want Nancy to focus on freelance roles this week? Edit one cell in Notion. Want to pause it while you're traveling? Flip active to false. No terminal, no redeployment.
Notion as data layer (output)
[SCREENSHOT: Nancy Jobs database — Pipeline Board (Kanban) with columns New / Notified / Reviewing / Applied / Archived]
Every job Nancy finds is saved to a Jobs database with full metadata and a status workflow:
New → Notified → Reviewing → Applied → Archived
This turns Notion into an actual application tracker. Nancy handles discovery; you handle decisions. The Pipeline Board makes it immediately obvious where each opportunity stands.
The full flow
/trigger-scraper
→ read config FROM Notion (active? max_pages? keywords? job_types?)
→ if active=false → return "paused"
→ fetch existing job URLs from Notion (deduplication)
→ scrape Dribbble up to max_pages
→ for each new job matching filters:
→ summarize via HuggingFace
→ send Telegram alert
→ save to Notion Jobs DB (Status = New)
→ update Status → Notified, tick Telegram Sent
Two new API endpoints complete the picture:
-
GET /config— returns the live Notion config on demand -
GET /jobs— returns all jobs from Notion, filterable by status
What's Next
I want to be honest: what Nancy does today with Notion is not yet even close to the full power of what MCP makes possible.
Anyway the vision is to build an autonomous job application agent.
Notion becomes the agent's long-term memory. Not just a database of scraped jobs, but a living context store with an overview of the user's identity and metadata on jobs that matter to the user.
The agent this way will know what is relevant to highlight when drafting assets to apply, and also what to apply.
Thanks for reading
Juvet Manga



Top comments (1)
Great