This is a submission for the Notion MCP Challenge
What I Built
I have a Notion database with 30+ startup ideas. Every single one followed the same pattern: get excited, write down the idea, tell myself "I'll start tomorrow"… and then spend the next 3 hours setting up a repo, googling competitors, creating GitHub issues, and losing all momentum before writing a single line of real code.
By idea #15, I realized the problem wasn't motivation. It was the Initialization Gap that dead time between having an idea and pushing a first commit. So I built the thing that would have saved me all those wasted hours.
ZeroToRepo is an autonomous AI agent. You check a single checkbox in Notion, and within 2 minutes, it has:
- Researched your market : 5-8 live Brave Search queries, AI-analyzed to extract competitors, gaps, and trends
Named your startup : AI-generated brand name + tagline grounded in the research
Written a Market Analysis : saved as a formatted Notion sub-page with proper headings and bold text
Built a 4-week strategy : every task maps to a competitive gap: "Competitors lack X → Week 2: Build Y"
Scaffolded a private GitHub repo : README with competitor table, package.json, src/index.js, all via ghost commits (zero local git)
Opened labeled GitHub Issues : each references the gap it addresses, with priority and owner
Synthesized a Project Brief : competitors, roadmap rationale, GitHub link, and timestamp, all back to Notion
Here's the thing that makes it actually interesting: the LLM decides what to do. This isn't a hardcoded script that runs step 1, step 2, step 3. It's a Groq-powered agent with 12 tools and function calling. The AI sees the tool descriptions, picks what to call next, and if something fails, it decides how to recover. The AI is the orchestrator.
Two commands. That's it.
npm run setup # Interactive wizard: collects keys, tests all 4 connections
npm start # Watches Notion. Check ☑️ Trigger. Done.
Demo
Show Us the Code
GitHub: https://github.com/Abeera81/zerotorepo
Architecture
The 4-Phase Pipeline
Agent Tool Registry (12 Tools)
| Tool | Phase | What It Does |
|---|---|---|
update_notion_status |
All | Update Notion status via MCP |
deep_search |
Research | 5-8 Brave searches (with fallback) |
analyze_market |
Research | AI → competitors, gaps, insights |
generate_startup_name |
Research | Creative name + tagline |
save_market_analysis |
Research | Write to Notion via MCP |
generate_strategy |
Strategy | 4-week gap-targeting roadmap |
save_strategy_to_notion |
Strategy | Write to Notion via MCP |
create_github_repo |
Execution | Private repo + ghost commit |
set_github_url |
Execution | Store URL in Notion via MCP |
create_github_issues |
Execution | Labeled issues from roadmap |
write_project_brief |
Synthesis | Project Brief → Notion via MCP |
finalize_idea |
Synthesis | Mark Done, uncheck trigger |
Project Structure
zerotorepo/
├── src/
│ ├── index.js # CLI — polling loop, @clack/prompts TUI
│ ├── agent.js # 🤖 LLM agent — Groq function calling, 12 tools
│ ├── stateMachine.js # Routes: live → agent, mock → sequential
│ ├── mcp-client.js # MCP client — spawns Notion MCP server (stdio)
│ ├── mcp-server.js # ZeroToRepo AS MCP server — 7 tools for AI assistants
│ ├── notion.js # Notion via MCP — markdown-to-blocks converter
│ ├── research.js # Brave Search (5-8 queries) + Groq analysis
│ ├── scaffold.js # GitHub ghost commits + rich README
│ ├── roadmap.js # 4-week gap-targeting strategy
│ └── brief.js # Project Brief synthesis
├── scripts/
│ ├── setup.js # Interactive setup wizard
│ └── reset-db.js # Reset Notion database
└── prompts/ # LLM system prompts
How I Used Notion MCP
Notion MCP is the central nervous system of ZeroToRepo. Every single Notion operation goes through MCP.
Consuming Notion MCP Server As MCP Client
ZeroToRepo spawns @notionhq/notion-mcp-server (v2.2.1) as a child process via stdio transport. All 22 MCP tools are available. We use 5:
| MCP Tool | Where We Use It |
|---|---|
API-query-data-source |
Poll for triggered ideas every 5 seconds |
API-patch-page |
Update status (Researching → Planning → Building → Done) |
API-post-page |
Create Market Analysis, Strategy & Roadmap, Project Brief sub-pages |
API-get-block-children |
Idempotency checks — skip if sub-page already exists |
API-delete-a-block |
Database reset script |
The MCP client handles connection lifecycle (lazy connect on first call, graceful disconnect on SIGINT/SIGTERM) and transparent JSON parsing of tool results.
Here's a problem I didn't expect: Notion doesn't render markdown. When you write research back to Notion, raw markdown appears as ugly plain text with literal **asterisks** everywhere. So I built a markdownToNotionBlocks() converter that transforms markdown into native Notion block types:
# Heading → heading_1
## Subheading → heading_2
- Bullet → bulleted_list_item
1. Number → numbered_list_item
> Quote → quote
--- → divider
**bold** → rich_text with bold annotation
[link](url) → rich_text with link
Every sub-page (Market Analysis, Strategy & Roadmap, Project Brief) renders with proper headings, bold text, bullet lists, and links, just like hand-crafted Notion pages.
Exposing ZeroToRepo As an MCP Server
Here's where it gets interesting. ZeroToRepo isn't just an MCP client... it's also an MCP server. Any MCP-compatible AI assistant (Claude Desktop, Cursor, VS Code) can call ZeroToRepo's 7 tools directly:
| Tool | Description |
|---|---|
process_idea |
Run the full 4-phase pipeline |
research_competitors |
Deep competitive research only |
generate_name |
Creative startup name from research |
scaffold_repo |
Create GitHub repo with scaffold |
generate_roadmap |
Generate & create roadmap issues |
generate_brief |
Synthesize project brief |
list_notion_ideas |
List all ideas from Notion database |
This means Claude can say: "Hey ZeroToRepo, research 'AI pet care' and scaffold a repo" and it just works. The setup wizard even auto-generates the MCP config for VS Code or Claude Desktop so you don't have to write JSON by hand.
The Full MCP Data Flow
Total MCP calls per pipeline run: ~12-15 (status updates, sub-page creation, URL setting, trigger reset)
What Makes This Different
LLM-Driven Orchestration (Not Hardcoded)
Most hackathon projects chain API calls in a fixed sequence. ZeroToRepo's agent uses Groq function calling. The LLM receives 12 tool descriptions and decides which to call next. If deep_search fails, it sees the error and can retry with a different query or skip to analyze_market with fallback data. This is genuine AI agent behavior, not a script.
Dual MCP Architecture
We're not just consuming Notion MCP, we're also exposing ZeroToRepo as an MCP server. This creates a composable tool ecosystem where AI assistants can chain ZeroToRepo with other MCP servers.
Gap-Targeted Strategy
The roadmap isn't generic. Every task explicitly references a competitive gap: "Competitors lack real-time sync → Week 1: Build WebSocket infrastructure." Phase 2 reads Phase 1's gap analysis and generates tasks that directly target those gaps.
Ghost Commits (Zero Local Git)
Repos are created entirely via GitHub's Git Data API, blobs, trees, commits, refs, all remotely. No git clone, no local filesystem, no SSH keys.
$0 Total Cost
Groq free tier (100k tokens/day), Brave Search free tier (2k queries/month), GitHub API (free with PAT), Notion API (free integration). A full pipeline run uses ~20k tokens, you get ~5 runs per day for free.
Tech Stack
| Technology | Role |
|---|---|
| Node.js v20+ | Runtime |
@modelcontextprotocol/sdk |
MCP client + server framework |
@notionhq/notion-mcp-server |
Notion MCP integration (22 tools) |
| Groq API (Llama-3.3-70B) | LLM function calling, analysis, generation |
| Brave Search API | Real-time competitive intelligence |
@octokit/rest |
GitHub repo creation, ghost commits, issues |
@clack/prompts |
Beautiful interactive CLI |



Top comments (1)
The LLM-as-orchestrator pattern is the interesting part here — letting the model pick which tool to call instead of hardcoding the sequence. Have you run into cases where it loops or picks the wrong tool order under ambiguous inputs?