DEV Community

Lawrence.io
Lawrence.io

Posted on

Building an AI-Powered Backlog Manager with MCP, Gemini 2.5, and Notion

The Problem
As a developer, I often have "lightbulb moments" for projects but struggle with the initial overhead of breaking them down into actionable tasks. I wanted a way to turn a single sentence into a structured, prioritized Kanban board and a set of GitHub issues in seconds.

The Solution: AI-Backlog-Manager
I built an MCP (Model Context Protocol) server that acts as a bridge between high-level project ideas and your development workspace. By leveraging Google Gemini 2.5 Flash, the system decomposes a project idea into critical development phases and pushes them directly into Notion and GitHub simultaneously.

System Architecture
The project follows a decoupled architecture, ensuring that the AI logic, the protocol server, and the database integration remain modular.

The Client: MCP Inspector (for testing and manual triggers).

The Server: A Python-based FastMCP server that orchestrates the workflow.

The Intelligence: Google Gemini 2.5 Flash, utilizing the latest google-genai SDK for high-performance inference.

The Persistence: Notion API for project management and GitHub API for issue tracking.

Technical Deep Dive
The AI Layer (Gemini 2.5 Integration)
I migrated the project to the Gemini 2.5 Flash model to take advantage of improved reasoning and lower latency.

The Challenge: Encountering 404 errors with legacy API endpoints.

The Fix: I transitioned from the legacy google-generativeai library to the modern google-genai SDK, explicitly configuring the client to target the stable v1 API version.

Python

Snippet from app/ai_agent.py

client = genai.Client(api_key=os.getenv("GOOGLE_API_KEY"))

response = client.models.generate_content(
model="models/gemini-2.5-flash",
contents=f"Generate a development backlog for: {idea}. Return ONLY a JSON list..."
)
The Protocol (MCP)
Using the Model Context Protocol allowed me to standardize how the AI interacts with local tools. By defining a create_backlog_from_idea tool, I made my local Python environment accessible to any MCP-compatible client, turning a script into a powerful agentic tool.

Backend Robustness & Rate Limiting
As a Backend Engineer, I realized that sending multiple tasks to Notion simultaneously could trigger 429 Rate Limit errors. I implemented an asynchronous delay (time.sleep) between API calls to ensure stable delivery.

Results
I tested the system with a "Solar Power Monitoring System for a rural clinic" idea.
The result? A fully populated Notion table and GitHub issues created in parallel, with priorities (High/Medium/Low) assigned automatically—all in under 5 seconds.



Lessons Learned
Schema Mapping: Notion’s API is highly sensitive. Aligning Python dictionaries with complex Notion title and select objects was a key debugging milestone.

API Pathfinding: I used discovery scripts to map available models, which helped bypass regional endpoint restrictions.

Prompt Stability: Robust regex-based cleaning was necessary to ensure the LLM's JSON output remained parsable even when wrapped in Markdown blocks.

🔗 Project Links
GitHub:https://github.com/lumbol77/Notion-ai-backlog-manager

Loom Demo: https://www.loom.com/share/4ec2c141a41a41d49e2d919e4dc02934

MCP #Python #AI #Notion #Gemini #BuildWithMCP #BackendEngineering

Top comments (0)