How I Used Notion MCP to Build an AI Recruiting Assistant
What I Built
HireIQ is an AI-powered recruiting assistant that uses Notion as its entire operational backbone. Point it at a role you're hiring for and it will:
- Spin up a recruiting workspace in Notion automatically
- Write polished, structured job descriptions and save them as Notion pages
- Screen candidates against the role criteria and log results
- Generate offer letters — drafted by AI, stored in Notion
The entire hiring pipeline lives in Notion. HireIQ just drives it.
Video Demo
Show Us the Code
🔗 github.com/himanshu748/dev-challenge-3
How I Used Notion MCP
HireIQ connects to Notion via the remote Notion MCP server at https://mcp.notion.com/sse — attached directly to a HuggingFace MCPClient. This means the AI model doesn't just answer questions about hiring; it actively reads from and writes to Notion as part of every step in the pipeline.
1. Workspace Setup (POST /api/setup)
On first run, HireIQ uses Notion MCP to create the full recruiting workspace structure — databases, pages, and sections — ready to receive jobs and candidates. Zero manual Notion configuration needed.
2. Job Description Creation (POST /api/add-job)
The AI generates a complete, professionally formatted job description and writes it directly into Notion via MCP. Every role gets its own page, making the workspace a living hiring hub rather than a static document.
3. Candidate Screening (POST /api/screen-candidate)
HireIQ reads the job requirements from Notion, evaluates the candidate against them using the LLM, and logs the screening result back into Notion — keeping everything in one place for the hiring team to review.
4. Offer Letter Generation (POST /api/generate-offer)
Once a candidate clears screening, HireIQ drafts a tailored offer letter and saves it to Notion via MCP. The whole candidate journey — from application to offer — is tracked in one workspace.
What Notion MCP Unlocks
The remote MCP connection (https://mcp.notion.com/sse) is what makes this work at a different level than a plain Notion API integration. The LLM has Notion MCP as an attached tool — so it can decide when and how to read or write Notion data as part of reasoning through each task. It's not just API calls wrapped in Python; the model is an active agent that uses Notion as its memory and output layer.
Every write endpoint returns a notion_urls field so the frontend can link directly to the pages the agent just created.
Tech Stack
| Layer | Technology |
|---|---|
| Backend | FastAPI + Python |
| AI / LLM | HuggingFace (Qwen/Qwen2.5-72B-Instruct) via MCPClient |
| Data store | Notion (via remote Notion MCP at https://mcp.notion.com/sse) |
| Frontend | Vanilla HTML/CSS/JS |
Getting Started
# 1. Clone the repo
git clone https://github.com/himanshu748/dev-challenge-3
# 2. Create a virtual environment
python3 -m venv .venv
source .venv/bin/activate
# 3. Install dependencies
pip install -r requirements.txt
# 4. Configure environment variables
Create a .env file:
HF_API_KEY=hf_...
NOTION_TOKEN=... # Must be a current OAuth access token (not an internal integration token)
NOTION_PARENT_PAGE_ID=... # Parent page for the recruiting workspace
HF_MODEL=... # Optional — defaults to Qwen/Qwen2.5-72B-Instruct
# 5. Start the server
uvicorn app.main:app --reload
Open http://127.0.0.1:8000 in your browser.
Note:
NOTION_TOKENmust be a current OAuth access token for the remote Notion MCP server. A plain internal integration token will not work for thehttps://mcp.notion.com/sseconnection.
Top comments (0)