DEV Community

Cover image for I Was Tired of Explaining My Project to Every New AI Tab I Opened
Isah Alamin
Isah Alamin

Posted on

I Was Tired of Explaining My Project to Every New AI Tab I Opened

Notion MCP Challenge Submission 🧠

This is a submission for the Notion MCP Challenge

What I Built

Let me describe something that happened to me and I am willing to bet it has happened to you too.

I was deep in a conversation with Claude. Twenty minutes in. I had explained my entire project, the stack, the decisions, the architecture. It finally got it. The momentum was real. The advice was flowing.

And then I needed to actually write the code.

I opened VS Code. I opened GitHub Copilot. And it hit me Copilot had absolutely no idea what Claude and I had just spent twenty minutes figuring out. Zero. I was back to square one. So I did what every developer does in that moment I started copying and pasting. Grabbing chunks of the Claude conversation, dumping them into Copilot, basically summarizing my own conversation to a completely different AI just to get it up to speed.

That is not a workflow. That is a tax. And I was paying it every single day.

The real problem is not that AI tools are not smart enough. They are brilliant. The problem is that they do not share memory. Each tool is an island. Claude does not know what Copilot knows. Copilot does not know what Claude knows. And you the developer are the only bridge between them. Manually. Every time.

I built the Notion Memory Engine to burn that bridge down and replace it with something permanent.

It is a system that turns Notion into a single shared brain for all your AI tools using Notion MCP. You finish a conversation in Claude, type @memory -create, and Claude writes the full context decisions, code snippets, next steps directly into your Notion database. Then you open VS Code, ask GitHub Copilot about the same topic, and it reads from that exact same Notion database and picks up right where Claude left off.

Same context. Different tool. No copy pasting. No starting over. No tax.

Video Demo

Show us the code

GitHub logo Bits232 / notion-memory-engine

A cross tool AI memory system using Notion MCP to save context in Claude, continue in VS Code Copilot. Built for the DEV.to x Notion MCP Challenge

🧠 Notion Memory Engine

One Notion workspace. Multiple AI tools. Zero context loss.


The Problem

You are deep in a conversation with Claude. Twenty minutes in. You have explained your entire project β€” the stack, the architecture, the decisions. The advice is flowing. The momentum is real.

And then you need to actually write the code.

You open VS Code. You open GitHub Copilot. And it hits you β€” Copilot has no idea what you and Claude just talked about. So you start copying and pasting. Grabbing context from one tab, dumping it into another. Summarizing your own conversation just to get a different AI up to speed.

This is the invisible tax every developer pays when working with multiple AI tools.


The Solution

The Notion Memory Engine turns Notion into a single shared brain for all your AI tools using the Notion MCP (Model Context Protocol).

  • Claude reads…

The repo contains everything you need to replicate this from scratch:

  • system-prompt.md β€” the full protocol page to paste into Notion
  • notion-setup.md β€” how to set up your workspace
  • claude-setup.md β€” connecting Notion MCP to Claude
  • vscode-setup.md β€” connecting Notion MCP to VS Code Copilot
  • other-tools.md β€” extending to Cursor, Windsurf, Claude Desktop, and more

How I Used Notion MCP

Notion MCP is not a feature in this project. It is the entire foundation. Without it, none of this exists.

Here is exactly what it enables step by step:

1. Notion holds the System Prompt that gives Claude its behavior

The entire memory protocol, the rules, the commands, the safeguards lives inside a single Notion page called System Prompt. When you start a new conversation, you send Claude one message:

Please read my Notion page titled "System Prompt" and follow the protocol.
Enter fullscreen mode Exit fullscreen mode

Claude reaches into Notion via MCP, reads that page, and responds with the initialization handshake:

🧠 Memory Engine Online
Ready for @memory commands.
Enter fullscreen mode Exit fullscreen mode

That is it. The engine is live. And because the protocol lives in Notion, you can change Claude's behavior anytime just by editing a page β€” no code, no config files, no redeployment.

The System Prompt page β€” one Notion page that controls Claude's entire memory behavior
The System Prompt page β€” one Notion page that controls Claude's entire memory behavior

2. Claude saves your conversations directly into Notion on command

When you are deep in a conversation and want to preserve it, you type:

@memory -create medium
Enter fullscreen mode Exit fullscreen mode

Claude looks at the entire conversation, generates a descriptive title, picks a category, writes a full technical summary with code snippets and next steps, and saves it into your Notion database through MCP. Live. While you watch.

Claude initializing after reading the System Prompt from Notion via MCP
Claude initializing after reading the System Prompt from Notion via MCP

Claude confirming it saved the conversation context directly into Notion
Claude confirming it saved the conversation context directly into Notion

3. The entry appears in Notion instantly

Switch to Notion right after and the new entry is already there β€” title, category, and the full technical content Claude just wrote.

The entry Claude just created β€” structured, searchable, permanent
The entry Claude just created β€” structured, searchable, permanent

4. All connected AI reads from the same database

This is the moment the whole system clicks.

Every AI connects to the same Notion workspace through its own independent MCP connection. It has never seen your Claude conversation. It does not know your project. But when you ask it:

"What do we have saved about our project in Notion?"

It reaches into the same database, finds Claude's entry, and responds with full context ready to continue exactly where you left off. Two completely separate AI tools. One shared brain.

Claude reading the same Notion database β€” no copy pasting, no context loss
Claude reading the same Notion database β€” no copy pasting, no context loss

5. The Unique Title Rule keeps your database clean automatically

Before Claude creates any new entry, it searches the database for a matching title. If one exists, it stops and asks:

"Memory 'Django Auth Setup' already exists. Update instead? (yes/no)"

No duplicates. No clutter. The database stays organized without you doing anything.

The four commands

Command What it does
@memory -read [Title] Load a specific entry back into Claude's context
@memory -create short/medium/full Save conversation at your chosen detail level
@memory -update Append new progress to an existing entry
@memory -search [keyword] Search all entries by keyword with previews

Why This Matters Beyond the Demo

I want to be direct about something.

Most people treat AI context loss as an inconvenience. I started treating it as a systems problem and once you see it that way, the solution becomes obvious. You do not fix memory loss by trying to make each AI tool smarter in isolation. You fix it by giving all the tools a shared external memory they can all read from and write to.

Notion with MCP is that memory layer.

What makes this different from other memory solutions is that it is tool-agnostic. It does not matter which AI tool you use. If it supports MCP, it connects to the same brain. Today it is Claude and VS Code Copilot. Tomorrow you add Cursor, Windsurf, or the Claude Desktop app and they instantly have access to every conversation you have ever saved. The more tools you connect, the more powerful the system becomes, and you do not have to change anything.

I built this because I was genuinely tired of being the memory layer myself. Every time I switched tools, I was the one bridging the gap β€” manually, repeatedly, forever. The Notion Memory Engine means I never have to do that again.

And neither do you.


Built for the DEV.to x Notion MCP Challenge. Full setup guide and repo above, replicate it in under 30 minutes.

Top comments (4)

Collapse
 
kenwalger profile image
Ken W Alger

Slick implementation of the Notion MCP! It’s cool to see so many different angles on this challenge. I went the 'Forensic Auditor' route for rare books to see how far we could push the relational data integrity. Good luck with the submission!

dev.to/kenwalger/archival-intellig...

Collapse
 
isah_alamin_93d4e4d2ab01f profile image
Isah Alamin

Thanks Ken! Your Forensic Auditor approach sounds fascinating, relational data integrity for rare books is such a creative angle. Definitely going to check it out. Good luck to you too!

Collapse
 
balkaran profile image
Balkaran Singh

Really liked the idea

Collapse
 
isah_alamin_93d4e4d2ab01f profile image
Isah Alamin

Appreciate that! Thanks for checking it out.