DEV Community

Lakshmi Sravya Vedantham
Lakshmi Sravya Vedantham

Posted on

I built a universal memory layer that works across every AI tool — here's how it works

Every AI tool I use has its own memory silo.

Claude knows I prefer Python. ChatGPT doesn't. Cursor doesn't know what I told Claude yesterday. Every tool starts from scratch.

So I built mem-mesh — a local proxy daemon that gives all your AI tools one shared memory.

How it works

mem-mesh runs a local HTTP server on port 4117. You set one environment variable:

export ANTHROPIC_BASE_URL=http://localhost:4117
Enter fullscreen mode Exit fullscreen mode

From that point, every AI API call goes through mem-mesh. It:

  1. Pulls your stored memories from ~/.mem-mesh/
  2. Injects them into the system prompt of every request
  3. After each response, extracts new memory signals (using Claude Haiku) and stores them

What gets remembered

mem-mesh categorizes memories into three buckets:

  • preferences — "Prefers Python over JavaScript", "Likes concise answers"
  • facts — "Lives in San Jose", "Works at a startup"
  • corrections — "Function is get_user, not fetch_user"

All stored as plain Markdown in ~/.mem-mesh/. Readable, editable, Git-backable, yours forever.

The CLI

mem-mesh show                    # See all memories
mem-mesh search "Python"         # Find specific memories  
mem-mesh forget "JavaScript"     # Remove matching memories
mem-mesh daemon start            # Start the proxy
mem-mesh daemon stop             # Stop it
Enter fullscreen mode Exit fullscreen mode

Why local-first matters

Your conversations go through enough third parties already. mem-mesh runs entirely on your machine. Nothing goes anywhere beyond your configured AI provider. No accounts, no cloud, no subscriptions.

Install

pip install mem-mesh
mem-mesh daemon start
export ANTHROPIC_BASE_URL=http://localhost:4117
Enter fullscreen mode Exit fullscreen mode

That's it. Every AI call now carries your memory.

What's next

mem-mesh currently works with any OpenAI-compatible API endpoint. Next: native support for ChatGPT web via a browser extension, and a mem-mesh sync command to back up memories to a private Git repo.

GitHub: https://github.com/LakshmiSravyaVedantham/mem-mesh

Built this because I was frustrated with fragmented AI context across tools. If you're using 3+ AI tools daily, you'll feel this immediately.

Top comments (0)