Or: How I gave my AI tools persistent memory that survives conversation resets and works across multiple platforms
The Problem: AI Tools with Amnesia
As developers, we've gotten used to AI assistants that are incredibly smart but frustratingly forgetful. You have a conversation with Claude about your project architecture, then start a new conversation and have to explain everything from scratch. You switch from Claude Desktop to VS Code and lose all context. Your GitHub Copilot knows nothing about the decisions you just made with Claude.
This constant context switching and re-explanation is exhausting. What if your AI tools could actually remember your setup, your preferences, and your project context across conversations and platforms?
The Vision: Persistent AI Memory Across Tools
I wanted to create a development environment where:
- Claude Desktop remembers our previous conversations
- VS Code has access to the same knowledge
- All AI tools share an understanding of my projects and setup
- Context survives conversation resets and tool switches
- Everything runs cleanly without polluting my host machine
The Solution: MCP Servers + Vagrant + Shared Memory
The breakthrough came with the Model Context Protocol (MCP) and a clever architecture using Vagrant. Here's what I built:
Architecture Overview
Architecture showing Claude Desktop and VS Code connecting to shared MCP servers running in a Vagrant VM
Key Components
1. Vagrant VM as the Foundation
All MCP servers run inside a Vagrant VM, keeping my Mac clean while providing a consistent Linux environment for AI tools.
2. Docker Containers for Isolation
Each MCP server runs in its own Docker container, ensuring clean separation and easy management.
3. Shared Memory Store
A single JSON file (/home/vagrant/.shared-memory/memory.json
) stores a knowledge graph that Claude Desktop and VS Code can access.
4. SSH Bridge
Both Claude Desktop and VS Code connect to the VM via SSH, running their MCP servers remotely but seamlessly.
Implementation: The Configuration
Claude Desktop Configuration
The Claude Desktop MCP configuration connects via SSH and runs servers in Docker:
{
"mcpServers": {
"memory": {
"command": "bash",
"args": [
"-c",
"cd ~/.config/nix && vagrant ssh -- 'docker run -i --rm -v /home/vagrant/.shared-memory:/app/data node:18-alpine sh -c \"MEMORY_FILE_PATH=/app/data/memory.json npx -y @modelcontextprotocol/server-memory\"'"
]
},
"github": {
"command": "bash",
"args": [
"-c",
"cd ~/.config/nix && vagrant ssh -- 'GITHUB_PERSONAL_ACCESS_TOKEN=$(gh auth token) docker run -i --rm -e GITHUB_PERSONAL_ACCESS_TOKEN node:18-alpine sh -c \"npx -y @modelcontextprotocol/server-github\"'"
]
}
}
}
VS Code Remote SSH Configuration
VS Code uses Remote SSH to connect to the Vagrant VM, then runs MCP servers locally on the VM:
{
"servers": {
"memory": {
"command": "docker",
"args": [
"run", "-i", "--rm",
"-v", "/home/vagrant/.shared-memory:/app/data",
"node:18-alpine",
"sh", "-c", "MEMORY_FILE_PATH=/app/data/memory.json npx -y @modelcontextprotocol/server-memory"
]
},
"github": {
"command": "bash",
"args": [
"-c",
"GITHUB_PERSONAL_ACCESS_TOKEN=$(gh auth token) docker run -i --rm -e GITHUB_PERSONAL_ACCESS_TOKEN node:18-alpine sh -c \"npx -y @modelcontextprotocol/server-github\""
]
},
"filesystem": {
"command": "docker",
"args": [
"run", "-i", "--rm",
"-v", "/home/vagrant/projects:/workspace",
"node:18-alpine",
"sh", "-c", "cd /workspace && npx -y @modelcontextprotocol/server-filesystem ."
]
}
}
}
The Magic: Shared Memory in Action
The breakthrough moment was testing the shared memory system:
- Claude Desktop stores information about my development setup
- VS Code can immediately access that same information
- Both tools share understanding of my projects, preferences, and technical decisions
- Context survives conversation resets and tool switches
Example: Storing Context
In Claude Desktop:
"Remember that I use a Vagrant VM for development with Docker containers for MCP servers"
Example: Retrieving Context
In VS Code:
"What do you know about my development setup?"
VS Code immediately retrieves the stored memory and responds with full context about the Vagrant VM, Docker setup, and MCP configuration.
Why This Matters
For Individual Developers
- No more re-explaining your setup to AI tools
- Consistent context across different development environments
- Persistent memory that survives restarts and conversation limits
- Clean host machine with powerful AI integration
For Teams
- Shared knowledge base that any team member can access
- Consistent AI assistance across different tools and platforms
- Documentation that grows organically through AI interactions
For the Future
This represents a new paradigm for AI-assisted development where tools maintain a persistent, shared understanding rather than operating in isolation.
Getting Started
Prerequisites
- Vagrant with UTM (for Apple Silicon) or VirtualBox
- Claude Desktop
- VS Code with Remote SSH extension
- Optional: Nix package manager (for reproducible VM configuration)
Basic Setup Steps
- Set up Vagrant VM with your preferred configuration
- Configure Claude Desktop MCP servers to run via SSH
- Configure VS Code Remote SSH and MCP servers
- Test shared memory by storing and retrieving context
Challenges and Solutions
Challenge: SSH Connection Management
Running multiple MCP servers via SSH can cause connection conflicts.
Solution: Careful configuration to avoid simultaneous SSH sessions and proper connection cleanup.
Challenge: Docker Container Lifecycle
Managing Docker containers for each MCP server session.
Solution: Using the --rm
flag for automatic cleanup and proper volume mounting for data persistence.
Challenge: Memory Consistency
Ensure both tools can read/write in the same memory store without conflicts.
Solution: Single shared file with proper Docker volume mounting and consistent file paths.
What's Next
This setup opens up exciting possibilities:
- Integration with more AI tools as MCP adoption grows
- Team-wide shared memory for collaborative AI assistance
- Project-specific knowledge bases that accumulate over time
- Cross-platform AI workflows that maintain full context
The Model Context Protocol is still young but already enables powerful new patterns for AI-assisted development. This shared memory approach is just the beginning.
Conclusion
Building a shared AI memory system transforms how you work with AI tools. Instead of constantly re-explaining your setup and losing context, you get AI assistants that truly understand your environment and remember your decisions.
The combination of MCP servers, Vagrant, and Docker provides a clean, robust foundation for this integration. As the ecosystem matures, I expect more developers to adopt similar patterns for persistent, context-aware AI assistance.
Your AI tools shouldn't have amnesia. Please give them the memory they deserve.
Top comments (1)
dev.to/pushbeyondlimits/ai-memory-...