Giving My AI Assistant a Long-Term Memory
I talk to Zo every day. Sometimes through the web interface. Sometimes over SMS when I'm away from my desk. Sometimes I'll ask it to remember something small, like how I take my coffee or that my son Brooks needs his EpiPen packed for trips.
Then I'd start a new conversation and watch that information vanish. Every session started fresh. No context. No continuity. Like Groundhog Day, but with an AI assistant.
Last weekend I finally fixed it. Here's what I built and why.
The Problem
Zo runs on Zo Computer, which means each conversation spins up a fresh instance. Clean state, no pollution between sessions, predictable behavior. That's by design. But it also means Zo can't remember anything I tell it.
If I mention on Monday that I prefer dark mode in all editors, then start a completely different conversation on Tuesday, Zo has no idea. I'd have to repeat myself constantly.
The same pattern plays out across most AI tools. Each session is an island.
What I Wanted
Three things:
- Preferences should persist. Dark mode, editor settings, how I structure my projects.
- Facts should stick. My husband's name, where I was born, who my contacts are.
- Context should be automatic. I didn't want to manually trigger a memory save every time I mentioned something worth keeping.
And it needed to work across all my channels. Web chat, SMS, email. If I tell Zo something via text message, the next web chat should already know about it.
The Architecture: Three Layers
I built a hybrid memory system. Each layer backs up the others, so if one is down or unavailable, memory still works.
Supermemory (cloud via MCP). A cloud memory service accessed through the Model Context Protocol. Zo calls it via mcporter, an MCP CLI tool. It stores preferences, facts, and profile-level information. Because it's cloud-based, it's accessible from anywhere. If I text Zo from my phone, it can recall what I said in a web chat last week.
Local SQLite with FTS5. Runs directly on Zo Computer. Uses SQLite's FTS5 extension for fast full-text memory search, indexed via OpenClaw's memory system. Works even if Supermemory is unreachable, and keeps everything private on my machine.
MEMORY.md (static context). A markdown file loaded at the start of every session via an AGENTS.md directive. It's the ground truth. If both cloud and local systems fail, this file still exists and still gets read.
Redundancy by design. If Supermemory is unreachable, local SQLite catches it. If SQLite has issues, MEMORY.md is still there.
How It Works in Practice
At the start of every session, Zo runs this automatically:
bun /home/workspace/Skills/zo-memory/scripts/recall.ts "user preferences projects family ongoing tasks"
That query hits both @supermemory and local SQLite in parallel, combines the results, and loads everything into context before I type my first message. The MEMORY.md file also gets auto-loaded via the AGENTS.md directive.
Under the hood, the recall script calls Supermemory via MCP:
const result = Bun.spawnSync([
'mcporter', 'call', 'supermemory.recall',
`query=${q}`
], { timeout: 30000 });
And queries local SQLite using FTS5:
SELECT path, start_line, text, bm25(chunks_fts) as score
FROM chunks_fts
WHERE chunks_fts MATCH 'user preferences'
ORDER BY score
LIMIT 10
When Zo learns something new about me, it saves to all three systems automatically:
bun /home/workspace/Skills/zo-memory/scripts/save.ts "Andrea prefers dark mode in all editors" --tag preference
The save script calls Supermemory via MCP:
const result = Bun.spawnSync([
'mcporter', 'call', 'supermemory.memory',
`content=${text}`, `action=save`
], { timeout: 30000 });
Then appends to a local memory file and triggers a reindex via OpenClaw.
I set up rules so this happens automatically. If I tell Zo something worth remembering, it saves. I don't have to prompt it. Memory should be invisible infrastructure, not a manual chore.
The Test
After building it, I wanted to verify it actually worked across channels.
I opened a web chat with Zo and mentioned something specific I hadn't said before. Then I texted Zo from my phone and asked if it remembered.
It did. The information had gone from web chat to Supermemory to a text message recall. Three days later, it still remembered.
What I'd Do Differently
The recall script can be slow. Supermemory is a cloud service, so there's network latency. Local SQLite is fast but requires reindexing after each save. If I were doing this again at scale, I'd add a caching layer.
I also didn't set up deduplication at first. If I told Zo the same preference three times, it saved it three times. That was a simple script fix — now it checks both exact match and semantic similarity before saving to any system. If duplicate, it skips with a clear message.
Why This Matters
AI assistants forget because we build them that way. Clean state is easier to reason about. No carry-over means no contamination. But for an assistant I use daily, across multiple channels, that trade-off stopped making sense.
I wanted something that knew me. Now it does.

Top comments (0)