DEV Community

Riyogarta P
Riyogarta P

Posted on

Introducing Syne — An AI Agent That Actually Remembers You

"I remember, therefore I am"


Most AI assistants have some form of memory. But it's limited — a handful of notes, a capped context window, a single user. The moment it gets too long, it forgets. The moment someone else joins the conversation, they start from zero.

Syne removes both ceilings. Persistent memory that never expires. Unlimited storage backed by pgvector semantic search. And it's shared — your household, your team, your circle — one agent that grows its understanding of everyone over time, not just you.


What Is Syne?

Syne is an open-source AI agent framework with persistent, semantic memory — with a landing page at syne.codes. It runs on your own server, talks to you via Telegram (and a terminal CLI), and — crucially — remembers things across sessions.

Not by storing raw chat logs and stuffing them into context. By actually learning what's worth keeping.

The name comes from Mnemosyne, the Greek goddess of memory and mother of the Muses. The goal is an agent that grows with you over time.


The Memory Problem — and How Syne Solves It

Most "memory" solutions for AI agents are naive: store everything in markdown files, retrieve by keyword, dump into context. You've seen the pattern — memory.md, soul.md, agents.md, roles.md. It works, until it doesn't.

The real constraint isn't storage. It's the context window. Even with 200K tokens, everything loaded into context — all those markdown files plus chat history — has to fit within that limit. So compaction kicks in: the agent summarizes the conversation into a shorter resume, discarding detail to make room.

memory.md + soul.md + agents.md + roles.md (30K) + chat history (160K) = 190K
↓ compaction
memory.md + soul.md + agents.md + roles.md (30K) + resume (20K) + chat (0K) = 50K

And the cycle repeats. Every compaction loses something.

Syne approaches this differently. Nothing is pre-loaded into context. Instead of reading all memory files upfront, Syne retrieves only what's relevant — on demand, via semantic search. The database doesn't touch the context window at all. You can store millions of memories and the context stays clean.

Syne takes a different approach with three components working together:

The Evaluator is a small local LLM (Ollama, $0) that runs a 3-layer filter on every message before anything gets stored. Layer 1 is a fast heuristic — skip greetings, questions, short messages. Layer 2 uses the LLM to ask: is this a fact, preference, decision, or lesson? Layer 3 checks similarity against existing memories to avoid duplicates. The whole thing runs asynchronously — it never slows down the chat response.

The Embedding Engine converts every memory into a vector and stores it in PostgreSQL with pgvector. When you ask something, your question is also embedded and matched by semantic similarity using an HNSW index. This is what makes search fast even with millions of memories — it finds meaning, not just keywords.

The Decay Engine mimics human forgetting. Memories that are never recalled slowly lose durability and eventually get deleted. Memories that keep getting used survive longer. Memories marked as permanent (things you explicitly say to remember) never decay. The system self-maintains without any manual cleanup.

Here's what it looks like in practice:

You:  Remember: I'm allergic to shellfish.
Syne: Stored.

              [3 days later]

You:  Suggest dinner for tonight.
Syne: How about rendang or soto ayam? Avoiding shellfish as noted.

You:  What do you remember about my family?
Syne: You have a partner and a child. You're allergic to shellfish.
      Anything else you'd like me to note?

You:  Forget that I like sushi.
Syne: Removed from memory.
Enter fullscreen mode Exit fullscreen mode

And memory is private by design — access respects the permission system:

Stranger: What do you know about your owner's family?
Syne:     I can't share that. That's private information.
Enter fullscreen mode Exit fullscreen mode

No Config Files. Just Talk.

Most agent frameworks ask you to edit SOUL.md, AGENTS.md, or CONFIG.yaml to change how the agent behaves. Syne has none of that. Everything — personality, rules, identity, configuration — lives in PostgreSQL, and you change it through conversation.

Want the agent to be more casual? Say so. Want to add a rule never to share your location? Say so. Want to rename it? Say so. No files, no restarts.


Privacy and Control

Syne runs entirely on your own server. Your memories never touch a third-party service unless you choose a cloud embedding provider. The chat LLM is Google Gemini via OAuth — free and without needing an API key.

The permission system is Linux-inspired: a 3-digit octal model (owner / family / public) controls access to every tool and ability. The first person to message Syne becomes the owner. Everything else is configurable from there.


Self-Evolving

One of the more interesting capabilities: Syne can create new abilities for itself at runtime — together with its owner. Tell it you wish it could check cryptocurrency prices, and it will write, validate, and register a new ability — no restart required.

It can only touch its own abilities/custom/ directory. Core engine code is off-limits. If it finds a bug in itself, it formats a GitHub issue for the owner to post.

Out of the box, Syne already ships with six bundled abilities:

  • image_gen — Generate images from text prompts (FLUX.1 via Together AI)
  • image_analysis — Analyze and describe images sent to the chat
  • maps — Places, directions, and geocoding via Google Maps
  • pdf — Generate PDF documents from conversation
  • website_screenshot — Capture screenshots of any URL
  • whatsapp — WhatsApp bridge to send and receive messages

Each ability manages its own dependencies — external binaries and packages are auto-installed when you enable it. And of course, anything not covered by the defaults can be added through conversation.


Zero API Cost

Syne is designed to run at zero ongoing cost outside of hardware. Chat runs via Google Gemini OAuth — no API key required. Embedding and memory evaluation run locally via Ollama. The database runs in Docker. Nothing phones home, nothing charges per token.


Getting Started

Installation is designed to be as frictionless as possible. Three commands to start, then everything else is just choices — pick your AI provider, confirm your hardware tier, enter your Telegram bot token. No manual commands, no editing config files, no copy-pasting connection strings. The installer handles Docker, PostgreSQL, pgvector, Ollama, model downloads, database schema, and systemd service automatically.

The only possible exception: if your user isn't already in the Docker group, one logout and login is needed after installation. That's it.

$ git clone https://github.com/riyogarta/syne.git
$ cd syne
$ bash install.sh
Enter fullscreen mode Exit fullscreen mode

When the installer finishes, Syne is already running.

Visit syne.codes for the landing page, or github.com/riyogarta/syne for full documentation and source.


Syne is early but functional. I'm using it daily. If you build something with it, or find something broken, contributions are welcome.

Top comments (0)