<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Razmik Ayvazyan</title>
    <description>The latest articles on DEV Community by Razmik Ayvazyan (@ayvazyan10).</description>
    <link>https://dev.to/ayvazyan10</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/ayvazyan10"/>
    <language>en</language>
    <item>
      <title>I Built an Open-Source Brain for AI Models — Here's How Engram Works</title>
      <dc:creator>Razmik Ayvazyan</dc:creator>
      <pubDate>Wed, 25 Mar 2026 12:23:24 +0000</pubDate>
      <link>https://dev.to/ayvazyan10/i-built-an-open-source-brain-for-ai-models-heres-how-engram-works-38m1</link>
      <guid>https://dev.to/ayvazyan10/i-built-an-open-source-brain-for-ai-models-heres-how-engram-works-38m1</guid>
      <description>&lt;p&gt;Every AI tool I use forgets everything the moment a session ends. I tell Claude Code about my tech stack, switch to Ollama the next day, and start from zero.&lt;/p&gt;

&lt;p&gt;So I built Engram — a persistent memory layer that gives any AI model human-like memory.&lt;/p&gt;




&lt;h2&gt;
  
  
  What is Engram?
&lt;/h2&gt;

&lt;p&gt;Engram is a universal AI brain. It stores what your AIs learn, retrieves the right memories at the right time, and presents them as context — automatically.&lt;/p&gt;

&lt;p&gt;Connect it once and every AI you use shares a single, growing brain.&lt;/p&gt;




&lt;h2&gt;
  
  
  How it works
&lt;/h2&gt;

&lt;p&gt;When an AI connected to Engram receives a query:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Embeds&lt;/strong&gt; the query into a 384-dim vector (locally, via ONNX — no API, no cost)
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Searches&lt;/strong&gt; the vector index for similar past memories
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Expands&lt;/strong&gt; via the knowledge graph to related concepts
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Scores&lt;/strong&gt; by similarity + recency + importance + access frequency
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Checks&lt;/strong&gt; for contradictions with existing memories
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Injects&lt;/strong&gt; assembled context into the AI's prompt
&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;The AI responds with full awareness of everything it has ever learned.&lt;/p&gt;




&lt;h2&gt;
  
  
  Three memory types
&lt;/h2&gt;

&lt;p&gt;Engram mirrors the human brain:&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Type&lt;/th&gt;
&lt;th&gt;What it stores&lt;/th&gt;
&lt;th&gt;Example&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;Episodic&lt;/td&gt;
&lt;td&gt;Events, conversations&lt;/td&gt;
&lt;td&gt;"User asked about deployment on March 15"&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Semantic&lt;/td&gt;
&lt;td&gt;Facts, knowledge graph&lt;/td&gt;
&lt;td&gt;concept: TypeScript → "typed superset of JS"&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Procedural&lt;/td&gt;
&lt;td&gt;Patterns, skills&lt;/td&gt;
&lt;td&gt;"When asked about DB migrations → use drizzle-kit"&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;




&lt;h2&gt;
  
  
  The features that set it apart
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;Memory decay&lt;/strong&gt; — Uses the Ebbinghaus forgetting curve. Memories that aren't accessed fade over time. Important ones get consolidated from episodes into semantic facts — like sleep consolidation in the brain.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Contradiction detection&lt;/strong&gt; — When you store "we use PostgreSQL" but there's already a memory saying "we use MongoDB," Engram detects it and offers resolution strategies (keep newest, keep most important, keep both, etc.).&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Knowledge graph&lt;/strong&gt; — Memories aren't just vectors. They form a graph with typed edges (is_a, causes, contradicts, relates_to, etc.). Recall traverses this graph to find context pure vector search would miss.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Plugin system&lt;/strong&gt; — 6 lifecycle hooks (onStore, onRecall, onForget, onDecay, onStartup, onShutdown). Build extensions without touching core code.&lt;/p&gt;




&lt;h2&gt;
  
  
  Quick start
&lt;/h2&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="c"&gt;# Install the CLI&lt;/span&gt;
npm i &lt;span class="nt"&gt;-g&lt;/span&gt; @engram-ai-memory/cli

&lt;span class="c"&gt;# Store a memory&lt;/span&gt;
engram store &lt;span class="s2"&gt;"User prefers TypeScript"&lt;/span&gt; &lt;span class="nt"&gt;--type&lt;/span&gt; semantic

&lt;span class="c"&gt;# Ask the brain&lt;/span&gt;
engram recall &lt;span class="s2"&gt;"What language does the user prefer?"&lt;/span&gt; &lt;span class="nt"&gt;--raw&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;






&lt;h2&gt;
  
  
  Claude Code integration
&lt;/h2&gt;

&lt;p&gt;Add to &lt;code&gt;~/.claude/settings.json&lt;/code&gt;:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"mcpServers"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"engram"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"command"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"node"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"args"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s2"&gt;"/path/to/engram/packages/mcp/dist/server.js"&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"env"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nl"&gt;"ENGRAM_DB_PATH"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"/path/to/engram.db"&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;18 tools appear automatically — &lt;code&gt;store_memory&lt;/code&gt;, &lt;code&gt;recall_context&lt;/code&gt;, &lt;code&gt;check_contradictions&lt;/code&gt;, &lt;code&gt;decay_sweep&lt;/code&gt;, and more.&lt;/p&gt;




&lt;h2&gt;
  
  
  The 3D dashboard
&lt;/h2&gt;

&lt;p&gt;Engram ships with a React Three Fiber visualization dashboard with 5 view modes.&lt;/p&gt;

&lt;p&gt;Memories appear as glowing neurons, connections as edges, contradictions in orange. You can store, delete, tag, search, and resolve contradictions directly from the UI.&lt;/p&gt;




&lt;h2&gt;
  
  
  Architecture
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;@engram-ai-memory/core&lt;/code&gt; — The brain engine (NeuralBrain, embeddings, graph, retrieval)
&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;@engram-ai-memory/mcp&lt;/code&gt; — 18 MCP tools for Claude Code
&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;@engram-ai-memory/cli&lt;/code&gt; — Terminal interface
&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;@engram-ai-memory/vis&lt;/code&gt; — 3D visualization helpers
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;SQLite by default (zero config), PostgreSQL optional for teams.&lt;/p&gt;




&lt;h2&gt;
  
  
  Performance
&lt;/h2&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Metric&lt;/th&gt;
&lt;th&gt;Value&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;Recall latency (100 memories)&lt;/td&gt;
&lt;td&gt;~18ms&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Store throughput&lt;/td&gt;
&lt;td&gt;~120 mem/s&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Embedding&lt;/td&gt;
&lt;td&gt;8ms/text (local ONNX)&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Cached startup (1k memories)&lt;/td&gt;
&lt;td&gt;~45ms&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;




&lt;h2&gt;
  
  
  Links
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;GitHub: &lt;a href="https://github.com/ayvazyan10/engram" rel="noopener noreferrer"&gt;https://github.com/ayvazyan10/engram&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;Docs + playground: &lt;a href="https://engram.am" rel="noopener noreferrer"&gt;https://engram.am&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;npm: &lt;a href="https://www.npmjs.com/org/engram-ai-memory" rel="noopener noreferrer"&gt;https://www.npmjs.com/org/engram-ai-memory&lt;/a&gt;
&lt;/li&gt;
&lt;/ul&gt;




&lt;p&gt;MIT licensed. Star it if you find it useful.&lt;br&gt;&lt;br&gt;
PRs welcome — especially new adapters for LM Studio, llama.cpp, and other AI tools.&lt;/p&gt;

</description>
      <category>opensource</category>
      <category>openclaw</category>
      <category>claudecode</category>
      <category>mcp</category>
    </item>
  </channel>
</rss>
