<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: AATEL</title>
    <description>The latest articles on DEV Community by AATEL (@mirko_perrone_9dbd3752227).</description>
    <link>https://dev.to/mirko_perrone_9dbd3752227</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/mirko_perrone_9dbd3752227"/>
    <language>en</language>
    <item>
      <title>𝗪𝗿𝗶𝘁𝗶𝗻𝗴 𝗰𝗼𝗱𝗲 𝗶𝘀 𝗲𝗮𝘀𝘆 𝗻𝗼𝘄. 𝗠𝗮𝗸𝗶𝗻𝗴 𝗶𝘁 𝗺𝗮𝘁𝘁𝗲𝗿 𝗶𝘀 𝗻𝗼𝘁.</title>
      <dc:creator>AATEL</dc:creator>
      <pubDate>Sat, 11 Apr 2026 06:12:57 +0000</pubDate>
      <link>https://dev.to/mirko_perrone_9dbd3752227/--473d</link>
      <guid>https://dev.to/mirko_perrone_9dbd3752227/--473d</guid>
      <description>&lt;p&gt;𝘝𝘪𝘣𝘦 𝘤𝘰𝘥𝘪𝘯𝘨 hasn’t flattened the playing field.&lt;br&gt;
It has simply replaced the technical hierarchy with a new one — built on celebrity, hype, and amplification capital.&lt;/p&gt;

&lt;p&gt;🔗 &lt;a href="https://medium.com/@aatel.license/ai-has-democratized-coding-just-not-for-everyone-555d2767fce8" rel="noopener noreferrer"&gt;https://medium.com/@aatel.license/ai-has-democratized-coding-just-not-for-everyone-555d2767fce8&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Paraphrasing Orwell in 𝘈𝘯𝘪𝘮𝘢𝘭 𝘍𝘢𝘳𝘮:&lt;br&gt;
𝘈𝘳𝘵𝘪𝘧𝘪𝘤𝘪𝘢𝘭 𝘪𝘯𝘵𝘦𝘭𝘭𝘪𝘨𝘦𝘯𝘤𝘦 𝘩𝘢𝘴 𝘥𝘦𝘮𝘰𝘤𝘳𝘢𝘵𝘪𝘻𝘦𝘥 𝘤𝘰𝘥𝘪𝘯𝘨 𝘧𝘰𝘳 𝘦𝘷𝘦𝘳𝘺𝘰𝘯𝘦, 𝘣𝘶𝘵 𝘧𝘰𝘳 𝘴𝘰𝘮𝘦 𝘪𝘵 𝘩𝘢𝘴 𝘥𝘦𝘮𝘰𝘤𝘳𝘢𝘵𝘪𝘻𝘦𝘥 𝘪𝘵 𝘮𝘰𝘳𝘦 𝘵𝘩𝘢𝘯 𝘧𝘰𝘳 𝘰𝘵𝘩𝘦𝘳𝘴.&lt;/p&gt;

&lt;h1&gt;
  
  
  AI #Coding #Programming #Tech #Innovation #DigitalTransformation #FutureOfWork #Developer #SoftwareDevelopment #MachineLearning #Automation #TechTrends #AIRevolution #Productivity
&lt;/h1&gt;

&lt;h1&gt;
  
  
  VibeCoding #Tech #BuildInPublic #ArtificialIntelligence #Anthropic
&lt;/h1&gt;

</description>
    </item>
    <item>
      <title>https://medium.com/@aatel.license/ai-has-democratized-coding-just-not-for-everyone-555d2767fce8</title>
      <dc:creator>AATEL</dc:creator>
      <pubDate>Fri, 10 Apr 2026 12:09:09 +0000</pubDate>
      <link>https://dev.to/mirko_perrone_9dbd3752227/httpsmediumcomaatellicenseai-has-democratized-coding-just-not-for-everyone-555d2767fce8-10pn</link>
      <guid>https://dev.to/mirko_perrone_9dbd3752227/httpsmediumcomaatellicenseai-has-democratized-coding-just-not-for-everyone-555d2767fce8-10pn</guid>
      <description>&lt;div class="crayons-card c-embed text-styles text-styles--secondary"&gt;
    &lt;div class="c-embed__content"&gt;
      &lt;div class="c-embed__body flex items-center justify-between"&gt;
        &lt;a href="https://medium.com/@aatel.license/ai-has-democratized-coding-just-not-for-everyone-555d2767fce8" rel="noopener noreferrer" class="c-link fw-bold flex items-center"&gt;
          &lt;span class="mr-2"&gt;medium.com&lt;/span&gt;
          

        &lt;/a&gt;
      &lt;/div&gt;
    &lt;/div&gt;
&lt;/div&gt;


</description>
    </item>
    <item>
      <title>I Built a Database That Works Like Human Memory — No SQLite, No ORM, Zero External Dependencies</title>
      <dc:creator>AATEL</dc:creator>
      <pubDate>Thu, 19 Mar 2026 16:25:27 +0000</pubDate>
      <link>https://dev.to/mirko_perrone_9dbd3752227/i-built-a-database-that-works-like-human-memory-no-sqlite-no-orm-zero-external-dependencies-14e8</link>
      <guid>https://dev.to/mirko_perrone_9dbd3752227/i-built-a-database-that-works-like-human-memory-no-sqlite-no-orm-zero-external-dependencies-14e8</guid>
      <description>&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;## TL;DR

- Append-only binary database engine written from scratch in Python
- No SQLite, no ORM, no external dependencies
- Each memory has a concept + one of 15 emotions + optional media
- LLM cognitive layer: perceive (extract structure from raw text), ask (RAG over your memories), reflect (emotional arc), dream (free association), introspect (psychological portrait)
- Provider-agnostic: any LLM via .env file, including LM Studio/Ollama locally
- Filesystem-aware media storage: hard links on ext4/NFS/NTFS, reflinks on btrfs/APFS, atomic copy on FAT32
- AATEL license, Python 3.12+
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;There's a question I couldn't stop thinking about: what if a database refused to let you update or delete anything?&lt;/p&gt;

&lt;p&gt;Not as a technical limitation. Not even as a reliability feature, like event sourcing. As a &lt;em&gt;semantic&lt;/em&gt; choice — because some data should be immutable by nature.&lt;/p&gt;

&lt;p&gt;Human memory is the obvious model. You can't &lt;code&gt;UPDATE&lt;/code&gt; what you experienced. You can't &lt;code&gt;DELETE&lt;/code&gt; a memory. Every experience accumulates, layered over time, each one carrying its own emotional weight. The same concept — "Debt", "Family", "Work" — means something completely different at 25 versus at 45. That difference, that arc through time, is the data. It's not something to normalize away.&lt;/p&gt;

&lt;p&gt;So I built &lt;strong&gt;MNHEME&lt;/strong&gt;: a database engine that enforces this constraint at the lowest level, with an LLM layer on top that understands memory the way we actually think about it.&lt;/p&gt;

&lt;p&gt;Here's how it works.&lt;/p&gt;




&lt;h2&gt;
  
  
  The Constraint: True Immutability
&lt;/h2&gt;

&lt;p&gt;Most databases that claim "immutability" actually mean "audit log plus current state." You can still mutate the current state — you just record the history.&lt;/p&gt;

&lt;p&gt;MNHEME is different. There is no current state to mutate. There's only the log.&lt;/p&gt;

&lt;p&gt;The &lt;code&gt;Memory&lt;/code&gt; dataclass is &lt;code&gt;frozen=True&lt;/code&gt;:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="nd"&gt;@dataclass&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;frozen&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="bp"&gt;True&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="k"&gt;class&lt;/span&gt; &lt;span class="nc"&gt;Memory&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
    &lt;span class="n"&gt;memory_id&lt;/span&gt;  &lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nb"&gt;str&lt;/span&gt;
    &lt;span class="n"&gt;concept&lt;/span&gt;    &lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nb"&gt;str&lt;/span&gt;      &lt;span class="c1"&gt;# "Debt", "Family", "Travel"
&lt;/span&gt;    &lt;span class="n"&gt;feeling&lt;/span&gt;    &lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nb"&gt;str&lt;/span&gt;      &lt;span class="c1"&gt;# one of 15 defined emotions
&lt;/span&gt;    &lt;span class="n"&gt;media_type&lt;/span&gt; &lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nb"&gt;str&lt;/span&gt;      &lt;span class="c1"&gt;# TEXT, IMAGE, VIDEO, AUDIO, DOC
&lt;/span&gt;    &lt;span class="n"&gt;content&lt;/span&gt;    &lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nb"&gt;str&lt;/span&gt;
    &lt;span class="n"&gt;note&lt;/span&gt;       &lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nb"&gt;str&lt;/span&gt;
    &lt;span class="n"&gt;tags&lt;/span&gt;       &lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nb"&gt;tuple&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="nb"&gt;str&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="p"&gt;...]&lt;/span&gt;
    &lt;span class="n"&gt;timestamp&lt;/span&gt;  &lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nb"&gt;str&lt;/span&gt;
    &lt;span class="n"&gt;checksum&lt;/span&gt;   &lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nb"&gt;str&lt;/span&gt;      &lt;span class="c1"&gt;# SHA-256 of content
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;code&gt;frozen=True&lt;/code&gt; means Python will raise &lt;code&gt;FrozenInstanceError&lt;/code&gt; if you try to modify any field after creation. And the &lt;code&gt;MemoryDB&lt;/code&gt; class has no &lt;code&gt;update()&lt;/code&gt; method, no &lt;code&gt;delete()&lt;/code&gt; method — not just "not implemented," literally absent from the codebase.&lt;/p&gt;




&lt;h2&gt;
  
  
  The Storage Engine: Binary Log From Scratch
&lt;/h2&gt;

&lt;p&gt;I didn't want SQLite because SQLite is fundamentally a mutable store. I wanted to build the append-only constraint into the file format itself.&lt;/p&gt;

&lt;p&gt;The &lt;code&gt;.mnheme&lt;/code&gt; file is a binary log of records:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;┌──────────────┬──────────┬───────────────────┐
│  MAGIC (4B)  │ SIZE (4B)│  PAYLOAD (N bytes)│
└──────────────┴──────────┴───────────────────┘

MAGIC   = [0x4D, 0x4E, 0x45, 0xE0]  — record signature
SIZE    = uint32 big-endian          — payload length
PAYLOAD = JSON UTF-8                 — the memory data
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Every write:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Serializes the record to JSON&lt;/li&gt;
&lt;li&gt;Prepends the MAGIC + SIZE header&lt;/li&gt;
&lt;li&gt;Appends the entire frame in a single &lt;code&gt;write()&lt;/code&gt; call&lt;/li&gt;
&lt;li&gt;Calls &lt;code&gt;os.fsync()&lt;/code&gt; before returning&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;This means every write is crash-safe. If the process dies mid-write, the record is truncated — detectable by the missing MAGIC bytes on the next record. Truncated records are silently skipped on startup.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Indexes in RAM&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;On startup, the file is scanned once. For each record, we store its byte offset in several dictionaries:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="n"&gt;concept_index&lt;/span&gt; &lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Debt&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;offset1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;offset2&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="p"&gt;...]&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="n"&gt;feeling_index&lt;/span&gt; &lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;fear&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;offset1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;offset3&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="p"&gt;...]&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="n"&gt;tag_index&lt;/span&gt;     &lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;bank&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;offset1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="p"&gt;...]&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;When you call &lt;code&gt;recall("Debt")&lt;/code&gt;, the index returns the offsets, and &lt;code&gt;read_at(offset)&lt;/code&gt; seeks directly to each record — reading only the bytes you need. &lt;code&gt;count()&lt;/code&gt; never touches the file at all.&lt;/p&gt;

&lt;p&gt;The result: &lt;code&gt;count()&lt;/code&gt; runs at 2.7 million ops/second from RAM. &lt;code&gt;recall(concept, limit=10)&lt;/code&gt; reads exactly 10 records, taking about 1.5ms.&lt;/p&gt;




&lt;h2&gt;
  
  
  The Filesystem Layer: Inode-Aware Media Storage
&lt;/h2&gt;

&lt;p&gt;For attachments (images, audio, video, documents), I wanted deduplication without copying files.&lt;/p&gt;

&lt;p&gt;The &lt;code&gt;FsProbe&lt;/code&gt; class identifies the filesystem and probes its actual capabilities at boot — not by trusting the filesystem name, but by &lt;em&gt;actually trying&lt;/em&gt; &lt;code&gt;os.link()&lt;/code&gt;, &lt;code&gt;ioctl(FICLONE)&lt;/code&gt;, and &lt;code&gt;os.symlink()&lt;/code&gt; in the target directory:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="n"&gt;probe&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nc"&gt;FsProbe&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;/data/mnheme_files&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="n"&gt;caps&lt;/span&gt;  &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;probe&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;detect&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
&lt;span class="c1"&gt;# caps.can_hardlink → True (verified by actually creating a hard link)
# caps.can_reflink  → False (ioctl FICLONE returned EOPNOTSUPP)
# caps.strategy     → LinkStrategy.HARDLINK
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The strategy chosen:&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Filesystem&lt;/th&gt;
&lt;th&gt;Strategy&lt;/th&gt;
&lt;th&gt;Bytes written&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;ext4, ZFS, NFS&lt;/td&gt;
&lt;td&gt;Hard link&lt;/td&gt;
&lt;td&gt;0 (same inode)&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;btrfs, xfs+reflink&lt;/td&gt;
&lt;td&gt;Reflink (CoW)&lt;/td&gt;
&lt;td&gt;0 (shared blocks)&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;NTFS&lt;/td&gt;
&lt;td&gt;Hard link&lt;/td&gt;
&lt;td&gt;0 (same inode)&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;FAT32, HDFS&lt;/td&gt;
&lt;td&gt;Atomic copy&lt;/td&gt;
&lt;td&gt;full file size&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;For deduplication: the pool is content-addressed by SHA-256. The same image attached to 100 different memories = one physical file, 100 hard links, one inode. &lt;code&gt;nlink&lt;/code&gt; counter shows exactly how many memories reference it.&lt;/p&gt;




&lt;h2&gt;
  
  
  The LLM Layer: A Brain for the Database
&lt;/h2&gt;

&lt;p&gt;This is where it gets interesting. The LLM isn't the primary interface — it's a semantic processing layer that understands memory as humans experience it.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;code&gt;perceive()&lt;/code&gt;&lt;/strong&gt; — raw input to structured memory&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="n"&gt;r&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;brain&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;perceive&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;I opened the letter from the bank. My hands were shaking.&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="c1"&gt;# The LLM extracted:
&lt;/span&gt;&lt;span class="n"&gt;r&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;extracted_concept&lt;/span&gt;  &lt;span class="c1"&gt;# "Debt"
&lt;/span&gt;&lt;span class="n"&gt;r&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;extracted_feeling&lt;/span&gt;  &lt;span class="c1"&gt;# "fear"
&lt;/span&gt;&lt;span class="n"&gt;r&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;extracted_tags&lt;/span&gt;     &lt;span class="c1"&gt;# ["bank", "body", "anxiety"]
&lt;/span&gt;&lt;span class="n"&gt;r&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;enriched_content&lt;/span&gt;   &lt;span class="c1"&gt;# psychologically enriched version of the text
&lt;/span&gt;
&lt;span class="c1"&gt;# The Memory is already saved in MemoryDB — immutable.
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;&lt;code&gt;ask()&lt;/code&gt;&lt;/strong&gt; — RAG over personal memory&lt;/p&gt;

&lt;p&gt;The LLM first extracts keywords and concepts from the question, retrieves relevant memories from the database, then answers using &lt;em&gt;only those memories&lt;/em&gt; as context. If the memories don't contain the answer, it says so.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="n"&gt;ans&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;brain&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;ask&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;How do I feel about money?&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="c1"&gt;# Searches memories tagged Debt, Finance, etc.
# Answers from what's actually stored, not from training data
&lt;/span&gt;&lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;ans&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;confidence_note&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;  &lt;span class="c1"&gt;# "Certainty: high — direct evidence from memories"
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;&lt;code&gt;reflect()&lt;/code&gt;&lt;/strong&gt; — emotional arc analysis&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="n"&gt;ref&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;brain&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;reflect&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Debt&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="c1"&gt;# Feeds all "Debt" memories in chronological order to the LLM
# Gets back an analysis of the emotional journey
&lt;/span&gt;&lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;ref&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;arc&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;  &lt;span class="c1"&gt;# "from visceral dread to earned serenity"
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;&lt;code&gt;dream()&lt;/code&gt;&lt;/strong&gt; — free association across distant memories&lt;/p&gt;

&lt;p&gt;Samples memories from different emotional states, asks the LLM to find unexpected connections. Loosely inspired by memory consolidation during sleep.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;code&gt;introspect()&lt;/code&gt;&lt;/strong&gt; — psychological portrait&lt;/p&gt;

&lt;p&gt;Feeds the full distribution of concepts and feelings, plus recent memories, and asks for a psychological portrait: dominant patterns, unresolved tensions, emotional resources.&lt;/p&gt;




&lt;h2&gt;
  
  
  The Provider System: Truly Vendor-Agnostic
&lt;/h2&gt;

&lt;p&gt;I wanted the LLM layer to work with any provider without changing code. The solution: a &lt;code&gt;.env&lt;/code&gt; file and pure &lt;code&gt;urllib&lt;/code&gt;.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="c"&gt;# Local — no API key&lt;/span&gt;
&lt;span class="nv"&gt;LM_STUDIO_URL&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;http://localhost:1234/v1/chat/completions
&lt;span class="nv"&gt;LM_STUDIO_MODEL&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;local-model
&lt;span class="nv"&gt;LM_STUDIO_RPM&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;60

&lt;span class="c"&gt;# Cloud&lt;/span&gt;
&lt;span class="nv"&gt;GROQ_API_KEY&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;gsk_...
&lt;span class="nv"&gt;ANTHROPIC_API_KEY&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;sk-ant-...

&lt;span class="nv"&gt;USE_MULTI_PROVIDER&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="nb"&gt;true&lt;/span&gt;   &lt;span class="c"&gt;# cascade fallback if primary fails&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Any variable ending in &lt;code&gt;_URL&lt;/code&gt; + &lt;code&gt;_MODEL&lt;/code&gt; activates a provider. Anthropic is the only special case — detected by URL pattern, uses the native Anthropic format. Everything else is OpenAI-compatible.&lt;/p&gt;

&lt;p&gt;Rate limiting is per-provider (token bucket). Retry uses exponential backoff on 429 and 5xx. With &lt;code&gt;USE_MULTI_PROVIDER=true&lt;/code&gt;, if one provider fails, the next in priority order is tried automatically.&lt;/p&gt;

&lt;p&gt;No SDK. No &lt;code&gt;pip install anthropic&lt;/code&gt;. Just HTTP.&lt;/p&gt;




&lt;h2&gt;
  
  
  Benchmark Results
&lt;/h2&gt;

&lt;p&gt;2,000 records, Python 3.12, 9p filesystem:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;remember() with fsync:      1.8ms    552 ops/s
remember() without fsync:   0.2ms  4,632 ops/s   (8.4× faster)
count() — pure RAM:         ~0ms   2,774,322 ops/s
feeling_distribution():     0.003ms  277,865 ops/s
recall(concept, limit=10):  1.5ms    636 ops/s
search() full-text:         40ms      24 ops/s    (~49k records/s)
search(limit=5):            0.1ms   8,348 ops/s   (stops at 5th match)
Cold start (2k records):    40ms     —            (49k rec/s indexed)
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;File size: ~374 bytes/record → ~36MB for 100k records, ~357MB for 1M.&lt;/p&gt;




&lt;h2&gt;
  
  
  The Deeper Question
&lt;/h2&gt;

&lt;p&gt;Is "append-only as a semantic constraint" useful beyond memory systems?&lt;/p&gt;

&lt;p&gt;Most phenomena we model are actually immutable events that we artificially collapse into mutable state. A bank transaction doesn't change — we just keep running totals. A sensor reading doesn't update — we just display the latest one. User behavior doesn't mutate — we summarize it.&lt;/p&gt;

&lt;p&gt;I wonder how many data models would be simpler if they started append-only and added mutability only where genuinely needed, rather than starting with full mutability and then trying to add audit trails, history, and immutability as afterthoughts.&lt;/p&gt;

&lt;h2&gt;
  
  
  MNHEME is one data point in that experiment.
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;GitHub:&lt;/strong&gt; [&lt;a href="https://github.com/aatel-license/mnheme" rel="noopener noreferrer"&gt;https://github.com/aatel-license/mnheme&lt;/a&gt;]&lt;br&gt;
&lt;strong&gt;Python 3.12+. AATEL License. Zero external dependencies.&lt;/strong&gt;&lt;/p&gt;




</description>
      <category>ai</category>
      <category>database</category>
      <category>datascience</category>
      <category>llm</category>
    </item>
    <item>
      <title>Why AATEL Was Created: A Pact of Respect for Software</title>
      <dc:creator>AATEL</dc:creator>
      <pubDate>Mon, 09 Mar 2026 23:50:56 +0000</pubDate>
      <link>https://dev.to/mirko_perrone_9dbd3752227/why-aatel-was-created-a-pact-of-respect-for-software-4in6</link>
      <guid>https://dev.to/mirko_perrone_9dbd3752227/why-aatel-was-created-a-pact-of-respect-for-software-4in6</guid>
      <description>&lt;h2&gt;
  
  
  Why AATEL Was Created: A Pact of Respect for Software
&lt;/h2&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;"Software is an act of generosity. Protecting it means protecting our creativity."&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h2&gt;
  
  
  The world has changed
&lt;/h2&gt;

&lt;p&gt;When code gets shared online, it is shared with purpose: to help other developers, solve real problems, or spark someone else's creativity. Shared because of a belief in the power of community — in the idea that knowledge grows when it flows freely between people.&lt;/p&gt;

&lt;p&gt;But today, that gift is being taken without asking.&lt;/p&gt;

&lt;p&gt;Code is harvested by automated pipelines to train commercial AI systems, or repurposed for ends that have nothing to do with peace, progress, or the human spirit that created it in the first place.&lt;/p&gt;

&lt;p&gt;MIT. Apache. GPL. None of them were written for this world.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;This is not a punishment. It is a safeguard.&lt;/strong&gt;&lt;/p&gt;




&lt;h2&gt;
  
  
  What is AATEL
&lt;/h2&gt;

&lt;p&gt;The AATEL (Anti-AI Training Ethical License) was created to restore balance between openness and respect.&lt;/p&gt;

&lt;p&gt;The goal is not to lock away code. It is to ensure that whoever uses it, does so with intention and integrity.&lt;/p&gt;

&lt;p&gt;It is the &lt;strong&gt;first ethical source license&lt;/strong&gt;. Not open-source. Not proprietary. A third way that did not exist before.&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Variant&lt;/th&gt;
&lt;th&gt;For&lt;/th&gt;
&lt;th&gt;Base&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;&lt;code&gt;aatel v2.1&lt;/code&gt;&lt;/td&gt;
&lt;td&gt;Software / code&lt;/td&gt;
&lt;td&gt;MIT-derived&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;code&gt;aatel-IC v2.1&lt;/code&gt;&lt;/td&gt;
&lt;td&gt;Internet content (articles, docs, datasets)&lt;/td&gt;
&lt;td&gt;Original&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;




&lt;h2&gt;
  
  
  The three pillars
&lt;/h2&gt;

&lt;h3&gt;
  
  
  🧠 Respect for ingenuity
&lt;/h3&gt;

&lt;p&gt;AI systems may not train on the work without explicit, informed permission.&lt;/p&gt;

&lt;p&gt;Code is a human artifact — the product of thought, time, and creativity. It is not free fuel for algorithms.&lt;/p&gt;

&lt;h3&gt;
  
  
  ☮️ Peace and ethics
&lt;/h3&gt;

&lt;p&gt;The work must never be used to build weapons, military surveillance systems, or any technology designed to harm.&lt;/p&gt;

&lt;p&gt;These are not abstract principles. They are enforceable terms.&lt;/p&gt;

&lt;h3&gt;
  
  
  🤝 Reciprocal sustainability
&lt;/h3&gt;

&lt;p&gt;If a corporation profits from the work, it is only fair they give something back.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Revenue &amp;lt; €1M / Non-profit / Individual  →  Free (notification only)
Revenue ≥ €1M (Level A)                  →  Fixed monthly fee
High revenue (Level B)                    →  Fixed fee + % of turnover
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The €1M threshold is fixed. Small projects, indie developers, open communities — always free.&lt;/p&gt;




&lt;h2&gt;
  
  
  The pay-per-response mechanism (AATEL-IC)
&lt;/h2&gt;

&lt;p&gt;AI search engines and answer engines index content once, then use it to generate responses millions of times — with no traffic ever returning to the source. The old web model is broken.&lt;/p&gt;

&lt;p&gt;The Golden Rule:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;1 indexing × 1,000,000 responses = payment for 1,000,000 uses — not one access.&lt;/strong&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Built-in anti-loophole provisions cover missing tracking data, absent &lt;code&gt;robots.txt&lt;/code&gt;, and fragmented corporate structures attempting to evade liability.&lt;/p&gt;




&lt;h2&gt;
  
  
  How to adopt it
&lt;/h2&gt;

&lt;p&gt;Three steps:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="c"&gt;# 1. Download the template&lt;/span&gt;
curl &lt;span class="nt"&gt;-O&lt;/span&gt; https://raw.githubusercontent.com/aatel-license/aatel-license.github.io/main/aatel_v2_1_template_IT.txt

&lt;span class="c"&gt;# 2. Fill in your details&lt;/span&gt;
&lt;span class="c"&gt;# {{YEAR}}, {{NAME}}, {{EMAIL}}, {{CITY}}, {{AMOUNT}} ...&lt;/span&gt;

&lt;span class="c"&gt;# 3. Add to your repo&lt;/span&gt;
&lt;span class="nb"&gt;mv &lt;/span&gt;aatel_v2_1_template_IT.txt LICENSE
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Then share it. The more developers who adopt AATEL, the harder it becomes for large operators to ignore independent creators.&lt;/p&gt;




&lt;h2&gt;
  
  
  A final word
&lt;/h2&gt;

&lt;p&gt;Open source was never meant to mean unconditional. It was built on trust — the trust that what we give will be used to build something better, not to erase us from the equation.&lt;/p&gt;

&lt;p&gt;AATEL is a small act of reclaiming that trust.&lt;/p&gt;

&lt;h2&gt;
  
  
  &lt;strong&gt;PS: AATEL is not against AI per se.&lt;/strong&gt;
&lt;/h2&gt;

&lt;p&gt;🔗 GitHub: &lt;a href="https://github.com/aatel-license/aatel-license.github.io" rel="noopener noreferrer"&gt;https://github.com/aatel-license/aatel-license.github.io&lt;/a&gt;&lt;br&gt;&lt;br&gt;
🌐 Site: &lt;a href="https://aatel.org" rel="noopener noreferrer"&gt;https://aatel.org&lt;/a&gt;&lt;/p&gt;

</description>
      <category>ai</category>
      <category>programming</category>
      <category>opensource</category>
      <category>discuss</category>
    </item>
  </channel>
</rss>
