<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: joygram</title>
    <description>The latest articles on DEV Community by joygram (@joygram).</description>
    <link>https://dev.to/joygram</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/joygram"/>
    <language>en</language>
    <item>
      <title>AI Lost in the Fragmented IDL Pile: Building a Single Context Hub with 'Unified AST'</title>
      <dc:creator>joygram</dc:creator>
      <pubDate>Wed, 01 Apr 2026 05:14:15 +0000</pubDate>
      <link>https://dev.to/joygram/ai-lost-in-the-fragmented-idl-pile-building-a-single-context-hub-with-unified-ast-36dg</link>
      <guid>https://dev.to/joygram/ai-lost-in-the-fragmented-idl-pile-building-a-single-context-hub-with-unified-ast-36dg</guid>
      <description>&lt;p&gt;In the previous article, we discussed the importance of semantic modeling to specify the identity of data and prevent hallucinations in AI agents (Cursor, Windsurf, etc.) or MCP protocols. However, when we try to apply this in practice, we hit the largest and coldest wall of reality: &lt;strong&gt;Fragmented Legacy&lt;/strong&gt;.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;"The backend team uses Thrift, while the client uses old code in Protobuf. Plus, design data is scattered in Excel. When we ask an AI agent to write the entire system code, it mixes these three and makes a mess. Do we have to throw everything away and rewrite with a new standard?"&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Semantic modeling design started from this desperate question. Instead of breaking everything and rebuilding, an architecture was adopted: the &lt;strong&gt;Unified AST (Unified Abstract Syntax Tree) Hub&lt;/strong&gt;, where all these fragmented schemas gather into one giant knowledge graph.&lt;/p&gt;

&lt;h2&gt;
  
  
  1. The Tragedy of Migration: AI Cannot Bridge Fragmented Contexts
&lt;/h2&gt;

&lt;p&gt;Many development teams attempt to migrate with the thought, "Let's move to a better IDL standard all at once," but fail. The reason isn't a lack of human technical skill, but the &lt;strong&gt;impossibility of simultaneous transition&lt;/strong&gt; across systems.&lt;/p&gt;

&lt;p&gt;AI agents also get lost in this swamp of fragmentation.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Disconnected Metadata&lt;/strong&gt;: When modifying a server packet, the AI cannot simultaneously attend to the validation rules of the Excel data associated with that packet.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Failure in Context Switching&lt;/strong&gt;: After reading Thrift, when the AI generates a Protobuf structure, it misses subtle differences in field types (Optional vs. Required), creating fatal bugs.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;What the AI needs isn't just access to the code. It needs a &lt;strong&gt;Single Context Hub&lt;/strong&gt; that proves all these scattered pieces of data actually point to the same business logic.&lt;/p&gt;

&lt;h2&gt;
  
  
  2. The Solution: A Semantic Dialect Engine Based on 'Unified AST'
&lt;/h2&gt;

&lt;p&gt;To solve this fragmentation, a compiler was designed to read different IDLs (Protobuf, Thrift, etc.) with independent parsers and internally merge these disparate languages into a common &lt;strong&gt;Unified AST&lt;/strong&gt;.&lt;/p&gt;

&lt;h3&gt;
  
  
  ① Non-Destructive Hybrid Co-existence
&lt;/h3&gt;

&lt;p&gt;There's no need to abandon legacy code for the AI era. The semantic engine encapsulates existing &lt;code&gt;.proto&lt;/code&gt; or &lt;code&gt;.thrift&lt;/code&gt; files as safe External Dialects.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;// ai_context.deuk
// 1. Import existing Protobuf files as-is (absolutely NO modifications)
include "legacy/item_base.proto"

// 2. Wrap legacy types in a 'meta-cognitive intent' shared by humans and AI
table&amp;lt;item_base::Item&amp;gt; = { key: "item_id" }
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Thanks to this design, a team can maintain their existing infrastructure with 0.1% disruption while instantly providing the AI agent with a firm context: "&lt;strong&gt;The &lt;code&gt;Item&lt;/code&gt; message in Protobuf is actually the baseline design information (Table) coming from Excel.&lt;/strong&gt;"&lt;/p&gt;

&lt;h3&gt;
  
  
  ② Context Resolver
&lt;/h3&gt;

&lt;p&gt;This is more than simple text parsing. The context resolver lifts all characteristics of the base language—such as Protobuf's field numbering system or Thrift's Optional attributes—into the Unified AST without losing them. Types promoted to the Unified AST in this way gain a perfectly equal "status."&lt;/p&gt;

&lt;h2&gt;
  
  
  3. Architecture Diagram: A Single Source of Truth (SSOT) for AI
&lt;/h2&gt;

&lt;p&gt;The Unified AST design passes fragmented formats through a single intelligent hub.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;   [Legacy &amp;amp; New Sources]          [Semantic Hub]             [AI &amp;amp; Output Code]

   legacy.proto (Protobuf)  ----+                            +-- C# (Unity Client)
                                |                            |
   legacy.thrift (Thrift)   ----+--&amp;gt; [ Unified AST ] --------+-- TypeScript (Web)
                                |       (Engine)             |
   new_logic.deuk (Schema)  ----+                            +-- C++ (Server)
                                                             |
                                                             +-- Agent/MCP Rule Sets
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Now, a single hub pierces through all areas. When a server packet is modified, not only is the client model updated synchronously, but the validation rules for Excel data are also connected. To AI coding assistants (Cursor, Copilot, etc.), what is given as context is not fragmented &lt;code&gt;.proto&lt;/code&gt; files and Excel spreadsheets, but a perfectly refined &lt;strong&gt;Single Source of Truth (SSOT)&lt;/strong&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion: Architecture Reduces AI's Cognitive Load
&lt;/h2&gt;

&lt;p&gt;The Unified AST architecture was not born simply to create a "pretty IDL."&lt;br&gt;
It is an extreme engineering response to the question: "&lt;strong&gt;How can we prevent humans and AIs from falling into the swamp of fragmented legacy systems and allow them to control the entire system on a single knowledge graph?&lt;/strong&gt;"&lt;/p&gt;

&lt;p&gt;By unifying system fragmentation and opening a clear path for the machine, human developers can finally focus solely on the essence of the business. That is the greatest value proven by this hybrid architecture.&lt;/p&gt;




&lt;p&gt;&lt;strong&gt;Continues in the next article:&lt;/strong&gt;&lt;br&gt;&lt;br&gt;
[Why AI Servers Die from OOM: Designing a Zero-Allocation Protocol] explores the technical reality of how data passed through this Unified AST is transmitted over the wire at hyper-speed—without wasting a single byte of memory allocation at runtime.&lt;/p&gt;




&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;Project DeukPack&lt;/strong&gt;&lt;br&gt;&lt;br&gt;
This article series is based on the design notes of &lt;strong&gt;DeukPack&lt;/strong&gt;, an open-source infrastructure created to prevent data fragmentation.  &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;GitHub&lt;/strong&gt;: &lt;a href="https://github.com/deukpack/DeukPack" rel="noopener noreferrer"&gt;DeukPack OSS&lt;/a&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;/blockquote&gt;

</description>
      <category>architecture</category>
      <category>devlog</category>
      <category>ai</category>
      <category>mcp</category>
    </item>
    <item>
      <title>Designing a Control Plane for AI Agents: The Power of Schemas with 'Intent'</title>
      <dc:creator>joygram</dc:creator>
      <pubDate>Mon, 30 Mar 2026 06:54:41 +0000</pubDate>
      <link>https://dev.to/joygram/from-frustration-to-language-the-birth-of-the-deukpack-idl-6m7</link>
      <guid>https://dev.to/joygram/from-frustration-to-language-the-birth-of-the-deukpack-idl-6m7</guid>
      <description>&lt;p&gt;This article explores an architectural challenge at the threshold of the MCP (Model Context Protocol) era: "How can AI agents grasp and handle our data without misunderstanding its intent?"&lt;/p&gt;

&lt;p&gt;In the past, merely using formats like Protobuf or JSON to exchange data between clients and servers was sufficient. Human developers could read the code and infer the context. However, the entity reading code and suggesting business logic is rapidly shifting to AI agents like GitHub Copilot and Cursor.&lt;/p&gt;

&lt;p&gt;This introduces a critical problem.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;"I told the AI agent to write the shop purchase logic (Request: &lt;strong&gt;Use this data model to write the shop purchase logic&lt;/strong&gt;), but the agent indiscriminately mixed client-only UI data with backend-only database tables."&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h2&gt;
  
  
  1. The Cause of Failure: Data Lacks 'Identity'
&lt;/h2&gt;

&lt;p&gt;In most traditional IDL (Interface Definition Language) environments, all objects are treated as equal &lt;code&gt;struct&lt;/code&gt;s or &lt;code&gt;message&lt;/code&gt;s.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Fixed design data managed by designers in Excel (e.g., item specs)&lt;/li&gt;
&lt;li&gt;Transactional data exchanged between server and DB (e.g., current gold balance)&lt;/li&gt;
&lt;li&gt;Temporary rendering data for the client UI&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Under the old paradigm, these three types of data show no visual distinction in code. Human engineers rely on names or documentation to understand the context, but to an AI agent, they are all just "identical byte blobs containing a few fields." Naturally, because their structures are identical, the AI mixes them up and ruins the logic.&lt;/p&gt;

&lt;h2&gt;
  
  
  2. The Solution: Structures Capable of 'Meta-cognition'
&lt;/h2&gt;

&lt;p&gt;Traditional keywords like &lt;code&gt;struct&lt;/code&gt; or &lt;code&gt;message&lt;/code&gt; are faithful in defining the technical form of data (arrays, integers, strings), but they fail to capture the intent behind how that data is used within the system.&lt;/p&gt;

&lt;p&gt;Instead of just introducing new keywords, a "meta-cognitive" approach is needed—one where, even if the underlying data structure remains identical, &lt;strong&gt;semantic encapsulation&lt;/strong&gt; minimizes cognitive errors for both humans and AI. For example, redefining a standard &lt;code&gt;struct&lt;/code&gt; as a &lt;code&gt;record&lt;/code&gt; or &lt;code&gt;table&lt;/code&gt; based on its nature implies its "principles of use" at the syntax level.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;// record: Technically a struct, but informs humans and AI that it is an 'immutable value state' shared across the system.
record Vector3 {
  1&amp;gt; float x
  2&amp;gt; float y
  3&amp;gt; float z
}

// entity: Specifies that it's not just a data clump, but a 'living actor' with a unique ID flowing through system swarms.
entity PlayerContext {
  1&amp;gt; string session_id = ""
  2&amp;gt; int32 level = 1
}

// table: Defines 'design data' that serves as an external reference and must not be modified at runtime.
table&amp;lt;ItemSpec&amp;gt; = { key: "item_id" }
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;When such a structure is introduced, the system pipeline automatically provides the AI with a powerful metadata guide: "This data is defined as a &lt;code&gt;table&lt;/code&gt;, so do not generate logic that modifies its values at runtime." This moves beyond technical type checking; it is the process of communicating the designer's &lt;strong&gt;Intent&lt;/strong&gt; to the machine.&lt;/p&gt;

&lt;h2&gt;
  
  
  3. Structural Evolution: From "Code Generator" to "AI Control Plane"
&lt;/h2&gt;

&lt;p&gt;The era of simply parsing bytes is over. With semantic declaration structures applied, the infrastructure no longer acts as a simple 'data pump' but serves as the &lt;strong&gt;Control Plane&lt;/strong&gt; for the entire architecture.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Imbuing Context to Agents&lt;/strong&gt;: When an AI bot connects to the server via MCP, the context that "this object is an &lt;code&gt;entity&lt;/code&gt; and must only be manipulated via state-change APIs" is injected in zero milliseconds.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Defending Against Hallucination&lt;/strong&gt;: When the AI attempts to write code modifying fields of design data (&lt;code&gt;table&lt;/code&gt;), the IDE or compiler blocks the malfunction at the source, stating, "This object is read-only reference data."&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  Conclusion: Let the Data Speak Its Intent
&lt;/h2&gt;

&lt;p&gt;In the era of AI-driven development, a data communication framework must not remain a mere delivery courier moving data from A to B. It must forcibly require, from the Syntax Layer upwards, that the data itself clearly declares: &lt;strong&gt;"For what purpose was I created, and how must I be handled?"&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Only when this Identity is guaranteed can human engineers and AI agents accept each other's intents without misunderstanding, expanding the system at destructive speeds.&lt;/p&gt;




&lt;p&gt;&lt;strong&gt;Continues in the next article:&lt;/strong&gt;&lt;br&gt;&lt;br&gt;
In [AI Lost in the Fragmented IDL Pile: Building a Single Context Hub with 'Unified AST'], we examine the technical sleight-of-hand needed to gather highly fragmented legacy systems (Protobuf, Thrift, Excel) into one place and deliver them to the AI before issuing these identity cards.&lt;/p&gt;




&lt;blockquote&gt;
&lt;p&gt;Project DeukPack&lt;br&gt;&lt;br&gt;
This article series is based on the design notes of &lt;strong&gt;DeukPack&lt;/strong&gt;, an open-source infrastructure created to block data fragmentation and guarantee AI agent reliability.  &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;GitHub&lt;/strong&gt;: &lt;a href="https://github.com/deukpack/DeukPack" rel="noopener noreferrer"&gt;DeukPack OSS&lt;/a&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;/blockquote&gt;

</description>
      <category>architecture</category>
      <category>programming</category>
      <category>devlogs</category>
    </item>
  </channel>
</rss>
