<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Arek Mazur</title>
    <description>The latest articles on DEV Community by Arek Mazur (@panhiszpan).</description>
    <link>https://dev.to/panhiszpan</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/panhiszpan"/>
    <language>en</language>
    <item>
      <title>I Built a Smart Meeting Lamp So My Wife Knows When I'm in a Call</title>
      <dc:creator>Arek Mazur</dc:creator>
      <pubDate>Wed, 13 May 2026 17:54:41 +0000</pubDate>
      <link>https://dev.to/panhiszpan/i-built-a-smart-meeting-lamp-so-my-wife-knows-when-im-in-a-call-3j6a</link>
      <guid>https://dev.to/panhiszpan/i-built-a-smart-meeting-lamp-so-my-wife-knows-when-im-in-a-call-3j6a</guid>
      <description>&lt;p&gt;There are two types of interruptions during remote work:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;the important ones&lt;/li&gt;
&lt;li&gt;and the "quick question" that somehow starts exactly when your camera turns on during a meeting&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;So I decided to build a small smart lamp that sits outside my office and instantly shows whether:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;my camera is active&lt;/li&gt;
&lt;li&gt;my microphone is active&lt;/li&gt;
&lt;li&gt;or I'm free and available&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The goal was simple:&lt;br&gt;
before entering the office, my wife can immediately see if I'm:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;on a call&lt;/li&gt;
&lt;li&gt;talking to someone&lt;/li&gt;
&lt;li&gt;recording something&lt;/li&gt;
&lt;li&gt;or finally available for coffee&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;And honestly... it turned into a surprisingly fun ESPHome + Home Assistant project.&lt;/p&gt;


&lt;h1&gt;
  
  
  Final Result
&lt;/h1&gt;

&lt;p&gt;The lamp changes colors automatically based on my MacBook state:&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;State&lt;/th&gt;
&lt;th&gt;Color&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;Camera active&lt;/td&gt;
&lt;td&gt;Red&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Microphone active&lt;/td&gt;
&lt;td&gt;Orange&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Neither active&lt;/td&gt;
&lt;td&gt;Green&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;Everything is fully local and integrated with Home Assistant.&lt;/p&gt;


&lt;h1&gt;
  
  
  Demo Video
&lt;/h1&gt;
&lt;h2&gt;
  
  
    &lt;iframe src="https://www.youtube.com/embed/c9omEMKit84"&gt;
  &lt;/iframe&gt;

&lt;/h2&gt;
&lt;h1&gt;
  
  
  Technologies Used
&lt;/h1&gt;

&lt;ul&gt;
&lt;li&gt;ESP32-S3&lt;/li&gt;
&lt;li&gt;ESPHome&lt;/li&gt;
&lt;li&gt;Home Assistant&lt;/li&gt;
&lt;li&gt;WS2812 RGB LED ring&lt;/li&gt;
&lt;li&gt;Tinkercad&lt;/li&gt;
&lt;li&gt;Bambu Lab 3D printer&lt;/li&gt;
&lt;/ul&gt;


&lt;h1&gt;
  
  
  Hardware Components
&lt;/h1&gt;
&lt;h2&gt;
  
  
  ESP32-S3 Board
&lt;/h2&gt;

&lt;p&gt;I used a tiny ESP32-S3 development board with:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;USB-C&lt;/li&gt;
&lt;li&gt;WiFi&lt;/li&gt;
&lt;li&gt;Bluetooth&lt;/li&gt;
&lt;li&gt;native USB support&lt;/li&gt;
&lt;li&gt;enough power for ESPHome projects&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The size is honestly impressive.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fffs5abyrqwjkmnotjs7b.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fffs5abyrqwjkmnotjs7b.jpg" alt=" " width="800" height="800"&gt;&lt;/a&gt;&lt;/p&gt;


&lt;h2&gt;
  
  
  WS2812 LED Ring
&lt;/h2&gt;

&lt;p&gt;The LED ring contains 24 individually addressable RGB LEDs.&lt;/p&gt;

&lt;p&gt;This means every LED can be controlled independently using a single DATA pin.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fco1ihzuu3k6xyzhfvifd.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fco1ihzuu3k6xyzhfvifd.jpeg" alt=" " width="800" height="1067"&gt;&lt;/a&gt;&lt;/p&gt;


&lt;h2&gt;
  
  
  Supporting Components
&lt;/h2&gt;

&lt;p&gt;I also used:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;470Ω resistor&lt;/li&gt;
&lt;li&gt;470µF capacitor&lt;/li&gt;
&lt;li&gt;prototype PCB&lt;/li&gt;
&lt;li&gt;female headers&lt;/li&gt;
&lt;/ul&gt;


&lt;h1&gt;
  
  
  Why the Resistor and Capacitor?
&lt;/h1&gt;

&lt;p&gt;The resistor protects the DATA line from signal spikes.&lt;/p&gt;

&lt;p&gt;The capacitor stabilizes the power delivery to the LED ring and prevents random flickering or instability when LEDs suddenly change brightness.&lt;/p&gt;

&lt;p&gt;Recommended setup:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;470Ω resistor between ESP GPIO and LED DATA&lt;/li&gt;
&lt;li&gt;470µF capacitor between 5V and GND&lt;/li&gt;
&lt;/ul&gt;


&lt;h1&gt;
  
  
  Wiring Diagram
&lt;/h1&gt;

&lt;p&gt;Connections used in the project:&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;LED Ring&lt;/th&gt;
&lt;th&gt;ESP32-S3&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;5V&lt;/td&gt;
&lt;td&gt;5V&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;GND&lt;/td&gt;
&lt;td&gt;GND&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;DI&lt;/td&gt;
&lt;td&gt;GPIO4 through 470Ω resistor&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;Capacitor:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;positive -&amp;gt; 5V&lt;/li&gt;
&lt;li&gt;negative -&amp;gt; GND&lt;/li&gt;
&lt;/ul&gt;


&lt;h1&gt;
  
  
  Soldering the Hardware
&lt;/h1&gt;

&lt;p&gt;I assembled everything on a small prototype PCB.&lt;/p&gt;

&lt;p&gt;The ESP32-S3 is mounted using female headers which makes it removable and easy to replace later.&lt;/p&gt;

&lt;p&gt;The LED ring wires are also detachable. &lt;/p&gt;
&lt;h2&gt;
  
  
  Prototype PCB
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmsl6w93r38k0y9j8r2r9.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmsl6w93r38k0y9j8r2r9.jpeg" alt=" " width="800" height="1067"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h2&gt;
  
  
  Top Side
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpzrujif8secar02m2c8v.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpzrujif8secar02m2c8v.jpeg" alt=" " width="800" height="1067"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h2&gt;
  
  
  Bottom Side
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9gldszdo6lpj8m7ni5vd.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9gldszdo6lpj8m7ni5vd.jpeg" alt=" " width="800" height="1067"&gt;&lt;/a&gt;&lt;/p&gt;


&lt;h1&gt;
  
  
  Installing ESPHome
&lt;/h1&gt;

&lt;p&gt;I installed ESPHome locally on macOS using &lt;code&gt;pipx&lt;/code&gt;.&lt;/p&gt;
&lt;h2&gt;
  
  
  Install pipx
&lt;/h2&gt;


&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;brew &lt;span class="nb"&gt;install &lt;/span&gt;pipx
pipx ensurepath
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;p&gt;Restart terminal after installation.&lt;/p&gt;
&lt;h2&gt;
  
  
  Install ESPHome
&lt;/h2&gt;


&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;pipx &lt;span class="nb"&gt;install &lt;/span&gt;esphome
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;p&gt;Verify installation:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;esphome version
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;






&lt;h1&gt;
  
  
  Creating the ESPHome Project
&lt;/h1&gt;

&lt;p&gt;Create project directory:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="nb"&gt;mkdir&lt;/span&gt; &lt;span class="nt"&gt;-p&lt;/span&gt; ~/esphome
&lt;span class="nb"&gt;cd&lt;/span&gt; ~/esphome
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Create configuration file:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;nano led-ring.yaml
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;






&lt;h1&gt;
  
  
  ESPHome Configuration
&lt;/h1&gt;

&lt;p&gt;The configuration below creates a WiFi-connected RGB lamp exposed directly to Home Assistant.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;&lt;span class="na"&gt;esphome&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;led-ring&lt;/span&gt;
  &lt;span class="na"&gt;friendly_name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;LED Ring&lt;/span&gt;

&lt;span class="na"&gt;esp32&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="na"&gt;board&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;esp32-s3-devkitc-1&lt;/span&gt;
  &lt;span class="na"&gt;framework&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;type&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;arduino&lt;/span&gt;

&lt;span class="na"&gt;wifi&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="na"&gt;ssid&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="s"&gt;YOUR_WIFI"&lt;/span&gt;
  &lt;span class="na"&gt;password&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="s"&gt;YOUR_PASSWORD"&lt;/span&gt;

&lt;span class="na"&gt;logger&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;

&lt;span class="na"&gt;api&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;

&lt;span class="na"&gt;ota&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="na"&gt;platform&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;esphome&lt;/span&gt;

&lt;span class="na"&gt;light&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;platform&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;esp32_rmt_led_strip&lt;/span&gt;
    &lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="s"&gt;LED&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Ring"&lt;/span&gt;
    &lt;span class="na"&gt;pin&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;GPIO4&lt;/span&gt;
    &lt;span class="na"&gt;num_leds&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="m"&gt;24&lt;/span&gt;
    &lt;span class="na"&gt;rgb_order&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;GRB&lt;/span&gt;
    &lt;span class="na"&gt;chipset&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;WS2812&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;






&lt;h1&gt;
  
  
  Flashing the Firmware
&lt;/h1&gt;

&lt;p&gt;Connect ESP32-S3 using USB-C and run:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;esphome run led-ring.yaml
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;ESPHome:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;compiles firmware&lt;/li&gt;
&lt;li&gt;uploads firmware&lt;/li&gt;
&lt;li&gt;connects the device to WiFi&lt;/li&gt;
&lt;li&gt;exposes it to Home Assistant&lt;/li&gt;
&lt;/ol&gt;




&lt;h1&gt;
  
  
  Home Assistant Integration
&lt;/h1&gt;

&lt;p&gt;Once connected to WiFi, Home Assistant discovered the device automatically.&lt;/p&gt;

&lt;p&gt;The LED ring appeared as:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;light.led_ring
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;And that was enough to control it from automations.&lt;/p&gt;




&lt;h1&gt;
  
  
  Detecting Camera and Microphone Usage
&lt;/h1&gt;

&lt;p&gt;The Home Assistant macOS companion app exposes sensors like:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;binary_sensor.areks_macbook_pro_camera_in_use
binary_sensor.areks_macbook_pro_microphone_in_use
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;That means Home Assistant knows:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;when my camera is active&lt;/li&gt;
&lt;li&gt;when microphone is active&lt;/li&gt;
&lt;li&gt;and can trigger automations instantly&lt;/li&gt;
&lt;/ul&gt;




&lt;h1&gt;
  
  
  Automation Logic
&lt;/h1&gt;

&lt;p&gt;The logic is intentionally simple:&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Condition&lt;/th&gt;
&lt;th&gt;Lamp Color&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;Camera ON&lt;/td&gt;
&lt;td&gt;Red&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Microphone only&lt;/td&gt;
&lt;td&gt;Orange&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Neither&lt;/td&gt;
&lt;td&gt;Green&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;This is the automation:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;&lt;span class="na"&gt;alias&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Office Call Lamp - Camera and Microphone Indicator&lt;/span&gt;
&lt;span class="na"&gt;description&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Red when camera is in use, orange when only microphone is in use, green when neither is in use.&lt;/span&gt;
&lt;span class="na"&gt;mode&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;restart&lt;/span&gt;

&lt;span class="na"&gt;trigger&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;platform&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;state&lt;/span&gt;
    &lt;span class="na"&gt;entity_id&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s"&gt;binary_sensor.areks_macbook_pro_camera_in_use&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s"&gt;binary_sensor.areks_macbook_pro_microphone_in_use&lt;/span&gt;

&lt;span class="na"&gt;action&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;choose&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;conditions&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
          &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;condition&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;state&lt;/span&gt;
            &lt;span class="na"&gt;entity_id&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;binary_sensor.areks_macbook_pro_camera_in_use&lt;/span&gt;
            &lt;span class="na"&gt;state&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="s"&gt;on"&lt;/span&gt;
        &lt;span class="na"&gt;sequence&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
          &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;service&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;light.turn_on&lt;/span&gt;
            &lt;span class="na"&gt;target&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
              &lt;span class="na"&gt;entity_id&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;light.led_ring&lt;/span&gt;
            &lt;span class="na"&gt;data&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
              &lt;span class="na"&gt;rgb_color&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="pi"&gt;[&lt;/span&gt;&lt;span class="nv"&gt;255&lt;/span&gt;&lt;span class="pi"&gt;,&lt;/span&gt; &lt;span class="nv"&gt;0&lt;/span&gt;&lt;span class="pi"&gt;,&lt;/span&gt; &lt;span class="nv"&gt;0&lt;/span&gt;&lt;span class="pi"&gt;]&lt;/span&gt;
              &lt;span class="na"&gt;brightness&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="m"&gt;255&lt;/span&gt;

      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;conditions&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
          &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;condition&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;state&lt;/span&gt;
            &lt;span class="na"&gt;entity_id&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;binary_sensor.areks_macbook_pro_camera_in_use&lt;/span&gt;
            &lt;span class="na"&gt;state&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="s"&gt;off"&lt;/span&gt;
          &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;condition&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;state&lt;/span&gt;
            &lt;span class="na"&gt;entity_id&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;binary_sensor.areks_macbook_pro_microphone_in_use&lt;/span&gt;
            &lt;span class="na"&gt;state&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="s"&gt;on"&lt;/span&gt;
        &lt;span class="na"&gt;sequence&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
          &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;service&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;light.turn_on&lt;/span&gt;
            &lt;span class="na"&gt;target&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
              &lt;span class="na"&gt;entity_id&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;light.led_ring&lt;/span&gt;
            &lt;span class="na"&gt;data&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
              &lt;span class="na"&gt;rgb_color&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="pi"&gt;[&lt;/span&gt;&lt;span class="nv"&gt;255&lt;/span&gt;&lt;span class="pi"&gt;,&lt;/span&gt; &lt;span class="nv"&gt;120&lt;/span&gt;&lt;span class="pi"&gt;,&lt;/span&gt; &lt;span class="nv"&gt;0&lt;/span&gt;&lt;span class="pi"&gt;]&lt;/span&gt;
              &lt;span class="na"&gt;brightness&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="m"&gt;255&lt;/span&gt;

    &lt;span class="na"&gt;default&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;service&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;light.turn_on&lt;/span&gt;
        &lt;span class="na"&gt;target&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
          &lt;span class="na"&gt;entity_id&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;light.led_ring&lt;/span&gt;
        &lt;span class="na"&gt;data&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
          &lt;span class="na"&gt;rgb_color&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="pi"&gt;[&lt;/span&gt;&lt;span class="nv"&gt;0&lt;/span&gt;&lt;span class="pi"&gt;,&lt;/span&gt; &lt;span class="nv"&gt;255&lt;/span&gt;&lt;span class="pi"&gt;,&lt;/span&gt; &lt;span class="nv"&gt;0&lt;/span&gt;&lt;span class="pi"&gt;]&lt;/span&gt;
          &lt;span class="na"&gt;brightness&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="m"&gt;120&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;






&lt;h1&gt;
  
  
  Designing the Enclosure in Tinkercad
&lt;/h1&gt;

&lt;p&gt;Once the electronics worked, I wanted the project to actually look good on the desk instead of resembling a cyberpunk breadboard experiment.&lt;/p&gt;

&lt;p&gt;So I designed a simple two-part enclosure in Tinkercad.&lt;/p&gt;

&lt;p&gt;The enclosure consists of:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;top shell with LED ring opening&lt;/li&gt;
&lt;li&gt;bottom shell holding PCB and ESP32-S3&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The design keeps:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;USB-C accessible&lt;/li&gt;
&lt;li&gt;LEDs visible&lt;/li&gt;
&lt;li&gt;electronics hidden&lt;/li&gt;
&lt;li&gt;assembly simple&lt;/li&gt;
&lt;/ul&gt;




&lt;h1&gt;
  
  
  Tinkercad Design
&lt;/h1&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fcw6pfkayoi8wn6dp28r1.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fcw6pfkayoi8wn6dp28r1.png" alt=" " width="800" height="438"&gt;&lt;/a&gt;&lt;/p&gt;




&lt;h1&gt;
  
  
  3D Printing
&lt;/h1&gt;

&lt;p&gt;The enclosure was printed on a Bambu Lab printer.&lt;/p&gt;

&lt;h2&gt;
  
  
  Printed Parts
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F02i5t9ki0cdzf03ywghv.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F02i5t9ki0cdzf03ywghv.jpeg" alt=" " width="800" height="600"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Final Assembly
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flbp7z7tvv66nw91svvy4.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flbp7z7tvv66nw91svvy4.jpeg" alt=" " width="800" height="1067"&gt;&lt;/a&gt;&lt;/p&gt;




&lt;h1&gt;
  
  
  What I Like About This Project
&lt;/h1&gt;

&lt;p&gt;What started as:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;"please don't walk into the office during calls"&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;ended up becoming:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;a fun ESPHome project&lt;/li&gt;
&lt;li&gt;a useful Home Assistant automation&lt;/li&gt;
&lt;li&gt;a custom hardware build&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The best part is how little code is actually needed.&lt;/p&gt;

&lt;p&gt;ESPHome handles:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;firmware&lt;/li&gt;
&lt;li&gt;networking&lt;/li&gt;
&lt;li&gt;Home Assistant integration&lt;/li&gt;
&lt;li&gt;OTA updates&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Which means you can focus on the actual idea instead of low-level firmware work.&lt;/p&gt;




&lt;h1&gt;
  
  
  Final Thoughts
&lt;/h1&gt;

&lt;p&gt;ESPHome + ESP32-S3 is honestly one of the best combinations for DIY smart home projects.&lt;/p&gt;

&lt;p&gt;The development experience is fast, modern and surprisingly reliable.&lt;/p&gt;

&lt;p&gt;And the result is a genuinely useful device that solves a real everyday problem.&lt;/p&gt;

&lt;p&gt;Or at least reduces the probability of:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;"Are you on a call right now?"&lt;/p&gt;
&lt;/blockquote&gt;

</description>
      <category>automation</category>
      <category>iot</category>
      <category>showdev</category>
      <category>sideprojects</category>
    </item>
    <item>
      <title>RAG finds chunks. TrailGraph finds answers. Here's the difference.</title>
      <dc:creator>Arek Mazur</dc:creator>
      <pubDate>Thu, 26 Mar 2026 10:59:51 +0000</pubDate>
      <link>https://dev.to/panhiszpan/rag-finds-chunks-trailgraph-finds-answers-heres-the-difference-1plb</link>
      <guid>https://dev.to/panhiszpan/rag-finds-chunks-trailgraph-finds-answers-heres-the-difference-1plb</guid>
      <description>&lt;p&gt;Imagine asking your AI assistant: "How does lead qualification work?"&lt;/p&gt;

&lt;p&gt;RAG searches the vector store, pulls the top 5 chunks by cosine similarity, and hands them to the model. You might get three paragraphs about leads, one about opportunity stages, and one about case escalation — because they all share similar keywords. The model does its best to stitch them together. Sometimes it works. Sometimes you get a confidently wrong answer built from pieces that were never meant to be combined.&lt;/p&gt;

&lt;p&gt;To be clear — RAG is a proven, powerful pattern. For unstructured documents, broad search, and fast retrieval, it's hard to beat. But when knowledge has clear hierarchy and multi-level relationships, retrieval by similarity alone can lose the structure that makes the answer meaningful.&lt;/p&gt;

&lt;p&gt;So I built &lt;strong&gt;TrailGraph&lt;/strong&gt; — an AI agent that navigates a knowledge graph step by step. It explores nodes, scores them for relevance, follows the most promising path, and only reads the full content when it's confident it found the right answer. No embeddings. No vector search. Just an LLM with a single tool and a graph of markdown files.&lt;/p&gt;




&lt;h2&gt;
  
  
  Where I ran into limits with standard retrieval
&lt;/h2&gt;

&lt;p&gt;RAG was designed for retrieval. You embed documents, query a vector store, get back the top N chunks ranked by cosine similarity. It's fast, it scales, and for many use cases it's the right approach.&lt;/p&gt;

&lt;p&gt;But working with Salesforce knowledge — which is inherently hierarchical — I kept running into situations where similarity-based retrieval felt like the wrong tool for the job:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Structure gets flattened.&lt;/strong&gt; A Salesforce knowledge base is a tree: CRM → Sales → Lead Process → Lead Qualification. With chunked retrieval, that hierarchy disappears — every chunk is equally distant from every other chunk.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;No feedback loop.&lt;/strong&gt; Top-k gives you one shot. There's no way for the model to say "this chunk isn't quite right, show me something adjacent." You get your k chunks and that's it.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Context mixing.&lt;/strong&gt; When retrieved chunks come from different branches of knowledge, the model has no way to know they're unrelated. This isn't a RAG-specific flaw — any retrieval method can surface unrelated content — but flat retrieval makes it harder to prevent.&lt;/p&gt;




&lt;h2&gt;
  
  
  The idea: what if the LLM navigated knowledge like a human?
&lt;/h2&gt;

&lt;p&gt;When a domain expert answers a question, they don't search all documents at once. They start from a general area, narrow down, check if they're on the right track, and drill into the specifics.&lt;/p&gt;

&lt;p&gt;TrailGraph gives the LLM this exact workflow:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Pick an entry point&lt;/strong&gt; based on the question (CRM, Security, Integrations)&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Explore&lt;/strong&gt; — see the node's key points, children, and related nodes&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Score&lt;/strong&gt; — assign a relevance score (0–100) and decide where to go next&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Drill down&lt;/strong&gt; — follow the most promising path through children&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Focus&lt;/strong&gt; — when confidence is high enough, retrieve the full content and answer&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;The LLM never sees the full graph. It only sees the local view of the current node — just like navigating a real knowledge base.&lt;/p&gt;




&lt;h2&gt;
  
  
  How it works
&lt;/h2&gt;

&lt;h3&gt;
  
  
  The knowledge graph
&lt;/h3&gt;

&lt;p&gt;Each node is a markdown file with metadata:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight markdown"&gt;&lt;code&gt;&lt;span class="gh"&gt;# Lead Qualification&lt;/span&gt;

summary: Detailed internal rules, stages, criteria and responsibilities
         for lead qualification.
parent: sales/Sales.md
children: []
related: [sales/Opportunity_management.md]
key_points: ["Mandatory data must be complete before qualification starts",
             "Duplicate check is required before conversion",
             "Business fit and ownership must be confirmed"]

&lt;span class="gu"&gt;## Content&lt;/span&gt;

Lead qualification in our company consists of four main steps:
&lt;span class="p"&gt;1.&lt;/span&gt; Verify that all mandatory lead data fields are complete.
&lt;span class="p"&gt;2.&lt;/span&gt; Check whether the company and contact already exist in the system.
&lt;span class="p"&gt;3.&lt;/span&gt; Assess business fit and confirm ownership of the lead.
&lt;span class="p"&gt;4.&lt;/span&gt; Decide the outcome: convert, reject, or request more information.
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The &lt;code&gt;summary&lt;/code&gt; and &lt;code&gt;key_points&lt;/code&gt; are what the LLM sees during exploration. The full &lt;code&gt;## Content&lt;/code&gt; section is only revealed when the model commits to this node as the answer.&lt;/p&gt;

&lt;p&gt;This separation is intentional — it forces the model to navigate rather than skim.&lt;/p&gt;

&lt;h3&gt;
  
  
  The graph structure
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;entry_points/
├── CRM.md ──────────┬── sales/Sales.md
│                     │      ├── Lead_process.md
│                     │      │      └── Lead_qualification.md
│                     │      └── Opportunity_management.md
│                     │             └── Opportunity_stages.md
│                     └── service/Service.md
│                            ├── Case_handling.md
│                            └── Escalation_process.md
├── Security.md
└── Integrations.md
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  The tool
&lt;/h3&gt;

&lt;p&gt;The agent has one tool: &lt;code&gt;get_knowledge_context&lt;/code&gt;. It accepts a node path, a view mode, a score, and a reason.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;run&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;node&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;view&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;exploration&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;score&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;reason&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;""&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
    &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;last_score&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;score&lt;/span&gt;
    &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;score&lt;/span&gt; &lt;span class="o"&gt;&amp;gt;&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;best_score&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;best_score&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;score&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;best_node&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;node&lt;/span&gt;

    &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;score&lt;/span&gt; &lt;span class="o"&gt;&amp;gt;=&lt;/span&gt; &lt;span class="n"&gt;ANSWER_THRESHOLD&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="n"&gt;view&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;focused&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;

    &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;node&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;visited&lt;/span&gt; &lt;span class="ow"&gt;and&lt;/span&gt; &lt;span class="n"&gt;view&lt;/span&gt; &lt;span class="o"&gt;!=&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;focused&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;already_visited&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="bp"&gt;True&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;suggestion&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Explore other candidates.&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;

    &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;visited&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;append&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;node&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;hop_count&lt;/span&gt; &lt;span class="o"&gt;+=&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt;

    &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;view&lt;/span&gt; &lt;span class="o"&gt;==&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;focused&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;disabled&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="bp"&gt;True&lt;/span&gt;

    &lt;span class="n"&gt;result&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;build_node_info&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;node&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;view&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="c1"&gt;# ...
&lt;/span&gt;    &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="n"&gt;result&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Key behaviors:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Two views&lt;/strong&gt;: &lt;code&gt;exploration&lt;/code&gt; returns key points, children, and related nodes. &lt;code&gt;focused&lt;/code&gt; returns the full content.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Score-driven transitions&lt;/strong&gt;: when the model assigns a score &amp;gt;= 95, the tool automatically switches to &lt;code&gt;focused&lt;/code&gt; view — no matter what the model requested.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Self-disabling&lt;/strong&gt;: after returning focused content, the tool disables itself. The model has what it needs; no more graph navigation.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Dead end detection&lt;/strong&gt;: if a node has no children and no related nodes, the tool flags it as a dead end and triggers a fallback.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  The agent loop
&lt;/h3&gt;

&lt;p&gt;The agent itself is completely generic — it knows nothing about knowledge graphs:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="k"&gt;class&lt;/span&gt; &lt;span class="nc"&gt;Agent&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;__init__&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;tools&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;prompt_vars&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="bp"&gt;None&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;verbose&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="bp"&gt;False&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;client&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nc"&gt;OpenRouterClient&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;model&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;os&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;getenv&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;OPENROUTER_MODEL&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;tools&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;tools&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;tool_map&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="n"&gt;tool&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;name&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;tool&lt;/span&gt; &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;tool&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;tools&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;
        &lt;span class="c1"&gt;# ...
&lt;/span&gt;
    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;_step&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;user_input&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
        &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;tool&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;tools&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
            &lt;span class="n"&gt;tool&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;reset&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;

        &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;iteration&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="nf"&gt;range&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;MAX_TOOL_ITERATIONS&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
            &lt;span class="n"&gt;active_tools&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;s&lt;/span&gt; &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;tool&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;s&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="nf"&gt;zip&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;tools&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;tool_schemas&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
                           &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="ow"&gt;not&lt;/span&gt; &lt;span class="n"&gt;tool&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;disabled&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;
            &lt;span class="n"&gt;message&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;client&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;complete&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;messages&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;tools&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;active_tools&lt;/span&gt; &lt;span class="ow"&gt;or&lt;/span&gt; &lt;span class="bp"&gt;None&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

            &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="ow"&gt;not&lt;/span&gt; &lt;span class="n"&gt;tool_calls&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
                &lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Agent: &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;message&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;get&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;content&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;''&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
                &lt;span class="k"&gt;return&lt;/span&gt;

            &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;tool_call&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="n"&gt;tool_calls&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
                &lt;span class="n"&gt;result&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;tool&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;run&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;**&lt;/span&gt;&lt;span class="n"&gt;tool_args&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
                &lt;span class="c1"&gt;# ...
&lt;/span&gt;                &lt;span class="n"&gt;fallback&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;tool&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;should_fallback&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
                &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;fallback&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
                    &lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;fallback&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
                    &lt;span class="k"&gt;return&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;All the intelligence lives in the tool. The agent just runs the loop, passes messages, and checks stop conditions. You could swap &lt;code&gt;GetKnowledgeContext&lt;/code&gt; for a completely different tool and the agent would work the same way.&lt;/p&gt;

&lt;h3&gt;
  
  
  The flow
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;User: "How does lead qualification work?"
  │
  ├─→ LLM selects entry point: CRM.md
  │     → exploration view: sees Sales and Service as children
  │
  ├─→ LLM picks Sales.md (score: 70)
  │     → exploration view: sees Lead_process, Opportunity_management
  │
  ├─→ LLM picks Lead_process.md (score: 80)
  │     → exploration view: sees Lead_qualification as child
  │
  ├─→ LLM picks Lead_qualification.md (score: 98)
  │     → score &amp;gt;= 95 → automatic switch to focused view
  │     → full content returned, tool disables itself
  │
  └─→ LLM writes final answer based on focused content
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;4 hops. Each one narrowing the search space. Graph traversal guided by LLM reasoning.&lt;/p&gt;




&lt;h2&gt;
  
  
  Trade-offs: when does each approach make sense?
&lt;/h2&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;&lt;/th&gt;
&lt;th&gt;RAG&lt;/th&gt;
&lt;th&gt;TrailGraph&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Retrieval&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Semantic similarity (top-k)&lt;/td&gt;
&lt;td&gt;Graph traversal (step by step)&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Structure awareness&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Chunks are independent&lt;/td&gt;
&lt;td&gt;Nodes have parents, children, relations&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Context control&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Fixed (top-k chunks)&lt;/td&gt;
&lt;td&gt;Dynamic — model decides what to explore&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Latency&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Fast (1 query)&lt;/td&gt;
&lt;td&gt;Slower (multiple LLM calls per question)&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Scalability&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Proven at scale with vector DBs&lt;/td&gt;
&lt;td&gt;Depends on graph depth and structure&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Setup complexity&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Moderate (embeddings, vector store)&lt;/td&gt;
&lt;td&gt;Low (just markdown files)&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Best for&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Unstructured docs, broad search&lt;/td&gt;
&lt;td&gt;Structured domains, multi-level knowledge&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;These aren't competing approaches — they solve different problems. RAG shines when you have large volumes of unstructured content and need fast, broad retrieval. Graph traversal shines when knowledge has explicit hierarchy and relationships. The roadmap for TrailGraph actually combines both: RAG for entry point selection, graph traversal for deep navigation.&lt;/p&gt;




&lt;h2&gt;
  
  
  What I learned building this
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Scoring is everything
&lt;/h3&gt;

&lt;p&gt;The system prompt includes a scoring guide that shapes how the model navigates:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;0–59&lt;/strong&gt;: Not relevant — stop exploring this path&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;60–94&lt;/strong&gt;: Partially relevant — keep exploring children&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;95–100&lt;/strong&gt;: Highly relevant — switch to focused view&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Getting these thresholds right took multiple iterations. Set the answer threshold too low (e.g. 85) and the model stops at intermediate nodes like "Sales" instead of drilling into "Lead Qualification." Set it too high and the model never commits.&lt;/p&gt;

&lt;p&gt;The key insight: &lt;strong&gt;only assign high scores to leaf nodes&lt;/strong&gt;. If a node has children, there's always a more specific answer deeper in the graph. This rule is baked into the system prompt.&lt;/p&gt;

&lt;h3&gt;
  
  
  The model will hallucinate paths
&lt;/h3&gt;

&lt;p&gt;Early versions had the model inventing node paths like &lt;code&gt;Integrations/Salesforce_SAP_Integration.md&lt;/code&gt; — files that don't exist. The fix was a hard rule in the system prompt:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Never invent node paths. Only navigate to nodes explicitly listed in &lt;code&gt;children&lt;/code&gt; or &lt;code&gt;related&lt;/code&gt; of a previous tool response.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Combined with dead end detection in the tool (no children + no related = fallback), this eliminated path hallucination entirely.&lt;/p&gt;




&lt;h2&gt;
  
  
  What's next
&lt;/h2&gt;

&lt;p&gt;TrailGraph is functional but minimal. Here's what's on the roadmap:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Near-term:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Multi-path exploration&lt;/strong&gt; — instead of following a single path, explore the top N candidates in parallel (beam search)&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Scoring module&lt;/strong&gt; — extract scoring logic into a dedicated module to make strategies swappable&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Longer-term:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Document parser&lt;/strong&gt; — a tool that converts external documents (PDF, DOCX, Confluence) into the &lt;code&gt;.md&lt;/code&gt; node format, making it easier to populate the graph&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;SQLite metadata store&lt;/strong&gt; — store node metadata (summary, children, related, key_points) in a database while keeping full content in markdown files. Graph traversal happens against the DB; file I/O only on &lt;code&gt;view=focused&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;RAG-based entry point selection&lt;/strong&gt; — replace the fixed entry point list with a semantic search step. When a question arrives, RAG returns the top N most relevant nodes as candidate entry points. If the graph traversal from candidate &lt;em&gt;i&lt;/em&gt; scores below threshold, automatically fall back to candidate &lt;em&gt;i+1&lt;/em&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Multi-query decomposition&lt;/strong&gt; — split complex questions into sub-questions, run each through the full traversal independently, and consolidate into a single answer&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The interesting part is that RAG and graph traversal aren't mutually exclusive. The roadmap leads to a hybrid: RAG for entry point selection, graph traversal for deep navigation.&lt;/p&gt;




&lt;h2&gt;
  
  
  Try it
&lt;/h2&gt;

&lt;p&gt;TrailGraph is open source. The knowledge base covers Salesforce sales and service processes, but the architecture works for any structured domain.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;git clone https://github.com/panhiszpandev/TrailGraph.git
&lt;span class="nb"&gt;cd &lt;/span&gt;TrailGraph
python &lt;span class="nt"&gt;-m&lt;/span&gt; venv .venv &lt;span class="o"&gt;&amp;amp;&amp;amp;&lt;/span&gt; &lt;span class="nb"&gt;source&lt;/span&gt; .venv/bin/activate
pip &lt;span class="nb"&gt;install&lt;/span&gt; &lt;span class="nt"&gt;-r&lt;/span&gt; requirements.txt
&lt;span class="nb"&gt;cp&lt;/span&gt; .env.example .env
&lt;span class="c"&gt;# Add your OpenRouter API key&lt;/span&gt;
python main.py &lt;span class="nt"&gt;--verbose&lt;/span&gt; &lt;span class="nt"&gt;--task&lt;/span&gt; &lt;span class="s2"&gt;"How does lead qualification work?"&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Watch the verbose output. You'll see the agent hop through the graph, scoring each node, narrowing down step by step — the way a human would navigate knowledge.&lt;/p&gt;




&lt;h2&gt;
  
  
  Acknowledgments
&lt;/h2&gt;

&lt;p&gt;This project was inspired by ideas from the &lt;a href="https://www.aidevs.pl/" rel="noopener noreferrer"&gt;AI_devs&lt;/a&gt; course, which pushed me to think beyond standard RAG patterns and explore agent-based architectures.&lt;/p&gt;




&lt;p&gt;&lt;em&gt;TrailGraph is a side project. Feedback, ideas, and PRs are welcome.&lt;br&gt;
&lt;a href="https://github.com/panhiszpandev/TrailGraph" rel="noopener noreferrer"&gt;GitHub&lt;/a&gt;&lt;/em&gt;&lt;/p&gt;

</description>
      <category>ai</category>
      <category>llm</category>
      <category>rag</category>
      <category>agents</category>
    </item>
  </channel>
</rss>
