<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Uwe c</title>
    <description>The latest articles on DEV Community by Uwe c (@uwe_c_39d9ab7d16ff8dfe67e).</description>
    <link>https://dev.to/uwe_c_39d9ab7d16ff8dfe67e</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/uwe_c_39d9ab7d16ff8dfe67e"/>
    <language>en</language>
    <item>
      <title>Screenshots cost 25,000 tokens. I got it down to 750.</title>
      <dc:creator>Uwe c</dc:creator>
      <pubDate>Mon, 09 Mar 2026 12:41:57 +0000</pubDate>
      <link>https://dev.to/uwe_c_39d9ab7d16ff8dfe67e/screenshots-cost-25000-tokens-i-got-it-down-to-750-3p8d</link>
      <guid>https://dev.to/uwe_c_39d9ab7d16ff8dfe67e/screenshots-cost-25000-tokens-i-got-it-down-to-750-3p8d</guid>
      <description>&lt;p&gt;A single screenshot in an AI conversation eats &lt;strong&gt;25,000 tokens&lt;/strong&gt;. That's more than most code searches, file reads, and tool calls combined.&lt;/p&gt;

&lt;p&gt;I've been building &lt;a href="https://github.com/CSCSoftware/AiDex" rel="noopener noreferrer"&gt;AiDex&lt;/a&gt;, an MCP server that gives AI coding assistants a persistent code index. But screenshots kept blowing up the context — even after I'd optimized everything else.&lt;/p&gt;

&lt;p&gt;So I asked myself: &lt;strong&gt;does an AI really need 16 million colors to read an error message?&lt;/strong&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  The answer is no
&lt;/h2&gt;

&lt;p&gt;Most screenshots in AI context are for reading text — error messages, stack traces, UI labels, terminal output. Black and white at half resolution is plenty.&lt;/p&gt;

&lt;p&gt;v1.13 adds two new parameters to &lt;code&gt;aidex_screenshot&lt;/code&gt;:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;aidex_screenshot({ scale: 0.5, colors: 2 })
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;That's it. Half resolution, two colors.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Result: 108 KB → 5 KB. 95% reduction.&lt;/strong&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  The numbers
&lt;/h2&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;&lt;/th&gt;
&lt;th&gt;Raw Screenshot&lt;/th&gt;
&lt;th&gt;Optimized&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;File size&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;100-500 KB&lt;/td&gt;
&lt;td&gt;5-15 KB&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Tokens&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;5,000-25,000&lt;/td&gt;
&lt;td&gt;250-750&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Colors&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;16 million&lt;/td&gt;
&lt;td&gt;2 (black &amp;amp; white)&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Text readable?&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Yes&lt;/td&gt;
&lt;td&gt;Yes&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;The text stays perfectly readable. You lose the pretty colors, but the AI doesn't care about your VS Code theme.&lt;/p&gt;

&lt;h2&gt;
  
  
  The clever part: AI decides the quality
&lt;/h2&gt;

&lt;p&gt;The tool description tells AI assistants to optimize automatically:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Start aggressive&lt;/strong&gt; — &lt;code&gt;scale: 0.5, colors: 2&lt;/code&gt; (B&amp;amp;W, half size)&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Can't read it?&lt;/strong&gt; — Retry with &lt;code&gt;colors: 16&lt;/code&gt; (adds shading)&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Still unclear?&lt;/strong&gt; — &lt;code&gt;scale: 0.75&lt;/code&gt; or full color&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Remember&lt;/strong&gt; — Cache what works per app for the session&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;So the AI learns: "Terminal screenshots work fine in B&amp;amp;W, but that Figma mockup needs 256 colors." No configuration needed.&lt;/p&gt;

&lt;h2&gt;
  
  
  Available options
&lt;/h2&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Parameter&lt;/th&gt;
&lt;th&gt;Values&lt;/th&gt;
&lt;th&gt;Use case&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;&lt;code&gt;scale&lt;/code&gt;&lt;/td&gt;
&lt;td&gt;0.1 - 1.0&lt;/td&gt;
&lt;td&gt;Resolution. Most HiDPI screens are 2-3x anyway.&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;code&gt;colors: 2&lt;/code&gt;&lt;/td&gt;
&lt;td&gt;B&amp;amp;W&lt;/td&gt;
&lt;td&gt;Text, terminals, logs&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;code&gt;colors: 4&lt;/code&gt;&lt;/td&gt;
&lt;td&gt;4 shades&lt;/td&gt;
&lt;td&gt;Text with light UI elements&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;code&gt;colors: 16&lt;/code&gt;&lt;/td&gt;
&lt;td&gt;16 colors&lt;/td&gt;
&lt;td&gt;Full UI, icons visible&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;code&gt;colors: 256&lt;/code&gt;&lt;/td&gt;
&lt;td&gt;256 colors&lt;/td&gt;
&lt;td&gt;Near-photographic quality&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;h2&gt;
  
  
  Cross-platform
&lt;/h2&gt;

&lt;p&gt;Works on Windows (PowerShell + System.Drawing), macOS (sips + ImageMagick), and Linux (ImageMagick). The screenshot is captured at full quality first, then post-processed — so you never lose data.&lt;/p&gt;

&lt;h2&gt;
  
  
  Part of a bigger picture
&lt;/h2&gt;

&lt;p&gt;This is part of &lt;a href="https://github.com/CSCSoftware/AiDex" rel="noopener noreferrer"&gt;AiDex&lt;/a&gt;, an MCP server with 27 tools for AI coding assistants. The core idea: &lt;strong&gt;stop wasting tokens on navigation.&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Code search&lt;/strong&gt; — 50 tokens instead of 2,000+ (vs grep)&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Method signatures&lt;/strong&gt; — See all methods without reading the file&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Global search&lt;/strong&gt; — Search across ALL your projects at once (v1.11)&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Zero-config&lt;/strong&gt; — &lt;code&gt;npm install -g aidex-mcp&lt;/code&gt; and it just works (v1.12)&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Screenshot optimization&lt;/strong&gt; — 95% smaller images for LLM context (v1.13)&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Try it
&lt;/h2&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;npm &lt;span class="nb"&gt;install&lt;/span&gt; &lt;span class="nt"&gt;-g&lt;/span&gt; aidex-mcp
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;That's literally it. Auto-setup detects your AI tools (Claude Code, Claude Desktop, Cursor, Windsurf, Gemini CLI, VS Code Copilot) and registers AiDex.&lt;/p&gt;

&lt;p&gt;Then ask your AI to take a screenshot:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;aidex_screenshot({ mode: "active_window", scale: 0.5, colors: 2 })
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;






&lt;p&gt;&lt;strong&gt;GitHub:&lt;/strong&gt; &lt;a href="https://github.com/CSCSoftware/AiDex" rel="noopener noreferrer"&gt;CSCSoftware/AiDex&lt;/a&gt; | &lt;strong&gt;npm:&lt;/strong&gt; &lt;a href="https://www.npmjs.com/package/aidex-mcp" rel="noopener noreferrer"&gt;aidex-mcp&lt;/a&gt; | MIT licensed&lt;/p&gt;

&lt;p&gt;What's your experience with screenshot token costs? I'd love to hear if you've found other ways to reduce context usage.&lt;/p&gt;

</description>
      <category>ai</category>
      <category>showdev</category>
      <category>opensource</category>
      <category>productivity</category>
    </item>
    <item>
      <title>[Boost]</title>
      <dc:creator>Uwe c</dc:creator>
      <pubDate>Tue, 03 Feb 2026 16:55:44 +0000</pubDate>
      <link>https://dev.to/uwe_c_39d9ab7d16ff8dfe67e/-oo3</link>
      <guid>https://dev.to/uwe_c_39d9ab7d16ff8dfe67e/-oo3</guid>
      <description>&lt;p&gt;

&lt;/p&gt;
&lt;div class="ltag__link--embedded"&gt;
  &lt;div class="crayons-story "&gt;
  &lt;a href="https://dev.to/uwe_c_39d9ab7d16ff8dfe67e/how-i-cut-ai-context-usage-by-50x-with-a-tree-sitter-code-index-plm" class="crayons-story__hidden-navigation-link"&gt;How I cut AI context usage by 50x with a Tree-sitter code index&lt;/a&gt;


  &lt;div class="crayons-story__body crayons-story__body-full_post"&gt;
    &lt;div class="crayons-story__top"&gt;
      &lt;div class="crayons-story__meta"&gt;
        &lt;div class="crayons-story__author-pic"&gt;

          &lt;a href="/uwe_c_39d9ab7d16ff8dfe67e" class="crayons-avatar  crayons-avatar--l  "&gt;
            &lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Fuser%2Fprofile_image%2F3748851%2F39f148c8-4549-44ef-bfb1-9973f141cd71.png" alt="uwe_c_39d9ab7d16ff8dfe67e profile" class="crayons-avatar__image"&gt;
          &lt;/a&gt;
        &lt;/div&gt;
        &lt;div&gt;
          &lt;div&gt;
            &lt;a href="/uwe_c_39d9ab7d16ff8dfe67e" class="crayons-story__secondary fw-medium m:hidden"&gt;
              Uwe c
            &lt;/a&gt;
            &lt;div class="profile-preview-card relative mb-4 s:mb-0 fw-medium hidden m:inline-block"&gt;
              
                Uwe c
                
              
              &lt;div id="story-author-preview-content-3222663" class="profile-preview-card__content crayons-dropdown branded-7 p-4 pt-0"&gt;
                &lt;div class="gap-4 grid"&gt;
                  &lt;div class="-mt-4"&gt;
                    &lt;a href="/uwe_c_39d9ab7d16ff8dfe67e" class="flex"&gt;
                      &lt;span class="crayons-avatar crayons-avatar--xl mr-2 shrink-0"&gt;
                        &lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Fuser%2Fprofile_image%2F3748851%2F39f148c8-4549-44ef-bfb1-9973f141cd71.png" class="crayons-avatar__image" alt=""&gt;
                      &lt;/span&gt;
                      &lt;span class="crayons-link crayons-subtitle-2 mt-5"&gt;Uwe c&lt;/span&gt;
                    &lt;/a&gt;
                  &lt;/div&gt;
                  &lt;div class="print-hidden"&gt;
                    
                      Follow
                    
                  &lt;/div&gt;
                  &lt;div class="author-preview-metadata-container"&gt;&lt;/div&gt;
                &lt;/div&gt;
              &lt;/div&gt;
            &lt;/div&gt;

          &lt;/div&gt;
          &lt;a href="https://dev.to/uwe_c_39d9ab7d16ff8dfe67e/how-i-cut-ai-context-usage-by-50x-with-a-tree-sitter-code-index-plm" class="crayons-story__tertiary fs-xs"&gt;&lt;time&gt;Feb 2&lt;/time&gt;&lt;span class="time-ago-indicator-initial-placeholder"&gt;&lt;/span&gt;&lt;/a&gt;
        &lt;/div&gt;
      &lt;/div&gt;

    &lt;/div&gt;

    &lt;div class="crayons-story__indention"&gt;
      &lt;h2 class="crayons-story__title crayons-story__title-full_post"&gt;
        &lt;a href="https://dev.to/uwe_c_39d9ab7d16ff8dfe67e/how-i-cut-ai-context-usage-by-50x-with-a-tree-sitter-code-index-plm" id="article-link-3222663"&gt;
          How I cut AI context usage by 50x with a Tree-sitter code index
        &lt;/a&gt;
      &lt;/h2&gt;
        &lt;div class="crayons-story__tags"&gt;
            &lt;a class="crayons-tag crayons-tag--filled  " href="/t/showdev"&gt;&lt;span class="crayons-tag__prefix"&gt;#&lt;/span&gt;showdev&lt;/a&gt;
            &lt;a class="crayons-tag  crayons-tag--monochrome " href="/t/ai"&gt;&lt;span class="crayons-tag__prefix"&gt;#&lt;/span&gt;ai&lt;/a&gt;
            &lt;a class="crayons-tag  crayons-tag--monochrome " href="/t/opensource"&gt;&lt;span class="crayons-tag__prefix"&gt;#&lt;/span&gt;opensource&lt;/a&gt;
            &lt;a class="crayons-tag  crayons-tag--monochrome " href="/t/productivity"&gt;&lt;span class="crayons-tag__prefix"&gt;#&lt;/span&gt;productivity&lt;/a&gt;
        &lt;/div&gt;
      &lt;div class="crayons-story__bottom"&gt;
        &lt;div class="crayons-story__details"&gt;
          &lt;a href="https://dev.to/uwe_c_39d9ab7d16ff8dfe67e/how-i-cut-ai-context-usage-by-50x-with-a-tree-sitter-code-index-plm" class="crayons-btn crayons-btn--s crayons-btn--ghost crayons-btn--icon-left"&gt;
            &lt;div class="multiple_reactions_aggregate"&gt;
              &lt;span class="multiple_reactions_icons_container"&gt;
                  &lt;span class="crayons_icon_container"&gt;
                    &lt;img src="https://assets.dev.to/assets/sparkle-heart-5f9bee3767e18deb1bb725290cb151c25234768a0e9a2bd39370c382d02920cf.svg" width="18" height="18"&gt;
                  &lt;/span&gt;
              &lt;/span&gt;
              &lt;span class="aggregate_reactions_counter"&gt;2&lt;span class="hidden s:inline"&gt; reactions&lt;/span&gt;&lt;/span&gt;
            &lt;/div&gt;
          &lt;/a&gt;
            &lt;a href="https://dev.to/uwe_c_39d9ab7d16ff8dfe67e/how-i-cut-ai-context-usage-by-50x-with-a-tree-sitter-code-index-plm#comments" class="crayons-btn crayons-btn--s crayons-btn--ghost crayons-btn--icon-left flex items-center"&gt;
              Comments


              2&lt;span class="hidden s:inline"&gt; comments&lt;/span&gt;
            &lt;/a&gt;
        &lt;/div&gt;
        &lt;div class="crayons-story__save"&gt;
          &lt;small class="crayons-story__tertiary fs-xs mr-2"&gt;
            3 min read
          &lt;/small&gt;
            
              &lt;span class="bm-initial"&gt;
                

              &lt;/span&gt;
              &lt;span class="bm-success"&gt;
                

              &lt;/span&gt;
            
        &lt;/div&gt;
      &lt;/div&gt;
    &lt;/div&gt;
  &lt;/div&gt;
&lt;/div&gt;

&lt;/div&gt;




</description>
      <category>ai</category>
      <category>showdev</category>
      <category>opensource</category>
      <category>productivity</category>
    </item>
    <item>
      <title>How I cut AI context usage by 50x with a Tree-sitter code index</title>
      <dc:creator>Uwe c</dc:creator>
      <pubDate>Mon, 02 Feb 2026 19:45:36 +0000</pubDate>
      <link>https://dev.to/uwe_c_39d9ab7d16ff8dfe67e/how-i-cut-ai-context-usage-by-50x-with-a-tree-sitter-code-index-plm</link>
      <guid>https://dev.to/uwe_c_39d9ab7d16ff8dfe67e/how-i-cut-ai-context-usage-by-50x-with-a-tree-sitter-code-index-plm</guid>
      <description>&lt;p&gt;I watched Claude burn through half its context window just to find where a function is defined.&lt;/p&gt;

&lt;p&gt;Ask any AI coding assistant "Where is PlayerHealth defined?" and here's what happens:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;It runs grep "PlayerHealth" → 200 matches across 40 files&lt;/li&gt;
&lt;li&gt;It reads File1.cs, File2.cs, File3.cs...&lt;/li&gt;
&lt;li&gt;2000+ tokens gone, 5+ tool calls — for a simple lookup&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Do that 10 times and your context window is toast. Not from coding — from navigation.&lt;/p&gt;

&lt;p&gt;grep is the wrong tool for AI&lt;/p&gt;

&lt;p&gt;Think about it: grep searches text. Search for log and you'll match catalog, logarithm, blog, every comment mentioning&lt;br&gt;
   "log", and every string containing it. The AI has to read through all that noise to find the actual log function.&lt;/p&gt;

&lt;p&gt;What if the AI could just query an index instead?&lt;/p&gt;

&lt;p&gt;AiDex: one query, exact answer&lt;/p&gt;

&lt;p&gt;I built &lt;a href="https://github.com/CSCSoftware/AiDex" rel="noopener noreferrer"&gt;https://github.com/CSCSoftware/AiDex&lt;/a&gt; — an MCP server that pre-indexes your codebase using Tree-sitter and&lt;br&gt;
  gives AI assistants instant access.&lt;/p&gt;

&lt;p&gt;Before (grep):&lt;br&gt;
  grep "PlayerHealth" → 200 matches, AI reads 5 files&lt;br&gt;
  → 2000+ tokens, 5 tool calls, 10+ seconds&lt;/p&gt;

&lt;p&gt;After (AiDex):&lt;br&gt;
  aidex_query({ term: "PlayerHealth" })&lt;br&gt;
  → Engine.cs:45, Player.cs:23, UI.cs:156&lt;br&gt;
  → ~50 tokens, 1 tool call, 3ms&lt;/p&gt;

&lt;p&gt;50x less context. One call instead of five.&lt;/p&gt;

&lt;p&gt;Tree-sitter parses your code into an AST — it knows what's a function, what's a class, what's a variable. AiDex&lt;br&gt;
  indexes only identifiers. So log finds only log, not catalog.&lt;/p&gt;

&lt;p&gt;What you get&lt;/p&gt;

&lt;p&gt;Instant code search — find any function, class, or variable:&lt;br&gt;
  aidex_query({ term: "render", mode: "starts_with" })&lt;br&gt;
  → renderFrame (engine.ts:45), renderUI (app.ts:120)&lt;/p&gt;

&lt;p&gt;Method signatures — see all methods without reading the file:&lt;br&gt;
  aidex_signature({ file: "src/engine.ts" })&lt;br&gt;
  → class Engine { ... }&lt;br&gt;
  → function renderFrame(delta: number): void&lt;/p&gt;

&lt;p&gt;Time-based filtering — "what changed in the last 2 hours?":&lt;br&gt;
  aidex_query({ term: "render", modified_since: "2h" })&lt;/p&gt;

&lt;p&gt;Cross-project search, session notes that persist between chats, a task backlog for tracking TODOs and bugs, and an&lt;br&gt;
  interactive browser viewer at localhost:3333.&lt;/p&gt;

&lt;p&gt;🆕 New since v1.11: Global Search, Zero-Config, Screenshot Optimization&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Global Search (v1.11)&lt;/strong&gt; — Search across ALL your projects at once:&lt;br&gt;
  aidex_global_query({ term: "TransparentWindow", mode: "contains" })&lt;br&gt;
  → Found in: LibWebAppGpu (3 hits), DebugViewer (1 hit)&lt;/p&gt;

&lt;p&gt;Perfect for "Have I ever written X?" — one call searches 150+ projects.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Zero-Config Setup (v1.12)&lt;/strong&gt; — Just install, everything auto-configures:&lt;br&gt;
  npm install -g aidex-mcp&lt;br&gt;
  That's it. No more manual setup. Auto-detects Claude Code, Claude Desktop, Cursor, Windsurf, Gemini CLI, VS Code Copilot and registers AiDex with all of them.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Screenshot Optimization (v1.13)&lt;/strong&gt; — Screenshots 95% smaller for LLM context:&lt;br&gt;
  aidex_screenshot({ scale: 0.5, colors: 2 })&lt;br&gt;
  → 108 KB → 5 KB (95% saved!)&lt;/p&gt;

&lt;p&gt;Most screenshots in AI context are for reading text — error messages, logs, UI labels. You don't need 16 million colors for that. New &lt;code&gt;scale&lt;/code&gt; and &lt;code&gt;colors&lt;/code&gt; parameters reduce file size dramatically while keeping text readable. The AI starts with aggressive settings (B&amp;amp;W, half size), retries with more colors if unreadable, and remembers what works per app.&lt;/p&gt;

&lt;p&gt;| | Raw Screenshot | Optimized (scale=0.5, colors=2) |&lt;br&gt;
  |---|---|---|&lt;br&gt;
  | File size | ~100-500 KB | ~5-15 KB |&lt;br&gt;
  | Tokens consumed | ~5,000-25,000 | ~250-750 |&lt;br&gt;
  | Text readable? | Yes | Yes |&lt;/p&gt;

&lt;p&gt;Performance&lt;br&gt;
  ┌─────────┬───────┬────────────┬────────────┐&lt;br&gt;
  │ Project │ Files │ Index time │ Query time │&lt;br&gt;
  ├─────────┼───────┼────────────┼────────────┤&lt;br&gt;
  │ Small   │ ~20   │ &amp;lt;1s        │ 1-5ms      │&lt;br&gt;
  ├─────────┼───────┼────────────┼────────────┤&lt;br&gt;
  │ Medium  │ ~100  │ &amp;lt;1s        │ 1-5ms      │&lt;br&gt;
  ├─────────┼───────┼────────────┼────────────┤&lt;br&gt;
  │ Large   │ ~500+ │ ~2s        │ 1-10ms     │&lt;br&gt;
  └─────────┴───────┴────────────┴────────────┘&lt;br&gt;
  Single SQLite file. No cloud, no telemetry. Everything runs locally.&lt;/p&gt;

&lt;p&gt;11 languages: C#, TypeScript, JavaScript, Rust, Python, C, C++, Java, Go, PHP, Ruby&lt;/p&gt;

&lt;p&gt;Setup in 10 seconds&lt;/p&gt;

&lt;p&gt;npm install -g aidex-mcp&lt;/p&gt;

&lt;p&gt;That's it. Auto-setup detects your AI tools and registers AiDex with them. Works with Claude Code, Claude Desktop, Cursor,&lt;br&gt;
  Windsurf, Gemini CLI, VS Code Copilot, and anything else that speaks MCP.&lt;/p&gt;

&lt;p&gt;How it works&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Tree-sitter parses each file into an AST&lt;/li&gt;
&lt;li&gt;Extractor walks the AST, collects identifiers + method signatures&lt;/li&gt;
&lt;li&gt;SQLite stores everything (WAL mode, fast reads)&lt;/li&gt;
&lt;li&gt;MCP server exposes 27 tools via stdio transport&lt;/li&gt;
&lt;li&gt;Incremental updates — only changed files get re-indexed&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;The bottom line&lt;/p&gt;

&lt;p&gt;AI assistants are getting better at writing code but still waste most of their context on finding code. Persistent&lt;br&gt;
  indexing fixes that. The AI gets instant, precise answers and spends its tokens on what matters — actually building&lt;br&gt;
  things.&lt;/p&gt;

&lt;p&gt;I've been using it daily for months. The difference is immediately noticeable, especially on longer sessions.&lt;/p&gt;

&lt;p&gt;Give it a try and let me know what you think — what's your biggest pain point with AI context usage?&lt;/p&gt;

&lt;p&gt;GitHub: &lt;a href="https://github.com/CSCSoftware/AiDex" rel="noopener noreferrer"&gt;https://github.com/CSCSoftware/AiDex&lt;/a&gt;&lt;br&gt;
  npm: npm install -g aidex-mcp&lt;br&gt;
  MCP Registry: io.github.CSCSoftware/aidex&lt;/p&gt;




&lt;p&gt;Open source, MIT licensed. 27 tools, 11 languages. Contributions welcome.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Updated March 2026: Added Global Search (v1.11), Zero-Config Setup (v1.12), and Screenshot Optimization (v1.13).&lt;/em&gt;&lt;/p&gt;

</description>
      <category>ai</category>
      <category>showdev</category>
      <category>opensource</category>
      <category>productivity</category>
    </item>
  </channel>
</rss>
