As developers, we've all been there: you join a new project or inherit a legacy "spaghetti" codebase, and it takes days (or weeks) just to understand the architecture.
With the rise of LLMs, we thought the problem was solved. But then came the "Context Window Fatigue":
- Sending a whole repo to a cloud AI is expensive.
- Uploading proprietary code to a third-party server is a privacy nightmare.
- Most AI assistants don't "see" the big picture (the architecture).
That's why I spent the last few months building Carto Explorer.
The Concept: Local-First Intelligence
I wanted a tool that lived on my machine, indexed my code in seconds, and only talked to the AI when it truly understood the context.
The Tech Stack
- Backend: Rust (using Tauri v2). I chose Rust for its raw performance in file indexing and safety.
- Frontend: React with Tailwind CSS.
- Visuals: React Flow for the interactive architecture maps.
- AI: Gemini 2.0 (Pro & Flash) via user-provided API keys.
How it works (The Technical Bits)
1. High-Speed Local Indexing
Using Rust, Carto scans thousands of files in seconds. It extracts imports, exports, routes, and logic definitions. This creates a "structural map" that stays 100% on your machine.
2. Interactive Architecture Mapping (C4-Level)
Instead of just a file tree, Carto generates a visual graph. It groups files into logical domains (like "auth-service" or "database-layer"), allowing you to see dependencies at a glance.
3. Smart RAG (Reducing Token Costs)
This is the part I'm most proud of. Instead of sending the whole file to the LLM, Carto's engine surgically selects only the relevant snippets based on the architectural map. In my tests, this reduces token consumption by up to 80%.
Lessons Learned building with Tauri v2
Building a commercial-grade desktop app with Tauri v2 was a journey.
The security model is tight, which is great for a privacy-focused tool. Dealing with local SQLite databases and real-time file watching in Rust was challenging but the performance gains over Electron are massive. Carto uses barely 150MB of RAM, even with large repos.
Open for Feedback!
I’m a solo developer from Colombia 🇨🇴, and I’ve just launched the v1.0 stable of Carto Explorer.
I’d love to hear from the community:
- How are you handling codebase onboarding today?
- Do you prefer local-first tools over cloud-based AI assistants?
You can check out the project here: https://www.cartolabs.io
Happy coding! 🚀

Top comments (0)