Hey dev community!
Like many of you, I've been riding the AI wave, trying to integrate LLMs into my daily coding workflow. And while tools like GitHub Copilot and ChatGPT are incredibly powerful, I often felt like I was fighting them. I was spending more time crafting the perfect prompt and copy-pasting context than actually coding.
I wanted AI to be a predictable, configurable, and code-aware tool that adapted to my workflow, not the other way around. That's why I built Clarion.
The Clarion Philosophy: A Garage, Not a Race Car
Many AI development tools are like buying a high-performance, pre-built race car. They're fast and ready to go for specific, common tasks.
Clarion, however, is like being given a professional garage with an engine, chassis, and a massive toolkit. It empowers you to build a race car, a dragster, or an off-road vehicle tailored perfectly to your exact specifications.
It's an open-source, AI-powered co-development platform that lets you build, customize, and deploy specialized AI agents for any coding task.
From Frustration to Features
I built Clarion to solve the key problems I faced when using general-purpose AI for development:
1. Problem: Unreliable, Unpredictable Outputs
Ever ask an AI to generate a JSON object and get back a markdown-formatted paragraph instead? It's frustrating and breaks any attempt at automation.
Clarion's Solution: Predictable Structured Output
Clarion lets you define a strict JSON schema for your agent's response. Whether you use the visual builder or write the schema yourself, the agent is constrained to return a reliable, parseable output every single time.
{
"type": "object",
"properties": {
"summary": {
"type": "string",
"description": "A summary of the file changes to be performed."
},
"file_changes": {
"type": "array",
"items": {
"type": "object",
"properties": {
"action": { "enum": ["create", "modify", "delete"] },
"path": { "type": "string" },
"new_content": { "type": "string" }
}
}
}
}
}
An example schema for file operations.
2. Problem: The Nightmare of Context Management
Manually copy-pasting files into a prompt window is slow, error-prone, and hits context limits fast.
Clarion's Solution: Granular Codebase Context
Clarion gives you precise control over the AI's context. Using powerful glob patterns, you can define exactly which files and directories are included or excluded. It even provides a real-time preview of the files that will be sent to the LLM, so you know exactly what the agent sees.
# Example of include/exclude globs
includeGlobs:
- "src/**/*.tsx"
- "server/routes/*.go"
excludeGlobs:
- "**/*.test.ts"
- "**/node_modules/**"
3. Problem: Generic AI is... Well, Generic
One-size-fits-all system prompts lead to one-size-fits-all results. For specialized tasks like writing tests, refactoring code, or generating documentation, you need a specialist.
Clarion's Solution: Customizable AI Agents
You can create, manage, and customize an entire team of AI agents. Each agent has its own unique system prompt, context filters, output schema, and LLM configuration. You can build a "React Component Specialist," a "Go Test Writer," or a "Python Docstring Expert"—whatever your workflow demands.
Under the Hood: The Tech Stack
For those curious about the internals, Clarion is a local-first desktop application:
- Backend: Written in Go, chosen for its performance, concurrency, and robust standard library. It handles all the file system interactions, AI orchestration, and serves a REST API to the frontend.
- Frontend: Built with Tauri, React, TypeScript, and styled with Tailwind CSS. Tauri allows us to create a lightweight, secure, and performant desktop app using web technologies.
Get Started and Build Your Own AI Agents
Clarion is open-source (Apache 2.0) and ready for you to try.
-
Clone the repo:
git clone https://github.com/ClarionDev/clarion.git cd clarion
-
Install dependencies:
# Install Go and Node.js first! go mod tidy cd clarion-frontend && npm install
-
Run the app(backend):
# In the project root go run main.go
-
Run the app(frontend):
# In another terminal, from clarion-frontend/ npm run tauri dev
Let's Build a Better AI Workflow
My goal with Clarion is to transform LLMs from powerful but unreliable text generators into predictable, configurable tools for developers. If that sounds like something you've been looking for, I'd love for you to check it out.
⭐ Star us on GitHub: https://github.com/ClarionDev/clarion
Contributions, feedback, and ideas are always welcome. Let me know what you think!
Top comments (0)