DEV Community

Cover image for Grom — Free, Open-Source AI Coding Assistant for VS Code (Ollama, LM Studio, Anthropic, and More)
Ryan
Ryan

Posted on

Grom — Free, Open-Source AI Coding Assistant for VS Code (Ollama, LM Studio, Anthropic, and More)

Grom — Free, Open-Source AI Coding Assistant for VS Code (Ollama, LM Studio, Anthropic, and More)

I've been building Grom, a free and open-source VS Code extension that brings agentic AI coding to your machine. No telemetry, no mandatory account, no subscription. If you use Ollama or LM Studio, nothing ever leaves your machine.


What is it?

Grom is a chat + agentic coding extension that lives in the VS Code sidebar. You can talk to it like a regular AI assistant, or switch it into BUILD mode where it reads files, writes code, searches your codebase, and runs terminal commands autonomously — pausing for your approval before anything destructive happens.


Key features

Chat

  • Streaming chat with PLAN and BUILD modes
  • Multiple sessions, compact history, export to Markdown
  • Persistent memory — custom instructions injected into every chat
  • Per-session system prompt override

Agentic loop

  • Built-in file tools: read, write, delete, search, list directory, run terminal
  • MCP (Model Context Protocol) server support
  • Per-action approval for destructive operations — nothing gets written or deleted without you seeing it first
  • Diff-aware undo — after the agent writes files, an undo button appears so you can revert any or all of the changes
  • Task log showing every tool call with args and results

RAG

  • Your codebase is automatically indexed with BM25 + optional semantic embeddings via Ollama
  • Relevant files are attached to every message without you having to think about it

Inline autocomplete

  • Ghost-text completions as you type
  • Adaptive debounce — slows down automatically when you rarely accept suggestions
  • Word-by-word partial accept
  • Per-language model routing

@ Context mentions

  • @filename — attach any workspace file
  • @selection — currently selected code in the editor
  • @git — your uncommitted diff
  • @terminal — recent terminal output
  • @problems — all VS Code errors and warnings
  • @url:https://... — fetch and attach a web page
  • @docs — search indexed documentation sources

Providers
Works with pretty much everything:

  • Local: Ollama, LM Studio, Open Code
  • Cloud: Anthropic (Claude), OpenAI (GPT-4o), Groq, Mistral, Gemini
  • Custom: any OpenAI-compatible or Anthropic-compatible endpoint

API keys are stored in the OS keychain — never in settings.json.


Why I built it

I wanted something like Cursor or GitHub Copilot but with full control over where my code goes. Most existing extensions either require a cloud subscription, send your code to a third party by default, or don't support local models well. Grom is designed so that local-first is the default, and cloud providers are opt-in extras.


Quick start

  1. Install from the VS Code Marketplace
  2. Install Ollama and run ollama pull qwen2.5-coder
  3. Open the Grom panel from the activity bar and start chatting

For cloud providers, select one from the dropdown and paste your API key when prompted — it gets stored in the OS keychain immediately.


Links

It's early days — v0.3.5 is out now. Would love feedback, bug reports, or feature requests via GitHub Discussions.

Top comments (0)