DEV Community

Cover image for I Built an AI README Generator Inside VS Code (BYOK)
freerave
freerave

Posted on

I Built an AI README Generator Inside VS Code (BYOK)

Let's be honest: Writing documentation is the last thing any developer wants to do. After spending weeks architecting a project, squashing bugs, and writing beautiful code, staring at a blank README.md file feels like a chore.

I wanted to fix this. Not by switching to another browser tab to chat with an AI, but by building the AI directly into the editor.

Today, I’m excited to announce the release of DotReadme v1.1.0, a VS Code extension that writes, audits, and formats your documentation for you.


What is DotReadme?

DotReadme is the ultimate documentation optimizer for VS Code. It features a Live Multi-Platform Simulator and a Quality Audit Score. But with the v1.1.0 update, I've introduced Context-Aware AI and a Smart TOC Generator.

1. Auto-Generate Missing Sections

Forgot an "Installation" or "Usage" guide? Highlight an empty space, right-click, and select Generate Missing Section. The AI reads your project's context and generates perfect Markdown.

2. Rewrite for Tone

If your text sounds too casual, select it, choose Rewrite for Tone -> Professional, and watch it transform instantly.

3. 1-Click Table of Contents

Writing TOCs manually with GitHub anchor links is painful. DotReadme now does this in one click.

Here is how the generated TOC looks under the hood. It uses hidden HTML comments to ensure it can update dynamically later without duplicating:

## Table of Contents
* [Getting Started](#getting-started)
* [Installation](#installation)
* [API Reference](#api-reference)
Enter fullscreen mode Exit fullscreen mode

Built for Privacy: The BYOK Approach
As developers, we are naturally skeptical of pasting our API keys into random extensions, or worse, sending our proprietary code to a third-party backend server.

That’s why DotReadme uses a Bring Your Own Key (BYOK) architecture.

There is no middleman server. The extension makes API calls directly from your local VS Code instance to the AI provider of your choice (Google Gemini, Anthropic Claude, or OpenAI GPT).

You simply configure your settings like this:

{
  "dotreadme.ai.provider": "gemini",
  "dotreadme.ai.geminiApiKey": "AIzaSy...",
  "dotreadme.ai.model": "gemini-2.5-flash"
}
Enter fullscreen mode Exit fullscreen mode

Under the Hood: The Provider Implementation
To make this extensible, I built a clean Interface for AI providers. Here is a simplified look at how the extension communicates directly with Google's Gemini API using TypeScript:

import { IAiProvider, AiResponse } from "../types";

export class GeminiProvider implements IAiProvider {
  readonly id = "gemini";
  readonly defaultModel = "gemini-2.5-flash";

  async complete(apiKey: string, prompt: string, model: string): Promise<AiResponse> {
    const url = `https://generativelanguage.googleapis.com/v1beta/models/${model}:generateContent?key=${apiKey}`;

    try {
      const response = await fetch(url, {
        method: "POST",
        headers: { "Content-Type": "application/json" },
        body: JSON.stringify({
          contents: [{ parts: [{ text: prompt }] }],
          generationConfig: { maxOutputTokens: 1024 },
        })
      });

      const data = await response.json();
      const text = data.candidates?.[0]?.content?.parts?.[0]?.text ?? "";

      return { success: true, result: text.trim() };
    } catch (error) {
      return { success: false, error: "API request failed." };
    }
  }
}
Enter fullscreen mode Exit fullscreen mode

This decoupled architecture means I can easily add Local AI support (like Ollama) in the next release!

Try it out!
If you want to stop wasting time on documentation and let the AI do the heavy lifting:

1. Search for DotReadme in the VS Code Extensions tab.

2. Or grab it directly from the VS Code Marketplace or vsx.

I would love to hear your feedback! What AI model are you currently using for your daily tasks, and what feature should I add to DotReadme next? Let me know in the comments!

Top comments (0)