DEV Community

Cover image for Supercharging AI Development with code-contextify
Elvin Ega
Elvin Ega

Posted on

Supercharging AI Development with code-contextify

As AI-powered coding becomes the norm, one challenge keeps popping up — how do we give AI models complete context about our codebase? If you’ve ever tried pasting a dozen files into ChatGPT, Claude, or Gemini, you know the pain.

That’s exactly where code-contextify comes in.

This nifty CLI tool takes your entire codebase, cleans it up, and turns it into a single AI-ready text file — perfect for sending to any large language model.

Ai code-contextify


🚀 What is code-contextify?

Simply put, code-contextify scans your project, applies smart filters (respects .gitignore, skips binaries, removes irrelevant noise), and outputs a formatted, readable text file that contains:

  • 📊 Project statistics (total files, total size)
  • 🌳 Directory tree with excluded file markers
  • 📄 Full file contents (only relevant code and docs)

This means that instead of AI guessing about your project, you can hand over full, precise context in one shot.


💡 Why Developers Love It

  • Full AI Context — Feed GPT, Claude, or Gemini the entire codebase
  • Perfect for Code Reviews — Share with teammates without sending raw repos
  • Great for Debugging — AI can see the full picture
  • Automated Filtering — No more handpicking files
  • Privacy First — Runs locally, no data leaves your machine

📦 Installation

npm install -g code-contextify
Enter fullscreen mode Exit fullscreen mode

Or without installing:

npx code-contextify /path/to/project
Enter fullscreen mode Exit fullscreen mode

⚡ Basic Usage

# Convert current directory into AI context
code-contextify

# Convert a specific directory
code-contextify /path/to/project

# Custom output file
code-contextify my-context.txt
Enter fullscreen mode Exit fullscreen mode

Example:

code-contextify review-context.txt --filter "node_modules,dist,*.min.js"
Enter fullscreen mode Exit fullscreen mode

This generates review-context.txt containing only the files that matter.


📊 What the Output Looks Like

Project Statistics:
Total Files: 42
Total Size: 1.25 MB

Folder Structure (Tree)
=====================
Legend: ✗ = Excluded
├── src/
│   ├── components/
│   │   ├── Header.js
│   │   └── Footer.js
│   └── utils/
├── node_modules/ ✗
├── package-lock.json (123.45 KB) ✗
└── README.md (2.34 KB)
Enter fullscreen mode Exit fullscreen mode

Followed by:

src/components/Header.js
------------------------
import React from 'react';
// ... actual file content
Enter fullscreen mode Exit fullscreen mode

📏 Counting Tokens Before Sending to AI

Before sending your context file to an AI model, it’s smart to check token usage — especially for models with smaller context windows.

You can use QuizGecko’s Token Counter to paste your generated file and see how many tokens it contains.


🎯 Sending to Any AI Model

Once you have your .txt file, you can paste it directly into an AI chat or send it via API.

Example with OpenAI API:

import fs from 'fs';
import OpenAI from 'openai';

const openai = new OpenAI({ apiKey: process.env.OPENAI_API_KEY });
const context = fs.readFileSync('project-context.txt', 'utf8');

const response = await openai.chat.completions.create({
  model: "gpt-4",
  messages: [
    { role: "system", content: "You are a senior developer reviewing a project." },
    { role: "user", content: `Here is the full project context:\n${context}` }
  ]
});

console.log(response.choices[0].message.content);
Enter fullscreen mode Exit fullscreen mode

You can swap gpt-4 for Claude, Gemini, or any other LLM.


📥 Try It Now

Stop pasting files one by one — let your AI see everything at once.

Turn your codebase into conversation — one context file at a time.

Top comments (0)