The problem with jumping into an unfamiliar codebase with AI
You've just been added to a repo. You open it up, fire up Claude Code or Copilot, and immediately start asking questions. The AI confidently tells you to use getServerSideProps. The project is on App Router. You correct it. Five minutes later it suggests a pattern that conflicts with how the existing Prisma setup works. You correct it again.
This isn't the AI being bad at its job — it just doesn't know anything about this codebase. And if you're new too, you're both guessing.
The usual workaround is a CLAUDE.md or .cursorrules file. But someone has to write that, it goes stale, and now there's a tax on the whole team to maintain documentation for a robot.
What ContextEngine does
ContextEngine is a CLI tool that scans your project and auto-generates context files for Claude Code, Cursor, Copilot, and Codex — in the exact format each tool expects.
It reads your actual config files and dependencies (package.json, tsconfig.json, prisma/schema.prisma, framework config files, etc.) and produces opinionated, framework-specific guidance that reflects current best practices for your detected stack. Not generic boilerplate — real, specific instructions like "this project uses the App Router, colocate page.tsx files inside app/, use Server Components by default."
Runs entirely offline. No account. No setup. Free, MIT licensed.
4-step setup walkthrough
Step 1: Run it against the repo
npx @strifero/contextengine
That's it for installation. npx pulls it fresh each time. Run this from the root of the project you're onboarding to.
Step 2: Watch it detect your stack
You'll see output like this as it scans:
✔ Detected: Next.js 14 (App Router)
✔ Detected: TypeScript
✔ Detected: Prisma ORM
✔ Detected: Tailwind CSS
✔ Detected: Vitest
Generating context files...
No prompts, no questions. It reads what's already there.
Step 3: Check what got generated
ContextEngine writes files to the locations each tool looks for automatically:
.claude/
CLAUDE.md ← picked up by Claude Code
skills/
typescript/SKILL.md
react/SKILL.md
...
.cursor/rules/
typescript.mdc ← picked up by Cursor
react.mdc
...
.github/
copilot-instructions.md ← picked up by Copilot
Use --tool cursor to generate only Cursor files, --tool copilot for Copilot only, or --tool all to generate everything in one pass.
Open any of the files. You'll see structured guidance specific to your detected stack — routing conventions, where key files live, which patterns to use and avoid.
Step 4: Commit it (optional but recommended)
git add .claude/ .cursor/ .github/copilot-instructions.md
git commit -m "add AI context files via contextengine"
Now every contributor who opens the repo gets the same starting point with their AI tool of choice. When the stack evolves, re-run with --update to pull in fresh content without touching anything you've manually edited:
npx @strifero/contextengine --update
What this looks like in practice
Here's a real example. Before running ContextEngine, I asked Claude Code where to put a new API endpoint in a Next.js 14 project:
Me: Where should I add a new API endpoint for fetching user data?
Claude (before): You can create a new file in
pages/api/users.tsand export a default handler function...
Wrong. App Router project. After running ContextEngine and letting Claude read the generated .claude/CLAUDE.md:
Me: Where should I add a new API endpoint for fetching user data?
Claude (after): Create a Route Handler at
app/api/users/route.ts. Export a namedGETfunction. Since this project uses Prisma, you can import your client fromlib/prisma.ts— based on the schema I can see theUsermodel has...
Same question. Completely different answer — and a correct one.
Pricing
ContextEngine is free. There's one tier: $0.
No account required, no usage limits, no telemetry. Runs entirely on your machine — no code or project data is sent anywhere.
If you're constantly re-explaining your stack at the start of every AI session, or you're onboarding to a codebase where nobody's written a CLAUDE.md, give it a try:
npx @strifero/contextengine
Takes about 30 seconds. Source on GitHub: https://github.com/strifero/ContextEngine — curious what stacks people are using it on, drop a comment if you run into anything it misses.
Top comments (0)