You open a chat with your AI assistant, paste 2,000 lines of legacy code, and type "refactor this." The model responds with a beautiful rewrite that breaks every integration point in the project.
The problem: you dumped context without framing it. The model doesn't know what matters, what's fragile, or what the implicit rules are.
The fix is what I call a Context Handshake — a structured onboarding document you feed the model before asking it to do anything.
The Template
# Context Handshake: [Project Name]
## What This Project Does (2 sentences)
An Express API that handles payment processing for our SaaS.
Talks to Stripe, PostgreSQL, and a legacy SOAP billing service.
## Tech Stack
- Node.js 18, Express 4, TypeScript
- PostgreSQL 15 via Prisma ORM
- Stripe SDK v14
- Legacy SOAP client (hand-rolled, do NOT touch)
## Architecture (3 bullet points)
- Routes → Controllers → Services → Repositories
- All DB access through Prisma (no raw SQL)
- SOAP client is in `lib/legacy/` — treat as black box
## The Sacred Cows (things you must not change)
1. `lib/legacy/soap-client.ts` — works, nobody understands it, don't touch
2. `middleware/auth.ts` — shared with 3 other services via npm package
3. Database migration files — append only, never edit existing
4. The `amount` field is in cents everywhere. EVERYWHERE.
## Naming Conventions
- Services: `XxxService` class with static methods
- Routes: `/api/v2/resource` (v2 prefix, plural nouns)
- Tests: `*.spec.ts` colocated with source files
## Recent Context
- Currently migrating from callbacks to async/await (60% done)
- PR #287 is in review — changes to `PaymentService`
- Known bug: timeout on large batch refunds (>50 items)
Why 10 Minutes?
You write this once per project. It takes 10 minutes because you already know all of this — it's just not written down anywhere the AI can read.
After the first write, maintaining it takes 30 seconds per session: update "Recent Context" and go.
How I Use It
Start of every session:
Here is the context handshake for this project. Read it, confirm you understand the constraints, then I'll give you the task.
The model responds with a summary of what it understood. If it gets something wrong, correct it before you start working. That's the "handshake" — mutual confirmation.
Then the task:
Refactor
PaymentController.processRefund()to use async/await. Remember: amounts in cents, don't touch the SOAP client, follow the Service pattern.
Now the model has guardrails. It won't convert amount to dollars. It won't "helpfully" rewrite the SOAP client. It won't create a refund_handler.ts that breaks your naming convention.
Building the Handshake From Scratch
Don't have 10 minutes? Ask the AI to help:
I'm going to paste the top-level directory listing and 2-3 key files from a legacy project.
Based on what you see, draft a Context Handshake document with these sections:
- What This Project Does
- Tech Stack
- Architecture
- Sacred Cows (your best guesses — I'll correct)
- Naming Conventions
Here's the file tree:
[paste `tree -L 2` output]
Here's the main entry point:
[paste index.ts or app.ts]
The model will get 70% right. You correct the rest. Still faster than writing from scratch.
Scaling to Teams
Put the handshake file in the repo as AI_CONTEXT.md (or .ai/context.md). Now every developer on the team — and every AI tool they use — starts from the same foundation.
Update it in PRs that change architecture. Review it like you'd review a README change.
What Changes
Before the handshake:
- Spend 5 minutes per session re-explaining the project
- Model makes wrong assumptions → you catch and correct → waste time
- Every teammate describes the project differently to their AI
After:
- 30-second copy-paste at session start
- Model respects constraints from the first interaction
- Consistent AI behavior across the team
Legacy codebases are hard enough. Don't make your AI assistant guess what the rules are — tell it upfront.
Top comments (0)