Stop the memory rot
TL;DR: You can keep your AI sharp by forcing it to summarize and prune what it remembers (a.k.a. compacting).
Common Mistake โ
You keep a single, long conversation open for hours.
You feed the AI with every error log and every iteration of your code.
Eventually, the AI starts to ignore your early instructions or hallucinate # AI Coding Tip 009 - Compact Your Context
Stop the memory rot
TL;DR: You can keep your AI sharp by forcing it to summarize and prune what it remembers (a.k.a. compacting).
Common Mistake โ
You keep a single, long conversation open for hours.
You feed the AI with every error log and every iteration of your code.
Eventually, the AI starts to ignore your early instructions or hallucinate nonexistent functions.
Problems Addressed ๐
- Context Decay: The AI loses track of your original goals in the middle of a long chat.
- Hallucinations: The model fills memory gaps with hallucinations or outdated logic.
- Token Waste: You pay for the AI to re-read useless error logs from three hours ago.
- Reduced Reasoning: A bloated context makes the AI less smart and more prone to simple mistakes.
How to Do It ๐ ๏ธ
- Restart often: You can start a new chat once you finish a sub-task.
- Request a State Summary: Before you close a conversation, ask the AI to summarize the current decisions and plan.
- Add Human Checkpoints: After the summary, confirm you are still on track.
-
Use Markdown Docs: Keep a small
context.mdfile with your current stack and rules. - Prune the Logs: You should only paste the relevant 5 lines of a stack trace instead of the whole irrelevant 200-line output.
- Divide and conquer: Break large tasks into smaller ones, invoking their own skills with local tokens and a fresh context.
- Divide the responsibility: A General doesn't need to know what every soldier is doing on the battlefield.
- Create and persist as Skill: After you have taught the AI, you should refactor the knowledge and business rules.
- Keep an Eye on the Context Size: Most tools have visual indicators of the window consumption.
- Use Local Persistence: Some tools allow sharing memory among agents and their sub-agents.
Benefits ๐ฏ
- You get more accurate code suggestions.
- You avoid divergences
- You follow the AI's train of thought.
- You spend less time correcting the AI's hallucinations.
- The AI follows your project constraints more strictly and keeps focused on your tasks
Context ๐ง
Large Language Models have limited attention.
Long context windows are a trap.
Many modern models offer a very large context window.
In practice, they ignore a lot of them to your frustration.
Even with huge context windows, they prioritize and focus ob the beginning and the end of the prompt.
Prompt Reference ๐
Bad Prompt
Here is the 500-line log of my failed build.
Also, remember that we changed the database schema
Three hours ago in this chat.
Add the unit tests as I described above.
Now, refactor the whole component.
Good Prompt
I am starting a new session. Here is the current state:
We use *PostgreSQL* with the 'Users' table schema [ID, Email].
The AuthService`interface is [login(), logout()].
Refactor the LoginComponent` to use these.
Considerations โ ๏ธ
You must ensure you don't purge essential context.
If you prune too much, the AI might suggest libraries that conflict with your current setup.
Review the compacted information.
Type ๐
[X] Semi-Automatic
Limitations โ ๏ธ
You can use this tip manually in any chat interface.
If you use advanced agents like Claude Code or Cursor, they might handle some of this automatically, but manual pruning is still more reliable.
Tags ๐ท๏ธ
- Context
Level ๐
[X] Intermediate
Related Tips ๐
AI Coding Tip 010 - Create Skill from Conversation
Conclusion ๐
You are the curator of the AI's memory.
If you let the context rot, the code will rot, too.
Keep it clean and compact. ๐งน
More Information โน๏ธ
Lost in the Middle: How Language Models Use Long Context
LLMLingua: Prompt Compression for LLMs
How to Manage Context in AI Coding
Prompt Engineering Guide: Context Management
Claude Context Window Best Practices
Also Known As ๐ญ
- Context Pruning
- Token Management
- Prompt Compression
Tools ๐งฐ
- Claude Code
- Cursor
- Windsurf
Disclaimer ๐ข
The views expressed here are my own.
I am a human who writes as best as possible for other humans.
I use AI proofreading tools to improve some texts.
I welcome constructive criticism and dialogue.
I shape these insights through 30 years in the software industry, 25 years of teaching, and writing over 500 articles and a book.
This article is part of the AI Coding Tip series.
functions.

Top comments (0)