You're deep in a coding session. Your AI assistant was crushing it for the first hour, understanding your requirements, following your coding style, and implementing features cleanly. Then suddenly, it's like talking to a goldfish.
Every new request introduces bugs. It ignores the constraints you set at the beginning. You find yourself repeating the same instructions over and over, wondering: "Are you even listening to me?"
If this sounds familiar, you're not alone. And more importantly, you're not going crazy.
Tokenization
Here's what's actually happening behind the scenes. AI doesn't process text like humans do. Before it can understand your words, everything gets converted into tokens. Think of it like feeding dollar bills into an arcade token machine, except you're feeding in words instead of money.
For example:
- Input: "hello world"
- Output: 3 tokens
Generally, one token equals about 3/4 of a word or 4 characters. Different models use different tokenization algorithms, which is why the same text might produce different token counts across providers.
If you're using API or CLI versions of LLMs, you're paying per token:
GPT-4o: $3/million input tokens, $15/million output tokens
GPT-4: $30/million input tokens, $60/million output tokens
But cost isn't the only concern.
Context Rot
Picture those tokens dropping onto a conveyor belt with fixed capacity. As you feed more words in, older tokens get pushed forward. When the belt fills up, tokens at the front fall off—and get forgotten.
That perfect prompt you crafted 20 minutes ago? Those crucial error messages you shared? If they've been pushed off the conveyor belt, they're gone from the AI's memory.
This is context rot, and it explains why your coding assistant seems to develop amnesia mid-conversation.
What You Can Do About It
The good news? Once you understand what's happening, you can work with it instead of against it. I've compiled strategies that have saved my sanity during long coding sessions.
Coming up next: Practical techniques to manage context rot and keep your AI assistant focused throughout your entire development workflow.
This post is part of my "Learning Out Loud" series where I share developer insights from real coding experiences. You can also watch the video version on LinkedIn. Follow for more practical AI development tips.



Top comments (0)