DEV Community

Sreekar Reddy
Sreekar Reddy

Posted on • Originally published at sreekarreddy.com

πŸͺŸ Context Window Explained Like You're 5

How much an AI can remember at once

Day 55 of 149

πŸ‘‰ Full deep-dive with code examples


The Desk Size

You're studying with notes spread out.

Small desk β†’ Often just a few pages visible
Big desk β†’ See everything at once!

Context window = AI's desk size!


What It Means

Context window = how many tokens AI can see at once

Your conversation:
[System prompt] + [Previous messages] + [Your question] + [Response]
                          ↑
            All need to fit within the context window.
Enter fullscreen mode Exit fullscreen mode

If it doesn't fit, old messages get "forgotten"!


Size Matters

Different models support different context-window sizes.

Bigger window = the model can keep track of more of the conversation.


What Happens When Full?

Conversation too long?

  • Oldest messages drop off
  • AI "forgets" early context
  • Might repeat itself or lose track

That's why sometimes AI forgets what you said earlier!


Workarounds

  • Summarize long context
  • Use RAG (fetch mostly relevant info)
  • Choose models with bigger windows

In One Sentence

Context window is the maximum amount of text an AI can consider at once, including your entire conversation.


πŸ”— Enjoying these? Follow for daily ELI5 explanations!

Making complex tech concepts simple, one day at a time.

Top comments (0)