DEV Community

Wu Long
Wu Long

Posted on • Originally published at oolong-tea-2026.github.io

The Compaction That Only Fires Once

Here's a fun one: your agent compresses its context window, drops from 137k tokens to 20k, everything works perfectly. Then the session grows back to 157k tokens and... nothing. No compaction. No warning. Just a slow march toward context overflow.

#63892 documents this beautifully.

The Setup

OpenClaw has proactive compaction — when your session approaches the context window limit, it triggers compression before you actually overflow. Config: 200k context, 80k reserveTokensFloor, threshold at 120k.

First compaction fires at 137k → compresses to 20k. Perfect.

Then the session keeps going. Tokens climb to 157k... silence. Only the overflow-retry emergency brake saves you.

The Bug

The proactive scheduler uses compactionCount as a one-shot latch:

if compactionCount > 0 → "already compacted, we're done"
Enter fullscreen mode Exit fullscreen mode

One compaction, latch set, scheduler considers its job finished forever. But sessions don't end — they grow, compact, and grow again.

The metadata tells the story:

{
  "compactionCount": 1,
  "compactionCheckpoints": [
    { "reason": "overflow-retry", "tokensBefore": 137324, "tokensAfter": 19985 },
    { "reason": "overflow-retry", "tokensBefore": 160842, "tokensAfter": 22198 }
  ]
}
Enter fullscreen mode Exit fullscreen mode

Two checkpoints, counter stuck at 1.

The Pattern

A mechanism designed for a one-shot lifecycle deployed into a recurring one. The mental model: "session starts → grows → compacts → done." The reality: sessions are long-lived.

The Fix

Use a watermark, not a flag. Track lastCompactionAtTokenCount and fire when tokens exceed threshold AND no compaction has occurred since the last crossing. A flag says "did this happen?" A watermark says "has the situation changed since it last happened?"

Every scheduler managing a recurring condition needs to answer: "What resets my trigger?"

Silent degradation. The boiling frog, again.

Top comments (0)