Why This Blog Matters
Most cloud overruns don’t come from AI models or Kubernetes complexity.
They come from small inefficiencies in your .NET code — the kind of hidden multipliers that chew up CPU, memory, and dollars every day.
Microsoft has quietly shipped features that don’t get keynote spotlight but can decide whether your App Service scales down or your CFO calls about a $30K overage.
This post highlights one such overlooked gem: Frozen Collections in .NET 8.
The Hidden Hero: Frozen Collections
Dictionaries and hash sets are staples of .NET. But at scale, they carry a hidden tax:
- GC churn from repeated allocations
- CPU wasted re-hashing keys on every lookup
- App Service scaling triggered by inefficient memory pressure
Frozen Collections (Microsoft Docs) flip the model:
- Immutable → once built, they never change (thread-safe by default)
- Pre-hashed → expensive work is done once at build time
-
Throughput wins → Microsoft benchmarks show
FrozenDictionary
achieves ~1.9 ns/lookup vs ~2.5 ns forDictionary<TKey,TValue>
(~25% faster steady-state) - Memory efficiency → lower footprint, fewer GC pauses
👉 Think of it like password hashing. Dictionary
re-hashes every login. FrozenDictionary
hashes once and reuses forever.
Real-World Case: Cosmos DB Partition Routing
Workload Context:
- ~12K requests/sec
- ~10K partition keys (static)
- Routing service hosted on Azure App Service P1v3 (2 cores)
- Partition map rebuilt once at startup
Observed Behavior (Before):
With Dictionary<string,string>
:
- CPU averaged 85%
- p99 latency ~180 ms
- Service scaled from 4 → 6 instances
- Cost impact: +2 × P1v3 = $560/month = $6,720/year
After Swap (FrozenDictionary):
private static readonly FrozenDictionary<string,string> PartitionMap =
routingList.ToFrozenDictionary(x => x.Key, x => x.Value);
Results:
- 💡 In this single service: $6,720/year saved.
- 💡 Across five routing-heavy services with similar lookup patterns: ~$33K/year saved.
Profiling Evidence (dotnet-counters)
Test setup: 30-minute steady load at 12K RPS using k6, 10K partition keys.
- % Time in GC: 12.3% → 5.8%
- Alloc Rate: 350 MB/s → 120 MB/s
- CPU Usage: 85% → 67%
👉 Lower % Time in GC + lower allocation rate = fewer pauses and more CPU available for real work.
Microsoft Benchmark Data (Build vs Lookup Trade-Off)
Source: .NET 8 performance benchmarks
- Startup: ~4× slower — but only paid once.
- Lookups: faster and cheaper forever.
For static sets ≥10K, Frozen wins.
For small or frequently mutating sets, stick with Dictionary
.
Visuals (Conceptual)
App Service Scaling (Before vs After)
- Dictionary: 6 Instances at peak load
- FrozenDictionary: 4 Instances at peak load
GC Time Reduction
- From 12.3% → 5.8%
When to Use (and When Not To)
✅ Best suited for
- Configurations that don’t change after startup
- Routing tables (e.g., Cosmos DB partition maps)
- Feature flags / metadata lookups
- Multi-threaded services (immutability removes locking overhead)
❌ Avoid if
- Collections need frequent updates
- Dataset is tiny (startup overhead > benefit)
- Startup latency is critical (cold path scenarios)
The Bigger Lesson
As engineers, we often jump to flashy fixes: caching layers, Redis tuning, Kubernetes autoscaling.
But sometimes, the highest ROI comes from the most boring one-liner in your codebase.
Frozen Collections don’t just clean up code. They cut real dollars off your bill.
⚡ Coming Next in This Series
Frozen Collections are just one of Microsoft’s “quiet million-savers.”
- Part 2 — LoggerMessage Source Generator: How precompiled logging cut one team’s App Insights bill by 30%
- Part 3 — Cosmos DB Analytical Store TTL: How a retailer saved $3,000/month with one config
Follow to catch the full series.
Takeaway
If you’re running high-throughput .NET apps in 2025, ask yourself:
- Am I paying CPU + GC tax for collections that never change?
- Do I need to scale out, or can I scale smarter?
Because sometimes the difference between burning money and saving it is just one line:
myList.ToFrozenDictionary(x => x.Id, x => x.Value);
It’s not glamorous — but it’s the kind of decision that compounds into thousands of dollars at scale.
Call to Action
At Microsoft, we obsess over deleting the right line of code.
This one deletion (of a Dictionary
) saved us $6,720/year in one service — and ~$33K/year across the fleet.
👉 What’s the most expensive line of code you’ve ever deleted?
Drop it in the comments — I’ll feature the top 3 in Part 2 of this series.
In enterprise systems, efficiency isn’t optional — it compounds into millions at scale.
Frozen Collections are one of those tools you overlook until you see the bill. Don’t wait for that moment.
Top comments (0)