Java LLD: Mastering LRU and LFU Cache Design for Machine Coding
Designing a production-grade cache is the "Hello World" of Low-Level Design (LLD) interviews at Tier-1 companies like Amazon and Apple. It tests your ability to balance data structures, time complexity, and thread safety within a single, cohesive system.
I built javalld.com while prepping for senior roles — complete LLD problems with execution traces, not just theory.
The Mistake Most Candidates Make
- Using Suboptimal Structures: Relying on
ArrayListorPriorityQueuefor eviction, which results in $O(N)$ or $O(\log N)$ time complexity instead of the required $O(1)$. - Ignoring Thread Safety: Writing a "dry" implementation that fails in a multi-threaded environment or using global
synchronizedblocks that kill performance. - Over-Engineering LRU: Implementing a manual Doubly Linked List for LRU when Java’s
LinkedHashMapalready provides the foundation for an $O(1)$ solution.
The Right Approach
- Core Mental Model: Use a
HashMapfor $O(1)$ lookups and aDoublyLinkedList(or frequency buckets) to track access order or frequency for $O(1)$ eviction. - Key Entities:
CacheNode,DoublyLinkedList,FrequencyMap,ReentrantLock. - Why it beats the naive approach: It decouples the data storage from the eviction policy, ensuring that adding, removing, and updating entries never scales with the size of the cache.
The Key Insight (Code)
For LFU (Least Frequently Used), the secret is maintaining a map of "Frequency Buckets." When an item is accessed, it moves to the next bucket in $O(1)$.
// Core LFU Logic: Moving a node to the next frequency bucket
private void updateFrequency(Node node) {
int freq = node.frequency;
freqMap.get(freq).remove(node); // O(1)
if (freqMap.get(freq).isEmpty() && freq == minFrequency) {
minFrequency++;
}
node.frequency++;
freqMap.computeIfAbsent(node.frequency, k -> new DoublyLinkedList())
.addAtHead(node); // O(1)
}
Key Takeaways
- LRU Shortcut: In Java, you can implement a thread-safe LRU cache in minutes by extending
LinkedHashMapand overridingremoveEldestEntry(), wrapped in aReentrantReadWriteLock. - LFU Complexity: LFU requires a
Map<Integer, DoublyLinkedList>where the key is the frequency; this allows you to find the "least frequent" and "oldest" item simultaneously. - Concurrency: Use
ReentrantLockfor fine-grained control orSemaphoreif you need to limit the number of concurrent threads accessing the cache resources.
Full working implementation with execution trace available at https://javalld.com/problems/cache-design
Top comments (0)