DEV Community

Machine coding Master
Machine coding Master

Posted on • Originally published at javalld.com

Java LLD: Mastering LRU and LFU Cache Design for Machine Coding

Java LLD: Mastering LRU and LFU Cache Design for Machine Coding

Designing a production-grade cache is the "Hello World" of Low-Level Design (LLD) interviews at Tier-1 companies like Amazon and Apple. It tests your ability to balance data structures, time complexity, and thread safety within a single, cohesive system.

I built javalld.com while prepping for senior roles — complete LLD problems with execution traces, not just theory.

The Mistake Most Candidates Make

  • Using Suboptimal Structures: Relying on ArrayList or PriorityQueue for eviction, which results in $O(N)$ or $O(\log N)$ time complexity instead of the required $O(1)$.
  • Ignoring Thread Safety: Writing a "dry" implementation that fails in a multi-threaded environment or using global synchronized blocks that kill performance.
  • Over-Engineering LRU: Implementing a manual Doubly Linked List for LRU when Java’s LinkedHashMap already provides the foundation for an $O(1)$ solution.

The Right Approach

  • Core Mental Model: Use a HashMap for $O(1)$ lookups and a DoublyLinkedList (or frequency buckets) to track access order or frequency for $O(1)$ eviction.
  • Key Entities: CacheNode, DoublyLinkedList, FrequencyMap, ReentrantLock.
  • Why it beats the naive approach: It decouples the data storage from the eviction policy, ensuring that adding, removing, and updating entries never scales with the size of the cache.

The Key Insight (Code)

For LFU (Least Frequently Used), the secret is maintaining a map of "Frequency Buckets." When an item is accessed, it moves to the next bucket in $O(1)$.

// Core LFU Logic: Moving a node to the next frequency bucket
private void updateFrequency(Node node) {
    int freq = node.frequency;
    freqMap.get(freq).remove(node); // O(1)
    if (freqMap.get(freq).isEmpty() && freq == minFrequency) {
        minFrequency++;
    }
    node.frequency++;
    freqMap.computeIfAbsent(node.frequency, k -> new DoublyLinkedList())
           .addAtHead(node); // O(1)
}
Enter fullscreen mode Exit fullscreen mode

Key Takeaways

  • LRU Shortcut: In Java, you can implement a thread-safe LRU cache in minutes by extending LinkedHashMap and overriding removeEldestEntry(), wrapped in a ReentrantReadWriteLock.
  • LFU Complexity: LFU requires a Map<Integer, DoublyLinkedList> where the key is the frequency; this allows you to find the "least frequent" and "oldest" item simultaneously.
  • Concurrency: Use ReentrantLock for fine-grained control or Semaphore if you need to limit the number of concurrent threads accessing the cache resources.

Full working implementation with execution trace available at https://javalld.com/problems/cache-design

Top comments (0)