DEV Community

Cover image for LRU (Least Recently Used) Cache || With Implementation
Nesh
Nesh

Posted on

2 1 1 1 1

LRU (Least Recently Used) Cache || With Implementation

Understanding LRU Cache: Efficient Data Management

What is LRU Cache?

LRU stands for Least Recently Used and is a cache replacement policy that evicts the least recently accessed items when the cache reaches its maximum capacity. This ensures that the most frequently accessed data stays in the cache, providing faster access times.

LRU caches are designed to store a limited number of items, and when the cache is full and a new item needs to be added, the least recently accessed item is evicted to make room for the new one. The eviction policy is crucial for ensuring data consistency, which is why it's important to have an efficient mechanism for maintaining the cache.

How Does LRU Cache Work?

Image description

An LRU cache combines a doubly linked list and a hashmap to manage the data efficiently:

  • Doubly Linked List: This list tracks the items in the cache in the order they were accessed, with the most recently used item at the front and the least recently used at the back. The advantage of using a doubly linked list is that items can be quickly moved to the front or removed from the back in constant time.

  • Hashmap: The hashmap stores key-value pairs, enabling constant-time retrieval and update operations. It provides fast access to items based on their keys.

When an item is accessed, it is moved to the front of the list, marking it as the most recently used. If the cache is full and a new item needs to be added, the item at the end of the list (the least recently used) is evicted.

Problem Statement

We need to design a data structure that behaves like an LRU cache, adhering to the following operations:

  1. LRUCache(int capacity): Initializes the LRU cache with a positive size capacity.
  2. int get(int key): Returns the value of the key if it exists in the cache; otherwise, returns -1.
  3. void put(int key, int value): Updates the value of the key if it exists; otherwise, adds the key-value pair to the cache. If the cache exceeds its capacity, the least recently used item is evicted.

LeetCode Problem Link: LRU Cache

Approach: Doubly Linked List + Hashmap

To implement the LRU cache efficiently, we use a doubly linked list and a hashmap. The doubly linked list allows for constant-time insertion and removal of nodes, while the hashmap provides fast access to the nodes. Let’s dive into the implementation.

Node Structure

The Node represents an entry in the doubly linked list:

class Node {
public:
    int data;
    int key;
    Node* prev;
    Node* next;
    Node(int x, int y) {
        data = x;
        key = y;
        prev = NULL;
        next = NULL;
    }
};
Enter fullscreen mode Exit fullscreen mode

Each node contains:

  • data: The value associated with the key.
  • key: The unique identifier for the cache entry.
  • prev and next: Pointers to the previous and next nodes in the list.

LRUCache Class

Here’s the class that implements the LRU Cache:

class LRUCache {
public:
    unordered_map<int, pair<int, Node*>> mp;
    int cap;
    Node* head;
    Node* tail;

    LRUCache(int capacity) {
        cap = capacity;
        head = new Node(-1, -1);
        tail = new Node(-1, -1);
        head->next = tail;
        tail->prev = head;
    }

    int get(int key) {
        if (mp.find(key) == mp.end()) {
            return -1;  // Key not found in cache
        } else {
            int ans = mp[key].first;
            Node* temp = mp[key].second;
            remove(temp);  // Remove from current position
            mp[key].second = insert(mp[key].first, key);  // Move to front
            return ans;
        }
    }

    void put(int key, int value) {
        if (mp.find(key) == mp.end()) {
            if (mp.size() < cap) {
                mp[key] = {value, insert(value, key)};
            } else {
                int k = tail->prev->key;
                remove(tail->prev);  // Remove LRU item
                mp.erase(k);  // Remove from hashmap
                mp[key] = {value, insert(value, key)};  // Insert new item
            }
        } else {
            Node* temp = mp[key].second;
            remove(temp);  // Remove existing item
            mp[key] = {value, insert(value, key)};  // Insert updated item
        }
    }

private:
    void remove(Node* temp) {
        temp->prev->next = temp->next;
        temp->next->prev = temp->prev;
        temp->prev = NULL;
        temp->next = NULL;
        delete temp;
    }

    Node* insert(int val, int key) {
        Node* newnode = new Node(val, key);
        Node* temp = head->next;
        head->next = newnode;
        newnode->next = temp;
        newnode->prev = head;
        temp->prev = newnode;
        return newnode;
    }
};
Enter fullscreen mode Exit fullscreen mode

Image description

Explanation of Methods

  • Constructor (LRUCache): Initializes the cache with a given capacity. The head and tail are dummy nodes that simplify the insertion and deletion operations.

  • get(int key): If the key exists in the cache, it returns the value and moves the key to the front of the list (indicating it was recently used). If the key is not found, it returns -1.

  • put(int key, int value): If the key is not already in the cache, it checks whether the cache has space. If there is space, it inserts the new key-value pair. If the cache is full, it evicts the least recently used item (removes the item at the tail) and then inserts the new key-value pair. If the key already exists, it updates the value and moves the item to the front of the list.

Advantages of LRU Cache

  1. Improved Performance: By keeping frequently accessed data in a fast-access cache, system performance improves since data retrieval from cache is faster than fetching it from slower storage.
  2. Resource Efficiency: The eviction policy ensures that only relevant data stays in the cache, making efficient use of memory.
  3. Simplicity: The LRU caching principle is straightforward to implement and understand, making it ideal for many applications.

Applications of LRU Caches

LRU caches are used in a variety of systems:

  • Web Caching: Frequently accessed web pages and resources can be cached for faster retrieval, reducing load times.
  • Database Management: Caches query results or data blocks to minimize disk I/O operations.
  • Operating Systems: OS uses LRU caches to manage memory pages efficiently.
  • CPU Caching: Modern CPUs use multiple levels of cache (L1, L2, L3) to store frequently accessed instructions and data.

Conclusion

LRU Caches are a powerful tool for managing limited storage while ensuring that the most relevant data is readily accessible. Whether you are working on a web application, database system, or operating system, understanding and implementing an LRU cache can significantly improve the performance of your system.

Don’t forget to follow for more insightful blogs on data structures and algorithms.


Retry later

Top comments (0)

Retry later
Retry later