loading...
Cover image for Learn Data Structure and Algorithm in JavaScript | Part 14

Learn Data Structure and Algorithm in JavaScript | Part 14

edisonnpebojot profile image Edison Pebojot(๐Ÿ‘จโ€๐Ÿ’ป) ใƒปUpdated on ใƒป26 min read


Prerequisite (โœ‹๐Ÿ˜)

If you're reading this article right now, please considered to read our Part 01: Big-O Notation, Part 02: JavaScript Unique Part, Part 03: JavaScript Numbers to Part 13: Linked Lists

Part Table of Contents Description
01 Big-O Notation By the end of this chapter, you will understand how to analyze an implementation of an algorithm with respect to both time (execution time) and space (memory consumed).
02 JavaScript Unique Parts Big-O is important for analyzing and comparing the efficiencies of algorithms. The analysis of Big-O starts by looking at the code, and, applying the rules, applying the rules is because to simplify the Big-O notation linear or quadratic rule is not enough.
03 JavaScript Numbers This part 3 will focus on JavaScript number operations, number representation, Number objects, common number algorithms, and random number generation.
04 JavaScript Strings This part 4 will focus on strings, JavaScript String object, and the String objectโ€™s built-in functions. You will learn how to access, compare, decompose, and search strings for commonly used real-life purposes. In addition, the chapter will explore string encoding, decoding, encryption, and decryption.
05 JavaScript Arrays As a JavaScript developer, you will use the array often; it is the most commonly used data structure. Arrays in JavaScript come with a lot of built-in methods. By the end of this part, you will understand arrays and choose the right method
06 JavaScript Object This part will focus on what JavaScript objects are, how they are declared, and how their properties can be changed. In addition, this part will cover how JavaScript classes are implemented using prototypal inheritance. Also this part will be short.
07 JavaScript Memory Management A variable takes up some memory. In C, the programmer allocate and deallocate memory manually. In contrast, modern JavaScript engines have garbage collectors that delete unused variables. However, there are pitfalls(unexpected) that developers can fall into(โ—) This part will show these unexpected and present techniques to help the garbage collector minimize the memory problems.
08 Recursion This part 8 introduces the concept of recursion and recursive algorithms(Remember they are different, we will discuss them later(๐Ÿ˜‰)). We will also discuss the definition of recursion and fundamental rules for recursive algorithms.
09 Sets This part focuses on the concepts of sets from both a mathematical definition and on the implementation. Also, Common set operations, as well as their implementations, are covered in great detail (๐Ÿ’ก).
10 Searching and Sorting This part 10 focuses on searching and sorting for arrays. By the end of this part 10, you will understand how to use sorting and searching algorithms for arrays. Also, this article is a bit complicated for beginners, so as much as possible the visual aids is your friend (๐Ÿ‘€). (๐Ÿ˜ƒ)
11 Hash Tables A hash table is a fixed-sized data structure in which the size is defined at the start. This part 11 explains how hash tables work, and the method of generating a unique key. By the end of this part 11, you will understand various hashing techniques and know how to implement a hash table. (๐Ÿ˜ƒ)
12 Stacks and Queues This part 12 covers stacks and queues(pronounce as kyooz (๐Ÿ”ˆ)) not (kwewe) okay? hehehe (๐Ÿ˜…); both are data structures used in the implementation of complex data structures. You'll learn what the stacks and queues are, how they're used, when they're used, and how to implement them (๐Ÿ˜ƒ) Let's go! (๐Ÿ”ฅ๐Ÿ”ฅ๐Ÿ”ฅ)
13 Linked Lists A linked list is a data structure in which each node (or element) points to another node. Unlike arrays, which have a fixed size, a linked list is a dynamic data structure. By the end of this part 13, you will understand how to implement and work with linked lists. And oh! (๐Ÿ˜ฎ) There are two types of linked lists: singly (โžก๏ธ) and doubly (โ†”๏ธ). Letโ€™s examine the singly linked list first.(๐Ÿ˜ƒ) Let's go! (๐Ÿ”ฅ๐Ÿ”ฅ๐Ÿ”ฅ)
14 Caching Caching is the process of storing data into temporary memory so that it can be easily retrieved for later use if it is required again. As an example, a database keeps data cached to avoid re-reading the hard drive, and a web browser caches web images to avoid re-downloading. In this part 14, two caching techniques will discussed: LFU and LRU caching.
15 Trees A general tree data structure is composed of nodes with children nodes. The top node is called the root node. This part 15 will explore different types of trees such as binary trees, binary search trees, and self-balancing binary search trees. First, this part 15 will cover what trees are and how they are structured. Then, it will cover methods of traversing(crossing or taking a zigzag path) the tree data structure in detail. Finally, you will learn about binary search trees and self-balancing binary search trees to understand how to store searchable data. (๐Ÿ˜ƒ)
16 Heaps A heap is an important data structure that returns the highest or lowest element in O(1) time. This part 16 will focus on explaining how heaps are implemented as well as how to work with them. One example is heap sort, which is a sorting algorithm based on heaps.
17 Graphs In this Part 17, you will learn graph basics, including fundamental terminology and graph types. The Part 17 will also cover working with these different graph types and methods of representing graphs in data structures. Finally, algorithms for traversing, searching, and sorting graphs are explored to solve problems such as finding the shortest path between two graph nodes. (๐Ÿ‘)
18 Advance Strings Part 18 will cover more complex string algorithms than covered in the previous section. Now that you have heard of certain other data models or structures, they should be easier to comprehend. Specifically, Part 18 will focus on string searching algorithms. (๐Ÿ˜‰)
19 Dynamic Programming Dynamic programming involves breaking down problems into their subproblems. Solving the subproblems and saving those results into memory to access them whenever a repeated problem needs to be solved, the algorithmic complexity decreases significantly (โฌ‡๏ธ). To explain dynamic programming, letโ€™s re examine the Fibonacci sequence that was discussed in Part 8. Then Part 19 will cover the rules of dynamic programming and walk you through some examples to make the concepts more concrete. (๐Ÿ˜‰)
20 Bit Manipulation Bit manipulation is an advanced topic that JavaScript developers typically do not need to know. However, you should learn a bit about bit manipulation if you want to implement high-performance server-side code. Understanding bit manipulation requires some knowledge of digital logic. Any introductory course in discrete math or circuits would be helpful to understand these concepts.

Part 14: Caching (๐Ÿ˜ฑ ๐Ÿ”ฅ ๐Ÿข)

Alt Text

Caching is the process of storing data into temporary memory so that it can be easily retrieved for later use if it is required again. As an example, a database keeps data cached to avoid re-reading the hard drive, and a web browser caches web images to avoid re-downloading. In this part 14, two caching techniques will discussed: LFU (See here) and LRU (See here) caching.

Reminder (๐Ÿ’ก): I assumed you already read Part 13: Linked Lists which you can see in the table of contents. Because I don't want to re-explain the same concept again and again. (๐Ÿ‘)

Understanding Caching (๐Ÿขโ“)

Alt text

Cache design generally considers these two factors: Temporal (time) locality and Spatial (space) locality. The optimal (best) caching algorithm would be able to replace the part of the cache that will be used in the future with the new element. This will require calculating how many time in the future that item will be accessed.

Note (๐Ÿ“): It should be obvious to you that this is impossible to implement because it requires looking into the future.

Least Frequently Used Caching (๐Ÿข๐Ÿข)


Alt text

Least frequently used (LFU) is a caching algorithm used by the operating system to manage memory. The system tracks the number of times a block is referenced in memory. The easiest implementation of the LFU cache is assigning a counter to every block loaded into the cache and incrementing a counter every time a reference is made. When the cache exceeds its limit, the system searches for the block with the lowest counter and removes it from the cache.

Although LFU is not ideal when an item in memory is referenced repeatedly for a short amount of time and because of its repeated reference, this forces the system to delete other blocks that may be used more. Because of these issues, LFU is uncommon, but some hybrid systems utilize the core LFU concept. Examples of a such system are mobile keyboard apps. Suggested words appear on the keyboard apps, and it makes sense to implement this using LFU caching since the user likely uses the same words often (frequently).

The LFU cache uses a doubly linked list to remove elements in O(1)O(1) time. The doubly linked node in LFUs also has the freqCount property, which represents how frequently it has been accessed after being inserted for the first time. Let's see an examlple code below:

Example:

// LFU Node in Doubly Linked Lists
function LFUNode(key, value) {
    this.prev = null;
    this.next = null;
    this.key = key;
    this.data = value;
    this.freqCount = 1;
}
Enter fullscreen mode Exit fullscreen mode

The LFU cache has two hash tables: keys and freq. freq has keys of frequency ( 11 to nn , where nn is the top frequency), and each item or node is an instance of a doubly linked list class. keys stores each doubly linked list node for O(1) retrieval. The classes for a doubly linked list and the LFU cache are defined here:

Example:

// LFU Node in Doubly Linked Lists
function LFUNode(key, value) {
    this.prev = null;
    this.next = null;
    this.key = key;
    this.data = value;
    this.freqCount = 1;
}

// LFU Doubly Linked Lists
function LFUDoublyLinkedList() {
    this.head = new LFUNode('buffer head', null);
    this.tail = new LFUNode('buffer tail', null);
    this.head.next = this.tail;
    this.tail.prev = this.head;
}

// LFU Cache
function LFUCache(capacity) {
    this.keys = {}; // stores LFUNode
    this.freq = {}; // stores LFUDoublyLinkedList
    this.capacity = capacity;
    this.minFreq = 0;
    this.size = 0;
}
Enter fullscreen mode Exit fullscreen mode

The LFUDoublyLinkedList class also requires the implementation
for insertion and removal. However, only the insertion at the head and the removal at the tail is needed. This implementation is the same as the implementation from the doubly linked list class shown in Part 13 (Linked Lists):

Example:

// LFU Node in Doubly Linked Lists
function LFUNode(key, value) {
    this.prev = null;
    this.next = null;
    this.key = key;
    this.data = value;
    this.freqCount = 1;
}

// LFU Doubly Linked Lists
function LFUDoublyLinkedList() {
    this.head = new LFUNode('buffer head', null);
    this.tail = new LFUNode('buffer tail', null);
    this.head.next = this.tail;
    this.tail.prev = this.head;
}

// LFU Cache
function LFUCache(capacity) {
    this.keys = {}; // stores LFUNode
    this.freq = {}; // stores LFUDoublyLinkedList
    this.capacity = capacity;
    this.minFreq = 0;
    this.size = 0;
}

// Insert at Head
LFUDoublyLinkedList.prototype.insertAtHead = function (node) {
    node.next = this.head.next;
    this.head.next.prev = node;
    this.head.next = node;
    node.prev = this.head;
    this.size++;
}

// Remove at Tail
LFUDoublyLinkedList.prototype.removeAtTail = function () {
    var oldTail = this.tail.prev;
    var prev = this.tail.prev;
    prev.prev.next = this.tail;
    this.tail.prev = prev.prev;
    this.size--;
    return oldTail;
}

// Remove Node
LFUDoublyLinkedList.prototype.removeNode = function (node) {
    node.prev.next = node.next;
    node.next.prev = node.prev;
    this.size--;
}
Enter fullscreen mode Exit fullscreen mode

Implementing set for the LFU has two cases: insert the new
item and replace an old item. If the cache is not full, it can be inserted into the freqโ€™s doubly linked list of frequency 11 . If the capacity is full, the tail item in the doubly linked list of frequency is deleted, and then the new node is inserted.

Note (๐Ÿ“): If the element already exists, the node is brought to the head of its corresponding frequency. Finally, the minimum frequency variable, minFreq, is incremented accordingly to compute which item should be evicted (Remove) in the future.

Example:

// LFU Node in Doubly Linked Lists
function LFUNode(key, value) {
    this.prev = null;
    this.next = null;
    this.key = key;
    this.data = value;
    this.freqCount = 1;
}

// LFU Doubly Linked Lists
function LFUDoublyLinkedList() {
    this.head = new LFUNode('buffer head', null);
    this.tail = new LFUNode('buffer tail', null);
    this.head.next = this.tail;
    this.tail.prev = this.head;
}

// LFU Cache
function LFUCache(capacity) {
    this.keys = {}; // stores LFUNode
    this.freq = {}; // stores LFUDoublyLinkedList
    this.capacity = capacity;
    this.minFreq = 0;
    this.size = 0;
}

// Insert at Head
LFUDoublyLinkedList.prototype.insertAtHead = function (node) {
    node.next = this.head.next;
    this.head.next.prev = node;
    this.head.next = node;
    node.prev = this.head;
    this.size++;
}

// Remove at Tail
LFUDoublyLinkedList.prototype.removeAtTail = function () {
    var oldTail = this.tail.prev;
    var prev = this.tail.prev;
    prev.prev.next = this.tail;
    this.tail.prev = prev.prev;
    this.size--;
    return oldTail;
}

// Remove Node
LFUDoublyLinkedList.prototype.removeNode = function (node) {
    node.prev.next = node.next;
    node.next.prev = node.prev;
    this.size--;
}

// Set
LFUCache.prototype.set = function (key, value) {
    var node = this.keys[key]

    if (node == undefined) {
        node = new LFUNode(key, value);

        this.keys[key] = node;

        if (this.size != this.capacity) {
            // Insert new without deleting old
            if (this.freq[1] === undefined) {
                this.freq[1] = new LFUDoublyLinkedList();
            }
            this.freq[1].insertAtHead(node);
            this.size++;
        } else {
            // Delete old and insert new
            var oldTail = this.freq[this.minFreq].removeAtTail();
            delete this.keys[oldTail.key];

            if (this.freq[1] === undefined) {
                this.freq[1] = new LFUDoublyLinkedList();
            }

            this.freq[1].insertAtHead(node);
        }
        this.minFreq = 1;
    } else {
        var oldFreqCount = node.freqCount;
        node.data = value;
        node.freqCount++;


        this.freq[oldFreqCount].removeNode(node);

        if(this.freq[node.freqCount] === undefined) {
            this.freq[node.freqCount] = new LFUDoublyLinkedList();
        }

        this.freq[node.freqCount].insertAtHead(node);

        if(oldFreqCount == this.minFreq && Object.keys(this.freq[oldFreqCount]).length == 0) {
            this.minFreq++;
        }
    }
}
Enter fullscreen mode Exit fullscreen mode

To implement get, the cache needs to return existing nodes in O(1)O(1) time and increment the counter for accessing. If the element does not exist in the cache, it is forced to return a null element. Otherwise, the frequency for the element is increased, the item is brought to the head of the doubly linked list, and the minimum frequency variable, minFreq, is adjusted accordingly.

Example:

// LFU Node in Doubly Linked Lists
function LFUNode(key, value) {
    this.prev = null;
    this.next = null;
    this.key = key;
    this.data = value;
    this.freqCount = 1;
}

// LFU Doubly Linked Lists
function LFUDoublyLinkedList() {
    this.head = new LFUNode('buffer head', null);
    this.tail = new LFUNode('buffer tail', null);
    this.head.next = this.tail;
    this.tail.prev = this.head;
}

// LFU Cache
function LFUCache(capacity) {
    this.keys = {}; // stores LFUNode
    this.freq = {}; // stores LFUDoublyLinkedList
    this.capacity = capacity;
    this.minFreq = 0;
    this.size = 0;
}

// Insert at Head
LFUDoublyLinkedList.prototype.insertAtHead = function (node) {
    node.next = this.head.next;
    this.head.next.prev = node;
    this.head.next = node;
    node.prev = this.head;
    this.size++;
}

// Remove at Tail
LFUDoublyLinkedList.prototype.removeAtTail = function () {
    var oldTail = this.tail.prev;
    var prev = this.tail.prev;
    prev.prev.next = this.tail;
    this.tail.prev = prev.prev;
    this.size--;
    return oldTail;
}

// Remove Node
LFUDoublyLinkedList.prototype.removeNode = function (node) {
    node.prev.next = node.next;
    node.next.prev = node.prev;
    this.size--;
}

// Set
LFUCache.prototype.set = function (key, value) {
    var node = this.keys[key]

    if (node == undefined) {
        node = new LFUNode(key, value);

        this.keys[key] = node;

        if (this.size != this.capacity) {
            // Insert new without deleting old
            if (this.freq[1] === undefined) {
                this.freq[1] = new LFUDoublyLinkedList();
            }
            this.freq[1].insertAtHead(node);
            this.size++;
        } else {
            // Delete old and insert new
            var oldTail = this.freq[this.minFreq].removeAtTail();
            delete this.keys[oldTail.key];

            if (this.freq[1] === undefined) {
                this.freq[1] = new LFUDoublyLinkedList();
            }

            this.freq[1].insertAtHead(node);
        }
        this.minFreq = 1;
    } else {
        var oldFreqCount = node.freqCount;
        node.data = value;
        node.freqCount++;


        this.freq[oldFreqCount].removeNode(node);

        if (this.freq[node.freqCount] === undefined) {
            this.freq[node.freqCount] = new LFUDoublyLinkedList();
        }

        this.freq[node.freqCount].insertAtHead(node);

        if (oldFreqCount == this.minFreq && Object.keys(this.freq[oldFreqCount]).length == 0) {
            this.minFreq++;
        }
    }
}

// Get
LFUCache.prototype.get = function (key) {
    var node = this.keys[key];

    if (node == undefined) {
        return null;
    } else {
        var oldFreqCount = node.freqCount;
        node.freqCount++;

        this.freq[oldFreqCount].removeNode(node);

        if (this.freq[node.freqCount] === undefined) {
            this.freq[node.freqCount] = new LFUDoublyLinkedList();
        }

        this.freq[node.freqCount].insertAtHead(node);

        if(oldFreqCount == this.minFreq && Object.keys(this.freq[node.freqCount]).length == 0) {
            this.minFreq++;
        }

        return node.data;
    }
}
Enter fullscreen mode Exit fullscreen mode

With all the functions defined above, the following code shows an example of this LFU usage:

Example:

// LFU Node in Doubly Linked Lists
function LFUNode(key, value) {
    this.prev = null;
    this.next = null;
    this.key = key;
    this.data = value;
    this.freqCount = 1;
}

// LFU Doubly Linked Lists
function LFUDoublyLinkedList() {
    this.head = new LFUNode('buffer head', null);
    this.tail = new LFUNode('buffer tail', null);
    this.head.next = this.tail;
    this.tail.prev = this.head;
    this.size = 0;
}

// LFU Cache
function LFUCache(capacity) {
    this.keys = {}; // stores LFUNode
    this.freq = {}; // stores LFUDoublyLinkedList
    this.capacity = capacity;
    this.minFreq = 0;
    this.size = 0;
}

// Insert at Head
LFUDoublyLinkedList.prototype.insertAtHead = function (node) {
    node.next = this.head.next;
    this.head.next.prev = node;
    this.head.next = node;
    node.prev = this.head;
    this.size++;
}

// Remove at Tail
LFUDoublyLinkedList.prototype.removeAtTail = function () {
    var oldTail = this.tail.prev;
    var prev = this.tail.prev;
    prev.prev.next = this.tail;
    this.tail.prev = prev.prev;
    this.size--;
    return oldTail;
}

// Remove Node
LFUDoublyLinkedList.prototype.removeNode = function (node) {
    node.prev.next = node.next;
    node.next.prev = node.prev;
    this.size--;
}

// Set
LFUCache.prototype.set = function (key, value) {
    var node = this.keys[key];

    if (node == undefined) {
        node = new LFUNode(key, value);

        this.keys[key] = node;

        if (this.size != this.capacity) {
            // Insert without deleting
            if (this.freq[1] === undefined) {
                this.freq[1] = new LFUDoublyLinkedList();
            }
            this.freq[1].insertAtHead(node);
            this.size++;
        } else {
            // Delete and insert
            var oldTail = this.freq[this.minFreq].removeAtTail();
            delete this.keys[oldTail.key];

            if (this.freq[1] === undefined) {
                this.freq[1] = new LFUDoublyLinkedList();
            }

            this.freq[1].insertAtHead(node);
        }
        this.minFreq = 1;
    } else {
        var oldFreqCount = node.freqCount;
        node.data = value;
        node.freqCount++;


        this.freq[oldFreqCount].removeNode(node);

        if (this.freq[node.freqCount] === undefined) {
            this.freq[node.freqCount] = new LFUDoublyLinkedList();
        }

        this.freq[node.freqCount].insertAtHead(node);

        if (oldFreqCount == this.minFreq && Object.keys(this.freq[oldFreqCount]).length == 0) {
            this.minFreq++;
        }
    }
}

// Get
LFUCache.prototype.get = function (key) {
    var node = this.keys[key];

    if (node == undefined) {
        return null;
    } else {
        var oldFreqCount = node.freqCount;
        node.freqCount++;

        this.freq[oldFreqCount].removeNode(node);

        if (this.freq[node.freqCount] === undefined) {
            this.freq[node.freqCount] = new LFUDoublyLinkedList();
        }

        this.freq[node.freqCount].insertAtHead(node);

        if (oldFreqCount == this.minFreq && Object.keys(this.freq[oldFreqCount]).length == 0) {
            this.minFreq++;
        }

        return node.data;
    }
}

// Log 1
LFUCache.prototype.log1 = function () {
    var key = Object.keys(this.freq);
    var freq = this.freq["1"];
    var a = freq.head.next.data;
    var b = freq.head.next.next.data;
    var c = freq.head.next.next.next.data;
    var d = freq.head.next.next.next.next.data;
    var e = freq.head.next.next.next.next.next.data;
    return "State1: " + key + ": " + a + " <-> " + b + " <-> " + c + " <-> " + d + " <-> " + e;
}

// Log 2
LFUCache.prototype.log2 = function () {
    var key = Object.keys(this.freq)[0];
    var freq1 = this.freq["1"];
    var freq2 = this.freq["4"];
    var a = freq1.head.next.data;
    var b = freq1.head.next.next.data;
    var c = freq1.head.next.next.next.data;
    var d = freq1.head.next.next.next.next.data;
    var frequency = freq2.head.next.freqCount;
    var value = freq2.head.next.data;

    return "State2: " + key + ": " + a + " <-> " + b + " <-> " + c + " <-> " + d + ", " + frequency + ": " + value;
}

// Log 3
LFUCache.prototype.log3 = function () {
    var key = Object.keys(this.freq)[0];
    var freq1 = this.freq["1"];
    var freq2 = this.freq["4"];
    var a = freq1.head.next.data;
    var b = freq1.head.next.next.data;
    var c = freq1.head.next.next.next.data;
    var d = freq1.head.next.next.next.next.data;
    var frequency = freq2.head.next.freqCount;
    var value = freq2.head.next.data;

    return "State3: " + key + ": " + a + " <-> " + b + " <-> " + c + " <-> " + d + ", " + frequency + ": " + value;
}

// Log 4
LFUCache.prototype.log4 = function () {
    var key = Object.keys(this.freq)[0];
    var freq1 = this.freq["1"];
    var freq2 = this.freq["2"];
    var freq3 = this.freq["4"];
    var a = freq1.head.next.data;
    var b = freq1.head.next.next.data;
    var c = freq1.head.next.next.next.data;
    var frequency2 = freq2.head.next.freqCount;
    var node2 = freq2.head.next.data;
    var frequency4 = freq3.head.next.freqCount;
    var node4 = freq3.head.next.data;

    return "State4: " + key + ": " + a + " <-> " + b + " <-> " + c + ", " + frequency4 + ": " + node4 + ", " + frequency2 + ": " + node2;
}

// Instance(example) of a LFUCache class
var myLFU = new LFUCache(5);

// Set
myLFU.set(1, 1); // State of myLFU.freq: {1: 1}
myLFU.set(2, 2); // State of myLFU.freq: {1: 2 <-> 1}
myLFU.set(3, 3); // State of myLFU.freq: {1: 3 <-> 2 <-> 1}
myLFU.set(4, 4); // State of myLFU.freq: {1: 4 <-> 3 <-> 2 <-> 1}
myLFU.set(5, 5); // State of myLFU.freq: {1: 5 <-> 4 <-> 3 <-> 2 <-> 1}

// Result 1
console.log(myLFU.log1()); // Prints "{1: 5 <-> 4 <-> 3 <-> 2 <-> 1}"

// Get
myLFU.get(1); // Returns "1" |  State of myLFU.freq: {1: 5 <-> 4 <-> 3 <-> 2, 2: 1}
myLFU.get(1); // Returns "1" |  State of myLFU.freq: {1: 5 <-> 4 <-> 3 <-> 2, 3: 1}
myLFU.get(1); // Returns "1" |  State of myLFU.freq: {1: 5 <-> 4 <-> 3 <-> 2, 4: 1}

// Result 2
console.log(myLFU.log2()); // Prints "{1: 5 <-> 4 <-> 3 <-> 2, 4: 1}"

// Set
myLFU.set(6, 6); // State of myLFU.freq: {1: 6 <-> 5 <-> 4 <-> 3, 4: 1}

// Result 3
console.log(myLFU.log3()); // Prints "{1: 6 <-> 5 <-> 4 <-> 3, 4: 1}"

// Get
myLFU.get(6); // Returns "6" |  State of myLFU.freq: {1: 5 <-> 4 <-> 3, 4: 1, 2: 6}

// Result 4
myLFU.log4(); // Prints "{1: 5 <-> 4 <-> 3, 4: 1, 2: 6}"

// Note: Log function is optional. You can customize it any output you want
Enter fullscreen mode Exit fullscreen mode

Execution:

// LFU Node in Doubly Linked Lists function LFUNode(key, value) { this.prev = null; this.next = null; this.key = key; this.data = value; this.freqCount = 1; } // LFU Doubly Linked Lists function LFUDoublyLinkedList() { this.head = new LFUNode('buffer head', null); this.tail = new LFUNode('buffer tail', null); this.head.next = this.tail; this.tail.prev = this.head; this.size = 0; } // LFU Cache function LFUCache(capacity) { this.keys = {}; // stores LFUNode this.freq = {}; // stores LFUDoublyLinkedList this.capacity = capacity; this.minFreq = 0; this.size = 0; } // Insert at Head LFUDoublyLinkedList.prototype.insertAtHead = function (node) { node.next = this.head.next; this.head.next.prev = node; this.head.next = node; node.prev = this.head; this.size++; } // Remove at Tail LFUDoublyLinkedList.prototype.removeAtTail = function () { var oldTail = this.tail.prev; var prev = this.tail.prev; prev.prev.next = this.tail; this.tail.prev = prev.prev; this.size--; return oldTail; } // Remove Node LFUDoublyLinkedList.prototype.removeNode = function (node) { node.prev.next = node.next; node.next.prev = node.prev; this.size--; } // Set LFUCache.prototype.set = function (key, value) { var node = this.keys[key]; if (node == undefined) { node = new LFUNode(key, value); this.keys[key] = node; if (this.size != this.capacity) { // Insert without deleting if (this.freq[1] === undefined) { this.freq[1] = new LFUDoublyLinkedList(); } this.freq[1].insertAtHead(node); this.size++; } else { // Delete and insert var oldTail = this.freq[this.minFreq].removeAtTail(); delete this.keys[oldTail.key]; if (this.freq[1] === undefined) { this.freq[1] = new LFUDoublyLinkedList(); } this.freq[1].insertAtHead(node); } this.minFreq = 1; } else { var oldFreqCount = node.freqCount; node.data = value; node.freqCount++; this.freq[oldFreqCount].removeNode(node); if (this.freq[node.freqCount] === undefined) { this.freq[node.freqCount] = new LFUDoublyLinkedList(); } this.freq[node.freqCount].insertAtHead(node); if (oldFreqCount == this.minFreq && Object.keys(this.freq[oldFreqCount]).length == 0) { this.minFreq++; } } } // Get LFUCache.prototype.get = function (key) { var node = this.keys[key]; if (node == undefined) { return null; } else { var oldFreqCount = node.freqCount; node.freqCount++; this.freq[oldFreqCount].removeNode(node); if (this.freq[node.freqCount] === undefined) { this.freq[node.freqCount] = new LFUDoublyLinkedList(); } this.freq[node.freqCount].insertAtHead(node); if (oldFreqCount == this.minFreq && Object.keys(this.freq[oldFreqCount]).length == 0) { this.minFreq++; } return node.data; } } // Log 1 LFUCache.prototype.log1 = function () { var key = Object.keys(this.freq); var freq = this.freq["1"]; var a = freq.head.next.data; var b = freq.head.next.next.data; var c = freq.head.next.next.next.data; var d = freq.head.next.next.next.next.data; var e = freq.head.next.next.next.next.next.data; return "State1: " + key + ": " + a + " <-> " + b + " <-> " + c + " <-> " + d + " <-> " + e; } // Log 2 LFUCache.prototype.log2 = function () { var key = Object.keys(this.freq)[0]; var freq1 = this.freq["1"]; var freq2 = this.freq["4"]; var a = freq1.head.next.data; var b = freq1.head.next.next.data; var c = freq1.head.next.next.next.data; var d = freq1.head.next.next.next.next.data; var frequency = freq2.head.next.freqCount; var value = freq2.head.next.data; return "State2: " + key + ": " + a + " <-> " + b + " <-> " + c + " <-> " + d + ", " + frequency + ": " + value; } // Log 3 LFUCache.prototype.log3 = function () { var key = Object.keys(this.freq)[0]; var freq1 = this.freq["1"]; var freq2 = this.freq["4"]; var a = freq1.head.next.data; var b = freq1.head.next.next.data; var c = freq1.head.next.next.next.data; var d = freq1.head.next.next.next.next.data; var frequency = freq2.head.next.freqCount; var value = freq2.head.next.data; return "State3: " + key + ": " + a + " <-> " + b + " <-> " + c + " <-> " + d + ", " + frequency + ": " + value; } // Log 4 LFUCache.prototype.log4 = function () { var key = Object.keys(this.freq)[0]; var freq1 = this.freq["1"]; var freq2 = this.freq["2"]; var freq3 = this.freq["4"]; var a = freq1.head.next.data; var b = freq1.head.next.next.data; var c = freq1.head.next.next.next.data; var frequency2 = freq2.head.next.freqCount; var node2 = freq2.head.next.data; var frequency4 = freq3.head.next.freqCount; var node4 = freq3.head.next.data; return "State4: " + key + ": " + a + " <-> " + b + " <-> " + c + ", " + frequency4 + ": " + node4 + ", " + frequency2 + ": " + node2; } // Instance(example) of a LFUCache class var myLFU = new LFUCache(5); // Set myLFU.set(1, 1); // State of myLFU.freq: {1: 1} myLFU.set(2, 2); // State of myLFU.freq: {1: 2 <-> 1} myLFU.set(3, 3); // State of myLFU.freq: {1: 3 <-> 2 <-> 1} myLFU.set(4, 4); // State of myLFU.freq: {1: 4 <-> 3 <-> 2 <-> 1} myLFU.set(5, 5); // State of myLFU.freq: {1: 5 <-> 4 <-> 3 <-> 2 <-> 1} // Result 1 console.log(myLFU.log1()); // Prints "{1: 5 <-> 4 <-> 3 <-> 2 <-> 1}" // Get myLFU.get(1); // Returns "1" | State of myLFU.freq: {1: 5 <-> 4 <-> 3 <-> 2, 2: 1} myLFU.get(1); // Returns "1" | State of myLFU.freq: {1: 5 <-> 4 <-> 3 <-> 2, 3: 1} myLFU.get(1); // Returns "1" | State of myLFU.freq: {1: 5 <-> 4 <-> 3 <-> 2, 4: 1} // Result 2 console.log(myLFU.log2()); // Prints "{1: 5 <-> 4 <-> 3 <-> 2, 4: 1}" // Set myLFU.set(6, 6); // State of myLFU.freq: {1: 6 <-> 5 <-> 4 <-> 3, 4: 1} // Result 3 console.log(myLFU.log3()); // Prints "{1: 6 <-> 5 <-> 4 <-> 3, 4: 1}" // Get myLFU.get(6); // Returns "6" | State of myLFU.freq: {1: 5 <-> 4 <-> 3, 4: 1, 2: 6} // Result 4 myLFU.log4(); // Prints "{1: 5 <-> 4 <-> 3, 4: 1, 2: 6}" // Note: Log function is optional. You can customize it any output you want

Least Recently Used Caching (๐Ÿข๐Ÿก)


Alt text

Least recently used (LRU) caching is a caching algorithm that removes the oldest items first, so the item replaced is the oldest. When an item in the cache is accessed, that item moves to the back (newest) of the list. When not found in the cache, the front item (oldest) is removed, and the new item is put at the back (newest) of the list.

The implementation of this algorithm requires keeping track of which node was used when. To accomplish this, the LRU cache is implemented using a doubly linked list and hash table.

A doubly linked list is needed to keep track of the head (oldest). A doubly linked list is required because each time a new data is inserted, the head moves up until the size is exceeded. Then the oldest data is evicted (or removed). Let's see some example below:


Alt Text

Figure 14-1. LRU cache

Figure 14-1 shows a diagram of an LRU cache with a size of 5. To implement the LRU cache, the node is defined similarly to the doubly linked list node in Part 13. And its implementation is shown in the following code block:

// Doubly Linked List Node
function DLLNode(key, data) {
    this.key = key;
    this.data = data;
    this.next =null;
    this.prev = null;
}
Enter fullscreen mode Exit fullscreen mode

The LRU cache can be initialized by passing the capacity parameter. capacity defines how many nodes are allowed to be in the cache. Let's see an example below.

Example:

// Doubly Linked List Node
function DLLNode(key, data) {
    this.key = key;
    this.data = data;
    this.next = null;
    this.prev = null;
}

// LRU Cache
function LRUCache(capacity) {
    this.keys = {};
    this.capacity = capacity;
    this.head = new DLLNode("", null);
    this.tail = new DLLNode("", null);
    this.head.next = this.tail;
    this.tail.prev = this.head;
}
Enter fullscreen mode Exit fullscreen mode

Since the LRU cache uses a doubly linked list, two functions for removing a node and adding a node to the tail will be defined here:

// Doubly Linked List Node
function DLLNode(key, data) {
    this.key = key;
    this.data = data;
    this.next = null;
    this.prev = null;
}

// LRU Cache
function LRUCache(capacity) {
    this.keys = {};
    this.capacity = capacity;
    this.head = new DLLNode("", null);
    this.tail = new DLLNode("", null);
    this.head.next = this.tail;
    this.tail.prev = this.head;
}

// Remove
LRUCache.prototype.removeNode = function (node) {
    var prev = node.prev,
        next = node.next;
    prev.next = next;
    next.prev = prev;
}

// Add
LRUCache.prototype.addNode = function (node) {
    var realTail = this.tail.prev;
    realTail.next = node;

    this.tail.prev = node;
    node.prev = realTail;
    node.next = this.tail;
}
Enter fullscreen mode Exit fullscreen mode

Two more functions need to be defined: get and set. Whenever get is called, the LRU caching scheme brings that node to the head of the doubly linked list since it was the most recently used node. For setting nodes via set, the keys property on the LRU cache is used to store the node to keep retrieval in O(1)O(1) time in get. However, if the cache is at full capacity, it evicts the farthest node from the tail. Let's see an example below.

Example:

// Doubly Linked List Node
function DLLNode(key, data) {
    this.key = key;
    this.data = data;
    this.next = null;
    this.prev = null;
}

// LRU Cache
function LRUCache(capacity) {
    this.keys = {};
    this.capacity = capacity;
    this.head = new DLLNode("", null);
    this.tail = new DLLNode("", null);
    this.head.next = this.tail;
    this.tail.prev = this.head;
}

// Remove
LRUCache.prototype.removeNode = function (node) {
    var prev = node.prev,
        next = node.next;
    prev.next = next;
    next.prev = prev;
}

// Add
LRUCache.prototype.addNode = function (node) {
    var realTail = this.tail.prev;
    realTail.next = node;

    this.tail.prev = node;
    node.prev = realTail;
    node.next = this.tail;
}

// Get
LRUCache.prototype.get = function (key) {
    var node = this.keys[key];
    if (node == undefined) {
        return null;
    } else {
        this.removeNode(node);
        this.addNode(node);
        return node.data;
    }
}

// Set
LRUCache.prototype.set = function (key, value) {
    var node = this.keys[key];

    if (node) {
        this.removeNode(node);
    }

    var newNode = new DLLNode(key, value);

    this.addNode(newNode);
    this.keys[key] = newNode;

    // Evic a node
    if (Object.keys(this.keys).length > this.capacity) {
        var realHead = this.head.next;
        this.removeNode(realHead);
        delete this.keys[realHead.key];
    }
}
Enter fullscreen mode Exit fullscreen mode

Finally, the following is an example of an LRU cache of size 55 :

Example:

// Doubly Linked List Node
function DLLNode(key, data) {
    this.key = key;
    this.data = data;
    this.next = null;
    this.prev = null;
}

// LRU Cache
function LRUCache(capacity) {
    this.keys = {};
    this.capacity = capacity;
    this.head = new DLLNode("", null);
    this.tail = new DLLNode("", null);
    this.head.next = this.tail;
    this.tail.prev = this.head;
}

// Remove
LRUCache.prototype.removeNode = function (node) {
    var prev = node.prev,
        next = node.next;
    prev.next = next;
    next.prev = prev;
}

// Add
LRUCache.prototype.addNode = function (node) {
    var realTail = this.tail.prev;
    realTail.next = node;

    this.tail.prev = node;
    node.prev = realTail;
    node.next = this.tail;
}

// Get
LRUCache.prototype.get = function (key) {
    var node = this.keys[key];
    if (node == undefined) {
        return null;
    } else {
        this.removeNode(node);
        this.addNode(node);
        return node.data;
    }
}

// Set
LRUCache.prototype.set = function (key, value) {
    var node = this.keys[key];

    if (node) {
        this.removeNode(node);
    }

    var newNode = new DLLNode(key, value);

    this.addNode(newNode);
    this.keys[key] = newNode;

    // Evic a node
    if (Object.keys(this.keys).length > this.capacity) {
        var realHead = this.head.next;
        this.removeNode(realHead);
        delete this.keys[realHead.key];
    }
}

// Log 1
LRUCache.prototype.log1 = function () {
    var a = this.head.next.data;
    var b = this.head.next.next.data;
    var c = this.head.next.next.next.data;
    var d = this.head.next.next.next.next.data;
    var e = this.head.next.next.next.next.next.data;

    return a + " <-> " + b + " <-> " + c + " <-> " + d + " <-> " + e;
}

// Log 2
LRUCache.prototype.log2 = function () {
    var a = this.head.next.data;
    var b = this.head.next.next.data;
    var c = this.head.next.next.next.data;
    var d = this.head.next.next.next.next.data;
    var e = this.head.next.next.next.next.next.data;

    return a + " <-> " + b + " <-> " + c + " <-> " + d + " <-> " + e;
}

// Log 3
LRUCache.prototype.log3 = function () {
    var a = this.head.next.data;
    var b = this.head.next.next.data;
    var c = this.head.next.next.next.data;
    var d = this.head.next.next.next.next.data;
    var e = this.head.next.next.next.next.next.data;

    return a + " <-> " + b + " <-> " + c + " <-> " + d + " <-> " + e;
}

// Instance(example) of the LRU Class
var myLRU = new LRUCache(5);

// Set
myLRU.set(1, 1); // 1
myLRU.set(2, 2); // 1 <-> 2
myLRU.set(3, 3); // 1 <-> 2 <-> 3
myLRU.set(4, 4); // 1 <-> 2 <-> 3 <-> 4
myLRU.set(5, 5); // 1 <-> 2 <-> 3 <-> 4 <-> 5

// Result
console.log(myLRU.log1());  // Prints "1 <-> 2 <-> 3 <-> 4 <-> 5"

// Get
myLRU.get(1); // 2 <-> 3 <-> 4 <-> 5 <-> 1
myLRU.get(2); // 3 <-> 4 <-> 5 <-> 1 <-> 2

// Result
console.log(myLRU.log2()); // Prints "3 <-> 4 <-> 5 <-> 1 <-> 2"

// Set
myLRU.set(6, 6); // 4 <-> 5 <-> 1 <-> 2 <-> 6
myLRU.set(7, 7); // 5 <-> 1 <-> 2 <-> 6 <-> 7
myLRU.set(8, 8); // 1 <-> 2 <-> 6 <-> 7 <-> 8

// Result
myLRU.log3(); // Prints "1 <-> 2 <-> 6 <-> 7 <-> 8"

// Note: Log function is optional. You can customize it in any output you want
Enter fullscreen mode Exit fullscreen mode

Execution:

// Doubly Linked List Node function DLLNode(key, data) { this.key = key; this.data = data; this.next = null; this.prev = null; } // LRU Cache function LRUCache(capacity) { this.keys = {}; this.capacity = capacity; this.head = new DLLNode("", null); this.tail = new DLLNode("", null); this.head.next = this.tail; this.tail.prev = this.head; } // Remove LRUCache.prototype.removeNode = function (node) { var prev = node.prev, next = node.next; prev.next = next; next.prev = prev; } // Add LRUCache.prototype.addNode = function (node) { var realTail = this.tail.prev; realTail.next = node; this.tail.prev = node; node.prev = realTail; node.next = this.tail; } // Get LRUCache.prototype.get = function (key) { var node = this.keys[key]; if (node == undefined) { return null; } else { this.removeNode(node); this.addNode(node); return node.data; } } // Set LRUCache.prototype.set = function (key, value) { var node = this.keys[key]; if (node) { this.removeNode(node); } var newNode = new DLLNode(key, value); this.addNode(newNode); this.keys[key] = newNode; // Evic a node if (Object.keys(this.keys).length > this.capacity) { var realHead = this.head.next; this.removeNode(realHead); delete this.keys[realHead.key]; } } // Log 1 LRUCache.prototype.log1 = function () { var a = this.head.next.data; var b = this.head.next.next.data; var c = this.head.next.next.next.data; var d = this.head.next.next.next.next.data; var e = this.head.next.next.next.next.next.data; return a + " <-> " + b + " <-> " + c + " <-> " + d + " <-> " + e; } // Log 2 LRUCache.prototype.log2 = function () { var a = this.head.next.data; var b = this.head.next.next.data; var c = this.head.next.next.next.data; var d = this.head.next.next.next.next.data; var e = this.head.next.next.next.next.next.data; return a + " <-> " + b + " <-> " + c + " <-> " + d + " <-> " + e; } // Log 3 LRUCache.prototype.log3 = function () { var a = this.head.next.data; var b = this.head.next.next.data; var c = this.head.next.next.next.data; var d = this.head.next.next.next.next.data; var e = this.head.next.next.next.next.next.data; return a + " <-> " + b + " <-> " + c + " <-> " + d + " <-> " + e; } // Instance(example) of the LRU Class var myLRU = new LRUCache(5); // Set myLRU.set(1, 1); // 1 myLRU.set(2, 2); // 1 <-> 2 myLRU.set(3, 3); // 1 <-> 2 <-> 3 myLRU.set(4, 4); // 1 <-> 2 <-> 3 <-> 4 myLRU.set(5, 5); // 1 <-> 2 <-> 3 <-> 4 <-> 5 // Result console.log(myLRU.log1()); // Prints "1 <-> 2 <-> 3 <-> 4 <-> 5" // Get myLRU.get(1); // 2 <-> 3 <-> 4 <-> 5 <-> 1 myLRU.get(2); // 3 <-> 4 <-> 5 <-> 1 <-> 2 // Result console.log(myLRU.log2()); // Prints "3 <-> 4 <-> 5 <-> 1 <-> 2" // Set myLRU.set(6, 6); // 4 <-> 5 <-> 1 <-> 2 <-> 6 myLRU.set(7, 7); // 5 <-> 1 <-> 2 <-> 6 <-> 7 myLRU.set(8, 8); // 1 <-> 2 <-> 6 <-> 7 <-> 8 // Result myLRU.log3(); // Prints "1 <-> 2 <-> 6 <-> 7 <-> 8" // Note: Log function is optional. You can customize it in any output you want

Summary (๐Ÿ“š๐Ÿ“š)

Alt text

This part covered two main caching ideas: least frequently used and least recently used. This part 14 talked about the concept of an optimal (best) caching algorithm, which is impossible to implement but provides an idea of what you would want to approximate (๐Ÿ˜ƒ). LFU caching sounds great because it uses frequency to determine what node should be evicted (Remove), but LRU in most cases account (describe) temporal (time) locality. There are other caching algorithms, but most of those algorithms are worse in general cases, such as the not recently used (See on Wikipedia) and first in, first out (See on Wikipedia) algorithms.

Note (๐Ÿ“): It should be noted that in real-life system behavior, LRU is the most effective algorithm in most cases. Table 14-1 summarizes the caching algorithms.

Algorithm Comment
Optimal Impossible to implement
Least Frequently Used Bad for temporal locality
Least Recently Used Uses doubly-linked and hashmap
Table 14-1. Caching Summary


Up Next๐Ÿ‘‰ Part 15: Trees (๐Ÿ”ฅ๐Ÿ”ฅ) (August 27-28, 2020)


Alt Text

Discussion

pic
Editor guide
Collapse
edisonnpebojot profile image
Edison Pebojot(๐Ÿ‘จโ€๐Ÿ’ป) Author

I might not be active next week because of so many things on track from my calendar, but I'm going to help you find out what's widely used when it comes to trees. That is, you must know the General Tree Structure, Binary Trees, Tree Traversal, Binary Search Trees, and AVL Trees. One may propose more than just a paragraph below what more ought to be learned when it comes to trees. Goodbye and happy learning! (๐Ÿ˜ƒ)