Test Category

Test Blog Post

Starter template for writing out a blog post using MDX/JSX and Next.js.

No Name Exists

Abdullah Muhammad

Published on May 17, 20265 min read 5 views

Share:
Article Cover Image

Introduction

We looked at the LRU cache in a past article. Today, we will re-visit the LRU cache and explore many of its variants.

Caching is an important concept in web development. In particular, caching can be useful for storing web browser information as well as database query results.

In the case of databases, caching can work to prevent load and lag in large systems where thousands of queries can be made to a database.

We can use caching to store query results to allow for easy access instead of running thousands of calls to a database.

We can allow users to retrieve this information and set a valid time frame for the cached data before we evict it and another database request is made to update the data in real-time and re-store it as cache.

It is important to distinguish between the different types of cache and what/when/where to use each type of cache.

There are popular services such as Redis cache and AWS Elasticache which are used to store data as cache (we will explore AWS Elasticache in a future article).

The LRU cache is one type of cache and it closely resembles what we just discussed so far.

Here is the link to the GitHub repository that we will be using in this article. The directory of concern is demos/Demo70_Cache_Variations.

Feel free to clone this repository as we touch on its various code segments in the coming sections.

Re-visiting the LRU Cache

The LRU (least recently used) cache operates by having a fixed size of data. It comprises of the following operations: evictions, insertions, updates, and deletions. Newly inserted, updated, or retrieved data is moved to the front of the cache to highlight its recent usage.

The LRU cache can be implemented using a hash map and a doubly linked list. This guarantees O(1) run-time complexity for all operations.

The architecture calls for each node in the cache to include a key (identity) and a value (as we have seen before).

The key is used to retrieve the reference of a particular node in the cache list via a hash map.

Rather than search through the entire list to find a particular node, we use the key in the hash map to access the reference to that node.

This allows for easy access, movement, updates, and deletes in O(1) time complexity.

The double linked list simply operates to manage the order of the LRU cache.

We also have an additional attribute that keeps track of the size of the cache to ensure that whenever new additions are made, the oldest one is removed when the cache gets full.


Modified Doubly Linked List

The following code segments detail an implementation of the LRU cache. We start by highlighting a modified version of the doubly linked list we looked at before Demo70_Cache_Variations/data_structures/cache/linked_lists/:

GitHub GistTypeScript
// Recursive doubly linked node recursive data structure
class DoublyCacheLinkNode {
    prev: DoublyCacheLinkNode | null;
    next: DoublyCacheLinkNode | null;
    key: string;
    value: number;

    constructor(key: string, value: number) {
        this.prev = null;
        this.next = null;
        this.key = key;
        this.value = value;
    }
}

export default DoublyCacheLinkNode;
Doubly cache link node implementation in TypeScript
GitHub GistTypeScript
import DoublyCacheLinkNode from "./DoublyCacheLinkNode";

// Doubly linked list allows for transversals forwards/backwards
// Implements four functions: appendFirst, deleteNode, moveFirst, deleteLast
class DoublyCacheLinkedList {
    head: DoublyCacheLinkNode | null;
    tail: DoublyCacheLinkNode | null;

    // Initialize the Doubly Linked List
    constructor() {
        this.head = null;
        this.tail = null;
    }

    // Append the node at the start of the cache list
    // Return the newly created node to be stored in the hash map
    // Insertions: put(key, value)
    appendFirst(key: string, value: number): DoublyCacheLinkNode {
        if (this.head === null) {
            let newNode = new DoublyCacheLinkNode(key, value);

            this.head = this.tail = newNode;
            return newNode;
        }
        else {
            let newNode = new DoublyCacheLinkNode(key, value);
            newNode.next = this.head;

            this.head.prev = newNode;
            this.head = newNode;

            return newNode;
        }
    }

    // Move the node from the list to the beginning of the list, removing it from current location
    // Retrieval/Updates: get(key), put(key, value)
    moveFirst(node: DoublyCacheLinkNode): void {
        if (this.head === node) {
            return;
        }
        else {
            this.deleteNode(node);

            if (this.head !== null) {
                this.head.prev = node;
            }
            else {
                this.tail = node;
            }
        }

        this.head = node;
    }

    // Four cases: One, Head, Tail, Middle
    // Lone node can easily be removed by setting head/tail to null
    // Head can point to the next element in the sequence, setting old head to null
    // Tail can point to the second last element in the sequence, setting old tail to null
    // In between, we capture the node before and after the current and make them point to each other
    // Ensure the pointers of the current node are set to null
    // Delete: delete(key)
    deleteNode(node: DoublyCacheLinkNode): void {
        if (this.head === node && node === this.tail) {
            this.head = this.tail = null; // Lone node removed
        }
        else if (this.head === node) {
            let secondNode = this.head.next;
            secondNode!.prev = null;

            this.head = secondNode;
        }
        else if (this.tail === node) {
            let secondLastNode = this.tail.prev;

            secondLastNode!.next = null;
            this.tail = secondLastNode;
        }
        else {
            let prevNode = node.prev;
            let nextNode = node.next;

            prevNode!.next = nextNode;
            nextNode!.prev = prevNode;
        }

        // Remove all of its pointers
        node.prev = null;
        node.next = null;
    }

    // For evictions, remove the last node in the cache list and return it
    // If no nodes exist, return null
    // Capture the tail node
    // Capture the second last node
    // Remove reference to the last node
    // Set old tail to the new tail
    // Return old node
    deleteLast(): DoublyCacheLinkNode | null {
        if (this.head === null) {
            return null;
        }
        else {
            // If only one node exists, capture it and remove the head/tail
            if (this.head === this.tail) {
                let loneNode = this.head;

                this.head = this.tail = null;
                return loneNode;
            }
            else {
                let tailNode = this.tail;
                let secondLastNode = this.tail!.prev;

                secondLastNode!.next = null;
                this.tail = secondLastNode;

                return tailNode;
            }
        } 
    }

    // For evictions, remove the first node in the cache list and return it
    // Capture the first node
    // Set the old head to point to the next node in the list, make it the new head
    // Return the first node
    deleteFirst(): DoublyCacheLinkNode | null {
        if (this.head === null) {
            return null;
        }
        else {
            if (this.head === this.tail) {
                let firstNode = this.head;
                this.head = this.tail = null;

                return firstNode;
            }
            else {
                let firstNode = this.head;
                let secondNode = this.head.next;

                secondNode!.prev = null;
                this.head = secondNode;

                firstNode.prev = null;
                firstNode.next = null;

                return firstNode;
            }
        }
    }
}

export default DoublyCacheLinkedList;
Doubly cache linked list implementation in TypeScript

The underlying concept is the same as before, except now we add an additional property known as the key.

The key is used to pinpoint the location of the node in the doubly linked list. The hash map stores this key guaranteeing us a O(1) time complexity for all operations.

For this solution, we have some modified functions: insertFirst, deleteLast, moveFirst, and deleteNode.

The insertFirst function takes in two parameters (a key and a value) which creates and returns a doubly link node.

The key-value pair comes from a function call in the LRU implementation (we will look at shortly).

We use the returned doubly link node and store it as a value inside the hash map.

The deleteNode function (as the name implies), deletes the requested node from the cache list. We retrieve the node from the hash map using the key (passed in as a parameter).

The retrieved node is then passed in as a parameter to the deleteNode function and is deleted accordingly. The delete operation considers four possible scenarios where the node may be positioned: head, tail, one, and middle.

Once this determination is made, the node is successfully deleted.


LRU Cache: Final Piece of the Puzzle

Now, we finally look at the implementation of the LRU cache. The following code segment details just that. It uses the modified doubly linked list we touched on earlier Demo70_Cache_Variations/data_structures/cache/LRUCache.ts:

GitHub GistTypeScript
import DoublyCacheLinkedList from "./linked_lists/DoublyCacheLinkedList";
import DoublyCacheLinkNode  from "./linked_lists/DoublyCacheLinkNode";

/* LRU Cache Data Structure
- get(key)[get], put(key, value)[update/insert], delete(key)[delete]
- DoublyCacheLinkNode ---> key, value, prev, next
- DoublyCacheLinkedList ---> deleteNode, moveFirst, insertFirst, deleteLast
*/
class LRUCache {
    cacheList: DoublyCacheLinkedList | null;
    nodeMap: Map<string, DoublyCacheLinkNode>;
    size: number;

    // Initialize LRU Cache properties
    constructor(sizeValue: number) {
        this.size = sizeValue < 0 ? 0 : sizeValue;
        this.nodeMap = new Map<string, DoublyCacheLinkNode>();
        this.cacheList = new DoublyCacheLinkedList();
    }

    // Insert a new key --> put(key, value)
    // Updating a key to a new value --> put(key, value)
    // First check hash Map, if the key exists, we are simply updating the value of the existing key
    // If it is a unique key, we are to insert a new key, check if we are at limit, if so evict/insert, and if not, insert
    // Remove node, insert at the beginning
    // Add the new key, node pair to the hash map
    // Evict a node and return its value
    // Insert the node at the front
    insertNode(key: string, value: number): void {
        if (this.nodeMap.has(key)){
            let node = this.nodeMap.get(key);

            this.cacheList!.moveFirst(node!);
            node!.value = value;
        }
        else if (this.nodeMap.size === this.size) {
            let node = this.cacheList!.deleteLast();
            this.nodeMap.delete(node!.key);

            let newNode = this.cacheList!.appendFirst(key, value);
            this.nodeMap.set(key, newNode);
        }
        else {
            let newNode = this.cacheList!.appendFirst(key, value);
            this.nodeMap.set(key, newNode!);
        }
    }

    // Delete a key ---> delete(key)
    // Check to see if key exists first
    // Delete its reference
    removeNode(key: string): void {
        if (this.nodeMap.has(key)) {
            this.cacheList!.deleteNode(this.nodeMap.get(key)!);
            this.nodeMap.delete(key); 
        } 
    }

    // Move node to the front of the list ---> get(key)
    // Retrieve the node from the hash map
    retrieveNode(key: string): number | null {
        if (this.nodeMap.has(key)) {
            let node = this.nodeMap.get(key)!;
            let value = this.nodeMap.get(key)!.value;

            this.cacheList!.moveFirst(node);
            
            return value;
        }
        else {
            return null;
        }
    }
};

export { LRUCache };
LRU cache implementation in TypeScript

We have three main functions: retrieveNode, insertNode, and removeNode.

The logic is the same as before. For node retrieval, we check the hash map to see if the key exists. If it does, we retrieve the node from the hash map and capture its value.

After that, we call the moveFirst function to move the node to the front of the cache list.

For node insertions, we check to see if the key already exists in the hash map. If it does, we know we are updating the node value, not creating a new one.

In this case, we also need to move the node to the beginning of the cache and update its value to the one requested.

If the key is unique and does not exist in the hash map, we know it is a new entry. For this scenario, we need to check the size of the hash map and compare it to the size attribute of the LRU cache.

If the maximum size has been reached, we need to evict the least recently used item. To perform this operation, we call the deleteLast function and delete the node from the cache list and return it.

We capture the key of the returned node and delete it from the hash map.

Once that is done, we create the new node and set its key and value properties (passed in as parameters) and use the node along with the key and insert it into the hash map. All of these operations take O(1) time complexity.

For deletions, we simply check the hash map to see if the key exists. If it does, we capture the node from the hash map using the key and delete it from the cache list as well as the hash map.

We capture the node value prior to deletion and return it to the user. This solution is straight forward and easy to implement once you understand how the LRU cache operates.

Now, we can build on this clean LRU cache solution and explore its variants.

The first of which is the MRU cache which operates exactly opposite to the LRU cache data structure.


MRU Cache

MRU (most recently used) cache is a data structure that evicts the most recently accessed item in place for a newly inserted one.

This operates exactly opposite to the LRU cache. The implementation largely remains the same except for the order in which evictions take place.

You can largely keep most of the implementation untouched in the case of the MRU cache except for the changes required in the eviction policy (delete nodes from the front instead of the back).

So, in this case, the deleteLast function would effectively be renamed to deleteFirst and its functionality would be modified to delete and return items at the front.

The following code segment details an implementation of the MRU cache Demo70_Cache_Variations/data_structures/cache/MRUCache.ts:

GitHub GistTypeScript
import DoublyCacheLinkNode from './linked_lists/DoublyCacheLinkNode';
import DoublyCacheLinkedList  from './linked_lists/DoublyCacheLinkedList';

// Implements an LRU cache, but backwards
// The most recently accessed item is the one evicted to make room for new entries
// Insert/Update ---> put(key, value)
// Get ---> get(key)
// Delete ---> delete(key)
class MRUCache {
    size: number;
    cacheList: DoublyCacheLinkedList | null;
    nodeMap: Map<string, DoublyCacheLinkNode>;

    // Initialize the MRU Cache
    constructor(size: number) {
        this.size = size < 0 ? 0 : size;
        this.cacheList = new DoublyCacheLinkedList();
        this.nodeMap = new Map<string, DoublyCacheLinkNode>();
    }

    // Insert a new key --> put(key, value)
    // Updating a key to a new value --> put(key, value)
    // First check hash Map, if the key exists, we are simply updating the value of the existing key
    // If it is a unique key, we are to insert a new key, check if we are at limit, if so evict/insert, and if not, insert
    // Remove node, insert at the beginning
    // Add the new key, node pair to the hash map
    // Evict a node and return its value
    // Insert the node at the front
    insertNode(key: string, value: number): void {
        if (this.nodeMap.has(key)){
            let node = this.nodeMap.get(key);

            this.cacheList!.moveFirst(node!);
            node!.value = value;
        }
        else if (this.nodeMap.size === this.size) {
            let node = this.cacheList!.deleteFirst();
            this.nodeMap.delete(node!.key);

            let newNode = this.cacheList!.appendFirst(key, value);
            this.nodeMap.set(key, newNode);
        }
        else {
            let newNode = this.cacheList!.appendFirst(key, value);
            this.nodeMap.set(key, newNode!);
        }
    }

    // Delete a key ---> delete(key)
    // Check to see if key exists first
    // Delete its reference
    removeNode(key: string): void {
        if (this.nodeMap.has(key)) {
            this.cacheList!.deleteNode(this.nodeMap.get(key)!);
            this.nodeMap.delete(key); 
        } 
    }

    // Move node to the front of the list ---> get(key)
    // Retrieve the node from the hash map
    retrieveNode(key: string): number | null {
        if (this.nodeMap.has(key)) {
            let node = this.nodeMap.get(key)!;
            let value = this.nodeMap.get(key)!.value;

            this.cacheList!.moveFirst(node);
            
            return value;
        }
        else {
            return null;
        }
    }
}



export { MRUCache };
MRU cache implementation in TypeScript

We defined an extra function called deleteLast in the modified doubly linked list data structure (see implementation above).

The MRU cache is rarely used in real-life, practical scenarios, but it is important to cover it since we are on the topic of caching.

Next, we will look at the RR cache, which stands for random replacement. This type of cache randomly evicts items to make room for new ones.

Order does not matter in this type of cache.


RR Cache

Most of these cache variations are based on their eviction policies. That is really the only thing that is unique to each of them.

We can generate a random number from range 0 to max size - 1 of the cache. This is because eviction would only ever take place when the cache is full to its size and the size would be determined when the cache is first instantiated.

Since order does not matter here, we can simply use an array which keeps track of the different keys in the cache. The randomly generated index is what is used to randomly remove one of these keys.

The following code segment details this thought process and a complete implementation of the RR cache Demo70_Cache_Variations/data_structures/cache/RRCache.ts:

GitHub GistTypeScript
// Random replacement cache
// Randomly deletes items as its eviction policy
class RRCache {
    rrCacheArray: string[];
    rrKeyMap: Map<string, number>;
    size: number;

    constructor(size: number) {
        this.rrCacheArray = [];
        this.rrKeyMap = new Map<string, number>();
        this.size = size;
    }

    // Follow the FIFO principle for insertions and deletions of nodes
    // If no key exists, a new one is being inserted, check eviction policy
    // If at limit, evict the oldest key and delete from hash map
    insertKey(key: string, value: number) {
        if (this.rrKeyMap.has(key)) {
            this.rrKeyMap.set(key, value); // If key exists, just simply update key value
        }
        else {
            if (this.rrKeyMap.size === this.size) {
                let removeKey = this.evictKey();
                this.rrKeyMap.delete(removeKey!);

                this.rrKeyMap.set(key, value);
                this.rrCacheArray.push(key);
            }
            else {
                // Very simple if no eviction is required
                this.rrKeyMap.set(key, value);
                this.rrCacheArray.push(key);
            }
        }
    }

    // Retrieve the value of the key
    retrieveKey(key: string): number | null {
        if (this.rrKeyMap.has(key)) {
            return this.rrKeyMap.get(key)!;
        }
        else {
            return null;
        }
    }

    // Simply filter out the key from the queue
    // Capture key value and return it
    // Delete from hash map
    deleteKey(key: string): number | undefined {
        if (this.rrKeyMap.has(key)) {
            this.rrCacheArray = this.rrCacheArray.filter(i => i !== key); // Filter the key out of the queu
            let value = this.rrKeyMap.get(key);

            this.rrKeyMap.delete(key);
            return value;
        }
        else {
            return undefined;
        }
    }

    // Randomly selects which item to remove once the cache is full
    evictKey(): string | undefined {
        const randomIndex = Math.floor(Math.random() * this.size);
        let value = this.rrCacheArray[randomIndex];
        
        this.rrCacheArray = this.rrCacheArray.filter(i => i !== value);
        return value;
    }
}
RR cache implementation in TypeScript

So again, like before, we use a hash map to store the key-value pairs and instead of a doubly linked list, we opt-in to use an array to store the keys in the cache.

The insertKey, retrieveKey, and deleteKey functions are easy to follow and implement as is the eviction policy. We make use of the Math library to randomly generate a valid index which is used to randomly delete a key in the cache.

We now look at one more type of cache variation known as FIFO.


FIFO Cache

The FIFO cache as the name implies (and as you might guess), bases its eviction policy on the notion of evicting items on a first-come, first-serve basis.

Like the RR cache, the order of the items does not matter, but rather the order of insertion.

We can implement this type of cache using a queue (as you might have guessed) where we keep the order of the items stored in such a manner that follows the FIFO principle.

We still have the other data structures in place such as the hash map to keep track of all the key-value pairs, but it is the queue and the FIFO principle that is used as the cache eviction policy.

The following code segment details an implementation of the FIFO cache data structure Demo70_Cache_Variations/data_structures/cache/FIFOCache.ts:

GitHub GistTypeScript
// FIFO cache implements a cache using queue properties
class FIFOCache {
    queueCache: string[];
    keyMap: Map<string, number>;
    size: number;

    constructor(size: number) {
        this.queueCache = [];
        this.keyMap = new Map<string, number>();
        this.size = size < 0 ? 0 : size;
    }

    // Follow the FIFO principle for insertions and deletions of nodes
    // If no key exists, a new one is being inserted, check eviction policy
    // If at limit, evict the oldest key and delete from hash map
    insertKey(key: string, value: number) {
        if (this.keyMap.has(key)) {
            this.keyMap.set(key, value); // If key exists, just simply update key value
        }
        else {
            if (this.keyMap.size === this.size) {
                let removeKey = this.evictKey();
                this.keyMap.delete(removeKey!);

                this.keyMap.set(key, value);
                this.queueCache.push(key);
            }
            else {
                // Very simple if no eviction is required
                this.keyMap.set(key, value);
                this.queueCache.push(key);
            }
        }
    }

    // Retrieve the value of the key
    retrieveKey(key: string): number | null {
        if (this.keyMap.has(key)) {
            return this.keyMap.get(key)!;
        }
        else {
            return null;
        }
    }

    // Simply filter out the key from the queue
    // Capture key value and return it
    // Delete from hash map
    deleteKey(key: string): number | undefined {
        if (this.keyMap.has(key)) {
            this.queueCache = this.queueCache.filter(i => i !== key); // Filter the key out of the queu
            let value = this.keyMap.get(key);

            this.keyMap.delete(key);
            return value;
        }
        else {
            return undefined;
        }
    }

    // Delete and return the oldest key
    evictKey(): string | undefined {
        let key = this.queueCache.shift();
        return key;
    }
}
FIFO cache implementation in TypeScript

If you understand caching (up to now) and how queues operate according to the FIFO principle, this should be a breeze.

We covered quite a bit as it relates to caching and there are aplenty more variations out there.

Here is a link to a comprehensive list of caching variations which you can look into and implement on your own as a thought exercise.

Conclusion

We re-visited the LRU cache data structure and looked at it in great detail.

We built on the foundational knowledge of linked lists and queues to explore cache variations such as the MRU, RR, and FIFO caches.

We implemented each of these cache variations in TypeScript and touched on what makes them unique in their own right.

In the list below, you will find helpful links to the different caches we covered in this article along with the link to the GitHub repository that was used in this article:

I hope you found this article helpful and look forward to more in the future.

Thank you!

No Name

Abdullah Muhammad

Blogger. Software Engineer. Designer.

Subscribe to the newsletter

Get new articles, code samples, and project updates delivered straight to your inbox.