Cache Algorithms (or cache replacement policies / cache replacement algorithms) are optimized instructions or strategies for computers and hardware to manage a cache of information. The main purpose of these algorithms is to decide which cached element must be evicted (removed) from the cache when the space is full. It is essential for high-performance software, like operation systems, databases, web servers, etc.
LRU picks an item that is least recently used and removes it in order to free space for the new data, when the cache memory is full.
Doubly Linked List allows us to move nodes (representing cache entries) efficiently between the head and the tail.
HashMap provides O(1) time complexity for lookups, and the linked list allows O(1) time for insertions and deletions from both ends.
importjava.util.HashMap;importjava.util.Map;publicclassLRUCache<K,V>{privateintsize=0;privateintcapacity;privateNodehead;privateNodetail;privateMap<K,Node>cache=newHashMap<>();publicLRUCache(intcapacity){this.capacity=capacity;}publicVget(Kkey){// Check the key for existenceNodenode=cache.get(key);if(node==null)returnnull;// Evict the value we found and put it to the frontdetach(node);prepend(node);returnnode.value;// Return out the found value}publicvoidput(Kkey,Vvalue){Nodenode=cache.get(key);if(node==null){node=newNode(key,value);// Create a new nodecache.put(key,node);// Add to cache mapsize++;// Increase cache sizeprepend(node);// Add to the fronttrimCache();// Ensure cache non-overflow}else{// Update and move it to the frontnode.value=value;detach(node);prepend(node);}}privatevoidtrimCache(){if(size<=capacity)return;cache.remove(tail.key);detach(tail);size--;}}
publicclassLRUCache<K,V>{/*
* Detaches a node from the doubly-linked list
*/privatevoiddetach(Nodenode){if(node.next!=null)node.next.prev=node.prev;if(node.prev!=null)node.prev.next=node.next;if(node==head)head=head.next;if(node==tail)tail=tail.prev;node.next=null;node.prev=null;}/*
* Inserts a node at the beginning of the doubly-linked list
*/privatevoidprepend(Nodenode){if(head==null){head=tail=node;return;}node.next=head;head.prev=node;head=node;}}
👉
The complete LRUCache implementation and examples are available here.
In Java, LRU cache can be implemented using a LinkedHashMap, which maintains the insertion order of elements and has the ability to remove the eldest entry when a new entry is added beyond a predefined capacity.
classLRUCache<K,V>extendsLinkedHashMap<K,V>{@OverrideprotectedbooleanremoveEldestEntry(Map.Entry<K,V>eldest){returnsize()>capacity;// Remove the eldest entry if the cache size exceeds the capacity}}