Caches

Cache Algorithms (or cache replacement policies / cache replacement algorithms) are optimized instructions or strategies for computers and hardware to manage a cache of information. The main purpose of these algorithms is to decide which cached element must be evicted (removed) from the cache when the space is full. It is essential for high-performance software, like operation systems, databases, web servers, etc.

The most common cache replacement algorithms:

  • Queue-based Policies
    • FIFO (First-In-First-Out)
    • FILO (First in last out)
    • SIEVE
  • Recency-based Policies
    • LRU (Least Recently Used)
    • MRU (Most Recently Used)
    • SLRU (Segmented LRU)
    • CLOCK (Second Chance):
  • Frequency-based Policies
    • LFU (Least Frequently Used)
    • LFRU (Least frequent recently used)
  • Other
    • RR (Random Replacement)
    • ARC (Adaptive Replacement Cache)

Least Recently Used (LRU)

LRU picks an item that is least recently used and removes it in order to free space for the new data, when the cache memory is full.

  • Doubly Linked List allows us to move nodes (representing cache entries) efficiently between the head and the tail.
  • HashMap provides O(1) time complexity for lookups, and the linked list allows O(1) time for insertions and deletions from both ends.

In Java, LRU cache can be implemented using a LinkedHashMap, which maintains the insertion order of elements and has the ability to remove the eldest entry when a new entry is added beyond a predefined capacity.

class LRUCache<K, V> extends LinkedHashMap<K, V> {
  @Override
  protected boolean removeEldestEntry(Map.Entry<K, V> eldest) {
    return size() > capacity; // Remove the eldest entry if the cache size exceeds the capacity
  }
}