Intermediate15 min read· Topic 3.2

Cache eviction policies

LRU, LFU, FIFO, Random, TTL-based, ARC — Adaptive Replacement Cache

🗑️Key Takeaways

  • 1
    LRU (Least Recently Used) is the most common and generally the best default choice
  • 2
    LFU (Least Frequently Used) is better for workloads with stable popular items
  • 3
    TTL (Time-To-Live) provides time-based automatic expiration independent of other policies
  • 4
    The right policy depends on your access pattern — there's no universal best

When the Cache is Full

Caches have finite memory. When the cache is full and a new item needs to be stored, an existing item must be evicted. The eviction policy determines which item to remove. Choosing the wrong policy wastes cache space on items that won't be accessed again.

Eviction Policies Compared

PolicyEvictsGood ForWeakness
LRULeast recently accessed itemGeneral purpose, temporal localityDoesn't account for frequency
LFULeast frequently accessed itemStable popularity distributionNew items may be evicted before getting popular
FIFOOldest item (first in, first out)Simplest implementationIgnores access patterns entirely
RandomRandom itemWhen access is unpredictableMay evict popular items
TTLItems past their expiration timeTime-sensitive dataNot space-based — cache can overflow
ARCAdaptive — tracks both recency and frequencyMixed workloadsMore complex implementation
LRU Cache Implementation (Concept)
LRU Cache = Hash Map + Doubly Linked List

Operations:
  GET(key):
    1. Look up key in hash map → O(1)
    2. If found, move node to front of linked list (most recent) → O(1)
    3. Return value

  PUT(key, value):
    1. If key exists, update value, move to front → O(1)
    2. If cache is full, remove last node (least recent) → O(1)
    3. Insert new node at front, add to hash map → O(1)

  All operations: O(1) time

  This is why LRU is the default — it's simple and fast.
Redis Default: Approximate LRU
Redis doesn't implement exact LRU (too expensive for millions of keys). Instead, it samples 5 random keys and evicts the least recently used among those. This 'approximated LRU' is >99% as effective as exact LRU with much less overhead. You can tune the sample size with maxmemory-samples.

Advantages

  • LRU handles most workloads well
  • O(1) operations make caching fast
  • TTL provides automatic freshness

Disadvantages

  • Wrong policy wastes cache space
  • LFU penalizes new items
  • Exact LRU has memory overhead per key

🧪 Test Your Understanding

Knowledge Check1/1

What data structures does an LRU cache use for O(1) operations?