🗑️Key Takeaways
- 1LRU (Least Recently Used) is the most common and generally the best default choice
- 2LFU (Least Frequently Used) is better for workloads with stable popular items
- 3TTL (Time-To-Live) provides time-based automatic expiration independent of other policies
- 4The right policy depends on your access pattern — there's no universal best
When the Cache is Full
Caches have finite memory. When the cache is full and a new item needs to be stored, an existing item must be evicted. The eviction policy determines which item to remove. Choosing the wrong policy wastes cache space on items that won't be accessed again.
Eviction Policies Compared
| Policy | Evicts | Good For | Weakness |
|---|---|---|---|
| LRU | Least recently accessed item | General purpose, temporal locality | Doesn't account for frequency |
| LFU | Least frequently accessed item | Stable popularity distribution | New items may be evicted before getting popular |
| FIFO | Oldest item (first in, first out) | Simplest implementation | Ignores access patterns entirely |
| Random | Random item | When access is unpredictable | May evict popular items |
| TTL | Items past their expiration time | Time-sensitive data | Not space-based — cache can overflow |
| ARC | Adaptive — tracks both recency and frequency | Mixed workloads | More complex implementation |
LRU Cache Implementation (Concept)
LRU Cache = Hash Map + Doubly Linked List
Operations:
GET(key):
1. Look up key in hash map → O(1)
2. If found, move node to front of linked list (most recent) → O(1)
3. Return value
PUT(key, value):
1. If key exists, update value, move to front → O(1)
2. If cache is full, remove last node (least recent) → O(1)
3. Insert new node at front, add to hash map → O(1)
All operations: O(1) time
This is why LRU is the default — it's simple and fast.✅Redis Default: Approximate LRU
Redis doesn't implement exact LRU (too expensive for millions of keys). Instead, it samples 5 random keys and evicts the least recently used among those. This 'approximated LRU' is >99% as effective as exact LRU with much less overhead. You can tune the sample size with maxmemory-samples.
Advantages
- •LRU handles most workloads well
- •O(1) operations make caching fast
- •TTL provides automatic freshness
Disadvantages
- •Wrong policy wastes cache space
- •LFU penalizes new items
- •Exact LRU has memory overhead per key
🧪 Test Your Understanding
Knowledge Check1/1
What data structures does an LRU cache use for O(1) operations?