A generic, fixed-capacity Least Recently Used (LRU) Cache implemented from scratch using a HashMap + custom Doubly Linked List — achieving O(1) time complexity for both get and put.
LRUCache<K, V>
├── HashMap<K, CacheEntry<K,V>> ← O(1) key lookup
├── CacheEntry<K,V> head ← sentinel (dummy head)
└── CacheEntry<K,V> tail ← sentinel (dummy tail)
head ↔ [MRU] ↔ ... ↔ [LRU] ↔ tail
| Class | Role |
|---|---|
LRUCache<K, V> |
The cache. Generic over key and value types. Maintains a HashMap for O(1) lookup and a doubly linked list for O(1) access-order tracking. |
CacheEntry<K, V> |
A node in the doubly linked list. Holds key, value, prev, and next pointers. |
| Requirement | Solution |
|---|---|
| O(1) lookup by key | HashMap<K, CacheEntry> |
| O(1) track & update access order | Doubly Linked List (move node to front on access) |
| O(1) evict LRU entry | Always tail.prev — remove in O(1) |
A singly linked list would require O(n) to find the previous node. A LinkedHashMap could do this too, but implementing it from scratch demonstrates the underlying mechanism.
head and tail are dummy sentinel nodes (holding null key/value). They permanently anchor the list, eliminating all null checks in pointer manipulation:
head ↔ [entry1 (MRU)] ↔ [entry2] ↔ [entry3 (LRU)] ↔ tail
On every get or put, the accessed entry is moved to head.next (front = Most Recently Used). Eviction always removes tail.prev (back = Least Recently Used).
LRUCache<K, V> works for any key/value types:
LRUCache<Integer, String> cache = new LRUCache<>(3);
LRUCache<String, User> cache = new LRUCache<>(100);1. cache.get(key) → CacheEntry (HashMap lookup)
2. moveToFront(entry) (removeNode + addToFront)
3. return entry.value
1. cache.get(key)
→ if exists: update value, moveToFront, return
2. if cache.size() == capacity:
evictLeastRecentlyUsed() ← remove tail.prev from list + map
3. newEntry = new CacheEntry(key, value)
4. cache.put(key, newEntry)
5. addToFront(newEntry) ← insert after head
removeNode(entry):
entry.prev.next = entry.next
entry.next.prev = entry.prev
addToFront(entry):
entry.next = head.next
entry.prev = head
head.next.prev = entry
head.next = entry| Operation | Time | Space |
|---|---|---|
get |
O(1) | — |
put |
O(1) | — |
| Overall | — | O(capacity) |
| Pattern | Where |
|---|---|
| Doubly Linked List + HashMap | Core data structure combination |
| Sentinel Node | head and tail dummy nodes simplify edge cases |
| Generic Programming | LRUCache<K,V> works with any types |
src/main/java/org/example/
├── LRUCache.java ← Core cache: HashMap + DLL logic
├── CacheEntry.java ← DLL node: key, value, prev, next
└── Main.java ← Demo with get/put scenarios
mvn compile exec:java -Dexec.mainClass="org.example.Main"