Understanding LRU Cache: Interview Thought Process and Implementation
This article explains why technical interviews focus on problem‑solving skills, outlines a four‑step interview approach, introduces the LRU cache eviction policy, analyzes required operations, selects HashMap plus a doubly linked list as the optimal data structures, and provides a complete Java implementation with detailed commentary.
Before diving into the classic LRU Cache problem, the author discusses what technical interviews really assess: the ability to think critically, analyze and solve problems, and communicate effectively with teammates.
The recommended interview workflow consists of four steps: clarify the problem, analyze possible solutions and their time/space complexities, write the code, and run test cases to verify correctness.
LRU (Least Recently Used) is a cache eviction policy that removes the least recently accessed items when the cache reaches its capacity, based on the assumption that recently used data is more likely to be needed again.
To implement an LRU cache efficiently, the required operations are identified: get a value, insert a new key‑value pair, evict the oldest entry when full, and update the usage order when a key already exists. These operations demand O(1) time for both lookup and updates.
After evaluating common data structures, the author concludes that a combination of a HashMap (for O(1) key lookup) and a doubly linked list (for O(1) insertion, deletion, and movement of nodes) is optimal. The HashMap stores references to the linked‑list nodes, while the linked list maintains the usage order.
The article then presents a complete Java implementation of an LRU cache, including the node class, methods for get , put , and helper functions appendHead and remove . The code preserves the O(1) performance guarantees by updating both structures together.
class LRUCache {
// HashMap:
// LinkedList:
public static class Node {
int key;
int val;
Node next;
Node prev;
public Node(int key, int val) {
this.key = key;
this.val = val;
}
}
Map
map = new HashMap<>();
private Node head;
private Node tail;
private int cap;
public LRUCache(int capacity) { cap = capacity; }
public int get(int key) {
Node node = map.get(key);
if (node == null) return -1;
int res = node.val;
remove(node);
appendHead(node);
return res;
}
public void put(int key, int value) {
Node node = map.get(key);
if (node != null) {
node.val = value;
remove(node);
appendHead(node);
} else {
node = new Node(key, value);
if (map.size() < cap) {
appendHead(node);
map.put(key, node);
} else {
map.remove(tail.key);
remove(tail);
appendHead(node);
map.put(key, node);
}
}
}
private void appendHead(Node node) {
if (head == null) {
head = tail = node;
} else {
node.next = head;
head.prev = node;
head = node;
}
}
private void remove(Node node) {
if (head == tail) {
head = tail = null;
} else {
if (head == node) {
head = head.next;
node.next = null;
} else if (tail == node) {
tail = tail.prev;
tail.next = null;
node.prev = null;
} else {
node.prev.next = node.next;
node.next.prev = node.prev;
node.prev = null;
node.next = null;
}
}
}
}
/**
* Your LRUCache object will be instantiated and called as such:
* LRUCache obj = new LRUCache(capacity);
* int param_1 = obj.get(key);
* obj.put(key,value);
*/Finally, the author reflects on why algorithm‑focused interviews are prevalent, emphasizing that they evaluate a candidate’s problem‑solving mindset rather than rote knowledge, and that mastering such fundamentals equips engineers to tackle unknown challenges effectively.
Full-Stack Internet Architecture
Introducing full-stack Internet architecture technologies centered on Java
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.