Problem

Design and implement a data structure for Least Recently Used (LRU) cache. It should support the following operations: get and add.

Implement the LRUCache class:

  • LRUCache(int capacity) Initialize the LRU cache with positive size capacity.
  • int get(int key) Return the value of the key if the key exists, otherwise return -1.
  • void put(int key, int value) Update the value of the key if the key exists. Otherwise, add the key-value pair to the cache. If the number of keys exceeds the capacity from this operation, evict the least recently used key.

The functions get and put must each run in O(1) average time complexity.

More on algo: Least Recently Used (LRU) Cache Replacement Algorithm

Examples

Example 1:

Input
["LRUCache", "put", "put", "get", "put", "get", "put", "get", "get", "get"]
[ [2], [1, 1], [2, 2], [1], [3, 3], [2], [4, 4], [1], [3], [4] ]
Output
[null, null, null, 1, null, -1, null, -1, 3, 4]

Explanation
LRUCache lRUCache = new LRUCache(2);
lRUCache.put(1, 1); // cache is {1=1}
lRUCache.put(2, 2); // cache is {1=1, 2=2}
lRUCache.get(1);    // return 1
lRUCache.put(3, 3); // LRU key was 2, evicts key 2, cache is {1=1, 3=3}
lRUCache.get(2);    // returns -1 (not found)
lRUCache.put(4, 4); // LRU key was 1, evicts key 1, cache is {4=4, 3=3}
lRUCache.get(1);    // return -1 (not found)
lRUCache.get(3);    // return 3
lRUCache.get(4);    // return 4

Solutions

Since it’s a cache you need to guarantee a O(1) for reading and writing. This hints at using hash table. We also need some form ordering, so Queue is a FIFO Data structure. We use two data structures to implement an LRU Cache.

  1. Queue which is implemented using a doubly linked list. The maximum size of the queue will be equal to the total number of frames available (cache size).The most recently used (MRU) node will be near tail end and least recently (LRU) used node will be near head end. (In queue, we remove values from front, but we can also do it from tail.)
  2. Hash Table with page number as key and address of the corresponding queue node as value.

There are a couple of solutions to this problem. Here are some we will cover:

  • Dequeue + Hashmap
  • LinkedHashMap

Method 1 - Using Dequeue and HashMap

Here is how we will use both the data-structures:

  • Add keys to the hashmap
  • Add values of the keys as the pointer to the DLL
  • We keep the recently added/get element to the Head

Define a double linked list node:

class DLLNode < T > {
    T value;
    DLLNode prev;
    DLLNode next;

    public DLLNode(T value) {
        this.value = value;
    }
}

But we modify our DLLNode to contain key as well:

class DLLNode {
    int value;
    DLLNode prev;
    DLLNode next;

    public DLLNode(int value) {
        this.value = value;
    }
}

Here is the LRUCache basic structure:

public class LRUCache {
    int capacity;
    HashMap < Integer, DLLNode > map = new HashMap < > ();
    DLLNode head = null;
    DLLNode tail = null;

    public LRUCache(int capacity) {
        this.capacity = capacity;
    }
    public int get(int key); //to be implemented
    public void put(int key, int value); //to be implemented
}

Lets start with the get():

public int get(int key) {
	if(!map.containsKey(key)){
		return -1;
	}
     
	 DLLNode n = map.get(key);
	 remove(n);
	 offerAtTail(n);
	 return n.value;
}

So, we just remove the node and it to the head again and return the value. How does remove works?

remove and setHead:

private void remove(DLLNode n) {
	// if not head node
    if (n.prev != null) {
        n.prev.next = n.next;
    } else {
        head = n.next;
    }

	// if not tail node
    if (n.next != null) {
        n.next.prev = n.prev;
    } else {
        tail = n.prev;
    }

}

private void offerAtTail(DLLNode n) {
	if (tail != null) {
		tail.next = n;
	}

	n.prev = tail;
	n.next = null;
	tail = n;

	if (head == null) {
		head = tail;
	}
}

Here is the add method -

public void put(int key, int value) {
    if (map.containsKey(key)) {
        DLLNode old = map.get(key);
        old.value = value;
        remove(old);
        offerAtTail(old);
    } else {
        if (map.size() >= capacity) {
            map.remove(head.key);
            remove(head);
        } 
	    DLLNode created = new DLLNode(value);
		offerAtTail(created);
		map.put(key, created);
    }
}

Here is the full LRUCache:

public class LRUCache {

    static class DLLNodeKV {
	    int key;
        int value;
        DLLNodeKV prev;
        DLLNodeKV next;

        public DLLNodeKV(int key, int value) {
	        this.key = key;
            this.value = value;
        }
    }
    
    private final int capacity;
    private final Map <Integer,DLLNodeKV > map;
    DLLNodeKV head = null;
    DLLNodeKV tail = null;

    public LRUCache(int capacity) {
        this.capacity = capacity;
        this.map = new HashMap<>();
    }

	public int get(int key) {
		if(!map.containsKey(key)){
			return -1;
		}
	     
		 DLLNodeKV n = map.get(key);
		 remove(n);
		 offerAtTail(n);
		 return n.value;
	}

	public void put(int key, int value) {
	    if (map.containsKey(key)) {
	        DLLNodeKV old = map.get(key);
	        old.value = value;
	        remove(old);
	        offerAtTail(old);
	    } else {
	        if (map.size() >= capacity) {
	            map.remove(head.key);//this is why we want to store key in dll node
	            remove(head);
	        } 
		    DLLNodeKV created = new DLLNodeKV(key, value);
			offerAtTail(created);
			map.put(key, created);
	    }
	}

	private void remove(DLLNodeKV n) {
		// if not head node
	    if (n.prev != null) {
	        n.prev.next = n.next;
	    } else {
	        head = n.next;
	    }
	
		// if not tail node
	    if (n.next != null) {
	        n.next.prev = n.prev;
	    } else {
	        tail = n.prev;
	    }
	
	}
	
	private void offerAtTail(DLLNodeKV n) {
		if (tail != null) {
			tail.next = n;
		}
	
		n.prev = tail;
		n.next = null;
		tail = n;
	
		if (head == null) {
			head = tail;
		}
	}
}

Of course you probably shouldn’t implement your own cache or linked list, there are so many very efficient implementations.

In Java you can implement your LRUCache in a couple of instructions reusing java built in data structures but here we just want to understand what’s going on and write our own for learning purpose.

Notice also that this implementation is not thread safe too.

Method 2 - Using LinkedHashMap

Here is the LRU cache using LinkedHashMap.

Code

Java
import java.util.LinkedHashMap;
import java.util.Iterator;

public class LRUCache {

    private int capacity;
    private LinkedHashMap < Integer, Integer > map;

    public LRUCache(int capacity) {
        this.capacity = capacity;
        this.map = new LinkedHashMap < > ();
    }

    public int get(int key) {
        Integer value = this.map.get(key);
        if (value == null) {
            value = -1;
        } else {
            this.set(key, value);
        }
        return value;
    }

    public void put(int key, int value) {
        if (this.map.containsKey(key)) {
            this.map.remove(key);
        } else if (this.map.size() == this.capacity) {
            Iterator < Integer > it = this.map.keySet().iterator();
            it.next();
            it.remove();
        }
        map.put(key, value);
    }
}

We can make code even simpler:

import java.util.LinkedHashMap;
import java.util.Map;

public LRUCache < K, V > extends LinkedHashMap < K, V > {
    private int cacheSize;

    public LRUCache(int cacheSize) {
        super(16, 0.75, true);
        this.cacheSize = cacheSize;
    }

    protected boolean removeEldestEntry(Map.Entry < K, V > eldest) {
        return size() >= cacheSize;
    }
}

The only catch is that by default the linked list order is the insertion order, not access. However one of the constructor exposes an option use the access order instead.