How to implement a LRU Cache? I got asked this question at a phone interview a long time ago. If you have rarely used and heard the term “cache” this question will come to you as quite a surprise. I am sure you have heard the term cache in CPUs or HDDs but to implement a cache? I remember when I first came across this question I was quite surprised myself and needed to think hard before I could understand the question.
Understanding a problem is the first step towards solving it. So the first thing you should do is ask yourself – what is a LRU cache? Well, LRU stands for Least Recently Used. Hence, a LRU Cache is a component that retrieves future requests faster by keeping them in a least recently used order and removing the least recently used item when a new one come up.
Some design specification to be mentioned for an LRU cache
- Cache hits need to be the fastest possible (meaning O(1))
- We need to keep and update a list of items so that we can easily put things at the top of the cache and we can easily discard items at the bottom of the cache
So while we think about this design specification we can further specify the data structures needed.
- To guarantee access in O(1) time in case of hits we need to keep all recently used items in a HashMap data structure
- To guarantee a continuously updated list we need some kind of List in the form of a Doubly Linked List where we can easily pinpoint the head and tail of the list
Let’s take a look at what the whole picture with both data structures would look like:
Thankfully, Java has already a ton of data structures ready for use. For our current task indeed, I have found a hidden gem that will provide our task to be fairly trivial. This simple approach will be to use the already defined for us java.util.LinkedHashMap (documentation found at LinkedHashMap API) so let’s start with this.
How to implement a LRU cache – Java simple implementation
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 |
public class LRUCache<Key, Value> extends LinkedHashMap<Key, Value> { private final int size; // size of the cache public LruCache(final int maxEntries) { super(size + 1, 1.0f, true); // 1.0 is the loadFactor or in cheap words how much each bucket/slot will be filled in the hashtable this.size = size; } /** * @see java.util.LinkedHashMap#removeEldestEntry(Map.Entry) */ protected boolean removeEldestEntry(final Map.Entry<Key, Value> oldest) { return super.size() > size; } } |
Good, but this answer is too specific to the language. Chances are in most other languages we will not have such an almost perfect class for our LRU cache. Instead we would have to create this data structure ourselves. Your interviewer will most likely want to know more too. So for this reason we will want to come up with a solution that proficiently uses both data structures mentioned at the beginning. Some kind of Map and some kind of Linked List.
Not only that, but to follow up and make your answer more crisp you might want to think about how parallel computing and will apply to our current solution. The topic at hand is called Concurrency and indeed even in real life, you will need to worry about how your implementation will fit in a multi-threaded application. The answer is astonishingly simple. Java provides you with all the help you need following the simple static method Collections.synchronizedMap method.
(check Collections API for more info)
How to implement a LRU cache – Java synchronizedMap example
1 |
Map<Key, Value> example = Collections.synchronizedMap(new LRUCache<Key, Value>(CACHE_SIZE)); |
Some food for thought on how this is going to pan out:
- If an item exists in the cache – Cache Hit -> Remove the element from the Linked List and add it to the beginning
- If an item does not exist in the cache – Cache Miss -> If cache is full / out of capacity, remove the oldest entry in the Linked List. Also add the new one to both Linked List and Map
Fortunately, once again we are given a bunch of help when it comes to dealing with concurrency because we can use some pre-defined data structures to implement our LRU Cache in Java. In fact if instead of using Java’s LinkedHashMap we search the library a bit more we will find ConcurrentHashMap and ConcurrentLinkedQueue which support concurrency and pretty much provide everything we need so that we do not need to synchronize actions by ourselves.
How to implement a LRU cache – Java in-depth implementation
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 |
public class LRUCache<Key, Value> { private final int size; private ConcurrentLinkedQueue<Key> linkedQueue; private ConcurrentHashMap<Key, Value> hashMap; /** * Cache constructor * @param size - size of our cache */ public LRUCache(final int size) { this.size = size; this.linkedQueue = new ConcurrentLinkedQueue<Key>(); this.hashMap = new ConcurrentHashMap<Key, Value>(size); } /** * Return the Value corresponding to the given Key in the map. return null if key not found. * @param key - Key * @return value - Value */ public Value get(final Key key) { if (hashMap.containsKey(key)) { Value value = hashMap.get(key); put(key, value); return value; } } /** * Add new Key, Value pair to our Map and Queue. If the Key already exists, * move it at the beginning of the queue. * @param key - Key * @param value - Value */ public synchronized void put(final Key key, final Value value) { if (hashMap.containsKey(key)) { linkedQueue.remove(key); } while (linkedQueue.size() >= size) { Key oldestKey = linkedQueue.poll(); if (oldestKey != null) { hashMap.remove(oldestKey); } linkedQueue.add(key); hashMap.put(key, value); } } |
Looks pretty good does it not? Now you can say you successfully know how to implement a LRU cache in Java and next time you will be asked in an interview you (and me) will absolutely ace the question with quite a few different approaches and in-depth answers.
Hope you guys enjoyed it… And I’ll see you guys next time ;D
Latest posts by Ajk (see all)
- Find Median from Numbers Array - January 1, 2018
- Find Whether a Number Can Be the Sum of Two Squares - August 12, 2017
- How to Find Two Primes Whose Sum is Equal to a Number N - April 22, 2017
68.96
How to Implement a LRU Cache
family guy the quest for stuff hack for android
How to Implement a LRU Cache
where to buy a treadmill in store
How to Implement a LRU Cache
I like your code and is very good example for LRU cache. I have one question though, suppose on put(..) you are checking if hashmap contains key if yes then you remove the key from LinkedQueue and then you are checking if linkedqueue size is equal or greater then fixed size, if linkedQueue size is less than fixed size then you will ignore the code where you are deleting the key and value from hashmap and hence on the very next code you are inserting both key and value on hashMap, this use case will create more than 2 key, value into the hashMap. Please let me know if you think otherwise.
From Java’s HashMap API
So putting it again is simply going to replace the key.
Thanks that clear the doubt
Hi, In order to correctly implement LRU list of items, “get” method should update the queue by moving the accessed item as most recently used item.
public int get(int key) {
Integer value = hashMap.get(key);
if (value != null) {
linkedQueue.remove(key);
linkedQueue.add(key);
}
return value;
}
There are major problems with the implementation given here.
1. In the
get()
method, you need to move the key’s item to the head of the queue. The code is completely missing this action.2. In the
put()
method, you calllinkedQueue.remove(key)
. Because ConcurrentLinkedQueue is implemented as a linked list, this remove() call will be O(N) time to find the key. The put() call should be O(1) time.