Problem
Implement the LRU cache
Solution
Least Recently Used (LRU) Cache is to discard the least recently used items first
How do you design and implement such a cache class? The design requirements are as follows:
1) find the item as fast as we can
2) Once a cache misses and a cache is full, we need to replace the least recently used item as fast as possible.
We use two data structures to implement an LRU Cache :
1. A Queue which is implemented using a doubly linked list. The maximum size of the queue will be equal to the total number of frames available (cache size).
The most recently used pages will be near front end and least recently pages will be near rear end.
2. A Hash with page number as key and address of the corresponding queue node as value.
When a page is referenced, the required page may be in the memory. If it is in the memory, we need to detach the node of the list and bring it to the front of the queue.
If the required page is not in the memory, we bring that in memory. In simple words, we add a new node to the front of the queue and update the corresponding node address in the hash. If the queue is full, i.e. all the frames are full, we remove a node from the rear of queue, and add the new node to the front of queue.
Note: Initially no page is in the memory.
A linked list + hashtable of pointers to the linked list nodes is the usual way to implement LRU caches. This gives O(1) operations (assuming a decent hash). Advantage of this (being O(1)): you can do a multithreaded version by just locking the whole structure. You don't have to worry about granular locking etc.
Briefly, the way it works:
On an access of a value, you move the corresponding node in the linked list to the head.
When you need to remove a value from the cache, you remove from the tail end.
When you add a value to cache, you just place it at the head of the linked list.
Assuming hashmap has retrieval complexity of O(1), we can do this using a doubly link list and a Hashmap
Doubly link list will be used to store indexes sorted by date/timestamp information.
Keep track of first and last node.
No sorting effort is needed as each new node will have latest timestamp.
The Hashmap will retrieve the node searched by key. It will also contain pointer to its respective doubly link list node.
READ -
Search the node by key. Retrieve the node value and index for doubly linklist.
Delete the node from doubly linked list and add it with the updated date value at the starting of the list.
Complexity of deleting a node in doubly link list is O(1) as we can reach to it without traversing the list.
INSERT -
Add the new node in the hashmap and the doubly linked list.
If the cache is full, use the pointer to the last node in the doubly linked list and delete that node. Set the delete pointer appropriately.
Inserting at the start of linked list is operation with time O(1).
UPDATE -
Update operation will need similar operation as READ in terms of accessing the data structure. Hence, similar time complexities apply.
DELETE -
Deleting the node from the doubly-linked list and hashmap can be done in O(1).
1) find the item as fast as we can
2) Once a cache misses and a cache is full, we need to replace the least recently used item as fast as possible.
We use two data structures to implement an LRU Cache :
1. A Queue which is implemented using a doubly linked list. The maximum size of the queue will be equal to the total number of frames available (cache size).
The most recently used pages will be near front end and least recently pages will be near rear end.
2. A Hash with page number as key and address of the corresponding queue node as value.
When a page is referenced, the required page may be in the memory. If it is in the memory, we need to detach the node of the list and bring it to the front of the queue.
If the required page is not in the memory, we bring that in memory. In simple words, we add a new node to the front of the queue and update the corresponding node address in the hash. If the queue is full, i.e. all the frames are full, we remove a node from the rear of queue, and add the new node to the front of queue.
Note: Initially no page is in the memory.
A linked list + hashtable of pointers to the linked list nodes is the usual way to implement LRU caches. This gives O(1) operations (assuming a decent hash). Advantage of this (being O(1)): you can do a multithreaded version by just locking the whole structure. You don't have to worry about granular locking etc.
Briefly, the way it works:
On an access of a value, you move the corresponding node in the linked list to the head.
When you need to remove a value from the cache, you remove from the tail end.
When you add a value to cache, you just place it at the head of the linked list.
Assuming hashmap has retrieval complexity of O(1), we can do this using a doubly link list and a Hashmap
Doubly link list will be used to store indexes sorted by date/timestamp information.
Keep track of first and last node.
No sorting effort is needed as each new node will have latest timestamp.
The Hashmap will retrieve the node searched by key. It will also contain pointer to its respective doubly link list node.
Operations
Here is how we will handle cache operationsREAD -
Search the node by key. Retrieve the node value and index for doubly linklist.
Delete the node from doubly linked list and add it with the updated date value at the starting of the list.
Complexity of deleting a node in doubly link list is O(1) as we can reach to it without traversing the list.
INSERT -
Add the new node in the hashmap and the doubly linked list.
If the cache is full, use the pointer to the last node in the doubly linked list and delete that node. Set the delete pointer appropriately.
Inserting at the start of linked list is operation with time O(1).
UPDATE -
Update operation will need similar operation as READ in terms of accessing the data structure. Hence, similar time complexities apply.
DELETE -
Deleting the node from the doubly-linked list and hashmap can be done in O(1).
Java's LinkedHashMap, explained well here - http://java-planet.blogspot.com/2005/08/how-to-set-up-simple-lru-cache-using.html
But, what if the environment is multi-threaded and we want a concurrent solution?
http://www.codewalk.com/2012/04/least-recently-used-lru-cache-implementation-java.html
http://javadecodedquestions.blogspot.in/2013/02/java-cache-static-data-loading.html
http://spsneo.com/blog/2011/03/15/a-simple-lru-cache-implementation/
http://stackoverflow.com/questions/15951935/the-best-way-to-implement-lru-cache
http://stackoverflow.com/questions/221525/how-would-you-implement-an-lru-cache-in-java-6
http://www.linkedin.com/groups/One-famous-java-interview-question-70526.S.5810112475999264771
http://www.careercup.com/question?id=17230678
http://www.careercup.com/question?id=13998662
http://www.cs.uml.edu/~jlu1/doc/codes/lruCache.html
http://timday.bitbucket.org/lru.html
http://java-planet.blogspot.in/2005/08/how-to-set-up-simple-lru-cache-using.html
http://www.geeksforgeeks.org/implement-lru-cache/
http://mcicpc.cs.atu.edu/archives/2012/mcpc2012/lru/lru.html
http://stackoverflow.com/questions/2504178/lru-cache-design
http://stackoverflow.com/questions/221525/how-would-you-implement-an-lru-cache-in-java-6
http://stackoverflow.com/questions/11623994/example-using-androids-lrucache
http://www.geeksforgeeks.org/implement-lru-cache/
http://www.careercup.com/question?id=1726905
http://codereview.stackexchange.com/questions/32849/design-lru-cache-interview-questions
http://stackoverflow.com/questions/224868/easy-simple-to-use-lru-cache-in-java
http://www.java-blog.com/creating-simple-cache-java-linkedhashmap-anonymous-class
http://stackoverflow.com/questions/13158657/simple-java-caching-library-or-design-pattern
0 comments:
Post a Comment