LRU and LFU cache implementation | Lxming's Weblog
I am learning a couple of LRU/LFU cache implementation. In a couple of implementations, I saw the combination of deque, double-linked list and hashmap data structure, the idea is like:
in LRU cache:
use hashmap to achieve a O(1) lookup.
use deque to maintain order by access.
in LFU cache:
use hashmap to achieve a O(1) lookup.
use double-linked list to maintain frequency
See: http://javarticles.com/2012/06/lfu-cache.html
http://www.lintcode.com/en/problem/lru-cache/#
It fascinated me that why such a data structure is chosen for this problem.
Here is a thought, for LRU cache, the current and next "order" are easily maintainable by deque, in the case of LRU, you essentially need to:
1) remove head (when cache expires)
2) delete an element, and append it to the end. (when an item is accessed).
For LFU cache, you need:
1) remove head (when cache expires)
2) move the item the next frequency slot ( when an item is accessed).
Read full article from LRU and LFU cache implementation | Lxming's Weblog
No comments:
Post a Comment