Caching Algorithms
- LRU (Least Recently Used) cache: LRU cache discards the least-recently-used objects from the cache to maintain the cache size.
An easy implementation of this can be keeping two data structures - a linked-list and a hash-map.
Linked-list stores the oldest entries towards the tail and the newest entries are added to the head of the list. Thus insertion and deletion to the cache take constant order time.
Hash-map is there to provide the actual cache behavior of accessing actual object from the key.
Note: Whenever an object is accessed which is already there in the list and the hash-map, then it must be moved from its current position in the list to the head. This O(N) can be improved to constant order time by maintaining another hash-map which allows direct jump to the linked-list location from the key. - LFU (Least Frequently Used) cache: LFU cache discards the least-frequently-used objects from the cache.
Implementation wise, this can be quite similar to the LRU. To store the frequency, a second hash-map may be required apart from the normal lookup-map required for cache. The second hash-map has a count for every key which is incremented with each cache-hit. The list in this case stores objects with highest frequency at the head and the lowest frequency objects are kept towards the tail.
Note:The hash-map storing frequency-of-access has to be much larger in size than the normal look-up map as well as the list.
This is so because the frequency hash-map can never delete any key, else the usage of key over time (and hence its frequency of access) will be lost.
Read full article from Chapter 2: Caching Algorithms
No comments:
Post a Comment