The goal of any cache is to maximize the hit ratio, while minimize resource usage.
Each strategy attempts to guess which nodes are the most likely to be needed again.
The main difference in strategies, is in how cached elements are selected
for elimination when the cache becomes full.
First In First Out
The FIFO policy is a simple algorithm strategy, elements are evicted in the same
order they were added to the cache. The simplicity of this replacement algorithm
typically makes this cache the fastest. This does not mean that the cache will
improve application performance the most, only that accesses to the cache will take
the least amount of time.
This replacement policy is recommended when cache gets are truly random - when the
likelihood of an object being requested does not have a bearing on the likelihood
of the same object being requested again.
This strategy has a bias towards fresher data.
Least Recently Used
The LRU policy is a variation of the FIFO policy. The only real difference between the
two is that LRU "boosts" objects when they are accessed. When an object is accessed, the
cache re-prioritize that object, causing it to remain in the cache longer. Note that
usage and priority does not effect the node's expiration.
This policy is generally recommended when cache access is non-random. This is commonly the
case for applications that need to cache data that users request, and some data is more
"popular" than others. For example, on a news website, headlines are going to get more
hits, and should therefore remain in cache longer.
This strategy has a bias towards more recently used data.
Least Frequently Used
LFU is a policy that evicts objects that are "least frequently used".
In this strategy, elements are evicted based when they were added, when they were last used, and how
many times they have been used.
This policy should be used if an object that has been used many times is likely to be
used again, even if other objects have been used more recently. This algorithm has the effect
of making some objects, those that were used several times, much more durable in the cache.
This strategy has a bias towards data that has been hit more times.