Least Frequently Used Cache Implementation

adavidoaiei profile image Adavidoaiei Dumitru-Cornel ・2 min read



The idea behind least frequently cache policy is that for each item from cache it keeps a use count which increments each time when the item is accessed, when cache exceed the limit this evicts(removes) the element with minimum use count freeing memory for a new element.


NuGet package manager:

Install-Package LfuCache -Version 1.0.0

Example Using

ICache<string, string> cache = new LfuCache<string, string>(1000);
cache.Add("name", "Helene");
cache.Add("surname", "Stuart");
var name = cache.Get("name");


The LfuCache implements interface ICache.

It's needed a data structure to store elements from cache sorted by use count, the current implementation uses a SortedList where key is use count and Value is LinkedList of elements from cache with the same use count, SortedList sorts LinkedLists by use count using a binary tree.
This data structure allows to run Add/Get operations in O(log n) time.

Performance benchmark

1000.000 add/get operations on implemented Least Frequently Used Cache of size 90.000 using elements from a list with 100.000 takes 466ms.

This cache runs faster than MemoryCache from .NET Framework and consumes less memory than this on the same benchmark.

The Add/Get operations sequence is generated random in an operations array of size OperationsCount, this operations process elements from a list of size EelementsCount using selected cache.

Performance benchmarks have been run with the following library PerformanceDotNet.

Unit Tests Code Coverage

The unit tests are written using MSTest framework and the code coverage report is generated with Azure Pipeline.

You could read an article on my blog in Romanian about implementation of this Least Frequently Used Cache

Posted on by:


markdown guide