DEV Community

Ben Witt
Ben Witt

Posted on • Originally published at Medium

HybridCache in a console application with Redis

In modern applications, the efficient management of data and the avoidance of unnecessary database queries is essential. An effective way to achieve this is to use a HybridCache, which utilises both a MemoryCache and a Distributed Cache such as Redis. In this article, I will present an implementation of a HybridCache in a console-only application and explain when to query which cache to achieve optimal performance. In addition, I will discuss the importance of cache invalidation and explain how this can be solved in the implementation.

What is a HybridCache?

A HybridCache combines several cache types in order to increase performance. In our case, we first access the fast, memory-based MemoryCache, and if no data is available there, we fall back on a distributed cache — in this case Redis. Only if neither cache returns a hit is a query sent to the database. This approach helps to minimise database queries and significantly reduce latency times.

Implementation of the HybridCache

I have created a console application that implements the HybridCache with Microsoft.Extensions.Caching.Memory and Redis. The following code shows the basic implementation where cache invalidation plays a role:

public class HybridCache : IHybridCache
{
    private readonly IMemoryCache _memoryCache;
    private readonly IDistributedCache _distributedCache;
    private static readonly object _lock = new object();  // Lock for thread-safety

    public HybridCache(IMemoryCache memoryCache, IDistributedCache distributedCache)
    {
        _memoryCache = memoryCache;
        _distributedCache = distributedCache;
    }

    public async Task<T> GetOrSetAsync<T>(string key, Func<Task<T>> factory, TimeSpan duration)
    {
        T value;

        // Check MemoryCache first
        lock (_lock)
        {
            if (_memoryCache.TryGetValue(key, out value))
            {
                string valueString = value.ToString();
                valueString = ReplaceSourceInfo(valueString, "MemoryCache");
                value = (T)Convert.ChangeType(valueString, typeof(T));
                return value;
            }
        }

        // Check Distributed Cache (Redis)
        var cachedData = await _distributedCache.GetStringAsync(key);
        if (cachedData != null)
        {
            value = System.Text.Json.JsonSerializer.Deserialize<T>(cachedData);

            string valueString = value.ToString();
            valueString = ReplaceSourceInfo(valueString, "Distributed Cache");
            value = (T)Convert.ChangeType(valueString, typeof(T));

            // Set the value in MemoryCache
            lock (_lock)
            {
                _memoryCache.Set(key, value, absoluteExpirationRelativeToNow: duration);
            }

            return value;
        }

        // Query the database if not found in any cache
        value = await factory();

        // Save the data to both MemoryCache and Redis
        string newValue = ReplaceSourceInfo(value.ToString(), "Database");
        value = (T)Convert.ChangeType(newValue, typeof(T));

        var serializedValue = System.Text.Json.JsonSerializer.Serialize(value);
        await _distributedCache.SetStringAsync(key, serializedValue, new DistributedCacheEntryOptions
        {
            AbsoluteExpirationRelativeToNow = duration  // Set expiration time for Redis
        });

        lock (_lock)
        {
            _memoryCache.Set(key, value, duration);  // Set expiration time for MemoryCache
        }

        return value;
    }

    public void InvalidateCache(string key)
    {
        // Remove key from MemoryCache
        lock (_lock)
        {
            _memoryCache.Remove(key);
        }

        // Remove key from Distributed Cache (Redis)
        _distributedCache.Remove(key);
    }

    private string ReplaceSourceInfo(string originalValue, string source)
    {
        string patternToRemove = @"\(Source:.*?\)";
        string updatedValue = System.Text.RegularExpressions.Regex.Replace(originalValue, patternToRemove, "");
        return $"{updatedValue} (Source: {source})";
    }

    public async Task<T> GetAsync<T>(string key)
    {
        // Try getting the value from MemoryCache first
        lock (_lock)
        {
            if (_memoryCache.TryGetValue(key, out T value))
            {
                return value;
            }
        }

        // If not found, get the value from Distributed Cache
        var cachedData = await _distributedCache.GetStringAsync(key);
        return cachedData != null ? System.Text.Json.JsonSerializer.Deserialize<T>(cachedData) : default;
    }
}
Enter fullscreen mode Exit fullscreen mode

Cache invalidation

In the above implementation, a simple time-based expiration is used to ensure that the cache entries in both the MemoryCache and the Redis Cache become invalid after a certain time. This is controlled by the AbsoluteExpirationRelativeToNow method, which determines when an entry expires automatically.
In addition, I have added an InvalidateCache method that enables manual invalidation of the cache entries. With this method, the cache entry can be explicitly deleted from both the MemoryCache and Redis if, for example, the underlying data in the database has been changed.

When is which cache used?

In this implementation, the HybridCache ensures that queries are made according to a clear hierarchy. MemoryCache is queried first, as it is faster. If no data is found there, the cache checks Redis. Redis is favoured in distributed systems as it can serve multiple instances of the application and therefore provides a central source for the cache data. Finally, if both caches fail, the database is queried.

Extended strategies for cache invalidation

  1. manual/explicit invalidation: As shown in the code, there is an InvalidateCache method that can be used to manually remove cache entries. This is useful when data in the database changes and the cache contains outdated entries.
  2. event-based invalidation: In distributed systems, an event-based model could be used to invalidate cache entries. If the data changes, a message could be sent (e.g. via a message queue system) that invalidates the cache entry in several instances simultaneously.
  3. sliding expiration: Another possible strategy is sliding expiration, in which the expiration time of a cache entry is extended with each access. This ensures that frequently used data remains in the cache for longer, while unused data expires after a certain time. An example of sliding expiration could look like this:
var cacheEntryOptions = new DistributedCacheEntryOptions
{
    SlidingExpiration = TimeSpan.FromMinutes(5)  // Expiration is reset after each access
};

await _distributedCache.SetStringAsync(key, serializedValue, cacheEntryOptions);
Enter fullscreen mode Exit fullscreen mode
var cacheEntryOptions = new DistributedCacheEntryOptions
{
    SlidingExpiration = TimeSpan.FromMinutes(5)  // Expiration is reset after each access
};

await _distributedCache.SetStringAsync(key, serializedValue, cacheEntryOptions);
Enter fullscreen mode Exit fullscreen mode

Install Redis - locally or via Docker

Redis is used as a distributed cache for this application. Redis can be installed in various ways. An easy way to install Redis locally is to download the Redis binaries from the official website (https://redis.io/download). Once Redis is installed, it can be executed on the local host on the standard port 6379.
Alternatively, Redis can also be run via Docker.

Other options - e.g. with MS SQL Server

In addition to Redis, there are also other options for implementing a distributed cache. One widespread option is the use of an SQL-based cache, e.g. with Microsoft SQL Server. This solution is often used in environments where there is already a strong integration with SQL Server-based systems.

Conclusion

The implementation of a HybridCache in a console application offers numerous advantages, especially when it comes to reducing latency times and database queries. By combining a fast memory-based cache such as MemoryCache and a distributed cache such as Redis, application performance can be significantly increased. The introduction of thread safety and proper synchronisation ensures that parallel requests do not cause inconsistencies. Furthermore, cache invalidation is an important aspect to ensure that only up-to-date data is used and should be customised to the needs of the application.
With Redis as a locally installed cache - or even as a Docker container - and the right cache hierarchy, scalable and high-performance applications can be implemented.

DEMO:

Github: HybridCache

Top comments (0)