Imagine you have a service SettingsService
that makes a REST request with a HttpClient
. This service calls a microservice for the configurations of a property. But, it takes a couple of seconds to respond and it is accessed frequently. It would be a great to have this value stored somewhere else for faster respond times. Let's see how to add a caching layer to the SettingsService
using ASP.NET Core!
A cache is an storage layer used to speed up future requests. Reading from cache is faster than computing data or retrieving it from an external source every time it is requested. ASP.NET Core has built-in abstractions to implement a caching layer using memory and Redis. You can use the Decorator pattern to separate the caching layer from your business logic.
In-Memory approach
Let's start with an ASP.NET Core 3.1 API project with a controller that uses your SettingsService
class. First, install the Microsoft.Extensions.Caching.Memory
NuGet package. Then, register the in-memory cache in the ConfigureServices
method of the Startup
class. You need to use the AddMemoryCache
method.
// Startup.cs
public void ConfigureServices(IServiceCollection services)
{
services.AddMemoryCache(options =>
{
options.SizeLimit = 1024;
});
// ...
}
Since memory isn't infinite, you want to limit the number of items stored in the cache. Make use of SizeLimit
. It is the maximum number of "slots" or "places" the cache can hold. Also, you need to tell how many "places" a cache entry takes when stored. More on that later!
Decorator pattern
Next, let's use the decorator pattern to add caching to the existing SettingsService
without modifying it. To do that, create a new CachedSettingsService
. It should inherit from the same interface as SettingsService
. That's the trick!
Also, you need a constructor receiving IMemoryCache
and ISettingsService
. This last parameter will hold a reference to the existing SettingService
. Then, in the GetSettingsAsync
method, you will call the existing service if the value isn't cached.
public class CachedSettingsService : ISettingsService
{
private readonly IMemoryCache _cache;
private readonly ISettingsService _settingsService;
public CachedSettingsService(IMemoryCache cache, ISettingsService settingsService)
{
_cache = cache;
_settingsService = settingsService;
}
public async Task<Settings> GetSettingsAsync(int propertyId)
{
var key = $"{nameof(propertyId)}:{propertyId}";
return await _cache.GetOrSetValueAsync(key, async () => await _settingsService.GetSettingsAsync(propertyId));
}
}
Size, Limits and Expiration Time
Now, let's create the GetOrSetValueAsync
extension method. It will check first if a key is in the cache. Otherwise, it will use a factory method to compute the value and store it on the cache. This method receives a custom MemoryCacheEntryOptions
to overwrite the default values.
Make sure to use expiration times when storing items. You can choose between sliding and absolute expiration times:
-
SlidingExpiration
will reset the expiration time every time an entry is used before it expires -
AbsoluteExpirationRelativeToNow
will expire the entry after the given time, no matter how many times it's been used - If you use both, the entry will expire when the first of the two times expire
Don't forget to include a size for each cache entry, if you use SizeLimit
when registering the in-memory cache into the dependency container. This Size
tells how many "places" from SizeLimit
an entry takes. When this limit is reached, the cache won't store any more entries until some of them expire. For more details, see Use SetSize, Size, and SizeLimit to limit cache size.
public static class MemoryCacheExtensions
{
// Make sure to adjust these values to suit your own defaults...
public static readonly MemoryCacheEntryOptions DefaultMemoryCacheEntryOptions
= new MemoryCacheEntryOptions
{
AbsoluteExpirationRelativeToNow = TimeSpan.FromSeconds(60),
SlidingExpiration = TimeSpan.FromSeconds(10),
Size = 1
};
public static async Task<TObject> GetOrSetValueAsync<TObject>(this IMemoryCache cache, string key, Func<Task<TObject>> factory, MemoryCacheEntryOptions options = null)
where TObject : class
{
if (cache.TryGetValue(key, out object value))
{
return value as TObject;
}
result = await factory();
options ??= DefaultMemoryCacheEntryOptions;
cache.Set(key, value, options);
return result;
}
}
Registration
To start using the new CachedSettingsService
, you need to register it into the dependency container. Back to the Startup
class! Register the existing SettingsService
and the new decorated service. You can use Scrutor to register your decorators.
// Startup.cs
public void ConfigureServices(IServiceCollection services)
{
services.AddTransient<SettingsService>();
services.AddTransient<ISettingsService>(provider =>
{
var cache = provider.GetRequiredService<IMemoryCache>();
var settingsService = provider.GetRequiredService<SettingsService>();
return new CachedSettingsService(cache, settingsService);
});
// The same as before...
services.AddMemoryCache(options =>
{
options.SizeLimit = 1024
});
// ...
}
Be aware of removing cached entries if you need to update or delete entities in you own code. You don't want to use an old value or, even worse, a deleted value read from your cache. In this case, you would need to use the Remove
method.
There are only two hard things in Computer Science: cache invalidation and naming things.
-- Phil Karlton
From TwoHardThings
Unit Tests
Let's see how you can create a test for this decorator. You will need to create a fake for the decorated service. Then, assert it's called only once after two consecutive calls to the cached method. Let's use Moq.
[TestClass]
public class CachedPropertyServiceTests
{
[TestMethod]
public async Task GetSettingsAsync_ByDefault_UsesCachedValues()
{
var memoryCache = new MemoryCache(Options.Create(new MemoryCacheOptions()));
var fakeSettingsService = new Mock<ISettingsService>();
fakeSettingsService.Setup(t => t.GetSettingsAsync(It.IsAny<int>()))
.ReturnsAsync(new Settings());
var service = new CachedSettingsService(memoryCache, fakeSettingsService.Object);
var propertyId = 1;
var settings = await service.GetSettingsAsync(propertyId);
fakeSettingsService.Verify(t => t.GetSettingsAsync(propertyId), Times.Once);
settings = await service.GetSettingsAsync(propertyId);
decoratedService.Verify(t => t.GetSettingsAsync(propertyId), Times.Once);
}
}
Distributed approach
Now, let's move to the distribute cache. A distributed cache layer lives in a separate server. You aren't limited to the memory of the server running your API site.
A distributed cache make sense when you want to share your cache server among multiple applications. Or, when your site is running behind a load-balancer along many instances of the same server. For more advantages of distributed cache, see Distributed caching in ASP.NET Core
There is an implementation of the distributed cache using Redis for ASP.NET Core. Redis is "an open source (BSD licensed), in-memory data structure store, used as a database, cache and message broker".
Using a distributed cache is similar to the in-memory approach. This time you need to install Microsoft.Extensions.Caching.StackExchangeRedis
NuGet package and use the AddStackExchangeRedisCache
method in your ConfigureServices
method. Also, you need a Redis connection string and an optional InstanceName
. The InstaceName
groups entries with a prefix. It's helpful when using a single Redis server with different sites.
Notice, there are two similar NuGet packages to use Redis with ASP.NET Core: Microsoft.Extensions.Caching.Redis and Microsoft.Extensions.Caching.StackExchangeRedis. They use different versions of the StackExchange.Redis client.
// Startup.cs
public void ConfigureServices(IServiceCollection services)
{
services.AddTransient<SettingsService>();
services.AddTransient<ISettingsService>(provider =>
{
var cache = provider.GetRequiredService<IDistributedCache>();
var settingsService = provider.GetRequiredService<SettingsService>();
return new CachedSettingsService(cache, settingsService);
});
services.AddStackExchangeRedisCache(options =>
{
var redisConnectionString = Configuration.GetConnectionString("Redis");
options.Configuration = redisConnectionString;
var assemblyName = Assembly.GetExecutingAssembly().GetName();
options.InstanceName = assemblyName.Name;
});
}
Redecorate
Make sure to change the cache interface from IMemoryCache
to IDistributedCache
. Go to your CachedSettingsService
class and the ConfigureService
method.
public class CachedSettingsService : ISettingsService
{
private readonly IDistributedCache _cache;
private readonly ISettingsService _settingsService;
public CachedSettingsService(IDistributedCache cache, ISettingsService settingsService)
{
_cache = cache;
_settingsService = settingsService;
}
public async Task<Settings> GetSettingsAsync(int propertyId)
{
var key = $"{nameof(propertyId)}:{propertyId}";
return await _cache.GetOrSetValueAsync(key, async () => await _settingsService.GetSettingsAsync(propertyId));
}
}
Now, let's create a new GetOrSetValueAsync
extension method to use the distributed cache. This time, you need to use asynchronous methods to retrieve and store items. These methods are GetStringAsync
and SetStringAsync
. Also, you need a serializer to cache objects. We are using Newtonsoft.Json.
Notice, this time you don't need sizes for cache entries.
public static class DistributedCacheExtensions
{
public static readonly DistributedCacheEntryOptions DefaultDistributedCacheEntryOptions
= new DistributedCacheEntryOptions
{
AbsoluteExpirationRelativeToNow = TimeSpan.FromSeconds(60),
SlidingExpiration = TimeSpan.FromSeconds(10),
};
public static async Task<TObject> GetOrSetValueAsync<TObject>(this IDistributedCache cache, string key, Func<Task<TObject>> factory, DistributedCacheEntryOptions options = null)
where TObject : class
{
var result = await cache.GetValueAsync<TObject>(key);
if (result != null)
{
return result;
}
result = await factory();
await cache.SetValueAsync(key, result, options);
return result;
}
private static async Task<TObject> GetValueAsync<TObject>(this IDistributedCache cache, string key)
where TObject : class
{
var data = await cache.GetStringAsync(key);
if (data == null)
{
return default;
}
return JsonConvert.DeserializeObject<TObject>(data);
}
private static async Task SetValueAsync<TObject>(this IDistributedCache cache, string key, TObject value, DistributedCacheEntryOptions options = null)
where TObject : class
{
var data = JsonConvert.SerializeObject(value);
await cache.SetStringAsync(key, data, options ?? DefaultDistributedCacheEntryOptions, token);
}
}
Unit Tests
For unit testing, you can use MemoryDistributedCache
, an in-memory implementation of IDistributedCache
. So you don't need to roll a Redis server to test your code. From the previous unit test, you need to replace the IMemoryCache
dependency with var memoryCache = new MemoryDistributedCache(Options.Create(new MemoryDistributedCacheOptions()));
.
Conclusion
Voilà! Now you know how to cache the results of a slow service using an in-memory and a distributed approach implementing the Decorator pattern on your ASP.NET Core API sites. Additionally, you can turn on or off the cache layer using a toggle in your appsettings
file to either create a decorated o a plain service. If you need to cache outside of an ASP.NET Core site, you can use libraries like CacheManager, Foundatio and Cashew.
To learn more about configuration in ASP.NET Core, read my post on how to read configuration values in ASP.NET Core
Happy caching time!
Top comments (1)
Awesome, thanks Cesar!