DEV Community

Erik Hoffman
Erik Hoffman

Posted on

Cache for performance, not for offloading

Often when you talk about caching it's focused on offloading the origin by letting most of the request to hit a CDN or get some cached data delivered to avoid processing too often. Though in my mind there is more to it.

I recently had a project where the caching would to be implemented purely on performance focus, and not by any mean based on offloading needs.
The processing of each request was far too heavy to get the response times down to a acceptable level.

Of course with a classic CDN cache you will get the response times low and you will get an incredible performance. Until the cache times out. Each X minutes there will be a single request that will take the performance hit to generate new data to cache.

How should we solve this?

There are of course a few solutions out there to solve this, by keeping the cache up to date and hot without the need of an end user request to go through and hit the origin generating data.

One solution could be to have a separate worker, which generates data to the cache with a set interval, never having an expiration on the cache but rather updating it. This is, I would say, an optimal solution since you will update the cache without ever letting someone through neither taking the performane hit of generating the new data to often on the server.
Though one thing with this solution is that it can be quite architectural heavy for a small application, since you need to have an external worker as well as a CDN integration where you activaly can update your data etc.

My take

My take on solving this for smaller application is not as light on the origin but nearly as good as the solution above on performance, and you will never let the cache expire.

In my examples I'm using Fastify but it could rather be any route handler there is. I am also using node-cache for in memory cache, but that can as well be switched to any cache solution that you might want to use, like Redis or Memcached.

The main idea is to always respond with the cached data, but even then letting the application work even after responding the request, being able to fetch new data and put into cache - to be updated for the next request to come in.

const NodeCache = require("node-cache");
const storage = new NodeCache({
  stdTTL: 3600,
  checkperiod: 120,
  useClones: false
});

fastify.get("/data", async (req, res) => {
  // Using our unique url as key, i.e. query params etc will differ the key
  const key = req.req.originalUrl;
  // If already in cache, send the cached data as response
  const data = storage.get(key);
  if (data) {
    res
      .header("Cache-Control", "public, max-age=300")
      .code(200)
      .send(data);
  }
  // Please notice that we do not trigger a return, but let the execution go on
  // Here we would typically generate some data
  const newData = "Some new data";
  // Updating the cache
  storage.set(key, newData);
  // This response will only be sent the first time, since fastify have already sent a response and closed connection
  res
    .header("Cache-Control", "public, max-age=300")
    .code(200)
    .send(response);
});
Enter fullscreen mode Exit fullscreen mode

So the first time, when no cache exists, it will go thorugh to the end setting new data into the cache, but also responding with that data. Each other time it will respond with the cached data, then keep execute and update the cache with the new data.

This way we will always get the performance of having data cached, though also having new data exposed all the time as we constantly updates the cache with new data.
The only way we offload is to have a cdn in front of this which will not be updated in this scenario.

Conclusion

You can either cache for performance on your server of for the performance of the end user. Far too often the focus is on the first one and not the latter.
My example is a great way to combine these by letting the cdn being the offload for your server, but when the requests gets through you still have an in memory cache in place for the performance enjoyment of your users.

How do you work with performance in your applications and APIs?

Top comments (0)