DEV Community

Cover image for How To Boost Your Node.js Performance by Implementing Caching With Redis.
Ukagha Nzubechukwu
Ukagha Nzubechukwu

Posted on • Updated on

How To Boost Your Node.js Performance by Implementing Caching With Redis.

Introduction

As your application grows and attracts more traffic, it becomes increasingly important to ensure faster response times. This helps prevent users from experiencing delays and frustration while using your app, as slower response times can make you lose users and revenue. By implementing caching, you can optimize and scale your application, providing users with faster response times and a great user experience.

Storing frequently accessed data in a location (cache) that is quicker to access is an effective way to improve the performance of your Node.js application. Caching reduces the time and resources required to query the main memory (database), which in turn reduces server load and speeds up content delivery to users.

In this guide, you will learn the fundamentals of Caching and Redis. You will also learn how to implement caching in a Node.js application using Redis. This will be done with a simple blog example built with Node.js, MongoDB, Express, and Express Handlebars.

Prerequisites

To follow this guide, ensure you have the following:

  1. A basic understanding of Node.js, Express, and Express handlebars.
  2. An Upstash account.
  3. A MongoDB Database (Atlas). If you're unsure how to retrieve your MongoDB Atlas connection string, check out the tutorial — A step-by-step guide: How to use MongoDB with Node.js to set up MongoDB Atlas for guidance.

TL;DR: you can find the entire code used in this guide on GitHub.

An Introduction to Caching.

What is a Caching?
Caching is the process of storing data in a cache. When we query for data from a source (main memory or a database), we have to wait for some time to get a response. But if data is stored in a fast access memory like Redis, data retrieval time can be cut in half, and we get to prevent a bottleneck caused by a large number of users trying to access the same piece of data.

Use case
Have you ever noticed that when you visit your favorite website, it always seems to know what you're interested in? Social media platforms like Facebook also use your cached information to make friend suggestions and show you posts that are relevant to your interests. By using your cached data, these websites aim to improve your user experience and make your online activities more enjoyable.

Cache internals and mechanism
Caches are designed to be small, fast, and lightweight. However, this limits their capacity to hold large amounts of data for extended periods. To ensure that they always have up-to-date information, caches work in tandem with databases. Because caches are small, it is important to regularly remove old or outdated data and replace it with fresher information. One technique used to accomplish this is Time-To-Live (TTL), which moves expired data out of the cache to make room for new data. Caching algorithms are also used to evaluate the data in the cache and determine which data has the lowest priority and should be removed.

Setting up a Cache network
When setting up a cache network, it's essential to consider the data that users frequently access. This is because you don't want to cache unnecessary information that will occupy valuable space. To achieve a high cache hit rate, a cache policy that works best for your application must be carefully implemented. For instance, the "write-through cache" policy ensures that the cache and database are updated simultaneously.

write-through cache illustration

However, there is a cost associated with this policy as it may not be suitable for write-intensive applications. This is because your application will have to store data in two locations, leading to performance issues.

On the other hand, if your application is write-intensive, the "write-back cache" may be more suitable, as it only writes data to the cache at first and later updates the database. This way, data is not stored simultaneously in two locations, preventing performance issues.

write-back cache

It's crucial to consider the type of application you're building before selecting a caching policy and algorithm to prevent further problems in your application.

An Introduction to Redis

What is Redis
Redis is an open-source, in-memory data store that can be used as a database, cache, and message broker. Redis is well-known for its high performance, flexibility, and scalability, making it a popular choice for modern web applications.

Redis's in-memory data model allows it to store and access data directly from the server's RAM, rather than being written to disk like traditional databases. As a result, Redis offers faster data access and retrieval times and lower latency since disk I/O operation is reduced.

Redis data storage
In addition, Redis stores data using a key-value pair, where each key is a string, and each value can be any of the supported data structures.

What is Key Value Pair? How can you automate key-value pair extraction?

Redis is capable of supporting a variety of data structures, including strings, hashes, lists, sets, and sorted sets, making it well-suited for real-time analytics, caching, and message passing.

Use case
Redis is a powerful tool for caching data. Its high throughput and availability make it an excellent choice for implementing caching, as it ensures a high cache hit rate. This means that frequently accessed data are served up faster, reducing the time it takes to retrieve data from the main memory (database). Redis is also used for several use cases such as chat, messaging, queues, gaming leaderboards, session stores, and much more.

Overall, Redis is a powerful and versatile tool that can help developers build faster, more scalable, and more reliable applications.

Setting up Redis.

In this guide, we will use a Redis server service named Upstash. Although other web services such as AWS Lambda, Azure Functions, and Google Cloud Functions offer Redis cache, we will use Upstash to navigate this tutorial.

  1. To get started, visit Upstash to create an account. If you already have one, you can log in.

  2. After setting up your account, click the "Create database" button to create a Redis database.
    Upstash-create-database-image

  3. Fill out the form in the pop-up and click "Create" to proceed. Refer to the screenshot below for help:
    Upstash-pop-up

  4. Viola! we have successfully set up a Redis cache.
    Upstash-dashboard

  5. To retrieve your connection string; scroll down, click on the “Node” tab, and select "ioredis". Refer to the screenshot below for help.
    Node-tab

    Hover over the password tab to copy your password. Then, replace the star symbols in your connection string with your actual password.

  6. Make sure to keep your Redis connection string safe and ready, as it will be needed in the next section when we implement application caching.

Caching in Action

To illustrate how caching improves the performance of a Node.js application, we will use a basic blog as an example. This blog has three main functionalities: creating a post, listing all posts, and viewing a post. Let's begin!

Project setup

  1. To proceed, go to GitHub and clone this boilerplate.

  2. After cloning the boilerplate, run the following commands in your terminal one after the other to install all the dependencies required:

    
    npm i 
    npm i ioredis   
    
    

    Quick note:

    ioredis is a popular Node.js client library with a simple and high-performance interface that helps developers to easily interact with Redis servers.

  3. Create a .env file in the root directory.

    Process.env

    Next, enter the following environment variables into the newly created file.

    PORT = 8000 
    mongodb_uri = yourMongoDBConnectionString
    redis_url = yourRedisConnectionString
    

    .

  4. To ensure that the application runs smoothly, execute the following command:

    npm run dev
    

    npm-run-dev

  5. To access the application, open your web browser and type in the following URL — localhost:8000/dashboard. This will take you directly to the dashboard.
    boilerplate

Writing code

In the section: "Introduction to Caching", we discussed caching policies and how selecting the right one can boost an application's performance. In this example blog, we will implement the "write-through cache" policy. This policy involves storing a post in both the cache and the database simultaneously. Since blogs are generally read-intensive, this policy will be the most appropriate.

  1. Create a folder named redis-store in the root directory. Inside the redis-store folder, create a file named redis.js for the redis connection.
    redis-store

  2. Open the redis.js file and add the following code to connect to Redis.

        const Redis = require("ioredis")
        const dotenv = require('dotenv')
    
        dotenv.config()
    
        const redisClient = () => {
        const redis = new Redis(process.env.redis_url)
    
         redis.on("error", (err) => {
            console.log(`Redis crashed: ${err.message}`)
         })
    
         redis.on("connect", () => {
            console.log(`Redis started on ${redis.options.host}`)
         })
    
         return redis
         }
    
         module.exports = redisClient()
    

    .

  3. Navigate to the controller/Post.js file, and make the following changes to incorporate caching into the code.

    a. First, import the Redis connection.

    const Redis = require('../redis-store/redis')
    

    b. To cache newly created post, update addPost function with provided code snippet:

    const addPost = async (req,res) => {
    try {
        try {
          const newPost = new Post({
            title: req.body.title,
            body: req.body.body
          });
    
          const post = await Post.create(newPost);
    
          // Caching newly created post
          await Redis.set(post.id, JSON.stringify(post), 'EX', 
          604800)
          await Redis.del("allPosts")
    
          res.redirect('/dashboard');
        } catch (error) {
          console.log(error);
        }
    
      } catch (error) {
        console.log(error);
      }
    }
    

    In the code snippet above:

    • We cached each post as soon as it was created.
    • An expiration time of either 604800 seconds or 7 days was set to expire the post from the cache.
    • We also deleted the "allPosts" entry from the cache because it would be outdated once a new post is added to the database.
    • Finally, we are redirected to the dashboard to view the newly created post.

    c. Modify the viewPost function to enable the display of specific posts from either the database or cache:

    const viewPost = async (req,res) => {
      try {
          let slug = req.params.id;
          let data;
    
          const redisData = await Redis.get(slug)
          data = JSON.parse(redisData)
    
          if (data === null){
            data = await Post.findById({ _id: slug });
            await Redis.set(slug, JSON.stringify(data), 'EX', 604800)
          }
    
          const locals = {
            title: data.title,
            description: "Simple Blog created with NodeJs, Express & MongoDb.",
          }
    
          res.render('post', { 
            locals,
            data,
            currentRoute: `/post/${slug}`
          });
        } catch (error) {
          console.log(error);
        }
    }
    

    In the code snippet provided:

    • We first check the Redis cache to see if a post matching the provided ID exists.
    • If it does, we return it.
    • However, if there is no match in the cache, we check the database for the data, cache it, and then return the post to the user.

    d. Lastly, update the getPosts function to retrieve all posts from the database or cache:

    const getPosts = async (req,res) => {
        try {
            const locals = {
              title: 'Dashboard',
              description: 'Simple Blog created with NodeJs, Express & MongoDb.'
            }
    
            const redisData = await Redis.get("allPosts")
            let data = JSON.parse(redisData)
    
            if (data !== null) {
                res.render('Post-crud/dashboard', {
                locals,
                data,
                layout: mainLayout
              });
            }else {
    
              data = await Post.find();
              await Redis.set("allPosts", JSON.stringify(data))
    
              res.render('Post-crud/dashboard', {
                locals,
                data,
                layout: mainLayout
              });
    
            }
          } catch (error) {
            console.log(error);
          }
    }
    

    In the given code snippet above.

    • We start by checking the Redis cache to see if there is an entry called “allPosts”.
    • If such an entry exists, we directly return “allPosts” from the cache.
    • If there is no cached data found, we retrieve “allPosts” from the database, and then proceed to cache it.
    • Finally, we display all the posts to the user.

Testing

Let's test the cache implementation to ensure that it works as expected.

  1. To run your application, open your terminal and type the command

    npm run dev
    

    .

  2. Open your browser and access localhost:8000/dashboard to begin testing.

  3. To create a post, click on the "Create Post" button.
    Create-post

  4. Provide a title and content for your post, then click “Add”.
    Title

  5. Upon reaching the dashboard, click on your newly created post to view it.
    New-post

  6. Yay! The application is working as intended, and we have successfully created a new post.
    New-stuff

  7. To complete the testing phase, log in to your Upstash account to check if both the newly created post and “allPosts” are added to the cache.

  8. After logging into your Upstash account, go to your dashboard and click on the "Data Browser" tab to access the newly added data.
    New-stuff

  9. Click on each entry to confirm the contents of the newly added data.

    • Single post: Single-post
    • All posts: All-post

Benchmarking

The moment of truth is here. Let's see if caching improves the performance of an application. We will use the browser dev tools to compare the data retrieval time from the database vs the cache.

Without Caching:

In this section, we will comment out all the code blocks where caching was implemented. Then, we will proceed to create a post, retrieve all posts, and view a single post.

  1. All posts: It took 635 milliseconds to retrieve all the posts from the database and return them to us.
    Allpost-benchmark

  2. Single post: It took the database 267 milliseconds to return a single post.

Single-post-benchmark

With caching:

Let's uncomment the caching logic to compare data retrieval time.

Make sure to delete your cached data

How-to-delete-cached-data

  1. All posts: Redis returned all cached posts in 193 milliseconds.
    Allpost-with-caching

    1. Single post: It took Redis 193 milliseconds to return a single cached post. Single-post-with-caching

Conclusion

In this guide, you have learned what caching is, its benefits, concepts and how to integrate Redis as a caching solution in a Node.js application. However, we used a small-sized application to illustrate how caching works, so the data retrieval time was not too significant. In larger applications, caching can make a considerable difference in performance.

This guide also discussed the significance of selecting the right caching policy that fits into your application to minimize the load on your backend systems. It also stressed the importance of integrating cache expiration and invalidation mechanisms to ensure data accuracy and freshness.

By leveraging Redis, developers can significantly reduce response times and minimize database loads. Caching not only improves the speed of data retrieval but also reduces the workload on the server, leading to enhanced scalability and efficient resource utilization.

As your application grows, understanding and implementing caching with Redis in Node.js becomes a valuable skill. By following the steps outlined in this guide, you can optimize your application, deliver faster response times, and ultimately provide a more satisfying and seamless user experience.

I hope this guide was easy for you to follow. Happy Caching! 😊

Further Reading

Resources

  • The complete code can be found here

Top comments (0)