DEV Community

Cover image for GETTING STARTED WITH CACHING: USING REDIS AND TYPESCRIPT
STEVE
STEVE

Posted on

GETTING STARTED WITH CACHING: USING REDIS AND TYPESCRIPT

INTRODUCTION:

In this Tutorial, We are going to learn the fundamentals of caching and how to implement them using Redis and Typescript/Nodejs. But before we begin, let's start with the basics.

What is Caching?
Caching is a technique used in computer systems and software applications to store and retrieve data quickly. It involves storing a copy of frequently accessed or expensive-to-compute data in a temporary storage location, called a cache so that future requests for the same data can be served faster.

The purpose of caching is to improve the performance and efficiency of a system by reducing the time and resources required to fetch data from its original source. Instead of retrieving the data from the original location, which may involve time-consuming operations like disk access or network communication, the data is retrieved from the cache, which is typically located closer to the requester and provides faster access.

PREREQUISITES
To follow along with this tutorial, you will need to have a basic understanding of NodeJS and Typescript. I will try my best to explain every required step and I am confident that I can provide thorough explanations to assist you throughout the process.

Why Redis?
When it comes to caching, Redis stands out as an exceptional choice for several reasons. Its features and capabilities make it an ideal solution for optimizing performance and improving overall system efficiency.

Redis is renowned for its incredible speed and performance. By storing data in memory, Redis enables lightning-fast data access and retrieval, making it suitable for applications that require real-time data processing and high throughput. With the ability to handle millions of requests per second and provide low-latency response times, Redis excels in scenarios where speed is of utmost importance.

By leveraging Redis as a cache, applications can store frequently accessed data in memory, eliminating the need for repetitive and resource-intensive operations. This results in improved performance, reduced response times, and a more seamless user experience.

In this tutorial, we will employ Redis for caching data retrieved from an external API with a noticeably slow response time. We will store this data in our cache and utilize it for subsequent requests made to the API. However, we will also address the scenario when the data from the original API undergoes changes. We will implement a mechanism to ensure that our cache consistently provides up-to-date data from the API, ensuring that our system remains synchronized with the latest information.

Let's get started.

First, let's initialize our Nodejs project and install the required dependencies.

npm init -y

npm i express axios cors dotenv redis

Now install the type definitions :

npm i @types/express @types/axios @types/cors @types/dotenv @types/redis

Having a tsconfig.json file ensures consistency in the compilation process across different environments and allows for easy project configuration and maintenance.

So go ahead and create one using :

tsc --init

Then configure your outDir and rootDir options to match the following structure.

file structure

Don't forget to watch for changes in your Typescript code by using tsc -w in a dedicated terminal.

Next, let us connect to Redis. To do this, you have two options.

  1. Connect to Redis-Local-Server using localhost.
  2. Connect to the Redis-cloud-console by creating an instance via Redis-cloud

In this tutorial, we will go with option 2. (If you don't have an account already, Kindly create one and create a database instance).

Now let's connect to our database. Our ./src/config.ts file should now have the following lines of code :

import { createClient } from 'redis';
import dotenv from 'dotenv';

dotenv.config();

export const client = createClient({
    password: process.env.REDIS_PASSWORD,
    socket: {
        host: process.env.REDIS_HOST,
        port: parseInt(process.env.REDIS_PORT || '6379', 10)
    }
});

Enter fullscreen mode Exit fullscreen mode

The code above sets up a Redis client using the redis package, loads environment variables from a .env file using dotenv, and exports the Redis client for use in other parts of the codebase. The environment variables are used to configure the Redis connection, including the host, port, and password. (These variables are available on your database instance once you create an account here : Redis-cloud )

Next, Let's import the client variable in our app.ts file and connect to the database.

Configure your app.ts file to look like this :

import dotenv from "dotenv"
import express, { Request, Response } from "express"
import axios from "axios"
import { client } from "./config/connect"
import cors from "cors"

dotenv.config()

const app : express.Application = express()

const PORT = process.env.PORT || 7000

app.use(cors())
app.use(express.json())

const start = async () => {
    try {
        await client.connect()
        app.listen(PORT, () => {
            console.log(`Server is connected to redis and is listening on port ${PORT}`)
        })
    } catch (error) {
        console.log(error)
    }
}

start()
Enter fullscreen mode Exit fullscreen mode

The code above imports all the installed dependencies we will need for this project, sets up an Express server, configures it to handle JSON requests, enables CORS, connects to the Redis client, and starts the server to listen for incoming requests if the connection is successful.

Now for this tutorial, I have created and deployed an API whose response is intentionally delayed by 5secs to demonstrate how caching can be useful in a real-world scenario.

So in our app.ts file, we will create two functions that make API calls like this :

async function isDataModified () {
    const response = await axios.get("https://pleasant-newt-girdle.cyclic.app/api/modified")
    return response.data.modified
}

async function getAllUsers () {
    const response = await axios.get("https://pleasant-newt-girdle.cyclic.app/api/users")
    return response.data
}

Enter fullscreen mode Exit fullscreen mode

Now, let's examine each of these functions separately. The first function sends a request to an endpoint that interacts with its database. It retrieves a boolean result, which evaluates to true if there have been any recent changes. These changes encompass scenarios such as the addition of a new item POST, modification of an existing item PUT, or deletion of an item DELETE. In the absence of any changes, the result will be false.

The second function, on the other hand, straightforwardly retrieves a list of all items (in this case, users) stored in the database of that particular API.

Now, let's understand the rationale behind this approach. Why are we making two requests to an API? Remember when we mentioned the need to cache only recent information? Exactly! So, what happens when a user decides to update specific details on their profile? In such cases, we also need to update the cache, right? Absolutely correct. That's precisely what the first function accomplishes. Before serving our response, we verify if any changes have been made in the API's database to ensure that our cache is up-to-date.

Now let's create an endpoint in this API to store and retrieve information from our cache. Update the app.ts file with this endpoint.

app.use("/get-users", async (req : Request, res : Response) => {

    let result;
    let isCahed;

    try {

        const data = await isDataModified()

        if (data === true) {
            result = await getAllUsers()
            isCahed = false
            await client.set("all_users", JSON.stringify(result))
        } 

        else {

            const isCahedInRedis = await client.get("all_users");

            if (isCahedInRedis) {

                isCahed = true
                result = JSON.parse(isCahedInRedis)
            }

           else {
                result = await getAllUsers()
                isCahed = false

                await client.set("all_users", JSON.stringify(result))
           }

        }

        return res.status(200).json({
            isCahed,
            result : result
        })
    } catch (error) {
        console.log(error)
        return res.status(500).json({error})
    }
})

Enter fullscreen mode Exit fullscreen mode

Now, let's explain what is happening here step by step:

Inside the route handler function:

a. Two variables, result and isCached, are created to store the API request result and caching status.

b. The isDataModified() function is called to check if there have been any modifications in the database. The result is stored in the data variable.

c. If modifications are detected (when data is true), it means the cache needs to be updated. The getAllUsers() function is called to retrieve all user data from the API. The result is assigned to the result variable, and the isCached variable is set to false. The retrieved data is then stored in the Redis cache using the client.set() method.

d. If no modifications are detected, it means the cached data is still valid. The code checks if the data is already cached in Redis using the client.get() method. If cached data exists, it is assigned to the result variable, and the isCached variable is set to true.

e. If no cached data exists, the getAllUsers() function is called to retrieve the user data from the API. The result is assigned to the result variable, and the isCached variable is set to false. The retrieved data is then stored in the Redis cache using the client.set() method.

f. Finally, the code sends a JSON response with a status of 200. The response includes the isCached status and the result data.

If any errors occur during the process, they are caught in the catch block. A JSON response with a status of 500 and the error message is returned.

To summarize, this code sets up an endpoint that fetches user data from an API. It checks if the data has been modified, updates the cache if needed, and returns the cached data or retrieves fresh data from the API.

And that pretty much does the job. Our app.ts file should now look like this :

import dotenv from "dotenv"
import express, { Request, Response } from "express"
import axios from "axios"
import { client } from "./config/connect"
import cors from "cors"

dotenv.config()

const app : express.Application = express()

const PORT = process.env.PORT || 7000

app.use(cors())
app.use(express.json())

async function isDataModified () {
    const response = await axios.get("https://pleasant-newt-girdle.cyclic.app/api/modified")
    return response.data.modified
}

async function getAllUsers () {
    const response = await axios.get("https://pleasant-newt-girdle.cyclic.app/api/users")
    return response.data
}

app.use("/get-users", async (req : Request, res : Response) => {

    let result;
    let isCached;

    try {

        const data = await isDataModified()

        if (data === true) {
            result = await getAllUsers()
            isCached = false
            await client.set("all_users", JSON.stringify(result))
        } 

        else {

            const isCachedInRedis = await client.get("all_users");

            if (isCachedInRedis) {

                isCached = true
                result = JSON.parse(isCachedInRedis)
            }

           else {
                result = await getAllUsers()
                isCached = false

                await client.set("all_users", JSON.stringify(result))
           }

        }

        return res.status(200).json({
            isCached,
            result : result
        })
    } catch (error) {
        console.log(error)
        return res.status(500).json({error})
    }
})

const start = async () => {
    try {
        await client.connect()
        app.listen(PORT, () => {
            console.log(`Server is connected to redis and is listening on port ${PORT}`)
        })
    } catch (error) {
        console.log(error)
    }
}

start()
Enter fullscreen mode Exit fullscreen mode

Now run the server and let us test our endpoints :

GET USERS

The first thing you will notice when you hit this endpoint http://localhost:7000/get-users is how long it took to get a response. As I mentioned before, that is intentional, it is meant to simulate how real-world applications, that are CPU intensive behave when caching is not implemented.

Next, you will notice that the first property from the response, which is isCached reads false. This means that this data does not exist in our cache but has been added immediately. How can we confirm that? well, lets make the same request again by simply refreshing our browser. This is what we get :

IsCached True

Notice how, the response time is reduced as you keep refreshing the page? also, Notice that the isCachedproperty now reads true.

To test what will happen when we alter the database, I invite you to modify the response by creating, editing, or deleting users using any of the endpoints below :

Create user (POST): https://pleasant-newt-girdle.cyclic.app/api/user
Update user (PUT)/Delete user (DELETE): https://pleasant-newt-girdle.cyclic.app/api/user/:id

For every time, you makes these requests successfully (POST, PUT and DELETE) the isCached value becomes false, prompting a delayed response to fetch current data and update the cache. Remember, you can always refresh your endpoint at http://localhost:7000/get-users to visualize the changes happening live.

Here is an example of how to make these requests on Postman.

CREATE NEW USER

CREATE A NEW USER

UPDATE USER

UPADTE USER

DELETE USER

DELETE USER

Now, if for some reason, the API, https://pleasant-newt-girdle.cyclic.app/api/user ceases to exist in the future, you can also try out this similar public API https://reqres.in/api/users?delay=3. Here is how it works: The response time of the API is contingent on the query parameter delay. That means if you set the delay=3, it will take three seconds before the API returns a response. This means that you can practice the caching mechanism with this API.

But wait, how do we actually visualize the data in our cache? Well, you can download a tool like RedisInsight, connect your database instance, and voila, you can now visualize and query your cache.

Here is a link to the complete code on GitHub. Thank you for sticking around till the end.

FINAL NOTES:

Congratulations, if you got this far. By now, you should have a solid understanding of how Redis can be leveraged as a powerful caching solution. However, caching with Redis extends beyond just speeding up data retrieval. In this final note, let's explore some additional use cases and discuss the remarkable usefulness of caching.

Session Caching: Redis is an excellent choice for storing session data. By caching session information, you can achieve high-performance session management, improve user experience, and reduce database load. Redis's ability to set expiration times on keys makes it perfect for managing session timeouts and automatically cleaning up expired sessions.

Full-page Caching: Redis can be used to cache entire HTML pages, eliminating the need to regenerate them on each request. By serving cached pages directly from Redis, you can dramatically reduce the response time and alleviate the load on your application servers.

Result Caching: Redis enables you to cache the results of complex or time-consuming computations. For example, if your application involves heavy calculations or data processing, you can store the computed results in the Redis cache and retrieve them when needed, avoiding redundant computations.

Leaderboards and Counters: Redis's sorted sets and atomic increment operations make it an excellent choice for implementing leaderboards, vote counters, or popularity rankings. By caching these frequently changing metrics, you can efficiently update and display them in real-time.

Pub/Sub Messaging: Redis supports Publish/Subscribe (Pub/Sub) messaging, allowing you to build real-time communication channels, notifications, and event-driven architectures. By caching messages or maintaining subscription lists, Redis facilitates the implementation of scalable, high-performance messaging systems.

The usefulness of caching with Redis cannot be overstated. By intelligently caching data, you can achieve significant performance improvements, reduce latency, and enhance the overall scalability of your applications. However, it's crucial to consider cache invalidation and maintain data consistency when working with caching systems.

Top comments (1)

Collapse
 
jdgamble555 profile image
Jonathan Gamble

Great post, you should edit the code so that it uses "ts" in your markdown so that it gets color coded (three backticks and then "ts").