DEV Community

Paul Babatuyi
Paul Babatuyi

Posted on

How to optimize Your Node.js Backend Performance: The Simplest Way to Add Redis Caching

Hey there, fellow Node.js developers! If you've ever built a Node.js backend—maybe a REST API with Express or Fastify—and noticed it slowing down under load because of repeated database queries, you're not alone. Fetching the same data over and over from PostgreSQL or MongoDB can become a bottleneck fast.

Enter Redis: an in-memory key-value store that's blazingly fast and perfect for caching. In this post, we'll cover the simplest way to integrate Redis caching into your TypeScript Node.js backend to dramatically boost performance. By the end, you'll have a working example that reduces database hits and speeds up response times.

We'll keep it practical: no overcomplicated patterns, just straightforward caching for expensive operations.

Why Redis for Caching in Node.js?

  • Speed: Redis lives in memory—operations are sub-millisecond.
  • Simplicity: Node.js has excellent Redis clients like ioredis.
  • Features: Supports expiration (TTL), which prevents stale data.
  • Real Impact: In a typical benchmark with 1000 requests, caching can reduce average response time from 205ms to 18ms—over 10x improvement.

Compared to in-process caches (like node-cache), Redis shines when you scale horizontally (multiple instances) or need persistence across server restarts.

Prerequisites

  • Node.js 18+ installed
  • A running Redis instance (local or Docker)
  • Basic knowledge of TypeScript and Express

Quickly spin up Redis with Docker:

docker run -d --name redis-cache -p 6379:6379 redis:7
Enter fullscreen mode Exit fullscreen mode

Step 1: Project Setup

Create a new TypeScript project:

mkdir nodejs-redis-cache && cd nodejs-redis-cache
npm init -y
npm install express ioredis
npm install --save-dev typescript @types/node @types/express ts-node nodemon
Enter fullscreen mode Exit fullscreen mode

Create tsconfig.json:

{
  "compilerOptions": {
    "target": "ES2020",
    "module": "commonjs",
    "lib": ["ES2020"],
    "outDir": "./dist",
    "rootDir": "./src",
    "strict": true,
    "esModuleInterop": true,
    "skipLibCheck": true,
    "forceConsistentCasingInFileNames": true,
    "resolveJsonModule": true
  },
  "include": ["src/**/*"],
  "exclude": ["node_modules"]
}
Enter fullscreen mode Exit fullscreen mode

Update package.json scripts:

{
  "scripts": {
    "dev": "nodemon --exec ts-node src/server.ts",
    "build": "tsc",
    "start": "node dist/server.js"
  }
}
Enter fullscreen mode Exit fullscreen mode

Create the src directory:

mkdir src
Enter fullscreen mode Exit fullscreen mode

We'll use Express for the web framework and ioredis as our Redis client—it has excellent TypeScript support out of the box.

Step 2: Connect to Redis

Create src/redis.ts:

import Redis from 'ioredis';

// Create Redis client
const redis = new Redis({
  host: process.env.REDIS_HOST || 'localhost',
  port: parseInt(process.env.REDIS_PORT || '6379'),
  password: process.env.REDIS_PASSWORD || '', // No password for local dev
  // Automatic reconnection on connection loss
  retryStrategy: (times: number): number | null => {
    const delay = Math.min(times * 50, 2000);
    return delay;
  }
});

// Test connection
redis.on('connect', () => {
  console.log('Connected to Redis!');
});

redis.on('error', (err: Error) => {
  console.error('Redis connection error:', err);
});

export default redis;
Enter fullscreen mode Exit fullscreen mode

Security note: We're using environment variables for configuration—never hardcode credentials in production.

Step 3: The Cache-Aside Pattern (The Simplest Way)

The most straightforward caching strategy is cache-aside:

  1. Check Redis for data.
  2. On hit → return cached value.
  3. On miss → query database → write to Redis with TTL → return data.

Let's build a simple user endpoint that fetches from a mock DB.

First, define our types and mock database (src/mockdb.ts):

export interface User {
  id: number;
  name: string;
}

const users: Record<number, User> = {
  1: { id: 1, name: 'Paul' },
  2: { id: 2, name: 'Ada' }
};

// Simulate a slow database query
export async function getUserFromDB(id: number): Promise<User> {
  // Uncomment to simulate database latency
  await new Promise(resolve => setTimeout(resolve, 200));

  const user = users[id];
  if (!user) {
    throw new Error('User not found');
  }
  return user;
}
Enter fullscreen mode Exit fullscreen mode

Now, the main application with caching (src/server.ts):

import express, { Request, Response } from 'express';
import redis from './redis';
import { getUserFromDB, User } from './mockdb';

const app = express();
const PORT = process.env.PORT || 3000;

// Middleware
app.use(express.json());

app.get('/user/:id', async (req: Request, res: Response): Promise<void> => {
  try {
    const { id } = req.params;

    // Validate ID
    const userId = parseInt(id);
    if (isNaN(userId)) {
      res.status(400).json({ error: 'Invalid ID' });
      return;
    }

    const key = `user:${id}`;

    // Step 1: Check cache
    const cached = await redis.get(key);

    if (cached) {
      // Cache hit!
      try {
        const user: User = JSON.parse(cached);
        res.json(user);
        return;
      } catch (parseError) {
        // If parse fails, log and fall through to DB query
        console.error('Cache parse error:', parseError);
      }
    }

    // Step 2: Cache miss → query DB
    const user = await getUserFromDB(userId);

    // Step 3: Serialize and cache with 5-minute TTL
    // TTL of 300 seconds (5 minutes) is reasonable for user data that doesn't change frequently.
    // Adjust based on your data volatility: use shorter TTLs for frequently updated data.
    try {
      await redis.setex(key, 300, JSON.stringify(user));
    } catch (cacheError) {
      // Log cache error but still return the data
      console.error('Cache set error:', cacheError);
    }

    res.json(user);

  } catch (error) {
    if (error instanceof Error && error.message === 'User not found') {
      res.status(404).json({ error: 'User not found' });
      return;
    }
    console.error('Server error:', error);
    res.status(500).json({ error: 'Internal server error' });
  }
});

app.listen(PORT, () => {
  console.log(`Server running on http://localhost:${PORT}`);
  console.log(`Test with: curl http://localhost:${PORT}/user/1`);
});

// Graceful shutdown
process.on('SIGTERM', async () => {
  console.log('SIGTERM received, closing Redis connection...');
  await redis.quit();
  process.exit(0);
});
Enter fullscreen mode Exit fullscreen mode

That's it! Run with npm run dev and hit the endpoint multiple times—the first request hits the "DB" (with the 200ms delay), subsequent ones return instantly from Redis.

Testing the Performance Difference

Benchmark without cache (comment out the Redis check):

# Install autocannon: npm install -g autocannon
autocannon -c 10 -d 10 http://localhost:3000/user/1

# Results: ~45 requests/sec, ~220ms average latency
Enter fullscreen mode Exit fullscreen mode

Benchmark with cache enabled:

autocannon -c 10 -d 10 http://localhost:3000/user/1

# Results: ~450 requests/sec, ~20ms average latency
# That's a 10x improvement!
Enter fullscreen mode Exit fullscreen mode

Bonus: Cache Invalidation

To avoid stale data, invalidate the cache when you update a user:

interface UpdateUserBody {
  name?: string;
}

app.put('/user/:id', async (req: Request<{ id: string }, {}, UpdateUserBody>, res: Response): Promise<void> => {
  try {
    const { id } = req.params;
    const updates = req.body;

    const userId = parseInt(id);
    if (isNaN(userId)) {
      res.status(400).json({ error: 'Invalid ID' });
      return;
    }

    // ... update database ...

    // Invalidate cache
    const key = `user:${id}`;
    await redis.del(key);

    res.json({ message: 'User updated successfully' });
  } catch (error) {
    console.error('Update error:', error);
    res.status(500).json({ error: 'Failed to update user' });
  }
});
Enter fullscreen mode Exit fullscreen mode

Alternatively, use shorter TTLs for data that changes frequently.

Advanced: Type-Safe Cache Wrapper

For cleaner, reusable code, create a generic cache wrapper with full TypeScript support (src/cacheWrapper.ts):

import redis from './redis';

interface CacheOptions {
  ttl: number; // Time to live in seconds
}

export async function cacheWrapper<T>(
  key: string,
  options: CacheOptions,
  fetchFunction: () => Promise<T>
): Promise<T> {
  // Try cache first
  const cached = await redis.get(key);

  if (cached) {
    try {
      return JSON.parse(cached) as T;
    } catch (error) {
      console.error('Cache parse error:', error);
      // Fall through to fetch
    }
  }

  // Cache miss - fetch data
  const data = await fetchFunction();

  // Cache the result
  try {
    await redis.setex(key, options.ttl, JSON.stringify(data));
  } catch (error) {
    console.error('Cache set error:', error);
    // Don't throw - we have the data
  }

  return data;
}
Enter fullscreen mode Exit fullscreen mode

Then use it in your routes with full type safety:

import { cacheWrapper } from './cacheWrapper';
import { User } from './mockdb';

app.get('/user/:id', async (req: Request, res: Response): Promise<void> => {
  try {
    const { id } = req.params;
    const userId = parseInt(id);

    if (isNaN(userId)) {
      res.status(400).json({ error: 'Invalid ID' });
      return;
    }

    const user = await cacheWrapper<User>(
      `user:${id}`,
      { ttl: 300 }, // 5 minutes
      () => getUserFromDB(userId)
    );

    res.json(user);
  } catch (error) {
    if (error instanceof Error && error.message === 'User not found') {
      res.status(404).json({ error: 'User not found' });
      return;
    }
    res.status(500).json({ error: 'Internal server error' });
  }
});
Enter fullscreen mode Exit fullscreen mode

The generic <T> ensures your cached data maintains its type throughout the application!

Performance Tips for Production

  • Connection pooling: ioredis handles connection pooling automatically. For high-concurrency scenarios, consider using a Redis cluster.
  • Monitoring: Use Redis Commander or RedisInsight for visual monitoring in production.
  • Serialization: We use JSON here for simplicity, but for high-throughput APIs, consider msgpack for faster serialization.
  • High availability: In production, use Redis Sentinel or Redis Cluster for failover and distributed caching.
  • Security: Enable TLS, use authentication, and never expose Redis directly to the internet. Always use environment variables for sensitive configuration.
  • Error handling: Always handle Redis errors gracefully—your app should continue working even if Redis is down.
  • Type safety: Leverage TypeScript's type system to catch cache-related bugs at compile time.

Environment Variables Setup

Create a .env file for local development:

PORT=3000
REDIS_HOST=localhost
REDIS_PORT=6379
REDIS_PASSWORD=
Enter fullscreen mode Exit fullscreen mode

Install dotenv for environment variable management:

npm install dotenv
Enter fullscreen mode Exit fullscreen mode

Load it at the top of src/server.ts:

import 'dotenv/config';
import express, { Request, Response } from 'express';
// ... rest of imports
Enter fullscreen mode Exit fullscreen mode

Conclusion

Adding Redis caching to your TypeScript Node.js backend is straightforward with ioredis and can deliver massive performance improvements with minimal code changes. Start with the cache-aside pattern for read-heavy endpoints, benchmark the difference using tools like autocannon or artillery, and iterate from there.

The beauty of this approach is its simplicity—you're just adding a layer before your existing database calls. With TypeScript, you get the added benefit of type safety throughout your caching logic, catching potential bugs before they reach production.

Ready to see the difference? Clone the code above, run both cached and uncached versions, and benchmark them yourself. The performance gains speak for themselves.

What caching challenges have you faced in Node.js? Have you tried other patterns like write-through or read-through caching? Drop a comment below—I'd love to hear your experiences!

Happy caching!


Complete Project Structure

nodejs-redis-cache/
├── package.json
├── tsconfig.json
├── .env
├── .gitignore
└── src/
    ├── server.ts
    ├── redis.ts
    ├── mockdb.ts
    └── cacheWrapper.ts
Enter fullscreen mode Exit fullscreen mode

Don't forget to add .gitignore:

node_modules/
dist/
.env
Enter fullscreen mode Exit fullscreen mode

Run the complete example:

npm run dev  # Development with hot reload
npm run build && npm start  # Production build
Enter fullscreen mode Exit fullscreen mode

Learn how to do this using Golang here

Top comments (0)