DEV Community

Cover image for Implementing Rate Limiting in NestJS with Redis for Scalable Applications πŸš€
Juan Castillo
Juan Castillo

Posted on • Edited on

Implementing Rate Limiting in NestJS with Redis for Scalable Applications πŸš€

Introduction

Rate limiting is essential to prevent abuse and ensure fair usage of your API. When running multiple instances of your NestJS application, using Redis as a centralized store allows rate limits to be shared across all instances. In this guide, we'll implement rate limiting in NestJS using Redis.

Why Use Redis for Rate Limiting?

When deploying your NestJS application in a distributed environment (multiple instances behind a load balancer), storing rate limit counters in memory won't work consistently. Instead, using Redis provides:

  • Shared state across all instances 🏒
  • High performance for quick lookups ⚑
  • Persistence even after app restarts πŸ”„

Setting Up Redis and Dependencies

First, install Redis and the required dependencies:

npm install redis ioredis rate-limiter-flexible
Enter fullscreen mode Exit fullscreen mode

If you haven't installed Redis yet, you can run it using Docker:

docker run -d --name redis -p 6379:6379 redis
Enter fullscreen mode Exit fullscreen mode

Implementing Rate Limiting in NestJS

1. Create a RateLimitMiddleware

Create a new middleware file:

mkdir src/middleware && touch src/middleware/rate-limit.middleware.ts
Enter fullscreen mode Exit fullscreen mode

Now, implement rate limiting logic using rate-limiter-flexible:

import { Injectable, NestMiddleware, HttpException, HttpStatus } from '@nestjs/common';
import { Request, Response, NextFunction } from 'express';
import { RateLimiterRedis } from 'rate-limiter-flexible';
import Redis from 'ioredis';

const redisClient = new Redis({
  host: 'localhost', // Change if running Redis in a different host
  port: 6379,
  enableOfflineQueue: false,
});

const rateLimiter = new RateLimiterRedis({
  storeClient: redisClient,
  keyPrefix: 'middleware',
  points: 10, // 10 requests
  duration: 60, // per 60 seconds
});

@Injectable()
export class RateLimitMiddleware implements NestMiddleware {
  async use(req: Request, res: Response, next: NextFunction) {
    try {
      await rateLimiter.consume(req.ip); // Identify clients by IP
      next();
    } catch (rejRes) {
      throw new HttpException('Too Many Requests', HttpStatus.TOO_MANY_REQUESTS);
    }
  }
}
Enter fullscreen mode Exit fullscreen mode

2. Apply Middleware Globally

Modify AppModule to register the middleware globally:

import { Module, MiddlewareConsumer, NestModule } from '@nestjs/common';
import { RateLimitMiddleware } from './middleware/rate-limit.middleware';

@Module({})
export class AppModule implements NestModule {
  configure(consumer: MiddlewareConsumer) {
    consumer.apply(RateLimitMiddleware).forRoutes('*');
  }
}
Enter fullscreen mode Exit fullscreen mode

3. Running and Testing πŸ› οΈ

Start Redis and your NestJS app:

docker start redis  # If using Docker
npm run start
Enter fullscreen mode Exit fullscreen mode

Then, send multiple requests to test rate limiting:

curl -X GET http://localhost:3000
Enter fullscreen mode Exit fullscreen mode

After 10 requests within 60 seconds, you should get:

{"statusCode":429,"message":"Too Many Requests"}
Enter fullscreen mode Exit fullscreen mode

Conclusion

With Redis and rate-limiter-flexible, we implemented a scalable rate-limiting solution for a distributed NestJS application. This setup ensures consistent request limits across multiple instances, preventing abuse while maintaining high performance. πŸš€πŸ”₯

Happy coding! πŸŽ‰

Top comments (0)