How to Add Rate Limiting to Your Next.js App Router
Rate limiting is an essential technique to protect your application from abuse by controlling the number of requests a user can make in a given time frame. In this tutorial, we'll walk you through adding rate limiting to your Next.js application using the App Router and middleware. We'll cover both TypeScript and JavaScript implementations.
Why Rate Limiting?
Rate limiting helps to:
- Prevent denial-of-service (DoS) attacks
- Control API usage and prevent abuse
- Ensure fair usage among all users
- Protect server resources
Setting Up Middleware for Rate Limiting
We'll use the rate-limiter-flexible
package for implementing rate limiting. It supports various backends, including in-memory, Redis, and more.
Step 1: Install the Required Package
First, install the rate-limiter-flexible
package:
npm install rate-limiter-flexible
Step 2: Create Middleware
Next, create a middleware file in your project root. We'll provide both TypeScript and JavaScript examples.
TypeScript Implementation
Create a middleware.ts
file:
// middleware.ts
import { NextResponse } from 'next/server';
import type { NextRequest } from 'next/server';
import { RateLimiterMemory } from 'rate-limiter-flexible';
// Initialize the rate limiter
const rateLimiter = new RateLimiterMemory({
points: 10, // Number of points
duration: 1, // Per second
});
export async function middleware(request: NextRequest) {
try {
// Consume a point for each request
await rateLimiter.consume(request.ip);
// If successful, proceed with the request
return NextResponse.next();
} catch (rateLimiterRes) {
// If rate limit is exceeded, send a 429 response
return new NextResponse('Too many requests', { status: 429 });
}
}
// Specify the paths that will use this middleware
export const config = {
matcher: '/api/:path*',
};
JavaScript Implementation
Create a middleware.js
file:
// middleware.js
const { NextResponse } = require('next/server');
const { RateLimiterMemory } = require('rate-limiter-flexible');
// Initialize the rate limiter
const rateLimiter = new RateLimiterMemory({
points: 10, // Number of points
duration: 1, // Per second
});
async function middleware(request) {
try {
// Consume a point for each request
await rateLimiter.consume(request.ip);
// If successful, proceed with the request
return NextResponse.next();
} catch (rateLimiterRes) {
// If rate limit is exceeded, send a 429 response
return new NextResponse('Too many requests', { status: 429 });
}
}
// Specify the paths that will use this middleware
const config = {
matcher: '/api/:path*',
};
module.exports = { middleware, config };
Step 3: Testing Your Middleware
Start your Next.js development server and make multiple requests to any API route under /api/
to verify that the rate limiting is enforced. After the allowed number of requests, additional requests should receive a 429 Too Many Requests
response.
Conclusion
By following these steps, you can effectively implement rate limiting in your Next.js application using the App Router and middleware. This will help you control the rate of requests to your API routes and protect your server from being overwhelmed by too many requests. For more scalable solutions, consider using a distributed store like Redis with RateLimiterRedis
.
Happy coding!
Top comments (4)
Thanks for sharing. A few notes: most websites use Web Application Firewalls (WAFs), which means the preferred method is to implement this at the network level. If you decide to implement it at the code level, ensure you have the real userโs IP address, as it is often not forwarded.
Should I use it for the frontend side? this example for backend api
No, don't use it, this article is just for fun purpose
usefull ๐