Rate limiting is a fundamental mechanism for controlling the number of requests a client can make to a server in a given time frame.
In a world where more than 30% of web traffic comes from malicious bots, that proactive strategy is critical to protect servers from abuse.
In this tutorial, you'll delve into the concept of rate limiting and understand why you need it in your Node.js backend.
Then, you'll learn how to implement it in Express to block or slow down incoming excess requests.
It's time to become a Node.js rate limiting expert!
What Is Rate Limiting?
Rate limiting is a strategy for limiting network traffic by placing a cap on how often an actor can call the same API in a given time frame.
That's essential for controlling the volume of requests a client can make to a server in a certain amount of time.
To better understand how this mechanism works, consider a scenario where you have a Node.js backend that exposes some public endpoints.
Suppose a malicious user targets your server and writes a simple script to overload it with automated requests.
The performance of your server will downgrade as a result, hindering the experience of all other users.
In an extreme situation, your server may even go offline. By limiting the number of requests the same IP can make in a given time frame, you can avoid all that!
Specifically, there are two approaches to rate limiting:
- Blocking incoming requests: When a client exceeds the defined limits, deny its additional requests.
- Slowing down requests: Introduce a delay for requests beyond the limits, making the caller wait longer and longer for a response.
In this guide, you'll see how to implement both approaches!
Why You Need API Rate Limiting in Node.js
Implementing API rate limiting is crucial for maintaining stability, security, and fair usage of your Node.js application.
In particular, controlling the rate at which requests are processed helps enforce usage limits, prevents server overloads, and safeguards against malicious attacks.
Here's a closer look at scenarios where limiting requests in Node.js is especially useful:
- Anti-bot and anti-scraping measures: Limiting the frequency of requests from a single source helps mitigate bot activity. Automated scripts might overload your public endpoints with a high volume of traffic, but rate limiting the incoming requests is an effective defense against that malicious behavior.
- DoS attack prevention: Rate limiting is a key strategy to protect your backend against DoS (Denial of Service) attacks. By restricting the rate at which requests are accepted, you can reduce the impact of large-scale, coordinated attacks that attempt to overwhelm your server infrastructure.
- Implementing limitations for paid plans: For applications offering different subscription tiers, rate limiting enables you to enforce usage limits based on a user's plan. This ensures that users can access resources only as frequently as determined by their plan.
Prerequisites
To follow this tutorial, you need a Node.js 18+ application. For example, the following basic Express server will be enough:
const express = require("express");
const port = 3000;
// initialize an Express server
const app = express();
// define a sample endpoint
app.get("/hello-world", (req, res) => {
res.send("Hello, World!");
});
// start the server
app.listen(port, () => {
console.log(`Server listening at http://hostname:${port}`);
});
This exposes a single /hello-world
endpoint that returns ”Hello, World!”
For a faster setup, clone the GitHub repository that supports this guide:
git clone https://github.com/Tonel/nodejs-rate-limiting-demo
You'll find the Express server above and further implementations in dedicated branches.
In the following two sections, you'll learn how to integrate rate limiting behavior into a Node.js application using the following libraries:
-
express-rate-limit
: To block requests that exceed specified limits. -
express-slow-down
: To slow down similar requests coming from the same actor.
Let's dive in!
Blocking Requests With express-rate-limit
in Node.js
Follow the steps below to block excess requests in your Node.js backend using express-rate-limit
.
Getting Started
express-rate-limit
is an npm library that provides a rate limiting middleware for Express, so it's easier to limit repeated requests to all APIs or only to specific endpoints.
The middleware allows you to control how many requests the same user can make to the same endpoints before an application starts returning 429 Too Many Requests
errors.
Add the express-rate-limit
npm package to your project's dependencies with:
npm install express-rate-limit
Implement the Rate Limiting Blocking Logic in Node.js
First, import the rateLimit
function exposed by express-rate-limit
:
const { rateLimit } = require("express-rate-limit");
// for ESM users -> import { rateLimit } from "express-rate-limit";
Use it to define your rate limiting middleware as below:
const limiter = rateLimit({
windowMs: 5 * 60 * 1000, // 5 minutes
limit: 10, // each IP can make up to 10 requests per `windowsMs` (5 minutes)
standardHeaders: true, // add the `RateLimit-*` headers to the response
legacyHeaders: false, // remove the `X-RateLimit-*` headers from the response
});
Note that rateLimit
accepts an options object and returns the rate limiting middleware. The options used in the example above are:
-
windowMs
: The time frame where requests are checked for rate limiting. The default value is60000
(1 minute). -
limit
: The maximum number of connections to allow during thewindowMs
time span. By default, it's5
. -
standardHeaders
: To enable support for theRateLimit
headers recommended by the IETF. The default value isfalse
. -
legacyHeaders
: To send the legacy rate limitX-RateLimit-*
headers in the error responses. The value istrue
by default.
Other useful parameters are:
-
message
: The response body to return when a request is rate limited. The default message is “Too many requests, please try again later.” -
statusCode
: The HTTP status code to set in the rate limiting error responses. The default value is429
. -
skipFailedRequests
: To avoid counting failed requests inlimit
. Set tofalse
by default. -
skipSuccessfulRequests
: To avoid counting successful requests inlimit
. The value isfalse
by default. -
keyGenerator
: The function that contains the logic used to uniquely identify users. By default, it checks the IP address of incoming requests.
Check out the Express documentation for a complete list of all options available.
You now know what express-rate-limit
has to offer. What's left is to add its rate limiting capabilities to your application. Register the limiter
middleware to all endpoints with:
app.use(limiter);
If you instead want to rate limit APIs under a specific path, use the limiter
middleware as below:
app.use("/public", limiter);
To protect only a certain endpoint, pass limiter
as a parameter in the endpoint definition:
app.get("/hello-world", limiter, (req, res) => {
// ...
});
Great! You just implemented rate limiting in Node.js!
Testing Rate Limiting
Launch your Node.js application and verify that the rate limiting logic works as expected.
Otherwise, check out the express-rate-limit
branch from the repository supporting this article:
git checkout express-rate-limit
Enter the project folder, install the dependencies, and start the Express development server:
cd nodejs-rate-limiting-demo
npm install
npm run start
Your rate limiting Node.js application should be listening locally on port 3000. Open your favorite HTTP client and call the /hello-world
endpoint:
The first ten times, the API will return the message “Hello, World!”
Inspect the response headers:
Note the RateLimit-*
headers added to the responses by express-rate-limit
.
These are useful for the caller to understand how many more requests it can make before getting blocked.
On the eleventh API call within a 5-minute time frame, the request will fail, with the 429 HTTP error: “Too many requests, please try again later.”
Well done! The Express rate limiting functionality has been implemented correctly!
Slowing Down Requests With express-slow-down
This step-by-step section will teach you how to use express-slow-down
to slow down requests and avoid overloading your Node.js server.
Getting Started
express-slow-down
provides a rate limiting middleware built on top of express-rate-limit
.
Instead of blocking requests outright, express-slow-down
slows down the processing of incoming requests that exceed defined limits.
Install it with:
npm install express-slow-down
Implement the Node.js Rate Limiting Slowing Down Logic
Import the rateLimit
function from express-slow-down
in your Express application:
const { rateLimit } = require("express-slow-down");
// for ESM users -> import { rateLimit } from "express-slow-down";
Use it to set up your slowing down rate limiting middleware:
const limiter = slowDown({
windowMs: 15 * 60 * 1000, // 5 minutes
delayAfter: 10, // allow 10 requests per `windowMs` (5 minutes) without slowing them down
delayMs: (hits) => hits * 200, // add 200 ms of delay to every request after the 10th
maxDelayMs: 5000, // max global delay of 5 seconds
});
The rateLimit
function provided by express-slow-down
works just like that of express-rate-limit
. The main difference is that it supports the following additional options:
-
delayAfter
: The maximum number of incoming requests allowed duringwindowMs
before the middleware starts delaying their processing. The default value is1
. -
delayMs
: The requests-to-rate-limit delay. It can be the delay itself in milliseconds or a function that defines custom behavior. By default, it increases the delay by 1 second for every request over the limit. -
maxDelayMs
: The absolute maximum value thatdelayMs
can reach. It defaults toInfinity
.
Again, you can apply the limiter
middleware to all endpoints or only to some APIs. To add it to all requests, use:
app.use(limiter);
Fantastic! You just implemented a rate limiter that slows down incoming requests in excess!
Testing express-slow-down
You now have to ensure that the Node.js server slows down incoming requests as desired.
Start your local server by following the steps above, or check out the express-slow-down
branch of the GitHub repository that backs this tutorial:
git checkout express-slow-down
Access the project directory, install the dependencies, and launch the development server:
cd express-slow-down
npm install
npm run start
The local Express application will listen on port 3000. Open your HTTP client and call the /hello-world
endpoint:
For the first ten calls, the server will return the response “Hello, World!” immediately.
Then, the responses will start to become slower and slower:
Take a look at the response time. It went from 33ms to 2.23 seconds.
This makes sense considering the slowing down logic:
(hits * 200ms)
hits
is 11, so 11*200ms + ~33ms, which brings us to ~2.2 seconds.
Et voilà! Now you understand rate limiting!
Wrapping Up: Protect Your Node.js Server From Malicious Behavior With Rate Limiting
In this blog post, we took a look at what rate limiting is and what benefits it introduces when applied to a Node.js application using Express.
You now know:
- The definition of rate limiting
- Why you should limit the number of requests users can make to your application
- How to implement the two most popular approaches to rate limiting in Node.js:
express-rate-limit
andexpress-slow-down
Thanks for reading!
P.S. If you liked this post, subscribe to our JavaScript Sorcery list for a monthly deep dive into more magical JavaScript tips and tricks.
P.P.S. If you need an APM for your Node.js app, go and check out the AppSignal APM for Node.js.
Top comments (2)
Great article. Thanks for sharing!
Thank you for reading it!