Rate limiting plays a critical role in preventing abuse by controlling the number of requests that a single user is allowed to make within a given time frame. To keep track of requests and corresponding IP addresses, this information must be stored in memory for fast access and a better response time. ⛔✋
Deploying your Next.js app on Vercel's serverless or edge environments allows for great speed and scalability. These infrastructures handle requests individually, spinning up lightweight instances as needed to deliver very fast responses. This architectural design does come with one major limitation, though: data stored in the memory of a single instance isn't accessible to other instances, since each instance runs in a stateless way.
To solve this, we reach out in most cases to external solutions like Redis, an in-memory high-performance database. One of the services that complement Vercel perfectly is Upstash Redis, enabling shared, low-latency data access across instances.
With that foundation set, let's look at the implementation!
Method 1: Upstash Redis 🙌
In your Vercel project page, go to Storage
tab.
Click on Create Database
button, then select Upstash for Redis:
After creating the db, you'll see the list of secrets.
Keep in mind that these secret variables are already bound to your vercel project and you don't need to redefine them, but for local development you will need to copy them into your .env.local
file.
Don't forget to put this line in your
.gitignore
:
.env*.local
Install the following modules:
$ npm i @upstash/redis @upstash/ratelimit
Create a ratelimit instance and use it
// route.ts
import { Ratelimit } from "@upstash/ratelimit";
import { Redis } from "@upstash/redis";
const rateLimiter = new Ratelimit({
limiter: Ratelimit.fixedWindow(1, "60 s"),
redis: Redis.fromEnv(),
analytics: true,
prefix: "@upstash/ratelimit",
});
const getIP = (req: Request) => {
const ip =
req.headers.get("cf-connecting-ip") ||
req.headers.get("x-forwarded-for") ||
undefined;
return Array.isArray(ip) ? ip[0] : ip;
};
export async function POST(req: Request) {
// ...
// Here we check if limit hit! 👇
const { success } = await rateLimiter.limit(ip || "unknown");
if (!success) return Response.json({ error: "Too many requests" }, { status: 429 } );
// ...
}
And done!
Method 2: Tile38 self-hosted
This method is not relied on third-party services with their limitations but requires you to have a server.
Install docker on your server if not installed already.
Run Tile38 docker with custom configs
$ docker run --name tile38 -d -p 9851:9851 tile38/tile38
Config ufw-docker
to open related port
$ sudo ufw-docker allow tile38
Or you can use
reverse-proxy
onnginx
if you want to keep it behind a domain.
Now that everything is set, it's time for usage.
To bypass the limitations, we can create our own adapter as below:
// route.ts
import { Tile38 } from "@iwpnd/tile38-ts";
import { Ratelimit } from "@upstash/ratelimit";
// Tile38 HTTP API URL
const tile38 = new Tile38("http://your_server_ip:9851");
// Custom adapter for Tile38
const customTile38Adapter = {
// Custom function to simulate Lua script execution via HTTP API
evalsha: async (script: any, keys: any, args: any) => {
try {
const key = keys[0]; // Assuming we're using a single key for rate-limiting
const value = args[0]; // The value used for rate-limiting
const { ok } = await tile38.jSet("ratelimit", key, "root", value);
return { success: ok };
} catch (error) {
return { success: false };
}
},
get: async () => null,
set: async () => null,
};
// Initialize the rate limiter with Tile38 as the adapter
const ratelimit = new Ratelimit({
redis: customTile38Adapter, // Using the custom adapter
limiter: Ratelimit.fixedWindow(1, '10s'), // 10 requests per 10 seconds
});
...
Congratulations! 🎉 You’ve unlocked the power of combining Vercel’s serverless capabilities with Upstash Ratelimit. This setup will not only elevate your app’s performance but also open the door to building highly scalable, fast, and reliable applications. With this powerful foundation, your Next.js app is ready to handle whatever comes next—smoothly and efficiently. 🚀
Top comments (0)