Adding Rate Limiting to Your .NET Web API (with Settings from appsettings.json
)
APIs are awesome, but they can also be fragile. If a client starts firing off requests too quickly, your API could slow down—or worse, go down entirely. That’s where rate limiting comes in: it helps you control traffic and keep things stable.
For this walkthrough, we’ll use the popular AspNetCoreRateLimit library, which makes it easy to set up IP-based rate limiting in a .NET Web API. The best part? We’ll configure everything in appsettings.json
so we can change rules without touching the code.
Step 1: Add NuGet Packages
First, install the package:
dotnet add package AspNetCoreRateLimit
Step 2: Configure appsettings.json
Here’s an example configuration with different rules based on HTTP verbs:
"IpRateLimiting": {
"EnableEndpointRateLimiting": true,
"StackBlockedRequests": false,
"RealIpHeader": "X-FORWARDED-FOR", // Use this header when your app is behind a proxy/load balancer to capture the actual client IP
"ClientIdHeader": "X-ClientId",
"HttpStatusCode": 429,
"GeneralRules": [
{
"Endpoint": "GET:/*",
"Period": "1s",
"Limit": 10
},
{
"Endpoint": "GET:/*",
"Period": "1m",
"Limit": 100
},
{
"Endpoint": "POST:/*",
"Period": "1s",
"Limit": 5
},
{
"Endpoint": "POST:/*",
"Period": "1m",
"Limit": 25
},
{
"Endpoint": "PUT:/*",
"Period": "1s",
"Limit": 5
},
{
"Endpoint": "PUT:/*",
"Period": "1m",
"Limit": 30
},
{
"Endpoint": "DELETE:/*",
"Period": "1m",
"Limit": 5
}
]
},
"IpRateLimitPolicies": {
"IpRules": []
}
This configuration enforces multiple layers of throttling:
- GET → 10 requests per second, 100 per minute
- POST → 5 per second, 25 per minute
- PUT → 5 per second, 30 per minute
- DELETE → 5 per minute
Step 3: Register Services in Program.cs
Now let’s wire things up:
using AspNetCoreRateLimit;
var builder = WebApplication.CreateBuilder(args);
// Load configuration from appsettings.json
builder.Services.AddOptions();
builder.Services.AddMemoryCache();
// Load IpRateLimiting config
builder.Services.Configure<IpRateLimitOptions>(
builder.Configuration.GetSection("IpRateLimiting"));
builder.Services.Configure<IpRateLimitPolicies>(
builder.Configuration.GetSection("IpRateLimitPolicies"));
// Inject the rate limiting services
builder.Services.AddInMemoryRateLimiting();
builder.Services.AddSingleton<IRateLimitConfiguration, RateLimitConfiguration>();
builder.Services.AddControllers();
var app = builder.Build();
// Enable IP rate limiting
app.UseIpRateLimiting();
app.MapControllers();
app.Run();
Step 4: Add a Test Controller
[ApiController]
[Route("api/[controller]")]
public class HelloController : ControllerBase
{
[HttpGet]
public IActionResult Get()
{
return Ok(new { Message = "Hello, world!" });
}
[HttpPost]
public IActionResult Post()
{
return Ok(new { Message = "You posted something!" });
}
}
Now, if you hit GET /api/hello
more than 10 times per second or 100 times per minute, you’ll start getting HTTP 429 Too Many Requests. The same applies for POST
, PUT
, and DELETE
with their own limits.
Wrapping Up
With just a few lines of setup, you’ve added IP-based rate limiting to your .NET Web API. Everything is controlled by configuration in appsettings.json
, so tweaking rules is as simple as editing JSON.
This approach is great for protecting your API from abuse, buggy clients, or even DDoS-style traffic spikes. And since it’s based on IP addresses, you can easily extend it with policies to give specific clients different limits.
Protect your API early—your infrastructure (and your users) will thank you. 🚀
Top comments (0)