You send a request to an API endpoint.
Milliseconds later, a response comes back.
Most of the time, we don’t think much about what happens in between. We write controllers, configure middleware, run the application, and everything works.
Until it doesn’t.
Maybe authentication suddenly stops working.
Maybe a middleware behaves differently than expected.
Maybe performance drops under load.
Or routing starts sending requests to the wrong endpoint.
When that happens, the question becomes unavoidable:
What actually happens inside ASP.NET Core when a request hits your API?
Understanding the request pipeline is what turns ASP.NET Core from a black box into something you can actually debug and optimize.
In this article, we'll walk through the lifecycle of a request in ASP.NET Core—from the moment it reaches your server to the moment the response is sent back.
The Big Picture: The ASP.NET Core Request Flow
If you trace a request from the network all the way to your controller or endpoint, it roughly goes through this path:
Client
↓
Reverse Proxy (optional)
↓
Kestrel Web Server
↓
ASP.NET Core Hosting Layer
↓
Middleware Pipeline
↓
Endpoint Routing
↓
Endpoint Execution (Controller / Minimal API)
↓
Middleware (Response Flow)
↓
Kestrel
↓
Client Response
Each stage gets a chance to process the request before it reaches your application logic.
Once you understand this flow, debugging strange behavior becomes much easier.
Step 1: The Request Reaches Kestrel
The first component inside your application that receives the request is Kestrel.
Kestrel is the default high-performance web server used by ASP.NET Core. Its job is to:
- Listen for incoming HTTP requests
- Parse HTTP messages
- Forward the request into the ASP.NET Core application pipeline
Kestrel is designed for high throughput and low latency. It uses asynchronous I/O and efficient networking primitives to handle thousands of concurrent connections.
In production environments, Kestrel usually sits behind a reverse proxy such as:
- Nginx
- Apache
- IIS
- Azure App Service infrastructure
The reverse proxy handles things like TLS termination, load balancing, and security filtering, while Kestrel still processes the request inside the application.
Once Kestrel receives the request, it passes it into the ASP.NET Core pipeline.
Step 2: The ASP.NET Core Hosting Layer
Before the request reaches middleware, ASP.NET Core’s hosting layer has already done some important work.
When the application starts, the hosting layer:
- Builds the dependency injection container
- Configures logging
- Loads configuration
- Constructs the middleware pipeline
This setup happens during application startup in Program.cs.
By the time a request arrives, the middleware pipeline has already been assembled and is ready to process incoming requests.
Step 3: The Request Enters the Middleware Pipeline
Most of the interesting work in ASP.NET Core happens inside the middleware pipeline.
Middleware are small components that can:
- Inspect the request
- Modify the request
- Stop the request from continuing
- Pass the request to the next component
- Modify the response before it leaves
Middleware are configured in Program.cs.
Example:
app.Use(async (context, next) =>
{
var logger = context.RequestServices
.GetRequiredService<ILoggerFactory>()
.CreateLogger("RequestLogger");
logger.LogInformation("Request started: {Path}", context.Request.Path);
await next();
logger.LogInformation("Response finished with status {StatusCode}",
context.Response.StatusCode);
});
Here’s what happens during execution:
- The request enters the middleware
- Code before
await next()runs - The request moves to the next middleware
- Eventually an endpoint executes
- The response travels back through middleware
- Code after
await next()runs
This creates a two-way pipeline:
Request → Middleware → Endpoint
Response ← Middleware ← Endpoint
One thing that surprises many developers when debugging middleware is that responses travel back through the pipeline in reverse order.
Middleware Can Short-Circuit the Pipeline
Middleware can also stop the pipeline entirely.
For example:
app.Use(async (context, next) =>
{
if (!context.User.Identity?.IsAuthenticated ?? true)
{
context.Response.StatusCode = StatusCodes.Status401Unauthorized;
return;
}
await next();
});
In this case, the request never reaches later middleware or the endpoint.
This behavior is commonly used for:
- authentication checks
- rate limiting
- request filtering
Step 4: Built-in Middleware Components
ASP.NET Core provides several built-in middleware components that most applications rely on.
Common examples include:
Routing Middleware
Determines which endpoint matches the request.
app.UseRouting();
Authentication Middleware
Validates the user identity.
app.UseAuthentication();
Authorization Middleware
Checks whether the authenticated user has permission.
app.UseAuthorization();
Exception Handling Middleware
Handles unhandled exceptions globally.
app.UseExceptionHandler();
*Other Common Production Middleware
*
Real-world APIs often include additional middleware such as:
- CORS (
UseCors) - Response compression (
UseResponseCompression) - HTTPS redirection (
UseHttpsRedirection) - Rate limiting (
UseRateLimiter)
Each middleware adds a delegate to the request pipeline. Individually they’re lightweight, but extremely long middleware chains can introduce small overhead in very high-throughput systems.
Middleware Order Matters
One of the most common sources of bugs in ASP.NET Core applications is incorrect middleware ordering.
Consider this configuration:
app.UseAuthorization();
app.UseAuthentication();
This breaks authentication because authorization runs before the user identity is established.
The correct order is:
app.UseAuthentication();
app.UseAuthorization();
When debugging strange authentication behavior, middleware order is often the first thing worth checking.
Step 5: Endpoint Routing
After middleware processing, ASP.NET Core needs to determine which endpoint should handle the request.
This is handled by endpoint routing.
Routing examines:
- HTTP method (GET, POST, etc.)
- request path
- route parameters
Example Minimal API:
app.MapGet("/products/{id}", (int id) =>
{
return Results.Ok($"Product {id}");
});
If the request is:
GET /products/10
Routing selects this endpoint and prepares it for execution.
UseRouting() identifies the matching endpoint, while the endpoint delegate itself executes later in the pipeline.
ASP.NET Core’s routing system is highly optimized and capable of efficiently matching large numbers of routes.
Step 6: Endpoint Execution
Once routing selects the correct endpoint, ASP.NET Core executes the endpoint logic.
This could be:
- a controller action
- a minimal API handler
- a Razor page
- a gRPC service
For controller-based APIs, ASP.NET Core performs several additional steps automatically.
Model Binding
ASP.NET Core maps incoming request data into method parameters.
Example:
[HttpPost]
public IActionResult CreateOrder(OrderDto order)
Data can be bound from multiple sources:
- request body
- route values
- query parameters
- headers
- form data
Validation
If validation attributes are used, ASP.NET Core validates the model automatically.
Example:
public class OrderDto
{
[Required]
public string CustomerEmail { get; set; }
}
Invalid models typically produce a 400 Bad Request response.
Business Logic
This is where your application code runs.
Typical tasks include:
- database queries
- calling services
- performing calculations
- invoking external APIs
Returning a Result
The endpoint returns a result such as:
return Ok(order);
ASP.NET Core then converts this result into an HTTP response.
For example:
- objects → JSON
- status codes → HTTP response codes
- headers → HTTP headers
Step 7: The Response Travels Back Through Middleware
Once the endpoint finishes execution, the response begins its return journey.
The response flows back through the middleware pipeline in reverse order.
This allows middleware to:
- modify response headers
- compress responses
- log execution time
- transform output
Finally, the response reaches Kestrel, which sends it back to the client.
A Simple Performance Debugging Trick
When diagnosing slow requests, a small timing middleware can quickly identify bottlenecks.
Example:
app.Use(async (context, next) =>
{
var stopwatch = Stopwatch.StartNew();
await next();
stopwatch.Stop();
var logger = context.RequestServices
.GetRequiredService<ILoggerFactory>()
.CreateLogger("Performance");
logger.LogInformation("Request completed in {Elapsed} ms",
stopwatch.ElapsedMilliseconds);
});
This simple middleware can reveal slow endpoints or middleware components almost instantly.
Visual Summary of the Request Flow
Request Flow Summary
Client
↓
Kestrel
↓
Middleware Pipeline
↓
Routing
↓
Endpoint Execution
↓
Middleware (response)
↓
Client
Key Takeaways
ASP.NET Core processes requests through a middleware pipeline
Kestrel is the web server that receives HTTP requests
Middleware can inspect, modify, or terminate requests
Middleware order directly affects application behavior
Endpoint routing determines which API logic executes
Responses travel back through the same middleware pipeline
Once you understand this flow, ASP.NET Core stops feeling like a black box. Debugging becomes easier, middleware behavior makes more sense, and performance issues are much easier to track down.
Have you ever spent hours debugging an ASP.NET Core API only to realize the issue was caused by middleware order?
Top comments (0)