DEV Community

Cover image for 10 ASP.NET Core API Performance Mistakes That Hurt Scalability
alinabi19
alinabi19

Posted on

10 ASP.NET Core API Performance Mistakes That Hurt Scalability

ASP.NET Core is one of the fastest web frameworks available today.

Benchmarks regularly show it outperforming many other platforms. Yet in real production systems, I’ve seen ASP.NET Core APIs struggle under load — even when the infrastructure was solid.

Response times that were 50–100ms during development suddenly climb to 800ms or more in production. CPU usage spikes, database calls slow down, and suddenly everyone is asking the same question:

“Why is our API so slow?”
In most cases, the problem isn’t ASP.NET Core.

It’s small design decisions made during development that quietly add overhead over time.

Things like:

  • Returning too much data
  • Inefficient database queries
  • Blocking threads
  • Missing caching
  • Large payloads

Individually these issues may seem harmless. Combined, they can dramatically reduce API performance and scalability.

After working on several production APIs, I’ve noticed the same performance issues appear again and again.

Let’s walk through 10 of the most common mistakes developers make when building ASP.NET Core APIs - and how to fix them.

Why API Performance Matters

API performance affects far more than just response time.

Slow APIs lead to:

  • Higher CPU and memory consumption
  • Increased cloud infrastructure costs
  • Poor scalability under load
  • Frustrated users waiting for responses

A well-designed ASP.NET Core API can handle thousands of requests per second with minimal infrastructure.

But that only happens when performance is considered early in the design.

1. Returning Too Much Data

One of the most common API performance issues is returning entire database entities instead of only the fields needed by the client.

Large payloads increase:

  • serialization time
  • network transfer time
  • client parsing time

Bad Example

app.MapGet("/users", async (AppDbContext db) =>
{
    return await db.Users.ToListAsync();
});
Enter fullscreen mode Exit fullscreen mode

If your User table has 20 columns but the frontend only needs four, you're wasting bandwidth and compute.

Better Approach: Use DTO Projection

app.MapGet("/users", async (AppDbContext db, CancellationToken ct) =>
{
    return await db.Users
        .AsNoTracking()
        .Select(u => new UserDto
        {
            Id = u.Id,
            Name = u.Name,
            Email = u.Email
        })
        .ToListAsync(ct);
});
Enter fullscreen mode Exit fullscreen mode

Benefits:

  • smaller SQL queries
  • less memory usage
  • faster serialization
  • smaller responses

This simple change can reduce payload sizes dramatically.

2. Blocking Threads Instead of Using Async

ASP.NET Core is designed for asynchronous I/O.

When database calls or external requests are made synchronously, threads become blocked while waiting for results.

Blocking Example

var users = db.Users.ToList();
Enter fullscreen mode Exit fullscreen mode

Under load, blocked threads reduce throughput and can cause thread pool starvation.

Correct Approach

var users = await db.Users.ToListAsync(ct);
Enter fullscreen mode Exit fullscreen mode

Async operations allow ASP.NET Core to:

  • free threads while waiting for I/O
  • handle more concurrent requests
  • scale efficiently

A common mistake I still see in production code is calling .Result or .Wait(). These should almost always be avoided in ASP.NET Core APIs.

3. Inefficient Database Queries

In most real-world APIs, the database is the primary performance bottleneck.

Common issues include:

  • N+1 queries
  • missing indexes
  • unnecessary joins
  • over-fetching data

N+1 Query Problem

var orders = await db.Orders.ToListAsync();

foreach (var order in orders)
{
    var items = await db.OrderItems
        .Where(i => i.OrderId == order.Id)
        .ToListAsync();
}
Enter fullscreen mode Exit fullscreen mode

If there are 100 orders, this produces 101 database queries.

Better Approach

var orders = await db.Orders
    .Include(o => o.Items)
    .AsNoTracking()
    .ToListAsync(ct);
Enter fullscreen mode Exit fullscreen mode

Always review your generated SQL queries. Small EF query mistakes can cause massive database load.

4. Ignoring Caching

If the same data is requested repeatedly, hitting the database every time is wasteful.

Caching can reduce response times dramatically.

In-Memory Cache Example

app.MapGet("/products", async (
    AppDbContext db,
    IMemoryCache cache,
    CancellationToken ct) =>
{
    if (!cache.TryGetValue("products", out List<Product> products))
    {
        products = await db.Products
            .AsNoTracking()
            .ToListAsync(ct);

        cache.Set("products", products,
            new MemoryCacheEntryOptions
            {
                AbsoluteExpirationRelativeToNow = TimeSpan.FromMinutes(5),
                SlidingExpiration = TimeSpan.FromMinutes(2)
            });
    }

    return products;
});
Enter fullscreen mode Exit fullscreen mode

Caching is particularly effective for:

  • configuration data
  • product catalogs
  • reference tables

For multi-instance deployments, a distributed cache like Redis is usually a better choice.

5. Missing Pagination on Large Endpoints

Returning thousands of rows in a single API response can create serious performance issues.

Problems include:

  • large payload sizes
  • high memory consumption
  • slow serialization

Proper Pagination

app.MapGet("/orders", async (
    AppDbContext db,
    int page = 1,
    int pageSize = 20,
    CancellationToken ct) =>
{
    pageSize = Math.Min(pageSize, 100);

    return await db.Orders
        .AsNoTracking()
        .Skip((page - 1) * pageSize)
        .Take(pageSize)
        .ToListAsync(ct);
});
Enter fullscreen mode Exit fullscreen mode

Always enforce a maximum page size to prevent abuse.

6. Overloading the Middleware Pipeline

Middleware runs on every request, so unnecessary middleware can add latency.

Common mistakes include:

  • heavy logging middleware
  • redundant request parsing
  • complex logic inside middleware

A clean pipeline might look like:

app.UseRouting();
app.UseAuthentication();
app.UseAuthorization();

Enter fullscreen mode Exit fullscreen mode

Each additional middleware adds overhead, so keep the pipeline intentional and minimal.

7. Not Using Response Compression

Large JSON responses can significantly increase network latency.

ASP.NET Core provides built-in response compression.

Enable Compression

builder.Services.AddResponseCompression();

app.UseResponseCompression();
Enter fullscreen mode Exit fullscreen mode

Compression is especially helpful for:

  • large JSON responses
  • mobile networks
  • APIs returning lists or datasets

8. Excessive Logging in Production

Logging is essential, but logging too much can hurt performance.

Common mistakes include:

  • logging entire request bodies
  • debug-level logs in production
  • logging inside loops

Better Logging Approach

logger.LogInformation(
    "Order created for user {UserId}",
    userId
);
Enter fullscreen mode Exit fullscreen mode

Structured logging keeps logs useful while minimizing overhead.

9. Mismanaging Database Connections

Creating database connections is expensive, which is why connection pooling exists.

However, performance issues still occur when:

DbContext lifetimes are misconfigured

long-running queries hold connections

transactions stay open too long

Best practices:

use DbContext with scoped lifetime

avoid long transactions

keep queries efficient

10. Skipping Load Testing

Many APIs perform well during development but fail under real traffic.

Performance problems often appear only when:

  • hundreds of requests run concurrently
  • database contention increases
  • thread pools become saturated

Good load testing tools include:

  • k6
  • Apache JMeter
  • NBomber
  • Azure Load Testing

Testing under realistic traffic conditions helps reveal bottlenecks before production users experience them.

Pro Tip: Measure Before Optimizing

Optimization without measurement is mostly guesswork.

Useful performance tools include:

  • MiniProfiler
  • Application Insights
  • OpenTelemetry

These tools help identify the real bottleneck instead of optimizing blindly.

A Common Performance Trap

One mistake I frequently see is developers trying to optimize application code first while ignoring the database.

In many real systems, database queries account for 70–90% of total API response time.

Start your investigation there.

Key Takeaways

To keep ASP.NET Core APIs fast and scalable:

  • Return only the data clients actually need
  • Use asynchronous I/O consistently
  • Optimize database queries early
  • Cache frequently requested data
  • Implement pagination on large endpoints
  • Keep middleware pipelines lean
  • Enable response compression
  • Avoid excessive logging in production
  • Use proper DbContext lifetimes
  • Load test before production traffic

Performance rarely comes from one big optimization.

It usually comes from avoiding dozens of small mistakes that accumulate over time.

If you're building high-traffic APIs, these small decisions can make the difference between an API that struggles under load and one that scales effortlessly.

Top comments (0)