.NET 9 Minimal APIs with Native AOT: The Performance Combo You're Missing
If you're building APIs in .NET 9, you're leaving performance on the table if you're not combining Minimal APIs with Native AOT compilation. Together, they deliver 5x faster startup, 50% less memory, and competitive throughput — all without a JIT compiler.
Let me break down the specific patterns that make this combination work.
Why Minimal APIs + Native AOT?
Minimal APIs were designed to be "pay for play" — you only pay for the features you use. Native AOT compiles your app to machine code ahead of time, eliminating the JIT compiler entirely. Together, they create the smallest, fastest-starting .NET APIs possible.
The numbers from Microsoft's own benchmarks:
| Metric | JIT (Stage2) | Native AOT SpeedOpt | Improvement |
|---|---|---|---|
| Startup Time | 528ms | 100ms | 5.3x faster |
| Memory (Linux) | 126MB | 56MB | 55% less |
| RPS (x64) | 235,008 | 215,637 | -8% |
| RPS (ARM) | 844,659 | 929,524 | +10% |
Source: aspnet/Benchmarks
On x64, JIT still wins on raw throughput. But on ARM (think cloud instances, Apple Silicon), Native AOT actually outperforms JIT. And startup time? Native AOT is in a different league.
Pattern 1: TypedResults for Type-Safe Responses
TypedResults isn't just about cleaner code — it provides compile-time type safety and automatic OpenAPI metadata with zero runtime overhead.
// Before: Generic IResult, no compile-time safety
app.MapGet("/products/{id}", (int id, AppDb db) =>
{
var product = db.Products.Find(id);
return product != null ? Results.Ok(product) : Results.NotFound();
});
// After: TypedResults with compile-time enforcement
app.MapGet("/products/{id}", Results<Ok<Product>, NotFound>(
(int id, AppDb db) =>
{
var product = db.Products.Find(id);
return product != null ? TypedResults.Ok(product) : TypedResults.NotFound();
}));
The Results<T1, TN> union type enforces that you only return the declared response types. Try returning a BadRequest? Compile error. This prevents the documentation drift that plagues traditional APIs.
Performance-wise, TypedResults and Results are identical. The benefit is architectural.
Pattern 2: Output Caching for Instant Responses
Output caching short-circuits the entire pipeline for cached requests. No endpoint logic, no database queries, no middleware — just a cached response.
// Program.cs
var builder = WebApplication.CreateBuilder();
builder.Services.AddOutputCache();
var app = builder.Build();
app.UseOutputCache(); // After UseRouting, UseCors
// Per-endpoint caching
app.MapGet("/products", async (AppDb db, CancellationToken ct) =>
await db.Products.AsNoTracking()
.Select(p => new ProductDto(p.Id, p.Name, p.Price))
.ToListAsync(ct))
.CacheOutput(policy => policy
.Expire(TimeSpan.FromMinutes(5))
.SetVaryByQuery("page", "pageSize"));
Default caching is 1 minute for 200 OK GET/HEAD responses, keyed by the full URL including query parameters. For read-heavy APIs, this is the single biggest performance win available.
Pattern 3: JsonSerializerContext for AOT-Compatible Serialization
Native AOT strips reflection-heavy code during trimming. The default JsonSerializer uses reflection to discover types at runtime — which breaks under trimming. The fix: source-generated serialization via JsonSerializerContext.
// Define your serializable types
[JsonSerializable(typeof(ProductDto))]
[JsonSerializable(typeof(List<ProductDto>))]
[JsonSerializable(typeof(ProductCreateDto))]
internal partial class AppJsonContext : JsonSerializerContext { }
// Register in Program.cs
builder.Services.ConfigureHttpJsonOptions(options =>
{
options.SerializerOptions.TypeInfoResolverChain.Insert(0, AppJsonContext.Default);
});
This generates serialization code at compile time — no reflection, no runtime type discovery. It's required for Native AOT and recommended for all trimmed applications.
The catch: You must declare every type you serialize. Miss one, and you get a runtime error. This is actually a feature — it forces you to think about your API contract.
Pattern 4: The AOT Project Configuration
<!-- .csproj -->
<PropertyGroup>
<TargetFramework>net9.0</TargetFramework>
<PublishAot>true</PublishAot>
<StripSymbols>true</StripSymbols>
<OptimizationPreference>Speed</OptimizationPreference>
</PropertyGroup>
Key settings:
-
<PublishAot>true</PublishAot>— Enables Native AOT compilation -
<StripSymbols>true</StripSymbols>— Removes debug symbols for smaller binaries -
<OptimizationPreference>Speed</OptimizationPreference>— Optimizes for speed over size
Publish with: dotnet publish -c Release
When Native AOT Doesn't Win
Native AOT has trade-offs:
- Longer build times — AOT compilation takes significantly longer than JIT
- No dynamic features — Reflection, dynamic types, and RuntimeCompilation don't work
- Steady-state throughput — JIT can optimize hot paths at runtime via PGO; AOT can't
For high-traffic, long-running services on x64, JIT may still deliver better throughput. But for:
- Serverless functions (cold start matters)
- Containerized microservices (memory density matters)
- ARM deployments (AOT outperforms JIT)
- Edge scenarios (latency matters)
Native AOT is the clear winner.
The Verdict
.NET 9 Minimal APIs with Native AOT deliver:
- 5x faster startup (100ms vs 528ms)
- 55% less memory (56MB vs 126MB)
- Competitive throughput (within 10% of JIT, better on ARM)
- Type-safe responses via TypedResults
- Instant cached responses via Output Caching
- AOT-safe serialization via JsonSerializerContext
If you're starting a new .NET 9 API project, Minimal APIs with Native AOT should be your default. The performance benefits are real, and the constraints (explicit types, no reflection) lead to cleaner, more maintainable code.
Sources:
Top comments (0)