.NET NativeAOT on AWS Lambda: The Performance Revolution You've Been Waiting For
If you're running .NET on AWS Lambda and haven't explored NativeAOT yet, you're leaving serious performance (and money) on the table. Let me show you some numbers that might change your mind:
- π 7Γ faster cold starts (6680ms β 940ms)
- β‘ 6Γ faster warm runs (91ms β 14ms)
- πΎ 50% less memory (93MB β 42MB)
- π° 73% lower costs for high-volume workloads
And here's the kicker: these aren't theoretical numbers. They're from real Lambda functions running in production.
.NET NativeAOT: Performance Revolution
A Deep Dive into AOT vs ReadyToRun vs Regular .NET
π Table of Contents
- Introduction & Objectives
- Understanding the Execution Models
- Why NativeAOT Matters
- Performance Results: The Numbers Don't Lie
- Build & Packaging Deep Dive
- Coding for AOT: Patterns & Best Practices
- Real-World Examples from This Repo
- Deployment Strategies
- Migration Guide
- When to Choose Which Approach
- FAQ & Troubleshooting
- Key Takeaways
- Additional Resources
- Getting Started
- Contributing
- License
- Acknowledgments
π― Introduction & Objectives
Who This Presentation Is For
.NET engineers familiar with Lambda, microservices, or performance-critical applications looking to understand modern compilation strategies.
What You'll Learn
- NativeAOT fundamentals and how it compares to ReadyToRun and Regular .NET
- Measured performance improvements from real-world Lambda functions
- Practical build and deployment strategies
- Code patterns required for successful AOT adoption
- Migration strategies for existing applications
Repository Context
This presentation uses a multi-mode Lambda demo repository with:
- 3 AOT Lambda functions (.NETβ¦
Why Should You Care?
Every second your Lambda function takes to cold start is a second your users wait. And in the world of web applications, every millisecond counts:
- Amazon found that 100ms delay = 1% revenue loss
- Google reports 53% of mobile users abandon sites taking >3 seconds to load
- A 6.7-second cold start (Regular .NET) is simply unacceptable for modern APIs
But it's not just about user experience - it's about your bottom line:
| Scale | Regular .NET | NativeAOT | Savings |
|---|---|---|---|
| 1M requests/mo | $0.98 | $0.26 | $0.72/mo |
| 10M requests/mo | $9.80 | $2.60 | $7.20/mo |
| 100M requests/mo | $98.00 | $26.00 | $72/mo |
For an enterprise with 50 Lambda functions handling 100M requests each: $43,200/year saved.
The Three Compilation Modes: A Quick Primer
Before we dive into the numbers, let's understand what we're comparing:
1. Regular .NET (JIT)
Traditional approach: ships IL bytecode, compiles to native at runtime.
App Start β Load IL β Init Runtime β JIT Compile β Execute
β±οΈ SLOW β±οΈ SLOW
Pros: Maximum flexibility, smallest package
Cons: Slowest startup, unpredictable performance
2. ReadyToRun (R2R)
Hybrid: ships both IL and precompiled native images.
App Start β Load R2R+IL β Init Runtime β Execute (mostly native)
β±οΈ SLOW
Pros: Faster startup than Regular, minimal code changes
Cons: Still requires full runtime, larger packages
3. NativeAOT
Pure native: everything compiled ahead-of-time.
App Start β Execute Native Binary
β
FAST
Pros: Fastest startup, lowest memory, no runtime overhead
Cons: Reflection limitations (requires source generators)
The Real Performance Data
I built identical Lambda functions in all three modes and ran them through hundreds of test cycles. Here's what I found:
Cold Start Performance
Regular .NET 8: ββββββββββββββββββββββββββββββββββββββββ 6680ms
ReadyToRun .NET 8: ββββββββββββββββββββββββ 4389ms
AOT .NET 8: ββββββ 1082ms β‘
AOT .NET 9: βββββ 971ms β‘
AOT .NET 10: ββββ 940ms β‘ (FASTEST)
Improvement: 7.1Γ faster (Regular β AOT .NET 10)
Warm Run Performance
ReadyToRun .NET 8: ββββββββββββ 99ms
Regular .NET 8: βββββββββββ 91ms
AOT .NET 8: ββ 18ms β‘
AOT .NET 9: β 14ms β‘ (FASTEST)
AOT .NET 10: ββ 17ms β‘
Improvement: 6.5Γ faster (Regular β AOT .NET 9)
Memory Usage
ReadyToRun .NET 8: ββββββββββββ 89-96 MB
Regular .NET 8: βββββββββββ 88-93 MB
AOT .NET 8: βββββ 46-52 MB β‘
AOT .NET 9: ββββ 43-49 MB β‘
AOT .NET 10: ββββ 42-48 MB β‘ (LOWEST)
Improvement: 52% less memory (Regular β AOT .NET 10)
The Code: What Changes?
The good news? Your business logic doesn't change. The differences are in how you bootstrap and serialize.
Regular .NET Lambda
[assembly: LambdaSerializer(typeof(DefaultLambdaJsonSerializer))]
namespace LambdaRegularDemo;
public class Function
{
private readonly IDynamoDBRepository _repository;
public Function(IDynamoDBRepository repository)
{
_repository = repository;
}
[LambdaFunction]
public async Task<Guid> FunctionHandler(
Dictionary<string, string> input,
ILambdaContext context)
{
using var cts = new CancellationTokenSource(context.RemainingTime);
return await _repository.CreateAsync(cts.Token);
}
}
NativeAOT Lambda
Main difference: Source-generated JSON serialization
// AOTJsonContext.cs
[JsonSerializable(typeof(Guid))]
[JsonSerializable(typeof(Dictionary<string, string>))]
public partial class AOTJsonContext : JsonSerializerContext { }
// Function.cs
public class Function
{
private readonly IDynamoDBRepository _repository;
public Function(IDynamoDBRepository repository)
{
_repository = repository;
}
public async Task<Guid> FunctionHandler(
Dictionary<string, string> input,
ILambdaContext context)
{
// Same business logic!
using var cts = new CancellationTokenSource(context.RemainingTime);
return await _repository.CreateAsync(cts.Token);
}
}
// Program.cs - Custom runtime bootstrap
var handler = async (Dictionary<string, string> input, ILambdaContext context) =>
{
await using var serviceProvider = CreateServiceProvider();
var function = serviceProvider.GetRequiredService<Function>();
return await function.FunctionHandler(input, context);
};
await LambdaBootstrapBuilder
.Create(handler, new SourceGeneratorLambdaJsonSerializer<AOTJsonContext>())
.Build()
.RunAsync();
Project Configuration
The key difference is in your .csproj:
<PropertyGroup>
<OutputType>Exe</OutputType> <!-- Must be Exe for AOT -->
<TargetFramework>net9.0</TargetFramework>
<RuntimeIdentifier>linux-x64</RuntimeIdentifier>
<PublishAot>true</PublishAot> <!-- Enable AOT -->
<SelfContained>true</SelfContained>
<StripSymbols>true</StripSymbols>
<InvariantGlobalization>true</InvariantGlobalization>
</PropertyGroup>
The AOT Challenge: Reflection
NativeAOT's biggest limitation? No dynamic reflection. But this is actually a blessing in disguise - it forces you to write better code.
β What Doesn't Work
// Dynamic type loading
Type t = Type.GetType("MyNamespace.MyClass");
var instance = Activator.CreateInstance(t);
// Runtime assembly loading
Assembly asm = Assembly.Load("PluginAssembly");
// Reflection-based serialization
JsonSerializer.Serialize(obj); // Uses reflection!
β AOT-Friendly Alternatives
1. JSON Source Generation
[JsonSerializable(typeof(MyType))]
public partial class MyJsonContext : JsonSerializerContext { }
// Usage
var json = JsonSerializer.Serialize(
data,
MyJsonContext.Default.MyType
);
2. Constructor Injection (Always!)
// β
DO: Types known at compile time
public class MyService
{
private readonly IRepository _repo;
public MyService(IRepository repo)
{
_repo = repo;
}
}
// β DON'T: Service locator pattern
var repo = serviceProvider.GetService<IRepository>();
3. Explicit Type Registration
// β
DO: Compile-time registration
services.AddSingleton<IPlugin, ConcretePlugin1>();
services.AddSingleton<IPlugin, ConcretePlugin2>();
// β DON'T: Runtime discovery
var plugins = Directory.GetFiles("plugins", "*.dll")
.Select(Assembly.LoadFrom);
Building for Production
Use Docker to ensure consistent Linux binaries:
FROM mcr.microsoft.com/dotnet/sdk:10.0 AS build
# Install native toolchain for AOT
RUN apt-get update && \
apt-get install -y clang zlib1g-dev zip
WORKDIR /src
COPY . .
# Build AOT Lambda
RUN dotnet restore ./LambdaAOTDemo9/LambdaAOTDemo9.csproj && \
dotnet publish ./LambdaAOTDemo9/LambdaAOTDemo9.csproj \
-c Release -o /artifacts/publish && \
mv /artifacts/publish/LambdaAOTDemo9 /artifacts/publish/bootstrap && \
cd /artifacts/publish && \
zip -r /artifacts/LambdaAOTDemo9-lambda.zip .
Why Docker?
- Ensures Linux build on any dev OS (Windows, Mac, Linux)
- Includes required native dependencies (
clang,zlib1g-dev) - Reproducible builds across team
Deploying to AWS Lambda
Option 1: Custom Runtime (Recommended)
LambdaAOTFunction:
Type: AWS::Lambda::Function
Properties:
Runtime: provided.al2023
Handler: bootstrap
Code:
S3Bucket: !Ref DeploymentBucket
S3Key: LambdaAOTDemo9-lambda.zip
MemorySize: 256 # Can use less memory with AOT!
Timeout: 30
Option 2: Container Image
FROM public.ecr.aws/lambda/provided:al2023
COPY --from=build /artifacts/publish/bootstrap ${LAMBDA_RUNTIME_DIR}/bootstrap
CMD ["bootstrap"]
Benefits:
- Deploy .NET 9, .NET 10, or future versions today
- No waiting for AWS managed runtime updates
- Smaller image (no 200MB+ .NET runtime layer)
When Should You Use AOT?
β Choose NativeAOT When:
- Cold start time is critical (APIs, webhooks, user-facing functions)
- Memory costs matter (high-volume serverless)
- Running .NET 9/10 on Lambda (managed runtime only supports .NET 8)
- Predictable performance required (SLA-driven workloads)
Perfect for:
- REST API backends
- Event processors (S3, SQS, EventBridge)
- GraphQL resolvers
- Scheduled tasks
β οΈ Choose Regular .NET When:
- Maximum flexibility required (plugins, dynamic loading)
- Heavy reflection/dynamic code (complex ORMs, frameworks)
- Long-running processes (cold start amortized over lifetime)
- Fastest dev iteration
π Choose ReadyToRun When:
- You want 34% faster cold starts with minimal code changes
- Testing AOT compatibility as a migration step
- Some dynamic features required
But remember: R2R still costs 300% more than AOT!
The Business Case: Dual Value Proposition
Quick responses aren't just a technical metric - they drive business value:
1. Lower Operating Costs
At 100M requests/month:
- Regular: $98/month per function
- AOT: $26/month per function
- Savings: $72/month
With 50 functions: $43,200/year saved π°
2. Better User Experience
At 100M requests/month, switching to AOT eliminates:
- 2,138 hours of cumulative user waiting time
- Calculation: (91ms - 14ms) Γ 100M requests = ~2,138 hours
Translation:
- Higher conversion rates
- Lower bounce rates
- Better SLA compliance
- Competitive advantage
Real-World Results
Here's the complete performance table from my tests:
| Function | .NET | Runtime | Cold Start | Warm Avg | Memory |
|---|---|---|---|---|---|
| Regular | 8 | dotnet8 | 6680 ms | 91 ms | 88-93 MB |
| ReadyToRun | 8 | dotnet8 | 4389 ms | 99 ms | 89-96 MB |
| AOT | 8 | dotnet8 | 1082 ms | 18 ms | 49-52 MB |
| AOT | 9 | dotnet8 | 971 ms | 14 ms β‘ | 47-49 MB |
| AOT | 10 | dotnet8 | 940 ms β‘ | 17 ms | 42-45 MB β‘ |
.NET 10 AOT wins on cold start and memory
.NET 9 AOT wins on warm performance
Getting Started
Ready to try it yourself? Here's the quickest path:
# Clone the demo repo
git clone https://github.com/whitewAw/dotnet-lambda-aot-performance-comparison.git
cd dotnet-lambda-aot-performance-comparison
# Build all Lambda packages via Docker
docker build -f src/Dockerfile -t aot-demo .
# Extract artifacts
docker create --name temp aot-demo
docker cp temp:/artifacts ./build-output
docker rm temp
# Deploy to AWS
aws s3 cp ./build-output/LambdaAOTDemo9-lambda.zip s3://your-bucket/
aws lambda update-function-code \
--function-name my-function \
--s3-bucket your-bucket \
--s3-key LambdaAOTDemo9-lambda.zip
Key Takeaways
- NativeAOT is production-ready for Lambda workloads
- 7Γ faster cold starts and 73% cost savings are real
- Code changes are minimal (mostly serialization)
- .NET 9/10 can run on Lambda today via custom runtime
- User experience AND cost both improve dramatically
-
Start preparing now even if you're not migrating yet:
- Adopt JSON source generation
- Use constructor injection
- Avoid reflection where possible
The Future is AOT
.NET's investment in NativeAOT is clear:
- .NET 8: Stable for console apps, minimal APIs
- .NET 9: Improved trimming, best warm performance (14ms avg)
- .NET 10: Smallest binaries (5.56 MB), fastest cold start (940ms)
Each version shows measurable improvements in size and performance.
Performance Variance Note
β οΈ Important: Your results may vary based on:
- AWS region and AZ
- Time of day
- Lambda execution environment reuse
- Your specific workload
The results shown represent averaged measurements from multiple test runs. Individual runs may vary by Β±5-15%.
Always benchmark your specific workload!
Try It Yourself
The complete source code, benchmarks, and detailed documentation are available:
.NET NativeAOT: Performance Revolution
A Deep Dive into AOT vs ReadyToRun vs Regular .NET
π Table of Contents
- Introduction & Objectives
- Understanding the Execution Models
- Why NativeAOT Matters
- Performance Results: The Numbers Don't Lie
- Build & Packaging Deep Dive
- Coding for AOT: Patterns & Best Practices
- Real-World Examples from This Repo
- Deployment Strategies
- Migration Guide
- When to Choose Which Approach
- FAQ & Troubleshooting
- Key Takeaways
- Additional Resources
- Getting Started
- Contributing
- License
- Acknowledgments
π― Introduction & Objectives
Who This Presentation Is For
.NET engineers familiar with Lambda, microservices, or performance-critical applications looking to understand modern compilation strategies.
What You'll Learn
- NativeAOT fundamentals and how it compares to ReadyToRun and Regular .NET
- Measured performance improvements from real-world Lambda functions
- Practical build and deployment strategies
- Code patterns required for successful AOT adoption
- Migration strategies for existing applications
Repository Context
This presentation uses a multi-mode Lambda demo repository with:
- 3 AOT Lambda functions (.NETβ¦
What's included:
- β Working Lambda functions (.NET 8, 9, 10 AOT + R2R + Regular)
- β Containerized build pipeline
- β Performance testing invoker
- β CloudFormation templates
- β Detailed migration guide
Discussion
Have you tried NativeAOT on Lambda? What were your results? Drop a comment below!
Questions I'd love to hear:
- What's holding you back from trying AOT?
- What performance improvements have you seen?
- What challenges did you face during migration?
If this helped you, consider β starring the repo and following me for more .NET performance deep-dives!
Connect:
- GitHub: @whitewAw
- Repository: dotnet-lambda-aot-performance-comparison
Related Resources:
Top comments (1)
Wow, this is incredibly well done β πgreat analysis and super valuable data. I love how practical and detailed your comparison is. Definitely inspired to give NativeAOT a real shot after reading this.