tldr;
Add this line to your .csproj file
<ServerGarbageCollection>false</ServerGarbageCollection>
as a part of the property group holding your target framework, like so
<PropertyGroup>
<TargetFramework>netcoreapp3.1</TargetFramework>
<ServerGarbageCollection>false</ServerGarbageCollection>
</PropertyGroup>
Can you tell me a bit more?
So, one of the things behind running microservices, or any service at all, is the constant use of memory.
Somewhere out there, you might be running a service or two, and depending on the environment and the programming language behind that service, it might require more or less memory just to stay alive, and running an Asp.Net Core service is no different.
Just spinning up a small service in Asp.Net Core requires up to 150MB of memory, so picture you're running a microservice setup containing just a couple of services will quickly require atleast 1GB of memory.
As you might already be thinking, this will quickly add up, so how ca we fix this?
Well, Microsoft themselves has a nice post about Garbage Collection in .Net - Here they explain that there are two different types of Garbage Collection in .Net WorkStation GC and Server GC.
To tell it short, WorkStation GC is more aggressive than Server GC and therefore collects more often.
Testing WorkStation GC we took a simple Asp.Net Core Crud Api and moved from Server to Workstation GC. This Api is serving around 500 requests a minute and swallowed around 230MB of memory.
After moving to Workstation GC the api fell to around 85MB of memory, with no noticeable performance degrading or increased CPU usage on the machines.
But
This is not a miracle cure, you might have different needs, throughput and so on, that could mean that moving from Server to WorkStation GC could have a performance impact on your service
Do you already run WorkStation GC? If not, try it out - And please share your results
Top comments (3)
I recently faced similar issue when building the application that required massive amount of throughput but noticed application was unreasonably using high amount of memory, application was running in docker container in Kubernetes and I spent a lot of time debugging and applied many techniques to dispose unmanaged resources but no luck, application was still using a lot of memory and after that it wasn't releasing memory, so everybody thought application had a memory leak, until I applied above tip and it dramatically reduced the memory usage without affecting performance so far, we are still testing it carefully but that was my observation and experience.
This is one of the core things that I came across when I first started using ASP.NET Core. The memory consumption was too high for something that seems straightforward. Having NodeJS normalized for web APIs, I don't mind the throughput being fractionally less, I can still have an API that runs much faster than most others and I'm happy with that
Out of interest, what did you measure? Are you measuring 99th percentile request duration for instance? I'm interested to see if now some requests get delayed by the more frequent gc collection (this is of course highly depending on how you use memory internally, if any objects allocated during the requests never reach gen 2 you should see no to very minor impact.
On the face of it, it does look like a good find for applications that aren't super latency sensitive :)