DEV Community

Lars Richter
Lars Richter

Posted on • Updated on

Speeding Up Our Unit Tests From 15min To Less Than 2min

Our codebase at work is pretty well tested. The code coverage is at 80+ percent. I'm working on our backoffice systems, which are developed on the .NET platform and consist of about 100 projects (services and components) and growing.
Most of our services/components live in a separate solution in which we also store the test projects for unit and integration tests.
While writing more code (and therefore more tests), the time needed for running our tests increased. By the end of last year (2019) the unit tests took 15 minutes to run.

Time for a Hackday!

But everything worked. In my previous companies, it would be difficult to get permission to "play" with the build and test scripts to improve it. But at my current company,
we do have the concept of "hackdays". Every month you can use one day to do stuff outside of the normal schedule. Maybe there is some code you always wanted to refactor but you never had the time to do so. Or maybe you wanted to extend the UI tests. Or you want to take a look into some new framework or technology that might be relevant for your work.
Whatever it is, you can work on those things one day every month.

So I invested a "hackday" to improve the speed of our test suite. I found out, that the tests itself weren't terribly slow. It seemed like the tests were not run in parallel.

Our old way

Why wasn't it running in parallel? To understand that, you have to understand the build and test process.
We had a csproj file to build and run our entire codebase. It uses MSBuild targets and commands to gather all relevant projects.
A simplified version of it would look something like this:

  <!-- ... doing the build somewhere here first ... -->

  <ItemGroup>
    <UnitTestProjects Include="**\*.Test.XUnit.csproj" />
  </ItemGroup>

  <ItemGroup>
    <IntegrationTestProjects Include="**\*.Test.XIntegration.csproj" />
  </ItemGroup>

  <Target Name="unittest">
    <Exec Command="dotnet.exe test %(UnitTestProjects.Identity) --no-build --no-restore" />
  </Target>

  <Target Name="integrationtest">
    <Exec Command="dotnet.exe test %(IntegrationTestProjects.Identity) --no-build --no-restore" />
  </Target>
Enter fullscreen mode Exit fullscreen mode

If I run MSBuild on this project file with the target "unittest", it will find all relevant projects and run them using dotnet test. Finding the projects is pretty simple because we have a naming convention, which says that the unit test projects must be named like this: [ProjectName].Test.XUnit.
Accordingly, our integration test projects are named [ProjectName].Test.XIntegration.

So it is a two-step process:

  • finding the relevant projects
  • Running each project using dotnet test

When I checked the build log, I saw, that it tested one project at a time. If you are familiar with MSBuild it might have been obvious to you, but it took me some time to realize that.
Luckily, dotnet test using xUnit V2 runs the tests inside of a project (or even inside a solution) in parallel by default. So at least the tests inside of a project will be started in parallel.

So I started researching how I could run all the test projects in parallel. While reading through a lot of documentation, I always came back to the fact, that all tests inside a solution run in parallel by default. So what about referencing all the test projects in one solution file?

Our new way

Despite my worries that maintaining such a solution file would be a lot of effort, I wanted to at least try it to see the performance improvements.
So I wrote a little PowerShell script that would pick up all the unit and integration test projects (just like the MSBuild stuff) and put each type in a single solution. As you will see in the script, I needed to do some string "magic" to get relative paths to the test projects. That was needed because the CI server might run the build in different directories from time to time. So hard-coding the path wasn't an option. Also, hard-coding paths is a bad idea anyhow.

$pathobject = get-location | Select-Object @{N="PathLength";E={$_.Path.Length}}
$pathlength = $pathobject.PathLength + 1

$mylocation = (get-location).Path 

if (Test-Path $mylocation/_unittest.sln -PathType Leaf) {
    Remove-Item _unittest.sln
}

if (Test-Path $mylocation/_integrationtest.sln -PathType Leaf) {
    Remove-Item _integrationtest.sln
}

dotnet new sln --name _unittest
dotnet new sln --name _integrationtest

Get-ChildItem -Path . -Recurse -File -Include *.Test.XUnit.csproj | ForEach-Object { dotnet sln _unittest.sln add $_.FullName.Substring($pathlength) }

Get-ChildItem -Path . -Recurse -File -Include *.Test.XIntegration.csproj | ForEach-Object { dotnet sln _integrationtest.sln add $_.FullName.Substring($pathlength) }
Enter fullscreen mode Exit fullscreen mode

After that, I ran the tests in that solution using dotnet test. The result was amazing. Instead of taking 15 minutes to run, it only took 1 minute and 20 seconds.
That's an improvement worth the name. :-)

But there was still the effort of keeping the file up-to-date, that scared me. After some more thinking a realized, that I had a script to generate this solution on demand already.
I inserted this script into our build pipeline. It now runs in parallel to our normal build step. This way, the time to run the entire build pipeline stays the same.
And when the pipeline gets to running the tests, it would run dotnet test on our newly generated solution file which includes all the unit tests.

Final thoughts

These changes shortened our feedback cycle quite a bit. So I think it is a hackday well spent.

If you have tips or tricks to shorten the feedback cycle, I would love to hear them. Let me know in the comments.

Top comments (0)