DEV Community

Florin Vica
Florin Vica

Posted on

9 Things That Silently Kill Your .NET Build Time (and How to Fix Each One)

Every developer knows the pain: you hit Build, and then you wait. A 2013 Electric Cloud survey of 443 engineers found developers spend 3.5 hours per week waiting on builds alone. Google's engineering productivity research team confirmed in a 2023 IEEE Software paper that even moderate reductions in build latency produce measurable gains in developer velocity — 11% faster active coding time, 14% faster wall-clock time for small/medium changes — reinforcing that there is no "safe" threshold below which build time stops mattering.

After fifteen years of working with large-scale MSBuild solutions — some with hundreds of projects — I've found the same nine culprits responsible for most of the pain. None of them announce themselves. They all silently compound. Here's how to find and fix each one.

1. Roslyn analyzers are eating 70% of your compile time

The problem hides in plain sight. You add StyleCop.Analyzers for consistency, a security scanner for compliance, maybe AsyncFixer for good measure — and suddenly your build is four times slower. In dotnet/roslyn Discussion #45933, a developer reported a 160-project solution that built in 1 minute 42 seconds without analyzers but ballooned to 8 minutes with StyleCop and FxCop enabled. That's a 4.7× regression from code analysis alone.

Anthony Simmon documented this extensively in his blog post "Optimizing C# code analysis for quicker .NET compilation". In his real-world web solution, 70% of build time was analyzers, 30% was actual compilation. Disabling analysis dropped build time from ~2 minutes to under 50 seconds.

Diagnose it by setting ReportAnalyzer to see per-analyzer timing:

dotnet build /p:ReportAnalyzer=true /bl
Enter fullscreen mode Exit fullscreen mode

Open the resulting .binlog in MSBuild Structured Log Viewer and expand the "Analyzer Summary" node. You'll see exactly which analyzers cost the most.

Fix it by running analyzers only in CI. Add this to your Directory.Build.props:

<Project>
  <PropertyGroup>
    <RunAnalyzersDuringBuild
      Condition="'$(CI)' != 'true' AND '$(TF_BUILD)' != 'true'">false</RunAnalyzersDuringBuild>
  </PropertyGroup>
</Project>
Enter fullscreen mode Exit fullscreen mode

Three properties control analyzer behavior: RunAnalyzersDuringBuild disables analyzers during build only (IDE live analysis still works), RunAnalyzers kills both build and IDE analysis, and EnforceCodeStyleInBuild enables IDE-style rules (IDExxxx) during CLI builds. For local development speed, toggling RunAnalyzersDuringBuild is the surgical option. Expect 50–70% build time reduction on analyzer-heavy solutions.

2. ResolveAssemblyReference scans thousands of directories silently

MSBuild's ResolveAssemblyReference (RAR) task maps every assembly reference in your project to an actual .dll path on disk. The critical detail: RAR runs unconditionally on every build, including incremental builds. As dotnet/msbuild Issue #2015 explains, the build system cannot know whether you've installed a new targeting pack since the last build, so it re-resolves every time.

With .NET's micro-assembly model, modern projects pass hundreds of references to RAR. In dotnet/msbuild Issue #6911, a developer with 7,000 directories in their packages folder reported RAR consuming 30 minutes of total build time. Even modest projects can suffer: one user in MSBuild Discussion #9382 saw RAR jump from 400ms to 3 seconds per project after upgrading to Windows 11, traced to Smart App Control intercepting filesystem calls.

Diagnose it by generating a binlog (dotnet build -bl) and searching for the ResolveAssemblyReference task. Normal RAR time is 100–500ms per project. Anything above 2–3 seconds signals a problem.

Fix it by trimming unnecessary search paths and ensuring your antivirus isn't amplifying the cost (see section 9). You can disable search paths you don't need in Directory.Build.props:

<PropertyGroup>
  <AssemblySearchPath_UseCandidateAssemblyFiles>false</AssemblySearchPath_UseCandidateAssemblyFiles>
  <AssemblySearchPath_UseTargetFrameworkDirectory>false</AssemblySearchPath_UseTargetFrameworkDirectory>
  <AssemblySearchPath_UseAssemblyFoldersConfigFileSearchPath>false</AssemblySearchPath_UseAssemblyFoldersConfigFileSearchPath>
</PropertyGroup>
Enter fullscreen mode Exit fullscreen mode

Also migrate away from packages.config to PackageReference format — it provides precise hint paths to RAR, drastically reducing directory scanning. For verbose diagnostics, set MSBUILDLOGVERBOSERARSEARCHRESULTS=1 before building.

3. Wildcard globs silently walk your entire disk

SDK-style projects use default globs — **/*.cs for Compile, **/* for None — that recursively enumerate your entire project directory tree. This works fine until it doesn't. In dotnet/msbuild Issue #2392, the MSBuild team documented that 30–50% of build time was spent searching the disk in large projects — particularly in design-time builds (IntelliSense, project-system updates), where evaluations taking 3 seconds should take 50–100ms. Full CLI builds are affected too, though typically less dramatically.

The worst offender is dotnet/msbuild Issue #8984, where glob expansion inside MSBuild targets was missing a critical exclude optimization — causing an internal project to spend over 10 minutes in a single task because MSBuild recursed into bin/ and obj/ directories before subtracting them. And dotnet/sdk Issue #49415 revealed that on a repo with 100K+ source files, 2.25 minutes of a build was pure globbing, with projects evaluated five separate times.

The node_modules folder is the classic trap. One npm install drops thousands of nested directories into your project tree, and every **/*.cs glob dutifully walks through all of them.

Diagnose it by generating a binlog and profiling evaluation. The MSBuild evaluation profiler requires a file path argument and must be passed through dotnet msbuild:

dotnet msbuild -bl -profileevaluation:eval-perf.md
Enter fullscreen mode Exit fullscreen mode

Open the resulting binlog in the Structured Log Viewer and look at evaluation time, or review the generated Markdown report for per-project evaluation breakdowns.

Fix it by adding exclusions to DefaultItemExcludes in Directory.Build.props:

<PropertyGroup>
  <DefaultItemExcludes>$(DefaultItemExcludes);node_modules/**;**/bower_components/**</DefaultItemExcludes>
</PropertyGroup>
Enter fullscreen mode Exit fullscreen mode

Always prepend $(DefaultItemExcludes) to preserve the default bin/ and obj/ exclusions. For projects where you control the file list explicitly, set EnableDefaultCompileItems to false and list files manually — this eliminates globbing entirely.

4. NuGet restore runs on every build when it shouldn't

Every dotnet build implicitly runs dotnet restore first. Even when restore is a no-op — when project.assets.json is already current — the evaluation overhead is real. NuGet must load and evaluate every project to determine whether restore is needed, and in large solutions this adds up fast.

The performance gap can be staggering. NuGet/Home Issue #11548 documented a case where dotnet restore took 5 minutes 39 seconds on Windows versus 16 seconds on Ubuntu for identical packages — because Windows was performing synchronous certificate revocation checks. Setting NUGET_CERT_REVOCATION_MODE=offline dropped it to 1 minute 22 seconds.

Fix it with three strategies. First, separate restore from build in CI:

dotnet restore
dotnet build --no-restore --configuration Release
dotnet test --no-build --configuration Release
Enter fullscreen mode Exit fullscreen mode

Second, enable static graph evaluation for a 20–40% restore speedup (G-Research benchmarks):

<PropertyGroup>
  <RestoreUseStaticGraphEvaluation>true</RestoreUseStaticGraphEvaluation>
</PropertyGroup>
Enter fullscreen mode Exit fullscreen mode

Third, if you're on .NET 9+, the new dependency resolver is enabled by default — Microsoft reported an internal 2,500-project repo going from over 30 minutes to 2 minutes for restore (the improvement came in two stages: optimizations to the legacy algorithm in .NET 8.0.300 halved restore time, then the rewritten resolver in .NET 9 brought it down to 2 minutes). For reproducibility, add RestorePackagesWithLockFile and use RestoreLockedMode in CI to fail on drift rather than silently re-resolving.

5. The CopyLocal avalanche copies hundreds of DLLs every build

In a solution with n projects, CopyLocal creates O(n²) file copy operations. Each project copies its transitive dependencies — including every NuGet package DLL — to its own output directory. One developer in dotnet/msbuild Issue #7014 reported their binaries directory ballooning to ~70 GB from redundant copies.

Fix it with a layered approach. For library projects that don't need runtime dependencies in their output:

<PropertyGroup>
  <CopyLocalLockFileAssemblies>false</CopyLocalLockFileAssemblies>
</PropertyGroup>
Enter fullscreen mode Exit fullscreen mode

For solutions where all projects share an output directory, eliminate copies entirely:

<PropertyGroup>
  <UseCommonOutputDirectory>true</UseCommonOutputDirectory>
  <OutDir>$(SolutionDir)artifacts\bin\$(Configuration)\</OutDir>
</PropertyGroup>
Enter fullscreen mode Exit fullscreen mode

When you can't avoid copies, replace them with near-instant NTFS hard links:

<PropertyGroup>
  <CreateHardLinksForCopyLocalIfPossible>true</CreateHardLinksForCopyLocalIfPossible>
  <CreateHardLinksForCopyFilesToOutputDirectoryIfPossible>true</CreateHardLinksForCopyFilesToOutputDirectoryIfPossible>
  <CreateHardLinksForPublishFilesIfPossible>true</CreateHardLinksForPublishFilesIfPossible>
</PropertyGroup>
Enter fullscreen mode Exit fullscreen mode

Note: hard links are deliberately disabled inside Visual Studio because they share underlying file data — modifying one modifies all, which can corrupt the NuGet cache. Use them in CI and command-line builds. For solutions with 50+ projects, combining UseCommonOutputDirectory with CopyLocalLockFileAssemblies=false on libraries routinely saves minutes per build.

Tradeoff warning: UseCommonOutputDirectory can break test isolation if two test projects emit conflicting assembly versions into the same folder. And hard links combined with aggressive CI caching can cause subtle cache poisoning. Profile first, apply selectively.

6. You're building sequentially on a multi-core machine

Here's a default that trips up many teams: standalone msbuild.exe defaults to /maxcpucount:1 — one core, fully sequential. If your CI pipeline calls msbuild.exe directly, you're leaving all but one core idle. The dotnet build command, however, already passes -maxcpucount without a value to MSBuild — meaning it builds in parallel by default using all available logical processors. Visual Studio's IDE also builds in parallel by default.

Fix it — but only where it matters. If you invoke msbuild.exe directly (CI scripts, legacy pipelines), add the /m switch:

msbuild MySolution.sln /m
Enter fullscreen mode Exit fullscreen mode

For dotnet build, parallel is already the default — adding -m is harmless but redundant. The MSBuild parallel build documentation explains the node architecture: each node is a separate worker process that builds one project at a time.

Your actual speedup depends entirely on your dependency graph — and this is where many developers are disappointed. A linear chain A→B→C→D gets zero benefit from parallelism — the critical path forces sequential execution. More commonly, solutions have a "hub" project (a shared core library, a data access layer) that 50+ projects depend on. That hub becomes the serialization bottleneck: nothing builds until it finishes, and -m:16 won't help. Wide, loosely-coupled solutions see the biggest gains: community reports show 40–58% reductions on solutions with 70+ projects and shallow dependency trees.

Before blindly adding -m, visualize your actual parallelism by opening a binlog in the Structured Log Viewer's Timeline tab. Look for idle nodes — they indicate dependency bottlenecks. If most of your build timeline shows a single active node followed by a burst, the real fix isn't more cores — it's restructuring your project graph. Split monolithic "Common" or "Core" projects into smaller, independent assemblies that can build concurrently. In solutions I've worked on, splitting a single 800-file hub project into 4 focused libraries improved parallel build time more than doubling the core count.

For even better scheduling, try static graph builds with /graph, which computes the full dependency DAG upfront and builds bottom-up for maximum parallelism.

7. Broken incremental builds force full recompilation every time

This is arguably the most impactful and least discussed build time killer. MSBuild's incremental build system relies on comparing Inputs and Outputs timestamps on every target. When this contract is broken — and it breaks silently — MSBuild rebuilds everything, every time. Developers notice that "clean build and normal build take the same time," shrug, and move on. Over weeks, the team just accepts 3-minute builds as normal when they should be 8-second no-ops.

Common culprits that break incrementality:

  • Custom targets without Inputs/Outputs attributes. A <Target Name="MyPreBuild" BeforeTargets="Build"> that runs a script or copies a file without declaring what it reads and writes forces MSBuild to re-run it unconditionally — and everything downstream.
  • Targets that touch output files unnecessarily. A code generator that rewrites a .g.cs file with identical content still updates the timestamp, which cascades into a full recompile of everything that depends on it.
  • BeforeBuild/AfterBuild targets that modify bin/ or obj/. Anything that writes into the output directory during build can confuse the up-to-date check.
  • AssemblyInfo auto-generation with changing values. If GenerateAssemblyInfo includes a build timestamp or incrementing version on every build, you've just guaranteed that every build is a full build.
  • File system timestamp issues. Git operations (checkout, rebase) can reset timestamps on source files, triggering unnecessary rebuilds. Build machines that clone fresh repos hit this on every CI run.

Diagnose it by setting the MSBUILDTARGETOUTPUTLOGGING=1 environment variable, or more precisely, by using the MSBuild up-to-date check logs. In Visual Studio, set Tools → Options → Projects and Solutions → .NET Core → Up to date checks logging to Verbose. Then build twice without changing anything. If the second build does real work, your incremental build is broken.

Fix it by auditing every custom target in your build. Ensure all targets declare Inputs and Outputs. For code generators, compare output content before writing — only write if the content actually changed. For assembly info, pin your version in CI and avoid timestamps:

<PropertyGroup>
  <Deterministic>true</Deterministic>
  <GenerateAssemblyInfo>true</GenerateAssemblyInfo>
  <!-- Don't embed build time — it breaks incrementality -->
</PropertyGroup>
Enter fullscreen mode Exit fullscreen mode

A working incremental build should complete in under 2 seconds for a no-change rebuild of a 50-project solution. If yours takes more than 10 seconds with no changes, you have a broken target somewhere.

8. The compiler server isn't running (or keeps dying)

The Roslyn compiler ships with VBCSCompiler.exe, a long-running server process that keeps the compiler loaded in memory between builds. Without it, every project invocation pays the full JIT and assembly load cost of starting the C# compiler from scratch — estimated at several hundred milliseconds per project based on community profiling. On a 100-project solution, that overhead compounds quickly into a significant penalty.

SDK-style projects enable the compiler server by default via /shared. But several common scenarios silently disable or destabilize it:

  • Mixed solutions with legacy .csproj (non-SDK-style) and .vcxproj projects. The MSBuild nodes spawned for native C++ builds don't share the compiler server context, and in some configurations the server doesn't start at all for the managed projects in the same build.
  • CI environments that kill long-running processes. If your CI agent terminates VBCSCompiler.exe between build steps (some Docker-based agents do this), every step pays the cold-start penalty.
  • UseSharedCompilation=false set somewhere in your props chain. Sometimes added intentionally for deterministic builds, sometimes inherited from a NuGet package's .props file without anyone noticing.
  • Server timeout too aggressive. The default VBCSCompiler idle timeout is 10 minutes (/keepalive:600). In CI pipelines with gaps between builds, the server may shut down between steps.

Diagnose it by checking if the server is alive during builds:

# During or right after a build:
tasklist /fi "imagename eq VBCSCompiler.exe"
Enter fullscreen mode Exit fullscreen mode

If the process isn't there, or if you see it spawning and dying repeatedly in Process Monitor, the server isn't providing its intended benefit.

Fix it by ensuring UseSharedCompilation isn't disabled anywhere in your import chain:

dotnet build /bl
# In binlog viewer, search for "UseSharedCompilation" in properties
Enter fullscreen mode Exit fullscreen mode

If you find it set to false, trace which .props or .targets file sets it. For CI, explicitly keep the server alive across steps:

<PropertyGroup>
  <UseSharedCompilation>true</UseSharedCompilation>
</PropertyGroup>
Enter fullscreen mode Exit fullscreen mode

And in CI scripts, avoid killing the process between build steps. If you must run in an isolated environment, at least keep the server alive within a single build invocation — the savings compound across projects.

9. Windows Defender intercepts every file your build writes

Real-time antivirus protection synchronously scans every file open, create, and write operation during your build. Steve Smith (Ardalis) documented that Defender's Antimalware Service Executable consumed nearly as much CPU as Visual Studio itself during builds. In extreme cases — particularly in I/O-heavy solutions with thousands of assembly references — Defender overhead can push build times into multiples of their unscanned baseline. Community reports consistently show 30–60% improvements after adding appropriate Defender exclusions for build paths and processes, though individual results vary significantly based on project size and I/O patterns.

The best modern fix is Developer Drive, introduced in Windows 11 23H2. Dev Drive uses the ReFS filesystem with optimizations for developer I/O patterns and enables Microsoft Defender's performance mode — an asynchronous scanning mode that defers security checks until after file operations complete instead of blocking them. Microsoft's engineering blog reported 14% faster builds from Dev Drive alone on a 500+ project C# codebase, and 28% total with the Microsoft.Build.CopyOnWrite package.

Caveats on Dev Drive adoption: Moving an established repo and its toolchain to a ReFS Dev Drive isn't trivial. You need to relocate your source tree, NuGet package cache (NUGET_PACKAGES), and potentially the .NET SDK itself to the Dev Drive to see full benefits. ReFS doesn't support NTFS compression (relevant if you're space-constrained), isn't available on Windows 10, and some third-party tools have quirks with ReFS paths. Profile on a test workstation before committing the team to a migration.

If you can't use Dev Drive, add exclusions for build-critical paths and processes:

# Process exclusions (highest impact)
Add-MpPreference -ExclusionProcess "dotnet.exe"
Add-MpPreference -ExclusionProcess "MSBuild.exe"
Add-MpPreference -ExclusionProcess "devenv.exe"
Add-MpPreference -ExclusionProcess "VBCSCompiler.exe"

# Path exclusions
Add-MpPreference -ExclusionPath "$env:USERPROFILE\.nuget\packages"
Add-MpPreference -ExclusionPath "C:\Program Files\dotnet"
Add-MpPreference -ExclusionPath "D:\Projects"  # your source root
Enter fullscreen mode Exit fullscreen mode

Verify your Dev Drive trust status with fsutil devdrv query D:. Microsoft explicitly states that performance mode provides better security than folder exclusions, since files are still scanned — just asynchronously.

Quick reference: all fixes in one place

Drop this into your Directory.Build.props to apply the non-destructive fixes solution-wide:

<Project>
  <PropertyGroup>
    <!-- 1. Analyzers: CI only -->
    <RunAnalyzersDuringBuild
      Condition="'$(CI)' != 'true'">false</RunAnalyzersDuringBuild>

    <!-- 3. Globs: exclude heavy directories -->
    <DefaultItemExcludes>$(DefaultItemExcludes);node_modules/**</DefaultItemExcludes>

    <!-- 4. NuGet: static graph restore -->
    <RestoreUseStaticGraphEvaluation>true</RestoreUseStaticGraphEvaluation>

    <!-- 5. CopyLocal: hard links on CLI builds -->
    <CreateHardLinksForCopyLocalIfPossible
      Condition="'$(BuildingInsideVisualStudio)' != 'true'">true</CreateHardLinksForCopyLocalIfPossible>
    <CreateHardLinksForCopyFilesToOutputDirectoryIfPossible
      Condition="'$(BuildingInsideVisualStudio)' != 'true'">true</CreateHardLinksForCopyFilesToOutputDirectoryIfPossible>

    <!-- 7. Incremental builds: deterministic output -->
    <Deterministic>true</Deterministic>

    <!-- 8. Compiler server: ensure it's enabled -->
    <UseSharedCompilation>true</UseSharedCompilation>
  </PropertyGroup>
</Project>
Enter fullscreen mode Exit fullscreen mode

And in your CI pipeline:

dotnet restore
dotnet build --no-restore --configuration Release
dotnet test --no-build --configuration Release
# If using msbuild.exe directly instead of dotnet build, add /m for parallel:
# msbuild MySolution.sln /m /p:Configuration=Release
Enter fullscreen mode Exit fullscreen mode

Conclusion

These nine fixes target different layers of the build pipeline — compiler integration, assembly resolution, filesystem evaluation, package management, output copying, CPU utilization, incremental build hygiene, compiler server management, and OS-level I/O — but they share a pattern: they're all defaults or oversights that make sense for small projects and silently degrade at scale.

The actual improvement you'll see depends heavily on your specific solution's profile. On 100+ project solutions where several of these issues compound, I've seen total build times drop significantly — but the gains vary. A solution dominated by analyzer overhead will see dramatic improvement from fix #1 alone. A solution with a broken incremental build might see the biggest win from auditing custom targets. There is no universal "apply these five XML properties and get 50% faster builds."

The diagnostic approach matters as much as the fixes: generate a binlog with dotnet build -bl, open it in the Structured Log Viewer, and let the data tell you where your specific build bleeds time. Measure before and after every change. Every solution is different, but these nine culprits account for the vast majority of preventable build waste.

Top comments (0)