DEV Community

Rich
Rich

Posted on • Originally published at yer.ac on

🔎📉 Identify and monitor technical debt with SonarQube

This post covers my attempts to use SonarQube as a stand-alone install to perform static code analysis on a regular basis. This will cover purely getting the tool working, Maybe I will pick up how I can use the data in a later post?

I will be doing this in a very narrow focus which is for the project I am currently working on which is .NET stack, with builds running in VSTS using MSBUILD.

SonarQube runs code analysis as solutions are being built and provides a web dashboard of code smells, security vulnerabilities, duplication and more. My aim is to use it to identify technical debt, as well as track debt is reducing over time.

Note you can hook this into Azure Dev-ops fairly easily too with a few clicks and less setup, but I wanted to host the tool on our own infrastructure for zero-cost. I also believe you can use their cloud version free if you are open source.

My aims are:

  • To get the self-hosted version of SQ installed/setup
  • Get it running against a local solution
  • Work out how to hook this into our VSTS build process (If possible)

Getting started

First of all, I downloaded and extracted the free self-hosted version of SQ (Community edition) and placed it on one of our build servers. This package is essentially a self-hosting application, and following the 2-min getting started guide here, it’s genuinely quite easy to get the dashboard running within that 2 minutes (Providing the system requirements are met – which looks like you just need a recent Java JRE/JDK installed)

Following the above guide, and launching the shell/batch script of your choice, you can then navigate to http://localhost:9000 and see the SonarQube dashboard asking you to create a new project.

When creating a new project you are prompted for a project key and display name. The key will be used for the integration, and the display name will be the name displayed on the dashboard.

Next up is the token. The token is used for authentication purposes when uploading analysis files and can be changed and revoked later. I just used the word “sausages” as an example, but when you click “generate” it will provide your token.

Next it will tell you how to configure your project for SQ. I am doing this against a .NET project (C#, JS, etc) so will continue with this example.

For a C# project which will be built using MSBUILD, you first need the “SonarScanner for MSBUILD“. The SonarScanner is the tool that performs the analysis by starting before MSBUILD kicks in, and then ending, and collating the results to send to the server when it ends.

This tool can be placed anywhere, but the folder will need adding to the PATH on your windows environment.

At this point, we can do some powershell to test that it’s all hooked up correctly! (You can use pre/post build events too, but this is much simpler for testing) In this example we will CD into the directory the SLN is in, start the scanner tool using our project name and key from earlier and build our solution file (in rebuild). Finally we will end the scanning.

cd 'C:\path to your SLN'
 SonarScanner.MSBuild.exe begin /k:" **My-Project**" /d:sonar.host.url="http://localhost:9000" /d:sonar.login=" **Your Key**"
 msbuild **MySolution.sln** /t:Rebuild
 SonarScanner.MSBuild.exe end /d:sonar.login=" **Your Key**"
Enter fullscreen mode Exit fullscreen mode

Note that this assumes you have the scanner and MSBUILD in your PATH variable. If you do not, you can simply call the exe directly. Note that the MSBUILD exe is located at C:\Program Files (x86)\Microsoft Visual Studio\\\MSBuild\\Bin\msbuild.exe. I believe that SQ requires MSBUILD of 12 and above – I am currently using 15.0.

If we run this, it will take a few moments to start and a variable amount of time to complete (Depends heavily on the size of your solution). At the very end you will see a line of “Execution Success” and if you still have your dash open you may have seen it update.

If you navigate back to http://localhost:9000 you should now see your project. Note that if your sln was particularly large you may just see a “background processing” message whilst it imports the analysis file.


This is good news! However, there is a bit of an ominous warning in the footer of the dashboard which reads

Embedded database should be used for evaluation purposes only. The embedded database will not scale, it will not support upgrading to newer versions of SonarQube, and there is no support for migrating your data out of it into a different database engine.

This is easily solved by simply having a backing database for SQ. You will however lose all your progress so far.

Moving beyond proof of concept.

So we can use this in our build/deploy pipeline, I want to have it hook into a database, and installed onto one of our servers and finally have one of our CI builds run it!

The process is the same as above in terms of placing the extracted files onto the server apart from we also have to punch a hole in Windows Firewall for TCP port 9000 so it can be accessed remotely. Running the start batch script now will bring you to the same dashboard and warning as before which is what we want to avoid, so we will need to hook up a database.

For the database I will be using MS SQL hosted on 2016. You can find documentation for other database types (MySql, Oracle etc.) on the SQ documentation.

First I created an empty database on the SQL 2016 server named “SonarQube”, and also a new SQL user named “Sonar” who is a dbo on the SonarQube database.

Back in the SQ install, in the \sonarqube\conf folder is a sonar.properties file. In here we need to add the following:

sonar.jdbc.url=jdbc:sqlserver://myserver;databaseName=SonarQube
 sonar.jdbc.username=sonar
 sonar.jdbc.password=thepassword
Enter fullscreen mode Exit fullscreen mode

Note: If, like me, your SQL instance is named like “server\instancename” you will need to escape the slash before the instance so it is like “server\instancename”. The error that is generated does not lead to this being the cause of SQ not launching which was a pain!

Next up, I wanted to have my VSTS build automatically run the analysis and process it. I wanted this to be part of the departments CI builds, but due to the size of the project it was taking upwards of 30 minutes to complete so we moved it to the nightly builds.

Now you should be able to launch SQ again and not see the banner in the footer. A new project should be created like before.

As we use TFS/VSTS in house, this guide will show working with self-hosted TFS. There is a lot more in-depth (and more useful!) guides on the SQ docs here. The steps to take before progressing are:

Once the extension is installed, you should see 2 new build steps of:

  • SonarQube Scanner for MSBUILD – Begin Analysis
  • SonarQube Scanner for MSBUILD – End Analysis

Before you can use these, you will need to configure the SQ endpoint. To do this, you can either goto the collections administration page, then the “service” tab, or add the “SonarQube Scanner For MSBUILD – Begin analysis” build step to your VSTS build and then click “Manage” the server panel. In here, you can click “New Service Endpoint” and then “SonarQube”

When you add a new SQ endpoint, you configure it using a friendly name, the URL of the dashboard, and the token you set up for your project/user (much like the earlier powershell script had)

Now if you go back to edit your build definition. You will be able to use Sonar Qube with the Begin and End tasks either side of your build actions (again, much like the earlier powershell)

To set it up, select the SQ end point as configured earlier, and then under Project Settings, set the Project Key and Project name appropriately (based on however the project in SQ was set up) and finally, under “Advanced”, check “Include full analysis report in the build summary”. There is no configuration required for the end analysis build step.

Running it now will most likely result in a failure unless the machine which is hosting the build agent has the correct software installed.

Much like we did for the local version, we need to download and setup the SonarScanner and add it to the build path. The server which hosts the agents will also need to be running MSBUILD v14 or v15 (at time of writing). You can get a standalone version of MSBUILD v15 direct from Microsofts download pages. It took me a while to find it, so this is the direct link to v14. (Which I had to use due to a separate issue)

https://download.microsoft.com/download/E/E/D/EEDF18A8-4AED-4CE0-BEBE-70A83094FC5A/BuildTools_Full.exe . Hopefully this is the correct one for 14.0.25420.1

Now if you run your build it should (hopefully) produce some output. I did run into a few errors in this stage which may be just me being a bit uninformed (and not really reading docs…)

Some of the “Gotcha’s” I ran into

  • As noted earlier, MS SQL server names which contain slashed must be escaped, and the error that it thrown does not indicate this is the case!
  • Weird, ambiguous messages. (An instance of analyzer SonarAnalyzer.Rules.CSharp.ThreadStaticWithInitializer cannot be created from SonarAnalyzer.CSharp.dll ) which actually had nothing to do with that, and was actually related to MSBUILD versions. I was pulling my hair out until I saw thispost which states you need MSBUILD v14+ from update 3 (14.0.25420.1) where I was using MSBUILD v14 but 14.0.23107.10).
  • MSBUILD Version issues. The build would work but the SQ analysis would fail with an error about supported MSBUILD versions. To get around this I made sure that the MSBUILD step in the VSTS build was using an argument of /tv:14.0 to ensure it would use a specific version.
  • Static code analysis could not be completed on CSS files due to Node JS version (ERROR: Only Node.js v6 or later is supported, got <ver>. No CSS files will be analyzed.). Simply install latest nodeJS to the server hosting the agents.
  • Timeouts! This was only an issue for me as the projects LOC is in the hundreds of thousands so it generates a large log. (##[error]The analysis did not complete in the allotted time of 300 seconds. Consider setting the build variable SonarQubeAnalysisTimeoutInSeconds to a higher value.) To get around this, you add a build variable to the VSTS build named “SonarQubeAnalysisTimeoutInSeconds”. I tried setting it to zero (which is usually ‘infinite’) but then I got “##[error]The analysis did not complete in the allotted time of 0 seconds“. I couldn’t find any reliable info about max values so set mine to 20 minutes to be safe.

Conclusion:

Finally, (for me) after some light tinkering it all worked. When I was scanning google/stackoverflow it seems a few people had the same issues as me, so I don’t feel too bad about it for a first try!

Our nightly builds now pump out some lovely code analysis to the dashboard which I have shared with the team and the results are feeding into our technical backlog to be resolved. I’m going to leave it running for at least a month, and see if we get any real usage out of it.

More on results later, perhaps?

This wasn’t really written as a complete step-by-step, more of a stream of consciousness as I tried to learn about something new, but hopefully it helps someone, even if that is future me when I come back to re-remember how it works.


Update : I realised this morning that the project version on the SonarQube dash never increased or changed which meant it was hard to check when issues were implemented without cross-referencing the date and the checkin history. If you set the “Project Version” on the Begin Analysis step the in-built variable of $(Build.SourceVersion) you will get the changeset the build was ran off as the version.

The post My attempt at using SonarQube for static code analysis appeared first on yer.ac | Adventures of a developer, and other things..

Top comments (0)