Note: Click on the image to see a larger (original) version.
Introduction
In this part of our SAST series, we focus on practical steps to incorporate SAST using SonarQube.
You will receive a comprehensive step-by-step guide on how to integrate SonarQube into your CI/CD workflow and automate the process.
We will explore:
- Integrating SonarQube scanning into the CI/CD pipeline using GitHub Actions
- Failing the pipeline based on SonarQube's default Quality Gates for a free account
- Adding unit tests and the JaCoCo plugin for code coverage
Tools:
- SonarQube Cloud
- GitHub Actions
- JaCoCo
- Spring Boot
- IntelliJ IDEA
- JUnit 5
By following this comprehensive guide, you will be able to seamlessly integrate SAST into your SSDLC process and improve your software’s security posture.
Integrating SonarQube scanning into the CI/CD pipeline using GitHub Actions
If you haven't created a Spring Boot project yet, then you can do so in this section. Create a Spring Boot project using IntelliJ IDEA or the Spring Boot Initializr at https://start.spring.io/. Click Next and generate the project without dependencies.
Update the “gradle.build” file to look like this. We will use Spring Boot 3 and Java 17, and will exclude tests for now.
plugins {
id 'java'
id "org.sonarqube" version "7.0.0.6105"
id 'org.springframework.boot' version '3.0.0'
id 'io.spring.dependency-management' version '1.1.7'
}
group = 'com.practice'
version = '0.0.1-SNAPSHOT'
description = 'sonarqube_actions_demo'
java {
toolchain {
languageVersion = JavaLanguageVersion.of(17)
}
}
configurations {
compileOnly {
extendsFrom annotationProcessor
}
}
repositories {
mavenCentral()
}
dependencies {
// Spring boot runtime
implementation 'org.springframework.boot:spring-boot-starter-web'
// Observability
implementation 'ch.qos.logback:logback-classic:1.5.13'
implementation 'ch.qos.logback:logback-core:1.5.19'
}
def localProps = new Properties()
def localFile = file("gradle-local.properties")
if (localFile.exists()) {
localProps.load(localFile.newDataInputStream())
}
sonar {
properties {
property "sonar.projectKey", System.getenv("SONAR_PROJECT_KEY") ?: localProps["sonar.projectKey"]
property "sonar.organization", System.getenv("SONAR_ORGANIZATION_KEY") ?: localProps["sonar.organization"]
property "sonar.projectName", System.getenv("SONAR_PROJECT_NAME") ?: localProps["sonar.projectName"]
property "sonar.token", System.getenv("SONAR_TOKEN") ?: localProps["sonar.token"]
property "sonar.host.url", System.getenv("SONAR_HOST_URL") ?: localProps["sonar.host.url"]
}
}
tasks.named('test') {
useJUnitPlatform()
}
Create the “gradle-local.properties” file to run SonarQube locally:
systemProp.sonar.qualitygate.wait=true
sonar.projectKey=sonarqube_actions_demo_key
sonar.organization=local-organization
sonar.projectName=sonarqube_actions_demo
sonar.token=sqp_b7dc8e023b58eb785b11c6c468cda2b79eb6090b
sonar.host.url=http://localhost:9000
sonar.coverage.JaCoCo.xmlReportPaths=build/reports/JaCoCo/test/JaCoCoTestReport.xml
Add the “gradle-local.properties” file, build, and .gradle folders to the “.gitignore” file.
### Build ###
/build
/.gradle
### Custom file ###
gradle-local.properties
Create a build.yml file for the CI/CD workflow using GitHub Actions in the .github/workflows/ folder at the root of your project.
name: SonarQube
on:
push:
branches:
- 'main'
pull_request:
branches:
- main
jobs:
branch-name-policy:
name: branch-name-policy
runs-on: ubuntu-latest
steps:
- name: Check PR source branch name
shell: bash
run: |
if [ "${{ github.event_name }}" = "pull_request" ]; then
BRANCH="${{ github.head_ref }}"
else
BRANCH="${{ github.ref_name }}"
fi
echo "PR head ref: $BRANCH"
if [[ "$BRANCH" =~ ^(release/|hotfix/|feature/|bugfix/|test|main).* ]]; then
echo "Allowed branch pattern: $BRANCH"
exit 0
else
echo "::error ::Branch name '$BRANCH' is not allowed to merge into main. Allowed patterns: release/*, hotfix/*, feature/*, bugfix/*, test*"
exit 1
fi
build:
name: Build
runs-on: ubuntu-latest
container:
image: eclipse-temurin:17-jdk
steps:
- name: Checkout source code to docker ubuntu container
uses: actions/checkout@v4
with:
token: ${{ secrets.GITHUB_TOKEN }}
fetch-depth: 0
- name: Build project
run: ./gradlew build -x test
sast:
needs:
- build
name: SonarQube Scan
runs-on: ubuntu-latest
steps:
- name: Checkout source code to docker ubuntu container
uses: actions/checkout@v4
with:
fetch-depth: 0
- name: Cache SonarQube packages
uses: actions/cache@v4
with:
path: ~/.sonar/cache
key: ${{ runner.os }}-sonar
restore-keys: ${{ runner.os }}-sonar
- name: SonarQube Scan
env:
SONAR_TOKEN: ${{ secrets.SONAR_TOKEN }}
SONAR_HOST_URL: ${{ secrets.SONAR_HOST_URL }}
SONAR_ORGANIZATION_KEY: ${{ secrets.SONAR_ORGANIZATION_KEY }}
SONAR_PROJECT_KEY: ${{ secrets.SONAR_PROJECT_KEY }}
SONAR_PROJECT_NAME: ${{ secrets.SONAR_PROJECT_NAME }}
run: ./gradlew sonar --info
Workflow name
This is the name that appears on the GitHub Actions UI.
name: SonarQube
Workflow trigger
This workflow runs on:
- pushing changes to the main branch
- pull requests that target the main branch
on:
push:
branches:
- 'main'
pull_request:
branches:
- main
Job Section
All jobs must be nested inside the job section.
The workflow contains three jobs:
- branch-name-policy – validates branch naming
- build – builds the application
- sast (SonarQube scan) – performs the SonarQube scan
jobs:
branch-name-policy
Job branch-name-policy is the branch name policy validation. This job runs first. This ensures developers follow consistent branch naming patterns.
Advantages:
- Keeps the repository clean
- Improves automation
- Prevents merging from incorrectly named branches
branch-name-policy:
name: branch-name-policy
runs-on: ubuntu-latest
steps:
- name: Check PR source branch name
shell: bash
run: |
if [ "${{ github.event_name }}" = "pull_request" ]; then
BRANCH="${{ github.head_ref }}"
else
BRANCH="${{ github.ref_name }}"
fi
echo "PR head ref: $BRANCH"
if [[ "$BRANCH" =~ ^(release/|hotfix/|feature/|bugfix/|test|main).* ]]; then
echo "Allowed branch pattern: $BRANCH"
exit 0
else
echo "::error ::Branch name '$BRANCH' is not allowed to merge into main. Allowed patterns: release/*, hotfix/*, feature/*, bugfix/*, test*"
exit 1
fi
Steps inside the branch-name-policy job.
Step 1: Detect branch name.
GitHub uses different variables depending on the event. The script normalizes the variable so you always get the correct branch name.
| Event | Branch variable |
|---|---|
| pull_request | ${{ github.head_ref }} |
| push | ${{ github.ref_name }} |
if [ "${{ github.event_name }}" = "pull_request" ]; then
BRANCH="${{ github.head_ref }}"
else
BRANCH="${{ github.ref_name }}"
fi
echo "PR head ref: $BRANCH"
Step 2: Validate Naming Convention
Allows only:
- release/*
- hotfix/*
- feature/*
- bugfix/*
- test*
- main Anything else fails with an error.
if [[ "$BRANCH" =~ ^(release/|hotfix/|feature/|bugfix/|test|main).* ]]; then
echo "Allowed branch pattern: $BRANCH"
exit 0
else
echo "::error ::Branch name '$BRANCH' is not allowed to merge into main. Allowed patterns: release/*, hotfix/*, feature/*, bugfix/*, test*"
exit 1
fi
build
This job runs a build using Java 17 (Eclipse Temurin JDK).
build:
name: Build
runs-on: ubuntu-latest
container:
image: eclipse-temurin:17-jdk
steps:
- name: Checkout source code to docker ubuntu container
uses: actions/checkout@v4
with:
token: ${{ secrets.GITHUB_TOKEN }}
fetch-depth: 0
- name: Build project
run: ./gradlew build -x test
Build steps:
Step 1: Check out the code
Using fetch-depth: 0 ensures the full commit history is available. SonarQube requires this.
- name: Checkout source code to docker ubuntu container
uses: actions/checkout@v4
with:
token: ${{ secrets.GITHUB_TOKEN }}
fetch-depth: 0
Step 2: Build using Gradle.
This:
- Compiles code
- Packages artifacts
- Skips tests (-x test) to keep the build fast.
- name: Build project
run: ./gradlew build -x test
sast
Job sast is for running SonarQube scan.
sast:
needs:
- build
name: SonarQube Scan
runs-on: ubuntu-latest
steps:
- name: Checkout source code to docker ubuntu container
uses: actions/checkout@v4
with:
fetch-depth: 0
- name: Cache SonarQube packages
uses: actions/cache@v4
with:
path: ~/.sonar/cache
key: ${{ runner.os }}-sonar
restore-keys: ${{ runner.os }}-sonar
- name: SonarQube Scan
env:
SONAR_TOKEN: ${{ secrets.SONAR_TOKEN }}
SONAR_HOST_URL: ${{ secrets.SONAR_HOST_URL }}
SONAR_ORGANIZATION_KEY: ${{ secrets.SONAR_ORGANIZATION_KEY }}
SONAR_PROJECT_KEY: ${{ secrets.SONAR_PROJECT_KEY }}
SONAR_PROJECT_NAME: ${{ secrets.SONAR_PROJECT_NAME }}
run: ./gradlew sonar --info
sast job steps:
This job runs after the build. If the build fails, the sast job will not run.
sast:
needs:
- build
Step 1: Each job runs on a fresh machine.
- name: Checkout source code to docker ubuntu container
uses: actions/checkout@v4
with:
fetch-depth: 0
Step 2: Cache the scan. Caching speeds up your scans by reusing analyzer packages.
GitHub Actions reuses previously downloaded analyzers from a cache stored in your workflow runner.
Benefit::
- Faster Sonar scans
- Reduced network usage
- More stable pipeline
SonarQube’s examples recommend caching for performance.
- name: Cache SonarQube packages
uses: actions/cache@v4
with:
path: ~/.sonar/cache
key: ${{ runner.os }}-sonar
restore-keys: ${{ runner.os }}-sonar
Step 3: Runs the SonarQube scan on your code.
- name: SonarQube Scan
env:
SONAR_TOKEN: ${{ secrets.SONAR_TOKEN }}
SONAR_HOST_URL: ${{ secrets.SONAR_HOST_URL }}
SONAR_ORGANIZATION_KEY: ${{ secrets.SONAR_ORGANIZATION_KEY }}
SONAR_PROJECT_KEY: ${{ secrets.SONAR_PROJECT_KEY }}
SONAR_PROJECT_NAME: ${{ secrets.SONAR_PROJECT_NAME }}
run: ./gradlew sonar --info
Authentication is done via GitHub secrets:
- SONAR_TOKEN for authentication
- SONAR_HOST_URL
- SONAR_ORGANIZATION_KEY
- SONAR_PROJECT_KEY
- SONAR_PROJECT_NAME These are injected as environment variables:
env:
SONAR_TOKEN: ${{ secrets.SONAR_TOKEN }}
SONAR_HOST_URL: ${{ secrets.SONAR_HOST_URL }}
SONAR_ORGANIZATION_KEY: ${{ secrets.SONAR_ORGANIZATION_KEY }}
SONAR_PROJECT_KEY: ${{ secrets.SONAR_PROJECT_KEY }}
SONAR_PROJECT_NAME: ${{ secrets.SONAR_PROJECT_NAME }}
Create the “application.properties” file under the “sonarqube_actions_demo/src/main/resources” folder. Set the available port on your machine.
spring.application.name=sonarqube_actions_demo
server.port=8008
Create the “logback-spring.xml” logging configuration file under “sonarqube_actions_demo/src/main/resources” folder.
<configuration>
<appender name="Console" class="ch.qos.logback.core.ConsoleAppender">
<encoder>
<pattern>%d{HH:mm:ss.SSS} [%thread] %-5level %logger{36} - %msg%n</pattern>
</encoder>
</appender>
<root level="INFO">
<appender-ref ref="Console"/>
</root>
</configuration>
Create a GitHub repository with the same name as the project.
Create a project in SonarQube Cloud at https://sonarcloud.io/projects/create
Select the project sonarqube_actions_demo and click “Set Up”
Select the new code definition as “Previous version” and click “Create project”.
Create repository secrets using GitHub Actions.
You can generate the SONAR_TOKEN value at https://sonarcloud.io/account/security
Click Github Actions
SonarQube will generate the SONAR_TOKEN for you. Click the Gradle button to view the project key and organization values. Copy and save these values so you can add them to your GitHub Actions repository secrets.
Create GitHub Actions at https://github.com/VardanMatevosyan/sonarqube_actions_demo/settings/secrets/actions
SONAR_HOST_URL = https://sonarcloud.io
SONAR_ORGANIZATION_KEY = sonar.organization
SONAR_PROJECT_KEY = sonar.organization
SONAR_TOKEN = generated token from the previous step
SONAR_PROJECT_NAME = sonarqube_actions_demo
Remove the auto-generated test files by the Spring Boot initializer from src/test/java/com/practice/sonarqube_actions_demo
Update the Gradle Wrapper to the 8.8 version by running the following command:
gradle wrapper --gradle-version 8.8 --distribution-type bin
Build the project by running ./gradlew clean build, and then run Sonar locally using one of the previously mentioned approaches.
If you're using SonarCloud, use the project key from the sonar.projectKey property.
We can see two issues, but both are acceptable in our case.
The first one is a false positive because of how we use the variable, we only read the data and compare it with the actual branch name. No execution is involved. This warning refers to a script in the branch-name-policy job located in the build.yml file.
The second issue is our gradle-local.properties file, where we manually added the Sonar token. Since this file is included in .gitignore, we can safely accept this when our code is analyzed in SonarCloud.
As you can see, we get the analysis results immediately without having to push code to GitHub every time, which also helps avoid creating unnecessary commits.
Push the changes to GitHub.
Git commands:
Replace git@github.com:USERNAME/REPOSITORY.git with your GitHub repository link.
Results
All jobs are successful on GitHub.
In SonarCloud, go to the Projects section and select the sonarqube_actions_demo project. There, you will see the scanned results, but without Quality Gate. Free-plan users have access only to the default Quality Gate.
Let's protect our main branch on GitHub and then create a new Pull Request to see how it works.
Go to the Rulesets at https://github.com/VardanMatevosyan/sonarqube_actions_demo/settings/rules and create the Rule for the main branch.
Here are the configurations
Rules.
Enable “Require status checks to pass” and add three jobs by name that were included in the jobs section in the build.yml file.
Click Create.
Pull Request workflow jobs check
Check out the new branch from the main and make a simple change, as shown in the screenshot below. Push the changes and create a PR with those changes.
All jobs have passed.
Merge the changes. Once the changes are merged, you’ll see the jobs running on the main branches.
To view the GitHub Actions details, click one of the Details links related to a specific job. Then select the job you want to inspect, and choose any individual execution step to see its detailed information.
Let’s check the scanning results on SonarQube. Go to your project at https://sonarcloud.io/project/overview?id=VardanMatevosyan_sonarqube_actions_demo.
The “id=VardanMatevosyan” part of the link will differ depending on your GitHub username. You can see the scanning results for each branch or pull request in the “Latest Activity” section, as well as the Main branch status and the Main Branch evolution chart for issues, code coverage, and duplications.
When you click on a specific branch, you will see the scanning results only for the new code changes included in the particular Pull Request.
Let’s accept or mark the issue as a false positive that is acceptable for our application. Go to the main branch and click on the Issues tab.
Click on the Open status and select, for example, False Positive.
Write the comment.
Return to the project at https://sonarcloud.io/projects, and you will see that there are no remaining issues.
Failing the pipeline based on SonarQube's default Quality Gates for a free account
Let's implement an endpoint at /users/{id} to retrieve user information by ID.
Add dependencies to the “gradle.build” file in addition to the Spring Boot Web and logback. This is the complete list of dependencies:
- H2 Database is an in-memory database for testing purposes.
- Spring Boot JDBC is used in the persistence layer to interact with the database.
- Spring Boot Test is used for unit tests of a Spring application.
- JUnit 5 to write unit tests.
- Lombok is to reduce boilerplate code.
- MapStruct is used for easily mapping entities to DTO objects.
dependencies {
// Spring boot runtime
implementation 'org.springframework.boot:spring-boot-starter-web'
implementation 'org.springframework.boot:spring-boot-starter-jdbc'
// Persistence H2 for demo (optional)
runtimeOnly 'com.h2database:h2'
// Observability
implementation 'ch.qos.logback:logback-classic:1.5.13'
implementation 'ch.qos.logback:logback-core:1.5.19'
// Test
testImplementation 'org.springframework.boot:spring-boot-starter-test'
testImplementation 'org.junit.jupiter:junit-jupiter:5.10.2'
testImplementation 'com.h2database:h2'
// Annotation processing
compileOnly 'org.projectlombok:lombok:1.18.26'
annotationProcessor 'org.projectlombok:lombok:1.18.26'
implementation 'org.mapstruct:mapstruct:1.5.5.Final'
annotationProcessor 'org.mapstruct:mapstruct-processor:1.5.5.Final'
}
Sourcecode structure
Controller
@RestController
@RequiredArgsConstructor
@FieldDefaults(level = AccessLevel.PRIVATE, makeFinal = true)
public class UserController {
UserServiceImpl userService;
@GetMapping("/users/{id}")
public ResponseEntity<UserDto> getUser(@PathVariable Integer id) {
UserDto user = userService.getUserById(id);
return ResponseEntity.ok(user);
}
}
Exception
DaoExcetion
public class DaoException extends RuntimeException {
public DaoException(String message, Throwable cause) {
super(message, cause);
}
}
GlobalExceptionHandler
@RestControllerAdvice
public class GlobalExceptionHandler {
@ExceptionHandler(NoSuchElementException.class)
public ResponseEntity<ErrorResponse> handleException(NoSuchElementException exception) {
HttpStatusCode httpStatusCode = HttpStatusCode.valueOf(HttpStatus.NOT_FOUND.value());
ErrorResponse errorResponse = ErrorResponse.create(exception, httpStatusCode, exception.getMessage());
return ResponseEntity.status(httpStatusCode).body(errorResponse);
}
@ExceptionHandler(value = {Exception.class, DaoException.class})
public ResponseEntity<ErrorResponse> handleException(DaoException exception) {
HttpStatusCode httpStatusCode = HttpStatusCode.valueOf(HttpStatus.INTERNAL_SERVER_ERROR.value());
ErrorResponse errorResponse = ErrorResponse.create(exception, httpStatusCode, exception.getMessage());
return ResponseEntity.status(httpStatusCode).body(errorResponse);
}
@ExceptionHandler(RuntimeException.class)
public ResponseEntity<ErrorResponse> handleException(RuntimeException exception) {
HttpStatusCode httpStatusCode = HttpStatusCode.valueOf(HttpStatus.BAD_REQUEST.value());
ErrorResponse errorResponse = ErrorResponse.create(exception, httpStatusCode, exception.getMessage());
return ResponseEntity.status(httpStatusCode).body(errorResponse);
}
}
Mapper
@Mapper(componentModel = "spring")
public interface UserMapper {
UserDto toDto(User user);
}
Model
- DTO
UserDto
@Getter
@Setter
@NoArgsConstructor
@AllArgsConstructor
@FieldDefaults(level = AccessLevel.PRIVATE)
public class UserDto {
Integer id;
String username;
String email;
}
- Entities
User
@Getter
@Setter
@AllArgsConstructor
@NoArgsConstructor
public class User {
Integer id;
String username;
String email;
}
Persistence
UserDao
public interface UserDao {
Optional<User> getUserById(Integer id);
}
UserDaoImpl
@Repository
@RequiredArgsConstructor
@FieldDefaults(level = AccessLevel.PRIVATE)
public class UserDaoImpl implements UserDao {
final DataSource dataSource;
public Optional<User> getUserById(Integer id) {
String query = "SELECT id, username, email FROM users WHERE id = " + id;
try (
Connection connection = dataSource.getConnection();
Statement ps = connection.prepareStatement(query)) {
var rs = ps.executeQuery(query);
if (rs.next()) {
User user = buildUserEntity(rs);
return Optional.of(user);
}
return Optional.empty();
} catch (NullPointerException | SQLException e) {
throw new DaoException("SQL error", e);
}
}
private User buildUserEntity(ResultSet rs) throws SQLException {
return new User(
rs.getInt("id"),
rs.getString("username"),
rs.getString("email"));
}
}
Service
UserService
public interface UserService {
UserDto getUserById(Integer id);
}
UserServiceImpl
@Service
@RequiredArgsConstructor
public class UserServiceImpl implements UserService {
final UserDao userDao;
final UserMapper userMapper;
public UserDto getUserById(Integer id) {
requireNonNull(id, "id must not be null");
return userDao.getUserById(id)
.map(userMapper::toDto)
.orElseThrow(() -> new NoSuchElementException("User not found with id: " + id));
}
}
Database schema and data
Add “schema.sql” to create the users table and insert some testing data.
CREATE TABLE users (
id INT PRIMARY KEY,
username VARCHAR2(255),
email VARCHAR2(255)
);
INSERT INTO users(id, username, email)
VALUES (1, 'Alice', 'alice@example.com'),
(2, 'Bob', 'bob@example.com'),
(3, 'Admin', 'admin@example.com');
Application properties
Add these properties to the existing ones in the “application.properties” file.
# Database configuration
spring.datasource.url=jdbc:h2:mem:testdb;DB_CLOSE_DELAY=-1
spring.datasource.driver-class-name=org.h2.Driver
spring.datasource.username=sa
spring.datasource.password=password
# Enable H2 web console via http://localhost:8008/h2-console
spring.h2.console.enabled=true
Add a “gradle.properties” file to the root of the project. This property is required so GitHub waits for SonarCloud to return the scanning results. If you omit this, all scanning results will appear as passed.
systemProp.sonar.qualitygate.wait=true
Let’s run the application. As you can see, it returns the correct user data by the ID.
Now let’s run SonarQube locally from IntelliJ IDEA.
After scanning, SonarQube immediately shows one detected SQL Injection vulnerability. You can even see the suggested fix.
Let’s imagine that we forgot to run a SonarQube scan.
Pull the latest changes from the main branch and create a new branch that starts with feature/. Commit and push your changes. Then go to GitHub and create a Pull Request.
You can see that the Sonar job has failed.
For more details, you can click on the SonarQube Scan job.
Let’s check the SonarQube project dashboard at https://sonarcloud.io/project/overview?id=VardanMatevosyan_sonarqube_actions_demo
Then choose your branch
As you can see, it failed. Click to view the details. You will see that two conditions have failed.
Click on “Security Hotspot Reviewed”. As you can see, it shows the same result we saw locally.
The SQL injection vulnerability.
Fix the SQL Injection vulnerability by using the PreparedStatement class and setter methods to set the values.
UserDaoImpl class
@Repository
@RequiredArgsConstructor
@FieldDefaults(level = AccessLevel.PRIVATE)
public class UserDaoImpl implements UserDao {
final DataSource dataSource;
public Optional<User> getUserById(Integer id) {
String query = "SELECT id, username, email FROM users WHERE id = ?";
try (
Connection connection = dataSource.getConnection();
PreparedStatement ps = connection.prepareStatement(query)) {
ps.setInt(1, id);
var rs = ps.executeQuery();
if (rs.next()) {
User user = buildUserEntity(rs);
return Optional.of(user);
}
return Optional.empty();
} catch (NullPointerException | SQLException e) {
throw new DaoException("SQL error", e);
}
}
private User buildUserEntity(ResultSet rs) throws SQLException {
return new User(
rs.getInt("id"),
rs.getString("username"),
rs.getString("email"));
}
}
Run Sonar locally, and you will not find any issues. Then push changes to a remote branch. The Sonar job still fails.
Look in the SonarQube dashboard, you'll see that the SQL Injection vulnerability is gone, but now the job is failing because tests and code coverage are missing. We'll explore this in the next section: “Adding unit tests and the JaCoCo plugin for code coverage.”
Even though we went through all of the steps from writing code up to creating a Pull Request, having the SonarQube plugin configured in your IDE is helpful in immediately detecting issues without running the application locally or pushing changes to your remote branch.
Adding unit tests and the JaCoCo plugin for code coverage
Add these changes to the “gradle.build” file:
JaCoCo plugin
id 'JaCoCo'
JaCoCo properties for SonarQube.
The config, mapper, model, and exception packages should be excluded from code coverage processing in SonarQube.
property "sonar.coverage.JaCoCoxml.import", "true"
property "sonar.java.coveragePlugin", "JaCoCo"
property "sonar.coverage.JaCoCo.xmlReportPaths",
System.getenv("SONAR_COVERAGE_JaCoCo_XML_REPORT_PATH")
?: localProps["sonar.coverage.JaCoCo.xmlReportPaths"]
property "sonar.coverage.exclusions", "**/config/**,**/mapper/**,**/model/**,**/exception/**"
JaCoCo configuration
JaCoCo {
toolVersion = "0.8.12"
}
JaCoCoTestReport {
dependsOn test
reports {
xml.required = true
html.required = true
csv.required = false
}
afterEvaluate {
classDirectories.setFrom(files(classDirectories.files.collect {
fileTree(dir: it, exclude: ['**/config/**', '**/model/**', '**/exception/**', '**/mapper/**'])
}))
}
}
JaCoCoTestCoverageVerification {
dependsOn JaCoCoTestReport
violationRules {
rule {
enabled = true
element = 'BUNDLE'
limit {
counter = 'LINE'
value = 'COVEREDRATIO'
minimum = 0.90
}
limit {
counter = 'BRANCH'
value = 'COVEREDRATIO'
minimum = 0.65
}
limit {
counter = 'METHOD'
value = 'COVEREDRATIO'
minimum = 0.90
}
limit {
counter = 'CLASS'
value = 'COVEREDRATIO'
minimum = 0.90
}
}
}
afterEvaluate {
classDirectories.setFrom(files(classDirectories.files.collect {
fileTree(dir: it, exclude: ['**/config/**', '**/model/**', '**/exception/**', '**/mapper/**'])
}))
}
}
Update the task test by adding this line at the end
finalizedBy JaCoCoTestReport
The complete “gradle.build” file
plugins {
id 'java'
id 'JaCoCo'
id "org.sonarqube" version "7.0.0.6105"
id 'org.springframework.boot' version '3.0.0'
id 'io.spring.dependency-management' version '1.1.7'
}
group = 'com.practice'
version = '0.0.1-SNAPSHOT'
description = 'sonarqube_actions_demo'
java {
toolchain {
languageVersion = JavaLanguageVersion.of(17)
}
}
configurations {
compileOnly {
extendsFrom annotationProcessor
}
}
repositories {
mavenCentral()
}
dependencies {
// Spring boot runtime
implementation 'org.springframework.boot:spring-boot-starter-web'
implementation 'org.springframework.boot:spring-boot-starter-jdbc'
// Persistence H2 for demo (optional)
runtimeOnly 'com.h2database:h2'
// Observability
implementation 'ch.qos.logback:logback-classic:1.5.13'
implementation 'ch.qos.logback:logback-core:1.5.19'
// Test
testImplementation 'org.springframework.boot:spring-boot-starter-test'
testImplementation 'org.junit.jupiter:junit-jupiter:5.10.2'
testImplementation 'com.h2database:h2'
// Annotation processing
compileOnly 'org.projectlombok:lombok:1.18.26'
annotationProcessor 'org.projectlombok:lombok:1.18.26'
implementation 'org.mapstruct:mapstruct:1.5.5.Final'
annotationProcessor 'org.mapstruct:mapstruct-processor:1.5.5.Final'
}
def localProps = new Properties()
def localFile = file("gradle-local.properties")
if (localFile.exists()) {
localProps.load(localFile.newDataInputStream())
}
sonar {
properties {
property "sonar.coverage.JaCoCoxml.import", "true"
property "sonar.java.coveragePlugin", "JaCoCo"
property "sonar.coverage.JaCoCo.xmlReportPaths",
System.getenv("SONAR_COVERAGE_JaCoCo_XML_REPORT_PATH")
?: localProps["sonar.coverage.JaCoCo.xmlReportPaths"]
property "sonar.projectKey", System.getenv("SONAR_PROJECT_KEY") ?: localProps["sonar.projectKey"]
property "sonar.organization", System.getenv("SONAR_ORGANIZATION_KEY") ?: localProps["sonar.organization"]
property "sonar.projectName", System.getenv("SONAR_PROJECT_NAME") ?: localProps["sonar.projectName"]
property "sonar.token", System.getenv("SONAR_TOKEN") ?: localProps["sonar.token"]
property "sonar.host.url", System.getenv("SONAR_HOST_URL") ?: localProps["sonar.host.url"]
property "sonar.coverage.exclusions", "**/config/**,**/mapper/**,**/model/**,**/exception/**"
}
}
JaCoCo {
toolVersion = "0.8.12"
}
JaCoCoTestReport {
dependsOn test
reports {
xml.required = true
html.required = true
csv.required = false
}
afterEvaluate {
classDirectories.setFrom(files(classDirectories.files.collect {
fileTree(dir: it, exclude: ['**/config/**', '**/model/**', '**/exception/**', '**/mapper/**'])
}))
}
}
JaCoCoTestCoverageVerification {
dependsOn JaCoCoTestReport
violationRules {
rule {
enabled = true
element = 'BUNDLE'
limit {
counter = 'LINE'
value = 'COVEREDRATIO'
minimum = 0.90
}
limit {
counter = 'BRANCH'
value = 'COVEREDRATIO'
minimum = 0.65
}
limit {
counter = 'METHOD'
value = 'COVEREDRATIO'
minimum = 0.90
}
limit {
counter = 'CLASS'
value = 'COVEREDRATIO'
minimum = 0.90
}
}
}
afterEvaluate {
classDirectories.setFrom(files(classDirectories.files.collect {
fileTree(dir: it, exclude: ['**/config/**', '**/model/**', '**/exception/**', '**/mapper/**'])
}))
}
}
tasks.named('test') {
useJUnitPlatform()
finalizedBy JaCoCoTestReport
}
Test package structure
We need to add the “data.sql” file. This script inserts testing data into the database. You can access them while running tests.
INSERT INTO users (id, username, email)
VALUES (10, 'alice', 'alice@example.com'),
(11, 'bob', 'bob@example.com');
UseControllerTest
@WebMvcTest(UserController.class)
class UserControllerTest {
@Autowired
private MockMvc mockMvc;
@MockBean
private UserServiceImpl userService;
@Test
void getUser_shouldReturnUserDto_whenUserExists() throws Exception {
// Arrange
Integer userId = 1;
UserDto userDto = new UserDto(userId, "jane_doe", "jane@example.com");
when(userService.getUserById(userId)).thenReturn(userDto);
// Act and Assert
mockMvc.perform(get("/users/{id}", userId)
.contentType(MediaType.APPLICATION_JSON))
.andExpect(status().isOk())
.andExpect(jsonPath("$.id").value(userDto.getId()))
.andExpect(jsonPath("$.username").value(userDto.getUsername()))
.andExpect(jsonPath("$.email").value(userDto.getEmail()));
verify(userService).getUserById(userId);
}
@Test
void getUser_shouldReturn500_whenUserServiceThrowsException() throws Exception {
// Arrange
Integer userId = 999;
when(userService.getUserById(userId))
.thenThrow(new NoSuchElementException("User not found with id: " + userId));
// Act and Assert
mockMvc.perform(get("/users/{id}", userId)
.contentType(MediaType.APPLICATION_JSON))
.andExpect(status().is4xxClientError());
verify(userService).getUserById(userId);
}
}
UserDaoImplTest
@JdbcTest
@TestPropertySource(properties = {
"spring.datasource.url=jdbc:h2:mem:testdb;DB_CLOSE_DELAY=-1",
"spring.datasource.driver-class-name=org.h2.Driver"
})
class UserDaoImplTest {
@Autowired
private DataSource dataSource;
private UserDaoImpl userDao;
@BeforeEach
void setUp() {
this.userDao = new UserDaoImpl(dataSource);
}
@Test
void getUserById_shouldReturnUser_whenUserExists() {
// Arrange
int userId = 10;
// Act
var userOpt = userDao.getUserById(userId);
// Assert
assertThat(userOpt).isPresent();
User user = userOpt.get();
assertThat(user.getId()).isEqualTo(userId);
assertThat(user.getUsername()).isEqualTo("alice");
assertThat(user.getEmail()).isEqualTo("alice@example.com");
}
@Test
void getUserById_shouldReturnEmpty_whenUserDoesNotExist() {
// Act
var userOpt = userDao.getUserById(999);
// Assert
assertThat(userOpt).isEmpty();
}
@Test
void getUserById_shouldReturnEmpty_whenIdIsNull() {
// Assert and Act
assertThatException()
.isThrownBy(() -> userDao.getUserById(null))
.isInstanceOf(DaoException.class)
.withMessage("SQL error");
}
}
UserServiceImplTest
@ExtendWith(MockitoExtension.class)
class UserServiceImplTest {
@Mock
private UserDao userDao;
@Mock
private UserMapper userMapper;
@InjectMocks
private UserServiceImpl userService;
@Test
void test_whenGetUserById_shouldReturnUserDto_whenUserExists() {
// Arrange
Integer userId = 1;
User user = new User(userId, "john_doe", "john@example.com");
UserDto userDto = new UserDto(userId, "john_doe", "john@example.com");
when(userDao.getUserById(userId)).thenReturn(Optional.of(user));
when(userMapper.toDto(user)).thenReturn(userDto);
// Act
UserDto result = userService.getUserById(userId);
// Assert
assertThat(result).isEqualTo(userDto);
verify(userDao).getUserById(userId);
verify(userMapper).toDto(user);
}
@Test
void test_whenGetUserById_shouldThrowNoSuchElementException_whenUserNotFound() {
// Arrange
Integer userId = 999;
when(userDao.getUserById(userId)).thenReturn(Optional.empty());
// Act and Assert
assertThatThrownBy(() -> userService.getUserById(userId))
.isInstanceOf(NoSuchElementException.class)
.hasMessage("User not found with id: " + userId);
verify(userDao).getUserById(userId);
verify(userMapper, never()).toDto(any());
}
@Test
void getUserById_shouldHandleNullId() {
// Act and Assert
assertThatThrownBy(() -> userService.getUserById(null))
.isInstanceOf(NullPointerException.class)
.hasMessage("id must not be null");
}
}
Update “build.yml” file.
Add test_and_coverage next to the build job.
test_and_coverage:
name: Test and Coverage
runs-on: ubuntu-latest
container:
image: eclipse-temurin:17-jdk
steps:
- name: Checkout source code to docker ubuntu container
uses: actions/checkout@v4
with:
token: ${{ secrets.GITHUB_TOKEN }}
fetch-depth: 0
- name: Run tests with coverage
run: ./gradlew JaCoCoTestCoverageVerification
- name: Upload test results
uses: actions/upload-artifact@v4
with:
name: test-coverage-report
path: 'build'
overwrite: true
retention-days: 5
Add the test_and_coverage job after the build job under the needs section for the sast job.
This configuration ensures that the sast job will wait until the build and test_and_coverage jobs have completed their execution.
sast:
needs:
- build
- test_and_coverage
Add these two sast steps after the “Cache SonarQube packages” step. First one download the JaCoCo report saved by the previous test_and_coverage job execution. The second one is to check if the report exists.
- name: Download JaCoCo report
uses: actions/download-artifact@v4
with:
name: test-coverage-report
path: .
- name: Verify report exists
run: |
ls -la ./reports/JaCoCo/test
Complete “build.yml” file.
name: SonarQube
on:
push:
branches:
- 'main'
pull_request:
branches:
- main
jobs:
branch-name-policy:
name: branch-name-policy
runs-on: ubuntu-latest
steps:
- name: Check PR source branch name
shell: bash
run: |
if [ "${{ github.event_name }}" = "pull_request" ]; then
BRANCH="${{ github.head_ref }}"
else
BRANCH="${{ github.ref_name }}"
fi
echo "PR head ref: $BRANCH"
if [[ "$BRANCH" =~ ^(release/|hotfix/|feature/|bugfix/|test|main).* ]]; then
echo "Allowed branch pattern: $BRANCH"
exit 0
else
echo "::error ::Branch name '$BRANCH' is not allowed to merge into main. Allowed patterns: release/*, hotfix/*, feature/*, bugfix/*, test*"
exit 1
fi
build:
name: Build
runs-on: ubuntu-latest
container:
image: eclipse-temurin:17-jdk
steps:
- name: Checkout source code to docker ubuntu container
uses: actions/checkout@v4
with:
token: ${{ secrets.GITHUB_TOKEN }}
fetch-depth: 0
- name: Build project
run: ./gradlew build -x test
test_and_coverage:
name: Test and Coverage
runs-on: ubuntu-latest
container:
image: eclipse-temurin:17-jdk
steps:
- name: Checkout source code to docker ubuntu container
uses: actions/checkout@v4
with:
token: ${{ secrets.GITHUB_TOKEN }}
fetch-depth: 0
- name: Run tests with coverage
run: ./gradlew JaCoCoTestCoverageVerification
- name: Upload test results
uses: actions/upload-artifact@v4
with:
name: test-coverage-report
path: 'build'
overwrite: true
retention-days: 5
sast:
needs:
- build
- test_and_coverage
name: SonarQube Scan
runs-on: ubuntu-latest
steps:
- name: Checkout source code to docker ubuntu container
uses: actions/checkout@v4
with:
fetch-depth: 0
- name: Cache SonarQube packages
uses: actions/cache@v4
with:
path: ~/.sonar/cache
key: ${{ runner.os }}-sonar
restore-keys: ${{ runner.os }}-sonar
- name: Download JaCoCo report
uses: actions/download-artifact@v4
with:
name: test-coverage-report
path: .
- name: Verify report exists
run: |
ls -la ./reports/JaCoCo/test
- name: SonarQube Scan
env:
SONAR_TOKEN: ${{ secrets.SONAR_TOKEN }}
SONAR_HOST_URL: ${{ secrets.SONAR_HOST_URL }}
SONAR_ORGANIZATION_KEY: ${{ secrets.SONAR_ORGANIZATION_KEY }}
SONAR_PROJECT_KEY: ${{ secrets.SONAR_PROJECT_KEY }}
SONAR_PROJECT_NAME: ${{ secrets.SONAR_PROJECT_NAME }}
run: ./gradlew sonar --info
Push the changes to the remote branch and check the state of the Pull Request’s jobs execution results. The pipeline failed because one last piece is missing.
We need to add the JaCoCo test coverage report path in the SonarQube configuration for this project. This can be done by navigating to the SonarQube project, Administration, located in the bottom-left corner, and then General Settings. Paste reports/JaCoCo/test/JaCoCoTestReport.xml and click save.
Go to GitHub, click on the SoarQube job to navigate to the GitHub Actions page. Click on the Sonar job and re-run the job.
Now all jobs have passed.
Return to the Pull Request page, and now you can merge the changes to the main branch.
Merge the changes, and verify that all jobs pass after merging to the main branch.
Verify that test coverage appears on the SonarQube project. It shows the overall test coverage percentage. If you need charts on code coverage, click on the project and select the Coverage header at the top of the “Main Branch Evolution” section.
Conclusion
Effective SAST implementation has to be smoothly integrated into your development and deployment processes. This article demonstrates how to maximize the benefits of SonarCloud, including local scanning with IntelliJ IDEA and Docker, integrating scans with GitHub Actions pipelines, and enhancing code coverage through unit testing and the JaCoCo plugin.
You can keep your codebase secure and prevent vulnerabilities from reaching production by enforcing quality gates and managing false positives. By implementing these practices in your SSDLC, you are not only strengthening the security but also establishing an environment of quality and accountability within your development teams.
Start applying these steps today to improve software security with continuous and automated SAST.
Links
- SonarQube Cloud - https://sonarcloud.io/
- SonarQube documentation - https://docs.sonarsource.com/
- GitHub repository - https://github.com/VardanMatevosyan/sonarqube_actions_demo
Originally published on my personal blog: https://matevosian.tech/blog/post/sast-part3-automation-scanning















































Top comments (0)