DEV Community

Srinivasaraju Tangella
Srinivasaraju Tangella

Posted on

Stop Just Running mvn test: Build a Jenkins Pipeline That Understands Test Results and Controls Build Promotion Automatically

For years teams run mvn test and then just look at “BUILD SUCCESS”. That’s not enough. Tests must be measured, recorded, and enforced by the pipeline. This post walks you through everything from scratch — install/configure Jenkins, prepare a sample Maven project, add test and coverage reporting, and a full declarative Jenkins pipeline that:

Runs unit tests

Counts total / passed / failed / skipped tests

Calculates pass/fail percentage

Stores summary on the build and prints it in console

Automatically rejects or promotes the build to the next stage based on pass % threshold

Follow this end-to-end guide step by step.

1)What you’ll end up with (short)

A Jenkins declarative pipeline that:

1.Checks out code

2.Builds with Maven

3.Runs tests with Surefire

4.Collects JUnit XML results with the JUnit plugin

5.Calculates metrics (total, passed, failed, skipped, pass%, fail%)

6.Enforces an acceptance threshold (example: pass% >= 90)

7.Deploys to QA only if accepted

2) Prerequisites (exact order)

1.A server with Jenkins installed (LTS recommended). If not installed, install Jenkins first.

2.Jenkins master or agent with:

Java JDK (matching your project)

Maven (available in PATH) or configured as a Jenkins tool

Git client

3.Jenkins plugins to install:

JUnit (for parsing test XML)

Pipeline Utility Steps (for script helpers)

Git Plugin

(Optional but recommended) JaCoCo plugin or use report generation in build step

(Optional) Notification plugins (Slack, Email)

4.A sample Maven Java project with unit tests (JUnit) and pom.xml configured to produce Surefire reports (default) and optionally JaCoCo coverage.

5.A deploy script deploy-to-qa.sh (can just echo for demo) in repo or accessible from agent.

If anything above is missing, stop and install/configure it now. This post assumes Jenkins is reachable and you can create pipelines.

3)Configure Jenkins (Global tools and credentials — do this first)

1.In Jenkins go to Manage Jenkins → Global Tool Configuration.

2.Add JDK (name it jdk11 or similar) and Maven (name maven3 or similar). Or ensure your agent has them in PATH.

3.Add Git if needed.

4.In Credentials add any SCM / deployment credentials you'll need (SSH keys, tokens).

Note the exact tool names you used; you’ll reference them from the Jenkinsfile or use shell mvn if agent has Maven in PATH.

4) Minimal sample Maven project structure (what to commit)

Create a simple Java project (or use your existing repo). Example layout:

sample-java-app/
├── src/
│ ├── main/
│ │ └── java/com/example/App.java
│ └── test/
│ └── java/com/example/AppTest.java
├── pom.xml
└── deploy-to-qa.sh

Create a basic test using JUnit so Surefire produces test reports. AppTest.java should contain a passing and a failing test to see behavior during development.

deploy-to-qa.sh can be a placeholder that prints a message for now:

!/bin/bash

echo "Deploy to QA: Simulated deploy"

place your actual deploy commands here

Make the script executable: chmod +x deploy-to-qa.sh

5)pom.xml essentials (Surefire + JaCoCo snippets)

Your pom.xml must run tests and optionally produce code coverage. Minimal config:


4.0.0
com.example
sample-java-app
1.0-SNAPSHOT


11
11
UTF-8


<!-- Add JUnit dependency -->

junit
junit
4.13.2
test



<!-- Surefire plugin (Maven default test runner) -->

org.apache.maven.plugins
maven-surefire-plugin
3.0.0-M8

  <!-- Optional: JaCoCo for coverage -->
  <plugin>
    <groupId>org.jacoco</groupId>
    <artifactId>jacoco-maven-plugin</artifactId>
    <version>0.8.8</version>
    <executions>
      <execution>
        <goals>
          <goal>prepare-agent</goal>
        </goals>
      </execution>
      <execution>
        <id>report</id>
        <phase>test</phase>
        <goals>
          <goal>report</goal>
        </goals>
      </execution>
    </executions>
  </plugin>

</plugins>
Enter fullscreen mode Exit fullscreen mode

Surefire creates XML in target/surefire-reports/ by default, which Jenkins JUnit plugin reads.

6)Add your repo to source control (GitHub/GitLab) — commit everything

Commit and push your project with pom.xml, src/, and deploy-to-qa.sh to a repository that Jenkins can access.

7)Create the Jenkins Pipeline job (Multibranch or Pipeline)

You can create a Multibranch Pipeline or a single Pipeline job. For simple demos create a Pipeline job and paste the Jenkinsfile (below) in the pipeline script or store Jenkinsfile in repo and use Pipeline from SCM.

8)The full Jenkinsfile

This is the full declarative pipeline that meets your request exactly: runs tests, counts metrics, calculates percentages, enforces threshold, and deploys to QA when accepted.

pipeline {
agent any

options {
    skipDefaultCheckout(false)
    timestamps()
    buildDiscarder(logRotator(numToKeepStr: '20'))
}

environment {
    // Threshold - change as needed
    PASS_PERCENT_THRESHOLD = "90"
    // Maven command - if using Jenkins tool, replace with mvn from tool
    MVN_CMD = "mvn"
}

stages {

    stage('Checkout') {
        steps {
            echo "1) Checkout - cloning repository"
            checkout scm
        }
    }

    stage('Prepare') {
        steps {
            echo "2) Prepare - show java and mvn version"
            sh 'java -version || true'
            sh "${env.MVN_CMD} -version || true"
            sh 'chmod +x ./deploy-to-qa.sh || true'
        }
    }

    stage('Build') {
        steps {
            echo "3) Build - running mvn clean compile"
            sh "${env.MVN_CMD} clean compile -DskipTests=true"
        }
    }

    stage('Run Tests') {
        steps {
            echo "4) Run Tests - running mvn test (generates target/surefire-reports)"
            sh "${env.MVN_CMD} test || true"
        }
        post {
            always {
                // Publish JUnit reports so Jenkins has the raw data accessible
                junit allowEmptyResults: true, testResults: 'target/surefire-reports/*.xml'
            }
        }
    }

    stage('Calculate Test Results') {
        steps {
            script {
                echo "5) Calculate Test Results - parse JUnit data and compute metrics"

                // Re-read JUnit results to get counts (junit step returns an object)
                def results = junit allowEmptyResults: true, testResults: 'target/surefire-reports/*.xml'

                // Extract metrics
                def total = results.totalCount ?: 0
                def failed = results.failCount ?: 0
                def skipped = results.skipCount ?: 0
                def passed = total - failed - skipped

                // Safe integer math
                def passPercent = 0
                def failPercent = 0
                if (total > 0) {
                    passPercent = (passed * 100) / total
                    failPercent = (failed * 100) / total
                }

                // Average test time per test if available - results has duration in seconds
                def avgTestTime = 0
                try {
                    if (results.totalDuration != null && total > 0) {
                        // results.totalDuration is in seconds (floating point)
                        avgTestTime = results.totalDuration / total
                    }
                } catch (err) {
                    avgTestTime = 0
                }

                // Build a readable summary
                def summary = []
                summary << "==========================="
                summary << "🧪 Test Results Summary"
                summary << "---------------------------"
                summary << "Total Tests   : ${total}"
                summary << "Passed        : ${passed}"
                summary << "Failed        : ${failed}"
                summary << "Skipped       : ${skipped}"
                summary << "Pass %        : ${passPercent}%"
                summary << "Fail %        : ${failPercent}%"
                summary << "Avg test time : ${String.format('%.3f', avgTestTime)} sec"
                summary << "==========================="

                echo summary.join('\n')

                // Save summary to build description and display name
                currentBuild.description = "Pass:${passPercent}% Fail:${failPercent}%"
                currentBuild.displayName = "#${env.BUILD_NUMBER} - ${passed}/${total} tests"

                // Decide acceptance
                def threshold = env.PASS_PERCENT_THRESHOLD as Integer
                if (passPercent < threshold) {
                    // Mark build failed and stop pipeline here
                    error(" Build rejected — Pass percentage ${passPercent}% is below threshold ${threshold}%")
                } else {
                    echo "Build accepted — pass percentage ${passPercent}% meets threshold ${threshold}%"
                }

                // Expose the numeric values for downstream use
                env.TEST_TOTAL = "${total}"
                env.TEST_PASSED = "${passed}"
                env.TEST_FAILED = "${failed}"
                env.TEST_SKIPPED = "${skipped}"
                env.TEST_PASS_PERCENT = "${passPercent}"
                env.TEST_AVG_TIME = "${avgTestTime}"
            }
        }
    }

    stage('Optional - Generate Coverage Report') {
        when {
            expression { fileExists('pom.xml') }
        }
        steps {
            echo "6) Coverage - run jacoco report if configured in pom.xml"
            // If you use jacoco plugin in pom, it already ran during test and created target/site/jacoco
            sh "${env.MVN_CMD} jacoco:report || true"
            // Optionally archive the generated coverage report
            archiveArtifacts artifacts: 'target/site/jacoco/**', allowEmptyArchive: true
        }
    }

    stage('Deploy to QA') {
        when {
            expression { currentBuild.resultIsBetterOrEqualTo('SUCCESS') }
        }
        steps {
            echo "7) Deploy to QA - running deploy script"
            sh './deploy-to-qa.sh'
        }
    }
}

post {
    always {
        echo "Pipeline finished - sending summary"
        echo "Build: ${currentBuild.fullDisplayName}"
        echo "Description: ${currentBuild.description}"
    }
    failure {
        echo "Pipeline failed - Please check test failures and logs."
    }
    success {
        echo "Pipeline succeeded - Build promoted to QA."
    }
}
Enter fullscreen mode Exit fullscreen mode

}

Notes about the Jenkinsfile:

The junit step is used twice: once in the Run Tests post always (to register results early) and again in Calculate stage to return the results object we parse. The junit step returns a structured object with counts and duration in pipeline context — we use that to compute metrics.

We compute passPercent using integer division; adjust for floating math if you want decimals.

error() halts the pipeline and marks build as failed if pass% < threshold.

PASS_PERCENT_THRESHOLD is configurable in the environment block.

9)How the pipeline computes metrics (explain, no skipping)

1.After mvn test, Surefire writes XML files in target/surefire-reports/.

2.Jenkins JUnit plugin reads those XML files via junit 'target/surefire-reports/*.xml'.

3.The junit pipeline step returns an object with fields such as totalCount, failCount, skipCount, and totalDuration.

4.We compute:

passed = total - failed - skipped

passPercent = (passed * 100) / total (0 if total is 0)

avgTestTime = totalDuration / total if duration available

5.We print a clean summary to the Jenkins console and set build description and display name so it’s visible in the Jenkins UI.

6.If passPercent < threshold then pipeline uses error() to stop and mark build failed; otherwise it continues to deployment.

10)Example console outputs (exact strings you’ll see)

When everything is fine (e.g., 94% pass):

===========================

🧪 Test Results Summary

Total Tests : 100
Passed : 94
Failed : 6
Skipped : 0
Pass % : 94%
Fail % : 6%

Avg test time : 0.012 sec

✅ Build accepted — pass percentage 94% meets threshold 90%
🚀 Deploy to QA - running deploy script
Deploy to QA: Simulated deploy

When below threshold (e.g., 87% pass), pipeline fails:

===========================

🧪 Test Results Summary

Total Tests : 100
Passed : 87
Failed : 13
Skipped : 0
Pass % : 87%
Fail % : 13%

Avg test time : 0.009 sec

❌ Build rejected — Pass percentage 87% is below threshold 90%

If no tests exist (total = 0) we treat pass% as 0 and reject unless you change logic.

11)How to tune acceptance criteria and behavior

Change PASS_PERCENT_THRESHOLD in environment block to any integer 0–100.

If you want a manual approval step for marginal pass% (e.g., 85–90), replace error() with input() to require human approval.

To allow builds with 0 tests (e.g., for quick smoke builds), add special-case logic when total == 0 to skip threshold check.

Example alternative acceptance logic (replace the if block in pipeline):

if (total == 0) {
echo "No tests found. Skipping automatic acceptance - manual approval required."
input message: "No tests found. Approve promotion to QA?", ok: "Approve"
} else if (passPercent < threshold) {
// treat as failure
error("Build rejected ...")
} else {
echo "Accepted ..."
}

12)Troubleshooting common issues (do these in order)

1.Jenkins shows no test results:

Ensure tests create target/surefire-reports/*.xml.

Check file paths and that tests actually ran.

2.junit step fails to parse:

Inspect XML in workspace target/surefire-reports for malformed XML.

3.Pipeline stops unexpectedly:

Check console log for error() and stack trace around that message.

4.mvn test exits non-zero but you want pipeline to continue to parse results:

In the Run Tests stage we used sh "mvn test || true" so the build step won't abort before junit runs. Adjust based on your policy.

5.Duration/avg test time missing:

Some Jenkins versions or configurations might not populate totalDuration. It’s optional.

13)Make it enterprise-grade (next improvements)

Integrate SonarQube and enforce Sonar quality gate before promotion.

Use JaCoCo thresholds (and fail build if coverage < X%).

Persist metrics in a database or push to Prometheus/Grafana for test trend graphs.

Add flaky test detection using test retry or quarantine strategy.

Publish test artifacts (HTML reports) and link from build page.

) Full final checklist before you develop and run pipeline.

1.Replace git url or use checkout scm depending on job type.

2.Confirm mvn is accessible in Jenkins agent or configure Maven in Global Tool Config.

3.Ensure deploy-to-qa.sh contains your actual deployment commands (for demo it can echo).

4.Adjust PASS_PERCENT_THRESHOLD if your org uses a different standard.

5.Optionally add Sonar/JaCoCo

**Real DevOps Acceptance Mindset:

Automation is not just about running commands —
it’s about building intelligence into pipelines that make decisions.

A professional DevOps engineer ensures:

Pipelines stop when quality gates fail.

No deployment happens without measurable quality.

Every stage reflects clear acceptance criteria.

That’s the true power of DevOps-driven testing.

🧭 Final Thought

It’s not about typing mvn test —
it’s about what the tests reveal and how Jenkins interprets them.

When your CI/CD pipeline understands metrics like pass rate, coverage, and failure percentage —
you stop being a “script executor” and become a pipeline engineer who drives quality and confidence.

✍️ Author’s Note — From My View

Over the years, I’ve seen countless builds pass with “BUILD SUCCESS” while hiding critical failures.
My intent with this article is to push every DevOps engineer to look beyond green builds and measure quality in every stage.

Let’s stop treating mvn test as a checkbox — and start making it a decision point in the CI/CD flow.

Top comments (0)