<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Spencer Radcliff</title>
    <description>The latest articles on DEV Community by Spencer Radcliff (@spencer_radcliff_eae6cf90).</description>
    <link>https://dev.to/spencer_radcliff_eae6cf90</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/spencer_radcliff_eae6cf90"/>
    <language>en</language>
    <item>
      <title>🚀 Automating API Load Testing with JMeter, Azure DevOps &amp; SLA Validation</title>
      <dc:creator>Spencer Radcliff</dc:creator>
      <pubDate>Fri, 05 Sep 2025 20:12:33 +0000</pubDate>
      <link>https://dev.to/spencer_radcliff_eae6cf90/automating-api-load-testing-with-jmeter-azure-devops-sla-validation-1lmc</link>
      <guid>https://dev.to/spencer_radcliff_eae6cf90/automating-api-load-testing-with-jmeter-azure-devops-sla-validation-1lmc</guid>
      <description>&lt;h2&gt;
  
  
  Introduction
&lt;/h2&gt;

&lt;p&gt;API performance testing is critical for ensuring reliability under load. Traditionally, engineers run JMeter locally, interpret results manually, and only test periodically. But in a DevOps world, performance testing should be continuous, automated, and part of your CI/CD pipeline.&lt;br&gt;
In this post, I'll share how I built a framework that:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Runs JMeter tests inside Azure DevOps pipelines&lt;/li&gt;
&lt;li&gt;Supports progressive load testing (incrementally increasing users)&lt;/li&gt;
&lt;li&gt;Performs automatic SLA validation on latency, response time, and throughput&lt;/li&gt;
&lt;li&gt;Publishes JUnit XML &amp;amp; HTML reports directly into the pipeline&lt;/li&gt;
&lt;/ul&gt;
&lt;h2&gt;
  
  
  🏗️ Architecture
&lt;/h2&gt;

&lt;p&gt;&lt;code&gt;PerformanceTestFramework/&lt;br&gt;
├── JMeter/&lt;br&gt;
│   ├── {Module}/&lt;br&gt;
│   │   ├── testplan/    # JMeter test plans (.jmx)&lt;br&gt;
│   │   └── SLA/         # SLA configs (.json)&lt;br&gt;
├── Pipelines/&lt;br&gt;
│   └── loadtest.yaml    # Azure DevOps pipeline config&lt;br&gt;
└── scripts/             # PowerShell automation scripts&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Tech Stack&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Apache JMeter (load testing engine)&lt;/li&gt;
&lt;li&gt;Azure DevOps Pipelines (orchestration &amp;amp; reporting)&lt;/li&gt;
&lt;li&gt;PowerShell (setup &amp;amp; execution)&lt;/li&gt;
&lt;li&gt;Python (JTL → JUnit conversion)&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;⚙️ Pipeline Configuration&lt;br&gt;
The pipeline is parameterized, making it easy to select test plans, SLA files, and environments at runtime.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;parameters:
  - name: MAX_THREADS
    type: number
    default: 10
  - name: THREAD_START
    type: number
    default: 5
  - name: THREAD_STEP
    type: number
    default: 5
  - name: RAMPUP
    type: number
    default: 1
  - name: TEST_PLAN
    type: string
    values:
      - 'JMeter/HomePage/testplan/HomePageFeatures.jmx'
      - 'JMeter/DataExploration/testplan/DataExplorationAssetsMe.jmx'
  - name: SLA_FILE
    type: string
    values:
      - 'JMeter/HomePage/SLA/sla_HomePage.json'
      - 'JMeter/DataExploration/SLA/sla_DataExploration.json'
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This way, testers can run different APIs under different loads without editing code.&lt;/p&gt;

&lt;p&gt;📈 Progressive Load Testing&lt;br&gt;
The test scales load gradually:&lt;br&gt;
Start with THREAD_START users&lt;br&gt;
Increase by THREAD_STEP until MAX_THREADS&lt;br&gt;
Use RAMPUP for smooth scaling&lt;/p&gt;

&lt;p&gt;Example:&lt;br&gt;
THREAD_START = 5&lt;br&gt;
THREAD_STEP  = 5&lt;br&gt;
MAX_THREADS  = 20&lt;br&gt;
👉 Runs 4 iterations: 5 → 10 → 15 → 20 users&lt;/p&gt;



&lt;p&gt;✅ SLA Validation&lt;br&gt;
Each test has an SLA JSON file, e.g.:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;{
  "response_time_ms": 2000,
  "latency_ms": 1500,
  "throughput_rps": 50,
  "violation_threshold_pct": 30
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The pipeline validates:&lt;br&gt;
Response Time ≤ response_time_ms&lt;br&gt;
Latency ≤ latency_ms&lt;br&gt;
Throughput ≥ throughput_rps&lt;br&gt;
SLA health classification → 🟢 Excellent / 🟡 Moderate / 🔴 Poor&lt;/p&gt;



&lt;p&gt;🐍 JTL → JUnit Conversion&lt;br&gt;
JMeter produces .jtl results, which aren't CI/CD friendly.&lt;br&gt;
 We use a Python script to convert JTL into JUnit XML, so Azure DevOps can show pass/fail status in the Test tab.&lt;br&gt;
Key snippet from jtl_to_junit.py:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;if elapsed &amp;gt; sla_response_time:
    message += f"Response time {elapsed}ms exceeded SLA."
if latency &amp;gt; sla_latency:
    message += f"Latency {latency}ms exceeded SLA."
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;✔️ Generates JUnit results per request + SLA health checks&lt;br&gt;
 ✔️ Failures appear just like unit test failures&lt;/p&gt;



&lt;p&gt;⚡ Automation with PowerShell&lt;br&gt;
PowerShell scripts handle setup &amp;amp; execution:&lt;br&gt;
1_install_jdk.ps1 → Install OpenJDK&lt;br&gt;
2_install_jmeter.ps1 → Install Apache JMeter&lt;br&gt;
3_clean_results.ps1 → Clean artifacts&lt;br&gt;
4_install_python.ps1 → Ensure Python is available&lt;br&gt;
5_run_jmeter_tests.ps1 → Run JMeter, collect results, call Python converter&lt;/p&gt;

&lt;p&gt;This keeps the pipeline clean and modular.&lt;/p&gt;


&lt;h2&gt;
  
  
  📊 Reporting
&lt;/h2&gt;

&lt;p&gt;JUnit Results → Published to pipeline test tab&lt;br&gt;
HTML Reports → JMeter's native HTML report uploaded as artifacts&lt;br&gt;
Raw JTL Files → Saved for debugging&lt;/p&gt;

&lt;p&gt;Example inline HTML report step:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;- task: PublishPipelineArtifact@1
  inputs:
    targetPath: '$(RESULTS_DIR)\html_reports'
    artifact: jmeter-html-reports
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;






&lt;h2&gt;
  
  
  🎯 Lessons Learned
&lt;/h2&gt;

&lt;p&gt;✅ Make SLA validation automatic → no more manual log parsing&lt;br&gt;
🔑 Tokens &amp;amp; correlation IDs must be refreshed before runs&lt;br&gt;
📦 Always store artifacts (JTL + JUnit + HTML) for traceability&lt;br&gt;
📈 Progressive load testing exposes degradation early&lt;/p&gt;




&lt;h2&gt;
  
  
  🌍 Conclusion
&lt;/h2&gt;

&lt;p&gt;With this setup, API performance testing became:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;1. Repeatable → Any tester can trigger tests with a few clicks&lt;/li&gt;
&lt;li&gt;2. Automated → Runs in CI/CD, no manual effort&lt;/li&gt;
&lt;li&gt;3. Actionable → Failures appear directly in pipeline results&lt;/li&gt;
&lt;li&gt;4. Scalable → Easy to add new APIs &amp;amp; SLAs&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This framework turns performance testing from a one-time activity into a continuous quality gate for APIs.&lt;/p&gt;




&lt;p&gt;✍️ Have you tried integrating performance testing into CI/CD pipelines?&lt;br&gt;
 I'd love to hear how you approached SLA validation and reporting!&lt;/p&gt;

</description>
      <category>devops</category>
      <category>loadtesting</category>
      <category>jmeter</category>
      <category>cicd</category>
    </item>
  </channel>
</rss>
