DEV Community

Thesius Code
Thesius Code

Posted on • Originally published at datanest-stores.pages.dev

API Testing Automation

API Testing Automation

Shipping untested APIs is shipping bugs on a schedule. This toolkit gives you a complete testing pipeline — from Postman collections that double as living documentation, to Newman CI integration that gates your deploys, to Pact contract tests that catch breaking changes before they reach consumers, to k6 load tests that prove your API handles production traffic. Every script is ready to plug into your CI/CD pipeline today.

Key Features

  • Postman Collection Generator — Python script that reads your OpenAPI spec and generates organized Postman collections with environment variables, pre-request scripts, and test assertions
  • Newman CI Pipeline — GitHub Actions and GitLab CI configs that run your Postman collections on every PR with HTML report generation
  • Pact Contract Testing — Consumer-driven contract tests in Python with a Pact broker publishing workflow and verification on the provider side
  • k6 Load Testing Scripts — Realistic load test scenarios with ramp-up profiles, thresholds, and custom metrics for latency percentiles
  • Test Data Factory — Deterministic test data generators that produce consistent, realistic payloads without external dependencies
  • Environment Manager — YAML-based environment configs (dev/staging/prod) with variable substitution and secret masking
  • Coverage Reporter — Analyzes your OpenAPI spec against your test collection and reports which endpoints lack test coverage

Quick Start

  1. Generate a Postman collection from your spec:
# scripts/generate_collection.py
from api_testing_automation import CollectionGenerator

gen = CollectionGenerator(
    spec_path="openapi.yaml",
    base_url="https://api.example.com/v1",
    auth_token="{{auth_token}}"  # Postman variable reference
)

collection = gen.generate()
collection.export("tests/postman/acme-api.postman_collection.json")
Enter fullscreen mode Exit fullscreen mode
  1. Run tests locally with Newman:
newman run tests/postman/acme-api.postman_collection.json \
    --environment tests/environments/dev.json \
    --reporters cli,htmlextra \
    --reporter-htmlextra-export reports/test-report.html
Enter fullscreen mode Exit fullscreen mode
  1. Run a k6 load test:
k6 run tests/load/spike-test.js --out json=results.json
Enter fullscreen mode Exit fullscreen mode

Architecture

api-testing-automation/
├── scripts/                     # Collection generator, coverage reporter, API client
├── tests/
│   ├── postman/                 # Postman collections + pre-request scripts
│   ├── contract/                # Pact consumer/provider tests
│   ├── load/                    # k6 scripts: smoke, load, spike, soak
│   ├── conftest.py              # Pytest fixtures
│   └── test_core.py             # Core integration tests
├── environments/                # dev.json, staging.json, prod.json
├── ci/                          # GitHub Actions + GitLab CI configs
└── config.example.yaml
Enter fullscreen mode Exit fullscreen mode

The testing pyramid here is intentional: many contract tests (fast, isolated), moderate integration tests via Postman (medium speed), and few load tests (expensive, run on schedule).

Usage Examples

k6 Load Test with Custom Thresholds

// tests/load/load-test.js
import http from 'k6/http';
import { check, sleep } from 'k6';
import { Rate, Trend } from 'k6/metrics';

const errorRate = new Rate('errors');
const latency = new Trend('api_latency');

export const options = {
    stages: [
        { duration: '1m', target: 20 },   // Ramp up
        { duration: '3m', target: 50 },   // Sustained load
        { duration: '1m', target: 0 },    // Ramp down
    ],
    thresholds: {
        http_req_duration: ['p(95)<500', 'p(99)<1000'],
        errors: ['rate<0.05'],  // Less than 5% error rate
    },
};

export default function () {
    const res = http.get('https://api.example.com/v1/users', {
        headers: { Authorization: `Bearer ${__ENV.API_TOKEN}` },
    });

    check(res, {
        'status is 200': (r) => r.status === 200,
        'response time < 500ms': (r) => r.timings.duration < 500,
        'has valid JSON': (r) => {
            try { JSON.parse(r.body); return true; } catch { return false; }
        },
    });

    errorRate.add(res.status !== 200);
    latency.add(res.timings.duration);
    sleep(1);
}
Enter fullscreen mode Exit fullscreen mode

CI Integration (GitHub Actions)

# ci/github-actions.yml
name: API Tests
on: [pull_request]
jobs:
  contract-tests:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v4
      - uses: actions/setup-python@v5
        with: { python-version: '3.12' }
      - run: pytest tests/contract/ -v --junitxml=reports/contract.xml
  integration-tests:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v4
      - run: npm install -g newman newman-reporter-htmlextra
      - run: newman run tests/postman/acme-api.postman_collection.json --environment tests/environments/staging.json --reporters cli,htmlextra
Enter fullscreen mode Exit fullscreen mode

Configuration

Key Type Default Description
spec_path string required Path to your OpenAPI spec file
base_url string required Target API base URL
auth.type string "bearer" Auth type: bearer, apikey, basic
auth.token_var string "auth_token" Environment variable name for auth token
load_test.default_vus int 10 Default virtual users for load tests
coverage.fail_threshold float 0.8 Minimum endpoint coverage (0.0-1.0)
reports.output_dir string "./reports" Directory for generated reports

Best Practices

  • Run contract tests on every PR, load tests on a schedule. Contract tests are fast. Load tests are expensive — run them nightly or before releases.
  • Use environment variables for secrets, never hardcode. The {{auth_token}} pattern in Postman and __ENV.API_TOKEN in k6 keep credentials out of test files.
  • Test error paths, not just happy paths. Include 401, 403, 404, 422, and 429 scenarios in your collection.
  • Set realistic think times in load tests. Without sleep() between requests, k6 generates unrealistic traffic patterns.

Troubleshooting

Newman exits with code 1 but all tests pass
Check for unhandled errors in pre-request scripts. A JavaScript exception in a pre-request script causes Newman to exit non-zero even if all test assertions pass. Add try/catch blocks to your pre-request scripts.

k6 reports "request timeout" on all requests
Verify the target URL is accessible from the machine running k6. If testing against a staging environment behind a VPN, k6 needs network access. Also check that --insecure-skip-tls-verify is set if using self-signed certificates.

Pact verification fails with "missing interaction"
Ensure the Pact contract file (JSON) is shared between consumer and provider. If using a Pact broker, verify both sides reference the same consumer/provider names and the contract version matches.

Coverage report shows 0% despite having tests
The coverage reporter matches tests to spec endpoints by HTTP method + path pattern. Ensure your Postman request URLs match the paths defined in your OpenAPI spec exactly, including path parameters.


This is 1 of 7 resources in the API Developer Pro toolkit. Get the complete [API Testing Automation] with all files, templates, and documentation for $29.

Get the Full Kit →

Or grab the entire API Developer Pro bundle (7 products) for $79 — save 30%.

Get the Complete Bundle →


Related Articles

Top comments (0)