DEV Community

Nicu Micle
Nicu Micle

Posted on

Meet parallelHTTP — A Simple Tool to Stress-Test Your APIs

In a world where backend systems and APIs are the backbone of many web services, testing their performance and robustness under load is essential. That's where parallelhttp comes in — a lightweight, open-source tool designed to make it easy to send many HTTP requests in parallel, see how your server behaves, measure latency and success rate, and export results for further analysis.

What is ParallelHTTP

ParallelHTTP is a minimalistic but functional tool that lets you:

  • Send multiple HTTP requests in parallel.
  • Configure method (GET, POST, etc.), endpoint URL, request body, and timeouts.
  • Track response times and status codes.
  • Get an aggregated summary: success/error counts, average latency, latency percentiles.
  • Export responses to CSV for later inspection.

It ships with multiple "interfaces", depending on how you want to use it:

  • Web UI: for interactive testing via browser.
  • CLI mode: quick testing from terminal — ideal for automation, scripts, or CI.
  • Docker image: to containerize usage, making integration with other tooling easier.
  • REST API endpoint: If you prefer to programmatically trigger loads or integrate with other systems, parallelhttp offers that too.

Why Parallel HTTP Requests — And How parallelHTTP Fits In

Making many HTTP requests in parallel is a common approach when you want to stress test, benchmark, or load-test APIs or web services. Rather than issuing requests serially (one after the other), parallel requests let you simulate concurrent clients, which is often closer to real-world usage, especially under load or when many clients access your service simultaneously.

There are many ways to implement parallelism: e.g. using threading, asynchronous programming, or language-specific libraries. For instance, in Python you might use threads or async libraries to parallelize HTTP calls.

But the appeal of parallelhttp is that it bundles everything together — UI, CLI, REST API — so you don't need to write custom scripts or glue code. For developers or QA engineers who just want to spin up a quick load test, parallelhttp provides an "out-of-the-box" solution.

How to install parallelHTTP

You can use ParallelHTTP via binary, Docker, Web UI, or CLI.

1. Install Using Binaries (CLI mode)

Download the latest release from:

👉 GitHub Releases

Run:

./parallelhttp --help
Enter fullscreen mode Exit fullscreen mode
  -duration duration
        Max duration for all calls. Example: 0->no limit, 1ms, 1s, 10m
  -endpoint string
        Request endpoint to be called.
  -format string
        Response format. One of: text, yaml, json. Default json. (default "json")
  -method string
        Request Method. Default GET. (default "GET")
  -parallel int
        Number of parallel calls. Default 1. (default 1)
  -timeout duration
        Request timeout. Default 10s
Enter fullscreen mode Exit fullscreen mode

Usage example:

./parallelhttp \
  --endpoint=http://localhost:8080/test \
  --parallel=5 \
  --method=GET \
  --timeout=2s \
  --duration=10s \
  --format=json 
Enter fullscreen mode Exit fullscreen mode

will output:

{
  "requests": [
    {
      "response": {
        "status_code": 200,
        "time": "2025-12-02T04:39:26.450811405+01:00",
        "duration": 176680135,
        "duration_h": "176.680135ms"
      },
      "error": null,
      "error_message": null
    },
    {
      "response": {
        "status_code": 200,
        "time": "2025-12-02T04:39:26.450838753+01:00",
        "duration": 177105875,
        "duration_h": "177.105875ms"
      },
      "error": null,
      "error_message": null
    },
    {
      "response": {
        "status_code": 200,
        "time": "2025-12-02T04:39:26.450989804+01:00",
        "duration": 176999320,
        "duration_h": "176.99932ms"
      },
      "error": null,
      "error_message": null
    },
    {
      "response": {
        "status_code": 200,
        "time": "2025-12-02T04:39:26.450761076+01:00",
        "duration": 177158817,
        "duration_h": "177.158817ms"
      },
      "error": null,
      "error_message": null
    },
    {
      "response": {
        "status_code": 200,
        "time": "2025-12-02T04:39:26.450879196+01:00",
        "duration": 179940733,
        "duration_h": "179.940733ms"
      },
      "error": null,
      "error_message": null
    }
  ],
  "stats": {
    "start_time": "2025-12-02T04:39:26.450727731+01:00",
    "end_time": "2025-12-02T04:39:26.630824982+01:00",
    "duration": "180.097251ms",
    "latency": {
      "p50": "177.105875ms",
      "p90": "179.940733ms",
      "p99": "179.940733ms"
    }
  }
}
Enter fullscreen mode Exit fullscreen mode

2. Web UI

docker run --rm -p 8080:8080 -it nicumicle/parallelhttp
Enter fullscreen mode Exit fullscreen mode

Open in browser:
👉 http://localhost:8080

You will see:

Web UI

Once you select an endpoint and click "Run", you will get the results:

Use Cases — When parallelHTTP Shines

Here are a few scenarios where using parallelhttp can be especially useful:

  • API performance testing: want to know how your endpoint behaves when hit 100, 1000 or 10000 times concurrently — parallelhttp can simulate load and report response times, errors, and timeouts.

  • Benchmarking new server versions: deploy a new version and compare latency or error rate with the previous one.

  • Stress testing/load testing before production: especially useful if you're about to launch a feature or expect a spike in traffic.

  • Regression testing for stability under load: integrate into CI/CD to automatically run parallel tests after changes.

  • Exporting detailed metrics for analysis: thanks to CSV export and aggregated stats, you can plug results into spreadsheets or graphs to track trends.

Final Thoughts

parallelHTTP serves a very real need: a straightforward tool for sending parallel HTTP requests that measures latency, status codes, and exports results. For developers, QA engineers, or anyone needing quick API stress-tests, it’s a neat addition to the toolbox.

If you’re curious, check the repository and play with your APIs.

Top comments (0)