DEV Community

Cover image for How I Took a Slow Node.js API from 5.7 sec 11ms Using Real Load Tests (Beginner-Friendly)
Vishal Pandey
Vishal Pandey

Posted on

How I Took a Slow Node.js API from 5.7 sec 11ms Using Real Load Tests (Beginner-Friendly)

Most tutorials show you “fast Node.js code,” but they never show you what happens when you actually measure performance under load.

I wanted to learn real backend optimization, so I built a small URL redirect API:

GET /:slug → returns original URL

Simple idea.
But the first time I load-tested it, the API took 5.73 seconds to respond.

This article is a beginner → mid-level friendly breakdown of:

  1. what slowed my API
  2. how I optimized it
  3. what actually improved performance
  4. real k6 load test results at every step
  5. and what beginners usually misunderstand about backend latency

Stage 1 — Baseline Test (No Index, Direct MySQL Connection)

Avg latency: 5.73 seconds
Requests: ~68 RPS

This was the first k6 run.

before-slug-index

Here are the baseline numbers:

Why it was this slow?
MySQL was doing a full table scan for every slug.
No index → MySQL compares slug against every row → worst possible plan.
At high concurrency, MySQL was simply overwhelmed.

Stage 2 — Added a Slug Index

Avg latency: 27.1 ms
Requests: ~18k RPS

Index command: CREATE INDEX idx_slug ON urls(slug);

after-slug-index

After adding the index:
This alone brought latency from 5.73 seconds → 27 ms.
This is the single biggest improvement you can ever make in a read-heavy API.
Beginners underestimate how critical indexing is.
No amount of Node.js code optimization can beat a proper DB index.

Stage 3 — Switched from mysql.createConnection to MySQL2 Connection Pool

Avg latency: ~18 ms
Before this, every request created a new MySQL connection.
That’s expensive under load.
So I switched to:
export const db = mysql2.createPool({
host,
user,
password,
database,
connectionLimit: 10,
});

After switching to pooling:

after-pooling

Why this helps
Reuses open connections
Avoids handshake overhead
Much better concurrency
MySQL stays stable under load
If you're using MySQL + Node.js:
Connection pools are mandatory.

Stage 4 — Added Redis Cache (Final Optimization)

Avg latency: 11.75 ms
Requests: ~42k RPS
Redis turns your slug lookup into an O(1) in-memory lookup.
Updated handler:

app.get('/:slug', async (req, res) => {
const slug = req.params.slug;

const cached = await redis.get(slug);
if (cached) return res.redirect(cached);

const data = await db.query("SELECT url FROM urls WHERE slug = ?", [slug]);
if (!data[0].length) return res.status(404).send("Not found");

await redis.set(slug, data[0][0].url);
return res.redirect(data[0][0].url);
});

k6 results after Redis:

after-redis
From 5.7 seconds → 11 ms.
This is a ~520x improvement, and nothing about the API changed except internal optimizations.

🔗 GitHub Repository (Full Source Code)

You can check out the entire project here:

👉 Github Link

🙏 Thanks for Reading
If you have questions, want to discuss backend engineering, or just want to say hi —
I’m always open to messages.

Top comments (0)