DEV Community

Cover image for ⚡ Deep Dive: How `Promise.all` Works with API & DB Calls in Node.js
Abhinav
Abhinav

Posted on

⚡ Deep Dive: How `Promise.all` Works with API & DB Calls in Node.js

1. JavaScript vs Node.js Runtime

  • JavaScript (V8):

    • Single-threaded.
    • Executes code top-to-bottom.
    • Uses the microtask queue to resolve promises.
  • Node.js runtime:

    • Built on V8 + libuv.
    • libuv provides the event loop + thread pool.
    • Integrates with the OS networking stack for async I/O.

👉 Node.js achieves concurrency not by adding threads for JS, but by outsourcing work to the OS, DB engines, and libuv workers.


2. How Promise.all Works

const p = Promise.all([p1, p2, p3]);
Enter fullscreen mode Exit fullscreen mode
  • Creates an aggregate promise p.
  • Subscribes to each input promise (p1, p2, p3).
  • Tracks results + pending count.
  • If all resolve → p resolves with an array of results.
  • If any reject → p rejects immediately.
  • Resolution runs in the microtask queue, before timers and I/O.

3. API Calls with Promise.all

const fetch = require("node-fetch");

async function run() {
  const [r1, r2, r3] = await Promise.all([
    fetch("https://httpbin.org/delay/1"),
    fetch("https://httpbin.org/delay/1"),
    fetch("https://httpbin.org/delay/1")
  ]);
  console.log(await Promise.all([r1.json(), r2.json(), r3.json()]));
}
run();
Enter fullscreen mode Exit fullscreen mode

What happens under the hood:

  1. fetch creates TCP sockets.
  2. OS handles DNS, TCP handshake, TLS.
  3. libuv registers sockets for readiness.
  4. When data arrives → libuv pushes I/O callbacks.
  5. Promises resolve via microtasks.

✅ Concurrency comes from the OS networking stack.


4. DB Queries with Promise.all

const { Pool } = require("pg");
const pool = new Pool();

async function run() {
  const [users, orders] = await Promise.all([
    pool.query("SELECT * FROM users LIMIT 5"),
    pool.query("SELECT * FROM orders LIMIT 5")
  ]);
  console.log(users.rows, orders.rows);
}
run();
Enter fullscreen mode Exit fullscreen mode

What happens under the hood:

  1. Queries are written to DB sockets.
  2. PostgreSQL executes them in parallel inside DB worker processes.
  3. Results are streamed back.
  4. libuv notifies event loop.
  5. Promises resolve, aggregated by Promise.all.

✅ Concurrency here is provided by the DB engine.


5. Event Loop & Microtasks

Node.js event loop phases:

  1. TimerssetTimeout, setInterval.
  2. Pending callbacks.
  3. Idle/prepare.
  4. Poll → new I/O events (e.g., socket readiness).
  5. ChecksetImmediate.
  6. Close callbacks.

🔑 Between every phase → microtask queue runs first.
That’s why promise resolutions (Promise.all) happen before timers or I/O callbacks.


6. libuv: Networking vs Thread Pool

  • Networking I/O (fetch, DB sockets):

    • Uses OS mechanisms like epoll/kqueue/IOCP.
    • No threads involved → highly scalable.
  • Thread Pool:

    • Used when OS doesn’t provide async APIs.
    • Examples: fs.readFile, crypto.pbkdf2.
    • Default size = 4 (UV_THREADPOOL_SIZE).

👉 API calls & DB sockets rarely use threads — they rely on the OS and external servers for concurrency.


7. Benchmarks: Sequential vs Parallel

🔹 Sequential DB Queries

const { Pool } = require("pg");
const pool = new Pool();

async function sequential() {
  console.time("sequential");
  await pool.query("SELECT pg_sleep(1)");
  await pool.query("SELECT pg_sleep(1)");
  await pool.query("SELECT pg_sleep(1)");
  await pool.query("SELECT pg_sleep(1)");
  await pool.query("SELECT pg_sleep(1)");
  console.timeEnd("sequential");
}
sequential().then(() => pool.end());
Enter fullscreen mode Exit fullscreen mode
  • Each query sleeps for 1 second.
  • Run sequentially → ~5 seconds total.

🔹 Parallel DB Queries

const { Pool } = require("pg");
const pool = new Pool();

async function parallel() {
  console.time("parallel");
  await Promise.all([
    pool.query("SELECT pg_sleep(1)"),
    pool.query("SELECT pg_sleep(1)"),
    pool.query("SELECT pg_sleep(1)"),
    pool.query("SELECT pg_sleep(1)"),
    pool.query("SELECT pg_sleep(1)")
  ]);
  console.timeEnd("parallel");
}
parallel().then(() => pool.end());
Enter fullscreen mode Exit fullscreen mode
  • All queries fired at once.
  • Each sleeps for 1s → ~1 second total.

🔹 Results

Strategy Queries Time
Sequential 5 × 1s ~5s
Parallel 5 × 1s ~1s

🚀 Promise.all gives a 5× speedup by overlapping I/O work.


8. Practical Pitfalls

⚠️ API Rate Limits

  • Too many concurrent requests → HTTP 429 or throttling. ✅ Use concurrency control (p-limit, Bottleneck).

⚠️ DB Pool Exhaustion

await Promise.all(bigArray.map(() => pool.query("...")));
Enter fullscreen mode Exit fullscreen mode
  • Exceeds connection pool → queries queue → latency spikes. ✅ Tune pool size + run queries in batches.

⚠️ Error Handling

  • Promise.all fails fast on the first rejection. ✅ Use Promise.allSettled for resilience.

⚠️ Memory Usage

  • Large Promise.all → stores all intermediate results in memory. ✅ For very large workloads → stream results instead of batching everything.

9. Alternatives to Promise.all

  • Promise.allSettled → get all results, even if some fail.
  • Promise.any → resolve on the first success (useful for redundant services).
  • Promise.race → resolve/reject on the first settled (great for timeouts).

🔑 Final Takeaways

  • Promise.all coordinates async tasks, it doesn’t make JS parallel.
  • API concurrency comes from OS networking.
  • DB concurrency comes from database engines.
  • Node.js event loop + libuv glue it all together.
  • Use with care: pool limits, API rate limits, memory usage matter.
  • Benchmarks show the real-world latency gains, but also highlight scalability risks.

💡 Closing Thought

Promise.all isn’t magic — it’s orchestration.
The “parallelism” you see comes from the OS kernel, libuv, and external systems.

An engineer’s job is not just to know that it “runs concurrently,” but to understand how much concurrency the system can handle before it breaks. That’s what separates toy demos from production-grade systems.

Top comments (0)