DEV Community

Cover image for Blocking vs Non-Blocking Code in Node.js Blocking vs Non-Blocking Code in Node.js
Pratham
Pratham

Posted on

Blocking vs Non-Blocking Code in Node.js Blocking vs Non-Blocking Code in Node.js

The difference between a server that freezes under load and one that handles thousands of users effortlessly.


Let me show you two versions of the same server. Both read a file and send it to the client. Both produce the correct result. But one can handle 10,000 users and the other chokes at 50.

// Version A — blocking
const data = fs.readFileSync("large-file.txt", "utf8");
res.send(data);

// Version B — non-blocking
fs.readFile("large-file.txt", "utf8", (err, data) => {
  res.send(data);
});
Enter fullscreen mode Exit fullscreen mode

The only difference? readFileSync vs readFile. One letter — Sync — and your server's ability to handle concurrent users goes from thousands to barely any.

Understanding why this matters is one of the most important things I've learned about Node.js in the ChaiCode Web Dev Cohort 2026. Let me break it down.


What Does Blocking Code Mean?

Blocking code means the program stops and waits for an operation to finish before moving to the next line. Nothing else can run during the wait.

The Waiting Analogy

Imagine you're at a coffee shop. You order a latte. In the blocking model, you stand at the counter, staring at the barista, doing absolutely nothing until your coffee is in your hand. The person behind you? They can't even order. They wait until you walk away.

YOU → Order → STAND AND WAIT → Get coffee → Leave
                    ↑
              3 minutes of doing NOTHING
              Next person CAN'T ORDER until you're done
Enter fullscreen mode Exit fullscreen mode

Blocking in Node.js

const fs = require("fs");

console.log("1. Starting...");

// BLOCKING — Node.js STOPS here until the file is fully read
const data = fs.readFileSync("bigfile.txt", "utf8");

console.log("2. File read complete. Size:", data.length);
console.log("3. Now I can continue...");
Enter fullscreen mode Exit fullscreen mode

Output:

1. Starting...
(... frozen for however long the file read takes ...)
2. File read complete. Size: 1048576
3. Now I can continue...
Enter fullscreen mode Exit fullscreen mode

Between line 1 and line 2, the entire Node.js process is frozen. No other code runs. No other requests are processed. No events are handled. The single thread is stuck waiting for the hard drive.

Blocking Execution Timeline

Main Thread:
────────────────────────────────────────────────────

  0ms                                          200ms
  │                                             │
  ▼                                             ▼
  [log "Starting"]  [██████ BLOCKED ██████]  [log "File read"]  [log "Continue"]
                     ↑                   ↑
                     File read starts     File read ends

  NOTHING else can run during this entire block.
  If 100 requests arrive during these 200ms, they ALL wait.
Enter fullscreen mode Exit fullscreen mode

What Does Non-Blocking Code Mean?

Non-blocking code means the program starts an operation, moves on immediately, and handles the result later when it's ready. The thread stays free for other work.

The Continuing Analogy

Same coffee shop, but this time: you order your latte, the barista gives you a buzzer, and you go sit down, check your phone, chat with a friend. When the buzzer goes off, you pick up your coffee. Meanwhile, the person behind you ordered right after you — no waiting.

YOU → Order → Get buzzer → SIT DOWN (do other things)
                              ↓
                         Buzzer goes off!
                              ↓
                         Pick up coffee

NEXT PERSON → Order → Get buzzer → sit down (immediately)
Enter fullscreen mode Exit fullscreen mode

Non-Blocking in Node.js

const fs = require("fs");

console.log("1. Starting...");

// NON-BLOCKING — Node.js starts the read and MOVES ON immediately
fs.readFile("bigfile.txt", "utf8", (err, data) => {
  console.log("3. File read complete. Size:", data.length);
});

console.log("2. Not waiting — doing other work!");
Enter fullscreen mode Exit fullscreen mode

Output:

1. Starting...
2. Not waiting — doing other work!
3. File read complete. Size: 1048576
Enter fullscreen mode Exit fullscreen mode

Notice the order: 1, 2, 3 — not 1, 3, 2. Line 2 runs immediately after the file read is started. Node.js didn't wait. When the file is ready, the callback fires and handles the result.

Non-Blocking Execution Timeline

Main Thread:
────────────────────────────────────────────────────

  0ms    1ms                                  200ms
  │      │                                     │
  ▼      ▼                                     ▼
  [log]  [log "Not waiting"]  ...FREE...  [callback: "File read complete"]

  Background:
  [████████████████ File being read ████████████████]
  0ms                                           200ms

  Main thread was FREE from 1ms to 200ms.
  It could handle hundreds of other requests during that time!
Enter fullscreen mode Exit fullscreen mode

Why Blocking Slows Servers

This is where the difference becomes critical. In a standalone script, blocking is annoying but survivable. In a server, blocking is catastrophic.

Scenario: 3 Users Request a File

Blocking Server

const http = require("http");
const fs = require("fs");

const server = http.createServer((req, res) => {
  // BLOCKING — each request freezes the server
  const data = fs.readFileSync("page.html", "utf8");
  res.end(data);
});

server.listen(3000);
Enter fullscreen mode Exit fullscreen mode
User A requests → Server reads file (BLOCKS 100ms) → Response sent
                  (Users B and C are WAITING)

User B requests → Server reads file (BLOCKS 100ms) → Response sent
                  (User C is still WAITING)

User C requests → Server reads file (BLOCKS 100ms) → Response sent

Timeline:
─────────────────────────────────────────────────────────
  0ms        100ms       200ms       300ms
  │           │           │           │
  User A ████████████                           → response at 100ms
              User B ████████████                → response at 200ms
                          User C ████████████    → response at 300ms

Total: User C waited 300ms for a 100ms operation.
       Each user's wait = their position × file read time.
Enter fullscreen mode Exit fullscreen mode

Non-Blocking Server

const http = require("http");
const fs = require("fs");

const server = http.createServer((req, res) => {
  // NON-BLOCKING — server stays free between reads
  fs.readFile("page.html", "utf8", (err, data) => {
    res.end(data);
  });
});

server.listen(3000);
Enter fullscreen mode Exit fullscreen mode
User A requests → Start file read → move on
User B requests → Start file read → move on
User C requests → Start file read → move on

  ...all three file reads happen concurrently...

File A ready → Send response to User A
File B ready → Send response to User B
File C ready → Send response to User C

Timeline:
─────────────────────────────────────────────────────────
  0ms  1ms  2ms                                    ~100ms
  │    │    │                                        │
  A    B    C                                      A✓ B✓ C✓
  start all three                              all respond ~100ms

Total: ALL users got responses in ~100ms.
       No one waited for anyone else.
Enter fullscreen mode Exit fullscreen mode

The Impact at Scale

Concurrent Users Blocking Server Non-Blocking Server
1 100ms 100ms
10 1,000ms (last user) ~100ms (all users)
100 10,000ms = 10 seconds 💀 ~100ms (all users)
1,000 100 seconds 💀💀 ~100ms (all users)

The blocking server gets linearly slower with each user. The non-blocking server stays consistently fast.


Async Operations in Node.js

Node.js provides non-blocking versions of most operations. Here are the most common patterns:

File System — Blocking vs Non-Blocking

const fs = require("fs");

// ❌ BLOCKING
const data = fs.readFileSync("config.json", "utf8");
const stats = fs.statSync("config.json");
fs.writeFileSync("output.txt", "Hello");
fs.mkdirSync("new-folder");

// ✅ NON-BLOCKING (callback)
fs.readFile("config.json", "utf8", (err, data) => { /* ... */ });
fs.stat("config.json", (err, stats) => { /* ... */ });
fs.writeFile("output.txt", "Hello", (err) => { /* ... */ });
fs.mkdir("new-folder", (err) => { /* ... */ });

// ✅ NON-BLOCKING (promises — modern)
const fsPromises = require("fs").promises;
const data2 = await fsPromises.readFile("config.json", "utf8");
const stats2 = await fsPromises.stat("config.json");
await fsPromises.writeFile("output.txt", "Hello");
await fsPromises.mkdir("new-folder");
Enter fullscreen mode Exit fullscreen mode

The Pattern

Every blocking fs method ends with Sync. The non-blocking versions either:

  • Accept a callback as the last argument
  • Return a Promise via fs.promises
Blocking:     fs.readFileSync()     returns data directly, blocks thread
Non-blocking: fs.readFile()         returns immediately, data comes in callback
Promise:      fs.promises.readFile()  returns Promise, use with await
Enter fullscreen mode Exit fullscreen mode

Database Calls — Always Non-Blocking

Database drivers in Node.js are inherently non-blocking:

// MongoDB — non-blocking
const user = await db.collection("users").findOne({ name: "Pratham" });

// PostgreSQL — non-blocking
const result = await pool.query("SELECT * FROM users WHERE id = $1", [1]);

// These use await, but they DON'T block the thread.
// Other requests continue being processed while the DB query runs.
Enter fullscreen mode Exit fullscreen mode

HTTP Requests — Always Non-Blocking

// fetch — non-blocking
const response = await fetch("https://api.example.com/data");
const data = await response.json();

// The thread is FREE while waiting for the API response.
Enter fullscreen mode Exit fullscreen mode

Timers — Non-Blocking by Design

// setTimeout — non-blocking
setTimeout(() => {
  console.log("This runs later");
}, 2000);
console.log("This runs NOW");

// Output: "This runs NOW" → (2 seconds) → "This runs later"
Enter fullscreen mode Exit fullscreen mode

Real-World Example: API Server

Let's build a realistic example that shows blocking vs non-blocking in a server handling multiple operations:

Blocking Version — Everything Sequential

const http = require("http");
const fs = require("fs");

const server = http.createServer((req, res) => {
  // Step 1: Read user data (BLOCKS 50ms)
  const userData = fs.readFileSync("users.json", "utf8");
  const users = JSON.parse(userData);

  // Step 2: Read config (BLOCKS 30ms)
  const configData = fs.readFileSync("config.json", "utf8");
  const config = JSON.parse(configData);

  // Step 3: Read template (BLOCKS 20ms)
  const template = fs.readFileSync("template.html", "utf8");

  // Total: 100ms BLOCKED per request
  res.end(template.replace("{{users}}", JSON.stringify(users)));
});

server.listen(3000);
Enter fullscreen mode Exit fullscreen mode
Per-request timeline:
[██ users 50ms ██][██ config 30ms ██][██ template 20ms ██] → Response

Total: 100ms blocked. Thread frozen the entire time.
10 concurrent requests = 1 second for the last user.
Enter fullscreen mode Exit fullscreen mode

Non-Blocking Version — Everything Concurrent

const http = require("http");
const fs = require("fs").promises;

const server = http.createServer(async (req, res) => {
  try {
    // All three reads start at the SAME TIME
    const [userData, configData, template] = await Promise.all([
      fs.readFile("users.json", "utf8"),
      fs.readFile("config.json", "utf8"),
      fs.readFile("template.html", "utf8"),
    ]);

    const users = JSON.parse(userData);
    const config = JSON.parse(configData);

    res.end(template.replace("{{users}}", JSON.stringify(users)));
  } catch (error) {
    res.writeHead(500);
    res.end("Server error");
  }
});

server.listen(3000);
Enter fullscreen mode Exit fullscreen mode
Per-request timeline:
[██ users 50ms ████████████████████]
[██ config 30ms ██████████]          ← all three run concurrently
[██ template 20ms ████]
                                     → Response at ~50ms (slowest read)

Total: ~50ms instead of 100ms. Thread free during I/O.
10 concurrent requests ≈ still ~50ms for everyone.
Enter fullscreen mode Exit fullscreen mode

By using Promise.all(), all three file reads happen concurrently. The total time is the slowest operation (50ms), not the sum of all operations (100ms).


When Is Blocking Code Acceptable?

Blocking isn't always bad. There are specific situations where it's fine:

✅ OK to Block

// 1. Application startup — before the server starts listening
const config = fs.readFileSync("config.json", "utf8");
const settings = JSON.parse(config);
// No users are waiting yet — blocking is fine here.

app.listen(3000); // NOW the server starts accepting requests

// 2. CLI scripts that run once and exit
const data = fs.readFileSync(process.argv[2], "utf8");
console.log(data.length, "characters");
// Single-user, single-run — no concurrency needed
Enter fullscreen mode Exit fullscreen mode

❌ Never Block

// Inside request handlers — NEVER use Sync methods
app.get("/data", (req, res) => {
  const data = fs.readFileSync("data.json", "utf8"); // ❌ BLOCKS ALL USERS
  res.json(JSON.parse(data));
});

// Inside event handlers
socket.on("message", (msg) => {
  const log = fs.readFileSync("log.txt", "utf8"); // ❌ BLOCKS EVENT LOOP
  // ...
});
Enter fullscreen mode Exit fullscreen mode

The Simple Rule

During startup → Blocking is fine (no users yet)
During runtime → NEVER block (users are waiting)
Enter fullscreen mode Exit fullscreen mode

Let's Practice: Hands-On Assignment

Part 1: Measure the Difference

const fs = require("fs");

// Create a test file first
fs.writeFileSync("testfile.txt", "x".repeat(10_000_000));

// BLOCKING
console.time("Blocking");
for (let i = 0; i < 5; i++) {
  fs.readFileSync("testfile.txt", "utf8");
}
console.timeEnd("Blocking");

// NON-BLOCKING
console.time("Non-blocking");
let completed = 0;
for (let i = 0; i < 5; i++) {
  fs.readFile("testfile.txt", "utf8", () => {
    completed++;
    if (completed === 5) console.timeEnd("Non-blocking");
  });
}
Enter fullscreen mode Exit fullscreen mode

Run this and compare the times. Non-blocking will be faster because the reads happen concurrently.

Part 2: See Server Impact

const http = require("http");
const fs = require("fs");

// Create a large file
fs.writeFileSync("large.txt", "data\n".repeat(1_000_000));

const server = http.createServer((req, res) => {
  if (req.url === "/blocking") {
    const data = fs.readFileSync("large.txt", "utf8");
    res.end(`Blocking: ${data.length} chars\n`);
  }

  if (req.url === "/non-blocking") {
    fs.readFile("large.txt", "utf8", (err, data) => {
      res.end(`Non-blocking: ${data.length} chars\n`);
    });
  }
});

server.listen(3000, () => {
  console.log("Test: open /blocking in one tab, then /non-blocking quickly");
  console.log("Notice how /blocking makes /non-blocking wait!");
});
Enter fullscreen mode Exit fullscreen mode

Part 3: Convert Blocking to Non-Blocking

Take this blocking code and convert it:

// ❌ Blocking version
const users = JSON.parse(fs.readFileSync("users.json", "utf8"));
const posts = JSON.parse(fs.readFileSync("posts.json", "utf8"));
console.log(`${users.length} users, ${posts.length} posts`);

// ✅ Your task: rewrite using async/await and Promise.all()
// Hint: use fs.promises.readFile
Enter fullscreen mode Exit fullscreen mode

Key Takeaways

  1. Blocking code stops the thread and waits for the operation to finish. Nothing else can run during the wait — including handling other user requests.
  2. Non-blocking code starts the operation, moves on immediately, and handles the result via callbacks, Promises, or async/await when it's ready.
  3. In a server context, blocking is catastrophic — one slow operation freezes every user. Non-blocking lets the server handle thousands of concurrent requests by staying free during I/O.
  4. Every Sync method in Node.js is blocking. Use the callback or Promise version inside request handlers. Only use Sync during application startup.
  5. Use Promise.all() for independent operations that can run concurrently — the total time is the slowest operation, not the sum of all.

Wrapping Up

The difference between blocking and non-blocking is the difference between a server that handles 50 users and one that handles 50,000. It's not about writing different logic — it's about choosing the right version of the same operation. readFileSync vs readFile. Waiting vs continuing. Frozen vs free.

I'm learning all of this through the ChaiCode Web Dev Cohort 2026 under Hitesh Chaudhary and Piyush Garg. Once you internalize this pattern — start operation, move on, handle result later — you'll write non-blocking code naturally. And your servers will thank you.

Connect with me on LinkedIn or visit PrathamDEV.in. More articles on the way as the backend journey continues.

Happy coding! 🚀


Written by Pratham Bhardwaj | Web Dev Cohort 2026, ChaiCode

Top comments (0)