DEV Community

Deveshwar Jaiswal
Deveshwar Jaiswal

Posted on • Originally published at beyondcodekarma.in

The JS Event Loop Has a Model Gap, Here's What Most Tutorials Don't Show You

The standard explanation of the JavaScript Event Loop goes like this: there's a call stack,
there's a queue, and when the stack is empty the loop pulls the next item from the queue.

That model isn't wrong. It's incomplete in ways that will eventually burn you.


What the standard explanation misses

1. The microtask queue is not part of the loop cycle in the way the task queue is

Most diagrams show two queues with different priorities. The actual model is different.

After every task completes, the runtime runs the microtask checkpoint. This checkpoint
drains the microtask queue completely — including any microtasks queued by microtasks — before
the event loop selects the next task.

setTimeout(() => console.log('task'), 0);

Promise.resolve()
  .then(() => {
    console.log('microtask 1');
    return Promise.resolve();
  })
  .then(() => console.log('microtask 2'));

console.log('sync');
Enter fullscreen mode Exit fullscreen mode

Output: syncmicrotask 1microtask 2task

The setTimeout callback doesn't run until every microtask — including chains — has settled.

2. The render step sits between tasks, not between microtasks

The browser's render pipeline (style recalc, layout, paint) runs between tasks. Not between
microtasks.

This means: a long Promise chain is as harmful to your frame rate as a long synchronous
function. No heavy computation required. If you're queueing work in Promises and wondering
why your animations stutter, this is likely the reason.

3. setTimeout(fn, 0) and Promise.resolve().then(fn) are different lanes

Not different speeds. Different lanes with different rules.

  • setTimeout → task queue. One per loop iteration. Render can happen before the next one.
  • Promise.resolve().then → microtask queue. Entire chain runs before the loop advances.

Swapping one for the other is not a performance micro-optimisation. It changes execution
semantics.


The mental model I use

I mapped this to Vedic karma and dharma — unconventional framing, but it makes the model stick.

Every async operation you schedule is karma: a consequence set in motion, deferred but
not cancelled. The event loop is dharma: the rule governing when consequences are processed.

  • Call stack = the present moment. One thing at a time.
  • Microtask queue = urgent karma. Drains immediately after the present resolves.
  • Task queue = future karma. Waits for a full loop cycle.
  • Render step = the breath between cycles. Can be held by urgent karma flooding.

It's a mnemonic layer, not a replacement for reading the WHATWG HTML spec. But it's why
the model stuck for me in a way no diagram had managed.


Interactive visualizer

I built a step-by-step visualizer: write any snippet, step through execution, and watch
every queue animate in real time. Microtask drain, render step timing, task queue behaviour
— all visible.

👉 Try the interactive visualizer →


If you're building production frontend and want to dig into performance or architecture →
I'm available for hire.

Top comments (0)