Sometimes, code feels like a hike. You're blazing through logic, climbing hills of complexity, and then—bam—you hit a dense, tangled forest. That's where I was the first time I met JavaScript generators.
I was building a data loader, piping results in batches from an API that couldn’t give me everything at once. I didn’t want to load all at once. But I didn’t want to write some mess of manual state and index-tracking either.
A colleague glanced over and said, “Why not just use a generator?”
What followed was a rabbit hole, but a good one.
What Are Iterators, Really?
Before we talk about generators, let’s meet their simpler cousin: iterators.
An iterator is just an object that knows how to step through a sequence, one value at a time.
It has a .next()
method that returns objects like:
{ value: 'next thing', done: false }
And when it’s out of things to give you:
{ value: undefined, done: true }
In short, it’s a protocol. Anything that follows this shape is an iterator.
You’ve probably used one without realizing it:
const arr = [1, 2, 3];
const iterator = arr[Symbol.iterator]();
console.log(iterator.next()); // { value: 1, done: false }
console.log(iterator.next()); // { value: 2, done: false }
But why would you write one?
Because sometimes you want to control how iteration happens.
Why Not Just Use Arrays or Objects?
Arrays and objects are great when:
- You have all the data up front
- You want to loop quickly and repeatedly
- You need random access (e.g.
arr[42]
)
But they can fall short when:
- You’re dealing with huge or infinite datasets
- You need to calculate values on demand
- You want to pause and resume a process
- You’re modeling a stateful process, not just a collection
Generators shine in lazy, sequential, and process-driven scenarios, especially when you don’t want to (or can’t) load or compute everything all at once.
Think of arrays as a box of things and generators as a factory that makes things when asked.
Custom Iterators: Manual but Precise
Let’s say you want to build a countdown from 5 to 1. You could do this:
function createCountdown(start) {
let current = start;
return {
next() {
if (current > 0) {
return { value: current--, done: false };
} else {
return { done: true };
}
}
};
}
const countdown = createCountdown(5);
console.log(countdown.next()); // { value: 5, done: false }
It’s a bit clunky. Lots of boilerplate.
That’s where generators shine.
Generators: Laziness With Style
A generator is a special function you can pause and resume. It’s written with function*
and uses the yield keyword to produce values one at a time.
Here’s that same countdown, generator-style:
function* countdown(start) {
while (start > 0) {
yield start--;
}
}
const counter = countdown(5);
console.log(counter.next()); // { value: 5, done: false }
Each call to .next()
resumes the function where it left off. It’s like a paused Netflix show that starts right where you stopped.
No boilerplate. Just pure flow.
When You Actually Need Them
One more power move worth knowing: generator composition. You can delegate to another generator using yield*
. This is useful for breaking up logic into reusable pieces.
function* inner() {
yield 'a';
yield 'b';
}
function* outer() {
yield 'start';
yield* inner();
yield 'end';
}
[...outer()]; // ['start', 'a', 'b', 'end']
This keeps your generators modular and expressive, especially when building sequences or behaviors out of smaller building blocks.
Cool tricks, sure. But when would you actually use this in real life
1. Lazy Evaluation
Great for large datasets or infinite sequences.
function* fibonacci() {
let [a, b] = [0, 1];
while (true) {
yield a;
[a, b] = [b, a + b];
}
}
const fib = fibonacci();
console.log(fib.next().value); // 0
console.log(fib.next().value); // 1
2. Controlling Async Flow (Pre-async/await)
Before Promises and async/await
, libraries like co
used generators to handle async:
function* loadData() {
const users = yield fetch('/api/users');
const posts = yield fetch('/api/posts');
return [users, posts];
}
3. State Machines with .next(value)
Generators let you pass values back into the generator.
function* loginFlow() {
const username = yield "What's your username?";
const password = yield "What's your password?";
const confirmation = yield `Confirm login for ${username}? (yes/no)`;
if (confirmation.toLowerCase() === "yes") {
return `Logged in as ${username}`;
} else {
return `Login cancelled.`;
}
}
const flow = loginFlow();
console.log(flow.next().value); // What's your username?
console.log(flow.next("alice").value); // What's your password?
console.log(flow.next("hunter2").value); // Confirm login for alice?
console.log(flow.next("yes").value); // Logged in as alice
This can power forms, wizards, chatbots, or turn-based logic.
Async Generators: Lazy + Async
Until now, our generators have been synchronous. You call .next()
, and it gives you a value.
But what if each value needs to be awaited? That’s where async generators come in.
async function* fetchPages(start = 1) {
let page = start;
while (true) {
const res = await fetch(`/api/items?page=${page}`);
const data = await res.json();
if (!data.length) break;
yield data;
page++;
}
}
for await (const items of fetchPages()) {
process(items);
}
Key differences:
-
async function*
lets you await inside the generator - for await...of iterates over async values outside
Perfect for paginated APIs, streaming responses, or infinite scroll.
Gotchas With Generators
They’re elegant, but not without quirks.
1. Generator objects are one-time use
const gen = countdown(3);
[...gen]; // [3, 2, 1]
[...gen]; // [] — already exhausted
You need to call the function again for a fresh run.
2. Error handling is manual
You can throw an error into a generator with .throw()
, but if the generator isn’t expecting it, the error will crash it.
function* fragile() {
try {
yield 'safe so far';
yield 'still good';
} catch (err) {
console.log('Caught inside generator:', err.message);
}
}
const g = fragile();
console.log(g.next()); // { value: 'safe so far', done: false }
console.log(g.throw(new Error("boom")));
// logs: Caught inside generator: boom
This pattern is especially helpful in real-world scenarios where you want to cancel a generator mid-flow—like aborting a multi-step process, retrying a failed step, or reacting to a timeout from the outside.
3. Not always worth it
Generators are powerful, but overkill if you just need to loop through an array. If you're not taking advantage of lazy evaluation, pausing execution, or streaming results, a plain for
loop or .map()
is usually simpler and more readable.
A Final Thought
Generators feel like a hidden language inside JavaScript. A gentler kind of loop, a quieter kind of power. They let you model flows that don’t fit into map
, for
, or reduce
.
Most of the time, you don’t need them.
But when you do, they make the code read like a story instead of a switchboard.
And sometimes, that’s exactly what you want.
Enjoyed this? Follow for more dev musings, quiet power tools, and lessons from the weird corners of the stack.
Top comments (0)