DEV Community

Cover image for Native Iterator Helpers Just Shipped. Here's What You Stop Doing
Gabriel Anhaia
Gabriel Anhaia

Posted on

Native Iterator Helpers Just Shipped. Here's What You Stop Doing


You have a list of 50,000 users. You want the names of the first 10 active ones. You write the line you have written a hundred times:

const names = users
  .filter(u => u.active)
  .map(u => u.name)
  .slice(0, 10);
Enter fullscreen mode Exit fullscreen mode

Three array allocations. filter walks all 50,000 entries and builds a new array of, say, 32,000 active users. map then walks those 32,000 and builds a new array of 32,000 names. slice keeps the first 10 and builds the final array. You needed 10 strings. You materialized about 64,000 of them on the way, and the chain visited roughly 82,000 elements to do it.

Until last year, that was the price of Array.prototype chaining. There were two ways out: write an imperative for loop, or pull in lodash/fp, ramda, or iter-tools for lazy versions. Both of those answers are now stale.

TC39's iterator helpers proposal reached Stage 4 at the October 2024 plenary and was merged into ECMA-262 as part of ES2025. The methods are lazy by default. They run in Node 22 LTS and Node 24, in Bun 1.1.31 and later, in Deno 2, and in every current Chrome, Firefox, and Safari (Baseline Newly available, March 31 2025). TypeScript 5.6 added the type story with lib.es2025.iterator.d.ts and the IteratorObject interface.

So the line above stops being idiomatic. The replacement:

const names = users.values()
  .filter(u => u.active)
  .map(u => u.name)
  .take(10)
  .toArray();
Enter fullscreen mode Exit fullscreen mode

.values() returns an iterator over the array. .filter, .map, and .take return iterator helpers — they do not materialize anything. .toArray is the only step that allocates, and it allocates exactly 10 strings. The runtime walks the source array until 10 active users have been seen and then stops. On a list of 50,000 users where the first 10 actives appear in the first 200, the loop visits 200 elements. The original chain visited around 82,000.

This is what you stop doing.

What actually shipped

The proposal adds methods to a single object: Iterator.prototype. Anything whose prototype chain ends in Iterator.prototype picks up the full set: the iterator returned by Array.prototype.values(), Set.prototype.values(), Map.prototype.entries(), String.prototype[Symbol.iterator](), every generator function, and the result of Iterator.from(anyIterable).

The lazy methods, the ones that return another iterator helper:

  • .map(fn) — yields fn(value) for each value
  • .filter(fn) — yields values where fn(value) is truthy
  • .take(n) — yields at most n values, then stops
  • .drop(n) — skips the first n values, then yields the rest
  • .flatMap(fn) — yields each item from the iterables fn returns

The eager methods, the ones that consume the iterator and return a final value:

  • .toArray() — collects everything into an Array
  • .reduce(fn, init?) — same shape as Array.prototype.reduce
  • .forEach(fn) — runs fn for side effects, returns undefined
  • .some(fn), .every(fn), .find(fn) — short-circuit predicates

Plus one static helper:

  • Iterator.from(value) — wraps any iterable (or iterator-like object) so you can chain helpers on it.

That is the full ES2025 surface. The async counterparts (AsyncIterator.prototype.map and friends) are a separate proposal still moving through TC39. Synchronous helpers shipped. Async helpers are coming.

Lazy versus eager: when it matters

The Array chain is eager. Every method allocates a fresh array and walks the whole input. The work is N + N + N + … where N is the input size.

An iterator chain is lazy. .filter, .map, and .take return immediately with an iterator object. Nothing happens until something pulls — a for…of, a .toArray(), a .reduce(), a .forEach(), or a [...spread]. The terminal call asks for the next value, the chain pulls one source element, threads it through every step, and yields one result.

Three places where the difference flips a program from "fine" to "fast":

  1. Early exit. .take(n) or .find() stops the source walk as soon as the answer is known. The Array version walks everything first.
  2. Infinite sequences. A generator can yield forever. With iterator helpers, .take(n) bounds the chain.
  3. One-pass sources. A response stream, an async generator, a database cursor — these cannot be re-walked. Iterator helpers consume them once.

The cost of laziness is per-element protocol overhead — each step is a method call returning an { value, done } object. For a tight chain over a small array of plain numbers, the Array version is sometimes faster on a microbenchmark. The win shows up the moment the input grows, you can short-circuit, or the source is not a fully-materialized array.

Five places iterator helpers replace something verbose

1. Pagination over a cursor API

The "fetch all pages and map them" loop used to need either a while with a mutable accumulator or a for await plus a manual push ladder. With Iterator.from and .flatMap, the shape collapses.

function* paginate(start: string | null) {
  let cursor: string | null = start;
  while (cursor !== null) {
    const page = fetchPageSync(cursor); // sync example
    yield page.items;
    cursor = page.nextCursor;
  }
}

const firstHundredIds = Iterator.from(paginate(null))
  .flatMap(page => page.values())
  .map(item => item.id)
  .take(100)
  .toArray();
Enter fullscreen mode Exit fullscreen mode

The generator yields one page at a time. .flatMap(page => page.values()) walks each page lazily. .take(100) stops the generator the moment 100 IDs are out. If the first hundred IDs live in the first two pages, the third page is never fetched.

2. Streaming JSON via NDJSON

Newline-delimited JSON is the format you hit when a server cannot afford a buffered array response. The classic shape is "read lines, parse, filter, collect." Iterator helpers compress it to a chain.

import { readFileSync } from "node:fs";

function* lines(text: string) {
  let from = 0;
  for (let i = 0; i < text.length; i++) {
    if (text.charCodeAt(i) === 10) {
      yield text.slice(from, i);
      from = i + 1;
    }
  }
  if (from < text.length) yield text.slice(from);
}

interface Event {
  type: string;
  user: string;
  ts: number;
}

const recentLogins = Iterator.from(lines(readFileSync("events.ndjson", "utf8")))
  .filter(line => line.length > 0)
  .map(line => JSON.parse(line) as Event)
  .filter(ev => ev.type === "login")
  .drop(10) // skip the first 10
  .take(50)
  .toArray();
Enter fullscreen mode Exit fullscreen mode

Same chain works on any character stream that you can wrap as an iterator — a ReadableStream's reader, a node:readline interface (its async version), a websocket frame loop. The shape stays "split, parse, filter, slice, collect."

3. Infinite sequences without an upper bound in the source

Array.from({ length: N }, …) forces you to know N upfront. A generator does not.

function* fibonacci() {
  let [a, b] = [0n, 1n];
  while (true) {
    yield a;
    [a, b] = [b, a + b];
  }
}

const bigFibs = fibonacci()
  .filter(n => n > 1_000_000n)
  .take(5)
  .toArray();

// [1346269n, 2178309n, 3524578n, 5702887n, 9227465n]
Enter fullscreen mode Exit fullscreen mode

The generator is infinite. .filter is infinite. The chain still terminates because .take(5) pulls exactly 5 satisfying values and then closes the upstream iterator. Without iterator helpers, you write the same logic with a while (results.length < 5) loop and a manual array, and you end up writing the bounding condition twice.

4. Early-exit predicate chains

Anywhere you previously wrote arr.some(x => predicate(transform(x))) and felt mildly guilty about the implicit double pass, the iterator-helper version is cheaper and reads cleaner.

interface Order {
  id: string;
  total: number;
  customer: { country: string };
}

function hasLargeEUOrder(orders: readonly Order[]): boolean {
  return orders.values()
    .filter(o => o.customer.country.startsWith("EU-"))
    .map(o => o.total)
    .some(total => total > 10_000);
}
Enter fullscreen mode Exit fullscreen mode

.some short-circuits on the first true. The lazy chain in front of it only runs as far as needed. The Array version would .filter first, allocating a new array of every EU order, and then .some over it. Same answer; one extra pass and one extra allocation.

5. Composable transforms over a Map or Set

Map.prototype.values() and Set.prototype.values() already returned iterators. Until 2024 you could not chain anything on them without spreading into an Array first. Now they get the full helper surface for free.

const eventCounts = new Map<string, number>([
  ["login", 1240],
  ["signup", 180],
  ["click", 90_400],
  ["purchase", 320],
]);

const topThree = eventCounts.entries()
  .filter(([, count]) => count >= 100)
  .map(([type, count]) => ({ type, count }))
  .toArray()
  .sort((a, b) => b.count - a.count)
  .slice(0, 3);
Enter fullscreen mode Exit fullscreen mode

Note the shape: lazy iterator chain up to .toArray(), then standard Array operations for ordering. Iterator helpers do not include .sort because sorting is inherently eager — you cannot sort a stream without seeing all of it. The boundary is honest about that.

Where iterator helpers do not help

Five places to keep using Array methods or a plain for loop.

Small fixed-size arrays. A list of 8 menu items does not benefit from laziness. The Array chain is shorter to type and the per-element overhead of the iterator protocol is real, even if small. If your input has tens of items and there is no early exit, .filter().map() on the array is fine. Reach for iterator helpers when the input is large, the source is a stream, or the chain ends in .take / .find / .some.

You need to sort or reverse. No .toSorted, .toReversed, or .sort on iterators. You cannot order a stream you have not finished reading. Materialize with .toArray() and sort the array.

You need random access. Iterators are linear. If you need arr[i] mid-pipeline, stay on Array.

You need to walk the source twice. Most iterators are single-use. The result of arr.values() happens to be re-iterable in some cases, but generators and Iterator.from(generator) are not. If your pipeline forks ("count them and also collect them"), either materialize once with .toArray and walk twice, or use .reduce to do both in a single pass.

You are mid-async. The synchronous helpers do not work on async generators. ES2025 only shipped the sync set. The async iterator helpers proposal is at Stage 2.7 as of April 2026, and it is the natural completion of this work. Until it lands, async pipelines still use libraries (it-pipe, iter-tools, custom utilities) or hand-rolled for await loops.

The TypeScript 5.6 type story

If you tried iterator helpers between September and December 2024 and got blanket any returns or "Property 'map' does not exist on type 'IterableIterator<…>'" errors, the type story has cleaned up.

TypeScript 5.6 introduced IteratorObject<T, TReturn, TNext> and the new lib.es2025.iterator.d.ts. The shipped libs were updated so that Array.prototype.values(), Map.prototype.entries(), Set.prototype.values(), generator return types, and friends all return IteratorObject subtypes (ArrayIterator, MapIterator, SetIterator, GeneratorIterator) rather than the old IterableIterator. Those IteratorObject subtypes carry the helper methods.

The minimal tsconfig to use them:

{
  "compilerOptions": {
    "target": "ES2024",
    "lib": ["ES2024", "ES2025.Iterator", "WebWorker"],
    "module": "ESNext",
    "moduleResolution": "Bundler",
    "strict": true
  }
}
Enter fullscreen mode Exit fullscreen mode

"ES2025.Iterator" pulls in the helper signatures without dragging in the rest of ES2025 (which, depending on the TypeScript version you are on, may include things you do not want yet). If you are already on "lib": ["ESNext", …] you have it.

A typed pipeline:

interface User {
  id: string;
  name: string;
  active: boolean;
}

function activeNames(users: readonly User[]): string[] {
  return users.values()
    .filter(u => u.active)
    .map(u => u.name)
    .toArray();
}
Enter fullscreen mode Exit fullscreen mode

The return type of users.values() is ArrayIterator<User>. .filter(predicate) narrows nothing by default — there is a known TypeScript limitation where the filter callback does not become a type predicate without an explicit annotation. If you want narrowing across the chain, write the predicate as one:

type ActiveUser = User & { active: true };

function activeNames(users: readonly User[]): string[] {
  return users.values()
    .filter((u): u is ActiveUser => u.active)
    .map(u => u.name)
    .toArray();
}
Enter fullscreen mode Exit fullscreen mode

The other gotcha is BuiltinIteratorReturn. TypeScript 5.6 also added --strictBuiltinIteratorReturn, which (under --strict) makes the TReturn of every built-in iterator default to undefined instead of any. That is what you want — the return value of an iterator's done: true step is normally undefined and treating it as any was lying. If a generator you wrote returns a non-undefined value, annotate it.

Closing

Iterator helpers are not new thinking. Every language with first-class sequences has been chaining over iterators for fifteen years. They are now the platform default in JavaScript too, on every runtime your code lands on, with no library to import and no polyfill to drop. The synchronous side alone is enough to retire a generation of utility code; the first .filter().map().slice(0, …) chain you convert will produce a smaller diff than you expect, and the runtime will visit 200 elements where the array version visited 82,000.


If this was useful

TypeScript Essentials is the entry point of The TypeScript Library, a 5-book collection that covers the language from "I write some TS at work" to "I ship libraries that run on every JS runtime." The iterator-helper story sits in the daily-driver chapter — the same chapter that walks through narrowing, async, and the shape of a modern tsconfig. Start with Essentials for the language, jump to Type System for the deep-dive, take the Kotlin/Java or PHP bridge if those are your home languages, finish with In Production when you are shipping libraries.

  • TypeScript Essentials — entry point. Types, narrowing, modules, async, daily-driver tooling: Amazon
  • The TypeScript Type System — deep dive. Generics, mapped/conditional types, infer, template literals, branded types: Amazon
  • Kotlin and Java to TypeScript — bridge for JVM developers. Variance, null safety, sealed-to-unions, coroutines-to-async/await: Amazon
  • PHP to TypeScript — bridge for modern PHP 8+ developers. Sync-to-async paradigm, generics, discriminated unions: Amazon
  • TypeScript in Production — production layer. tsconfig, build tools, monorepos, library authoring, dual ESM/CJS, JSR: Amazon
  • Hermes IDE — an IDE for developers who ship with Claude Code and other AI coding tools: hermes-ide.com
  • Me: xgabriel.com | GitHub

All five books ship in ebook, paperback, and hardcover.

The TypeScript Library — the 5-book collection

Top comments (1)

Collapse
 
sklieren profile image
Ben K.

Just a side note: Filter early. There's no need to map 32k items when only 10 will be used. Slice first, map after.