đ Executive Summary
TL;DR: A Dart developerâs streaming JSON parser in TypeScript, using event-emitter callbacks, is not idiomatic for modern TS. The article guides transitioning from callback-based patterns to Async Iterators using for awaitâŚof loops, offering adapter patterns or full refactors for cleaner, more maintainable code.
đŻ Key Takeaways
- Modern TypeScript favors Async Iterators (
for awaitâŚof) over event-emitter callbacks (onValue,onError,onDone) for handling data streams, leading to cleaner and flatter code. - The Adapter Pattern allows wrapping an existing callback-based streaming parser with an
AsyncGeneratorto immediately provide an idiomaticfor awaitâŚofAPI without a full refactor. - Refactoring to a native
async function\*is the most idiomatic solution, enabling directyieldof parsed values and standardtry/catcherror handling for streaming JSON.
A senior engineer breaks down the shift from event-driven patterns to modern async iterators, guiding a Dart developer on how to write truly âidiomaticâ TypeScript for a streaming JSON parser.
From Dart to TypeScript: Is Your API âIdiomaticâ or Just Lost in Translation?
I remember a 2 AM incident call. The auth-service-v2 was melting down, and the on-call, a sharp junior dev, was completely lost. The service was written in Node.js, but it was structured like a Spring Boot application, complete with dependency injection containers and factory patterns that felt completely alien. The original author was a Java dev who brought their patterns over, and while the code *worked*, nobody on the team knew how to debug it. It was a ghost ship in our own fleet. This is the exact feeling I get when I see code thatâs technically correct but culturally foreign. Itâs not about being âwrong,â itâs about being maintainable for the team youâre on.
The Core of the Problem: Callbacks vs. Iterators
So, youâve come from Dart and built a slick streaming parser in TypeScript. You used an event-emitter style API with onValue, onError, and onDone callbacks. This is a classic, battle-tested pattern. Itâs how things were done in Node.js for years.
However, the JavaScript/TypeScript world has evolved significantly with the introduction of async/await. The modern, idiomatic way to handle streams of data isnât through callbacks, but through Async Iterators. They let you treat a stream of data just like an array, using a simple for awaitâŚof loop. This makes the code cleaner, easier to reason about, and avoids the nesting that can come with callbacks.
Your API isnât *bad*, it just speaks an older dialect. Letâs get you fluent in the modern tongue.
Three Ways to Bridge the Gap
Here are three ways to tackle this, from a quick patch to a full-blown refactor.
1. The Quick Fix: The Adapter Pattern
Letâs be real: you donât always have time for a full rewrite, especially if the core logic is complex. The fastest way to make your existing parser feel more idiomatic is to wrap it. Create a function that instantiates your event-based parser and returns an AsyncGenerator. This acts as an adapter, translating the âoldâ event style into the ânewâ iterator style without touching the core implementation.
// Your existing parser class (simplified)
class StreamingParser {
constructor() { /* ... */ }
write(chunk) { /* ... */ }
on(event, callback) { /* ... */ }
}
// The adapter function
export async function* parseJsonStream(readable) {
const parser = new StreamingParser();
// A little queue to handle backpressure and race conditions
const queue = [];
let done = false;
let error = null;
let resolvePromise = () => {};
parser.on('value', (value) => {
queue.push(value);
resolvePromise();
});
parser.on('error', (err) => {
error = err;
resolvePromise();
});
parser.on('done', () => {
done = true;
resolvePromise();
});
// Pipe the source readable stream into the parser
readable.on('data', (chunk) => parser.write(chunk));
readable.on('end', () => parser.end());
while (!done) {
while (queue.length > 0) {
yield queue.shift();
}
if (error) {
throw error;
}
if (done) break;
// Wait for the next event
await new Promise(resolve => { resolvePromise = resolve; });
}
}
Pro Tip: This is a great non-destructive strategy. You can ship the adapter immediately to provide an idiomatic API for new consumers, while planning a deeper refactor of the core class for a future release. It keeps everyone happy.
2. The âRightâ Way: The Full Refactor to an Async Generator
This is the goal. You refactor the parserâs internal logic to be a native async function\*. This eliminates the need for managing event listeners and state manually. The yield keyword effectively âpausesâ your function and hands a value back to the consumer, resuming only when the consumer asks for the next item in the for awaitâŚof loop.
The change in consumer code is dramatic and beautiful:
| Before (Callback Style) | After (Idiomatic Async Iterator) |
const parser = new StreamingParser(); parser.on('value', (val) => { console.log('Got a value:', val); }); parser.on('error', (err) => { console.error('Oh no:', err); }); parser.on('done', () => { console.log('All done!'); }); stream.pipe(parser);
|
try { for await (const value of parseJsonStream(stream)) { console.log('Got a value:', value); } console.log('All done!'); } catch (err) { console.error('Oh no:', err); }
|
The refactored code is flat, uses standard try/catch for error handling, and is much easier to follow. Your core parser function would look something like this (conceptual):
export async function* parseJsonStream(stream) {
let buffer = '';
// ... other state variables ...
for await (const chunk of stream) {
buffer += chunk.toString();
// Loop to find and parse complete JSON objects from the buffer
while (true) {
const result = findAndParseJsonObject(buffer);
if (result) {
yield result.value; // Send a value to the consumer
buffer = buffer.slice(result.endIndex); // Consume buffer
} else {
break; // Need more data
}
}
}
// Handle any remaining data in the buffer...
}
3. The âNuclearâ Option: Bring in RxJS
Sometimes, youâre not just parsing one stream. Youâre combining, filtering, mapping, and debouncing multiple streams of data. If your use case is genuinely complex (think real-time analytics pipelines, not just parsing a file from prod-db-01), then trying to manage it with async iterators alone can get messy.
This is where a library like RxJS comes in. It provides a powerful vocabulary (Observables, Operators) for handling complex asynchronous event streams. Itâs the âenterprise-gradeâ solution.
import { fromEvent } from 'rxjs';
import { map, filter, takeUntil } from 'rxjs/operators';
const parser = new StreamingParser(); // Your original class
const stream$ = fromEvent(parser, 'value');
const done$ = fromEvent(parser, 'done');
const error$ = fromEvent(parser, 'error');
stream$.pipe(
takeUntil(done$), // Stop listening when 'done' event fires
// ... other powerful operators like filter(), map(), debounceTime() ...
).subscribe({
next: (value) => console.log('Got value:', value),
error: (err) => console.error('Stream error:', err),
complete: () => console.log('Stream complete!')
});
// Don't forget to handle errors from the error event
error$.subscribe(err => { throw err; });
Warning: Donât reach for this first. RxJS is a powerful tool, but it has a steep learning curve and adds a significant dependency. Itâs like using a sledgehammer to hang a picture frame for simple cases. But when you need to orchestrate multiple complex streams, itâs a lifesaver.
So, Whatâs the âRightâ Answer, Darian?
For your situation, Solution #2 is the destination. It produces the most idiomatic, maintainable, and modern TypeScript code that any developer on your team will immediately understand. Start with Solution #1 if you need to ship a better API today without a big refactor.
âIdiomaticâ code isnât just about following rules. Itâs about empathy. Itâs about writing code that aligns with the expectations and patterns of the ecosystem, so the next person on call at 2 AM (who might be you!) can solve the problem instead of fighting the codeâs dialect. Welcome to TypeScriptâweâre glad to have you.
đ Read the original article on TechResolve.blog
â Support my work
If this article helped you, you can buy me a coffee:

Top comments (0)