DEV Community

Cover image for Here are a few options, from most direct to more story-driven: **Option 1 (Recommended):** Solved: From Dart to TypeScript: ...
Darian Vance
Darian Vance

Posted on • Originally published at wp.me

Here are a few options, from most direct to more story-driven: **Option 1 (Recommended):** Solved: From Dart to TypeScript: ...

🚀 Executive Summary

TL;DR: A Dart developer’s streaming JSON parser in TypeScript, using event-emitter callbacks, is not idiomatic for modern TS. The article guides transitioning from callback-based patterns to Async Iterators using for await…of loops, offering adapter patterns or full refactors for cleaner, more maintainable code.

🎯 Key Takeaways

  • Modern TypeScript favors Async Iterators (for await…of) over event-emitter callbacks (onValue, onError, onDone) for handling data streams, leading to cleaner and flatter code.
  • The Adapter Pattern allows wrapping an existing callback-based streaming parser with an AsyncGenerator to immediately provide an idiomatic for await…of API without a full refactor.
  • Refactoring to a native async function\* is the most idiomatic solution, enabling direct yield of parsed values and standard try/catch error handling for streaming JSON.

A senior engineer breaks down the shift from event-driven patterns to modern async iterators, guiding a Dart developer on how to write truly “idiomatic” TypeScript for a streaming JSON parser.

From Dart to TypeScript: Is Your API “Idiomatic” or Just Lost in Translation?

I remember a 2 AM incident call. The auth-service-v2 was melting down, and the on-call, a sharp junior dev, was completely lost. The service was written in Node.js, but it was structured like a Spring Boot application, complete with dependency injection containers and factory patterns that felt completely alien. The original author was a Java dev who brought their patterns over, and while the code *worked*, nobody on the team knew how to debug it. It was a ghost ship in our own fleet. This is the exact feeling I get when I see code that’s technically correct but culturally foreign. It’s not about being “wrong,” it’s about being maintainable for the team you’re on.

The Core of the Problem: Callbacks vs. Iterators

So, you’ve come from Dart and built a slick streaming parser in TypeScript. You used an event-emitter style API with onValue, onError, and onDone callbacks. This is a classic, battle-tested pattern. It’s how things were done in Node.js for years.

However, the JavaScript/TypeScript world has evolved significantly with the introduction of async/await. The modern, idiomatic way to handle streams of data isn’t through callbacks, but through Async Iterators. They let you treat a stream of data just like an array, using a simple for await…of loop. This makes the code cleaner, easier to reason about, and avoids the nesting that can come with callbacks.

Your API isn’t *bad*, it just speaks an older dialect. Let’s get you fluent in the modern tongue.

Three Ways to Bridge the Gap

Here are three ways to tackle this, from a quick patch to a full-blown refactor.

1. The Quick Fix: The Adapter Pattern

Let’s be real: you don’t always have time for a full rewrite, especially if the core logic is complex. The fastest way to make your existing parser feel more idiomatic is to wrap it. Create a function that instantiates your event-based parser and returns an AsyncGenerator. This acts as an adapter, translating the “old” event style into the “new” iterator style without touching the core implementation.

// Your existing parser class (simplified)
class StreamingParser {
  constructor() { /* ... */ }
  write(chunk) { /* ... */ }
  on(event, callback) { /* ... */ }
}

// The adapter function
export async function* parseJsonStream(readable) {
  const parser = new StreamingParser();

  // A little queue to handle backpressure and race conditions
  const queue = [];
  let done = false;
  let error = null;
  let resolvePromise = () => {};

  parser.on('value', (value) => {
    queue.push(value);
    resolvePromise();
  });

  parser.on('error', (err) => {
    error = err;
    resolvePromise();
  });

  parser.on('done', () => {
    done = true;
    resolvePromise();
  });

  // Pipe the source readable stream into the parser
  readable.on('data', (chunk) => parser.write(chunk));
  readable.on('end', () => parser.end());

  while (!done) {
    while (queue.length > 0) {
      yield queue.shift();
    }

    if (error) {
      throw error;
    }

    if (done) break;

    // Wait for the next event
    await new Promise(resolve => { resolvePromise = resolve; });
  }
}
Enter fullscreen mode Exit fullscreen mode

Pro Tip: This is a great non-destructive strategy. You can ship the adapter immediately to provide an idiomatic API for new consumers, while planning a deeper refactor of the core class for a future release. It keeps everyone happy.

2. The “Right” Way: The Full Refactor to an Async Generator

This is the goal. You refactor the parser’s internal logic to be a native async function\*. This eliminates the need for managing event listeners and state manually. The yield keyword effectively “pauses” your function and hands a value back to the consumer, resuming only when the consumer asks for the next item in the for await…of loop.

The change in consumer code is dramatic and beautiful:

Before (Callback Style) After (Idiomatic Async Iterator)


const parser = new StreamingParser(); parser.on('value', (val) => { console.log('Got a value:', val); }); parser.on('error', (err) => { console.error('Oh no:', err); }); parser.on('done', () => { console.log('All done!'); }); stream.pipe(parser);

|

try { for await (const value of parseJsonStream(stream)) { console.log('Got a value:', value); } console.log('All done!'); } catch (err) { console.error('Oh no:', err); }

|

The refactored code is flat, uses standard try/catch for error handling, and is much easier to follow. Your core parser function would look something like this (conceptual):

export async function* parseJsonStream(stream) {
  let buffer = '';
  // ... other state variables ...

  for await (const chunk of stream) {
    buffer += chunk.toString();

    // Loop to find and parse complete JSON objects from the buffer
    while (true) {
      const result = findAndParseJsonObject(buffer);

      if (result) {
        yield result.value; // Send a value to the consumer
        buffer = buffer.slice(result.endIndex); // Consume buffer
      } else {
        break; // Need more data
      }
    }
  }
  // Handle any remaining data in the buffer...
}
Enter fullscreen mode Exit fullscreen mode

3. The ‘Nuclear’ Option: Bring in RxJS

Sometimes, you’re not just parsing one stream. You’re combining, filtering, mapping, and debouncing multiple streams of data. If your use case is genuinely complex (think real-time analytics pipelines, not just parsing a file from prod-db-01), then trying to manage it with async iterators alone can get messy.

This is where a library like RxJS comes in. It provides a powerful vocabulary (Observables, Operators) for handling complex asynchronous event streams. It’s the “enterprise-grade” solution.

import { fromEvent } from 'rxjs';
import { map, filter, takeUntil } from 'rxjs/operators';

const parser = new StreamingParser(); // Your original class
const stream$ = fromEvent(parser, 'value');
const done$ = fromEvent(parser, 'done');
const error$ = fromEvent(parser, 'error');

stream$.pipe(
  takeUntil(done$), // Stop listening when 'done' event fires
  // ... other powerful operators like filter(), map(), debounceTime() ...
).subscribe({
  next: (value) => console.log('Got value:', value),
  error: (err) => console.error('Stream error:', err),
  complete: () => console.log('Stream complete!')
});

// Don't forget to handle errors from the error event
error$.subscribe(err => { throw err; });
Enter fullscreen mode Exit fullscreen mode

Warning: Don’t reach for this first. RxJS is a powerful tool, but it has a steep learning curve and adds a significant dependency. It’s like using a sledgehammer to hang a picture frame for simple cases. But when you need to orchestrate multiple complex streams, it’s a lifesaver.

So, What’s the ‘Right’ Answer, Darian?

For your situation, Solution #2 is the destination. It produces the most idiomatic, maintainable, and modern TypeScript code that any developer on your team will immediately understand. Start with Solution #1 if you need to ship a better API today without a big refactor.

“Idiomatic” code isn’t just about following rules. It’s about empathy. It’s about writing code that aligns with the expectations and patterns of the ecosystem, so the next person on call at 2 AM (who might be you!) can solve the problem instead of fighting the code’s dialect. Welcome to TypeScript—we’re glad to have you.


Darian Vance

👉 Read the original article on TechResolve.blog


☕ Support my work

If this article helped you, you can buy me a coffee:

👉 https://buymeacoffee.com/darianvance

Top comments (0)