DEV Community

Cover image for Stop Writing Spaghetti Async Code: Master JavaScript Concurrency with These Patterns
Teguh Coding
Teguh Coding

Posted on

Stop Writing Spaghetti Async Code: Master JavaScript Concurrency with These Patterns

Async code is where most JavaScript developers quietly lose control. You start with a clean fetch(), then add another, then a loop, and suddenly you've got a wall of callbacks or a Promise chain that reads like a mystery novel.

This article is your guide to writing async JavaScript that's clean, fast, and actually understandable six months from now.


Why Async Code Gets Messy So Fast

JavaScript is single-threaded, but the world it talks to is not. Network requests, file reads, timers — they all operate outside the main thread. The language evolved through three generations of async handling:

  1. Callbacks — the original, and the source of callback hell
  2. Promises — cleaner chaining, but still verbose
  3. async/await — syntactic sugar over Promises that changed everything

Most developers stop at async/await and call it a day. But there's a lot more leverage available if you understand the patterns underneath.


Pattern 1: Don't Await in a Loop (Unless You Have To)

This is the most common mistake I see in code reviews:

// BAD: Sequential - each request waits for the previous
const users = [1, 2, 3, 4, 5];
const results = [];

for (const id of users) {
  const user = await fetchUser(id); // blocks here each time
  results.push(user);
}
Enter fullscreen mode Exit fullscreen mode

If each fetchUser takes 200ms, this takes 1000ms total. Unnecessary.

// GOOD: Parallel - all requests fire at once
const users = [1, 2, 3, 4, 5];
const results = await Promise.all(users.map(id => fetchUser(id)));
Enter fullscreen mode Exit fullscreen mode

Now all 5 requests run concurrently and finish in ~200ms total.

But here's the catch: Promise.all fails fast. If any promise rejects, the whole thing throws. For independent operations where partial success is acceptable, use Promise.allSettled:

const results = await Promise.allSettled(users.map(id => fetchUser(id)));

results.forEach((result, index) => {
  if (result.status === 'fulfilled') {
    console.log(`User ${users[index]}:`, result.value);
  } else {
    console.error(`Failed for user ${users[index]}:`, result.reason);
  }
});
Enter fullscreen mode Exit fullscreen mode

Pattern 2: Concurrency Limiting

Firing 1000 requests simultaneously can hammer an API or exhaust memory. You need a concurrency limit.

Here's a lightweight implementation without any library:

async function runWithConcurrency(tasks, limit) {
  const results = [];
  const executing = new Set();

  for (const task of tasks) {
    const promise = task().then(result => {
      executing.delete(promise);
      return result;
    });

    executing.add(promise);
    results.push(promise);

    if (executing.size >= limit) {
      await Promise.race(executing);
    }
  }

  return Promise.all(results);
}

// Usage
const tasks = userIds.map(id => () => fetchUser(id));
const users = await runWithConcurrency(tasks, 5); // max 5 at a time
Enter fullscreen mode Exit fullscreen mode

This is essentially how libraries like p-limit work under the hood. Understanding this pattern means you can tune it for your exact situation.


Pattern 3: Timeout Wrapping

Networks are unreliable. A promise that never resolves is a silent killer. Always add timeouts:

function withTimeout(promise, ms) {
  const timeout = new Promise((_, reject) =>
    setTimeout(() => reject(new Error(`Timed out after ${ms}ms`)), ms)
  );
  return Promise.race([promise, timeout]);
}

// Usage
try {
  const data = await withTimeout(fetchExpensiveData(), 5000);
} catch (err) {
  if (err.message.includes('Timed out')) {
    // handle timeout specifically
  }
}
Enter fullscreen mode Exit fullscreen mode

Combine this with retry logic for resilient data fetching:

async function fetchWithRetry(url, options = {}, retries = 3, delayMs = 500) {
  for (let attempt = 0; attempt <= retries; attempt++) {
    try {
      return await withTimeout(fetch(url, options), 5000);
    } catch (err) {
      if (attempt === retries) throw err;
      await new Promise(res => setTimeout(res, delayMs * (attempt + 1)));
    }
  }
}
Enter fullscreen mode Exit fullscreen mode

The delay multiplier (delayMs * (attempt + 1)) is a simple exponential backoff. Most production systems use this exact pattern.


Pattern 4: Async Generators for Streaming Data

Paginated APIs and streaming responses are perfect for async generators — a pattern many developers haven't fully explored:

async function* paginatedFetch(baseUrl) {
  let page = 1;
  let hasMore = true;

  while (hasMore) {
    const response = await fetch(`${baseUrl}?page=${page}&limit=100`);
    const data = await response.json();

    yield data.items;

    hasMore = data.hasNextPage;
    page++;
  }
}

// Process pages as they arrive without loading everything into memory
for await (const items of paginatedFetch('https://api.example.com/products')) {
  await processItems(items);
  console.log(`Processed page of ${items.length} items`);
}
Enter fullscreen mode Exit fullscreen mode

This is memory-efficient for large datasets — you never hold the full result set in memory at once.


Pattern 5: The Mutex (Preventing Race Conditions)

Race conditions happen when two async operations both read and write shared state. Classic example: incrementing a counter via API.

class AsyncMutex {
  constructor() {
    this.queue = Promise.resolve();
  }

  lock(fn) {
    const result = this.queue.then(() => fn());
    this.queue = result.catch(() => {});
    return result;
  }
}

const mutex = new AsyncMutex();

// These will now run sequentially, not race
const [a, b] = await Promise.all([
  mutex.lock(() => incrementCounter()),
  mutex.lock(() => incrementCounter()),
]);
Enter fullscreen mode Exit fullscreen mode

Without the mutex, both operations could read the same initial value and overwrite each other. The mutex queues them.


Putting It All Together: A Real-World Data Pipeline

Here's a realistic example combining several of these patterns:

async function syncProductCatalog(productIds) {
  const mutex = new AsyncMutex();
  let synced = 0;
  let failed = 0;

  // Create tasks for each product
  const tasks = productIds.map(id => async () => {
    try {
      // Fetch with timeout and retry
      const product = await fetchWithRetry(
        `https://api.supplier.com/products/${id}`
      );

      // Serialize writes to avoid DB conflicts
      await mutex.lock(() => saveToDatabase(product));
      synced++;
    } catch (err) {
      console.error(`Failed to sync product ${id}:`, err.message);
      failed++;
    }
  });

  // Run with controlled concurrency
  await runWithConcurrency(tasks, 10);

  console.log(`Sync complete: ${synced} synced, ${failed} failed`);
}
Enter fullscreen mode Exit fullscreen mode

Clean, composable, and production-ready.


Quick Reference Cheatsheet

Situation Pattern to Use
Multiple independent requests Promise.all()
Partial failure is OK Promise.allSettled()
Too many concurrent requests Concurrency limiter
Unreliable networks withTimeout + retry
Large paginated datasets Async generators
Shared mutable state AsyncMutex
First result wins Promise.race()

Final Thoughts

Async JavaScript doesn't have to be a mess. The patterns above are composable building blocks — understand them individually and you can combine them to handle almost any real-world scenario.

The biggest shift is thinking in flows of data rather than sequences of steps. When you start asking "what can run at the same time?" and "what needs to be coordinated?", the right patterns become obvious.

Start small: next time you find an await inside a for loop in your codebase, refactor it to Promise.all. That single change might cut your response times by 80%.

Happy coding.

Top comments (0)