You've seen it. Maybe you've written it. That deeply nested async code that starts reasonable and then slowly turns into something that nobody — including the original author — can debug three weeks later.
Let's fix that.
In this article, we'll walk through real-world JavaScript async patterns that scale cleanly, stay readable, and don't make your teammates cry during code review.
The Problem With "Just Use Async/Await"
Async/await is genuinely great. But it's also a tool that gives you just enough rope to hang yourself with if you're not careful.
Here's a pattern that shows up constantly in real codebases:
async function loadDashboard(userId) {
const user = await getUser(userId);
const posts = await getPosts(user.id);
const comments = await getComments(user.id);
const notifications = await getNotifications(user.id);
const settings = await getSettings(user.id);
return { user, posts, comments, notifications, settings };
}
This looks clean. It reads like synchronous code. But there's a hidden performance disaster: every await blocks the next one. If each call takes 200ms, you're looking at a 1000ms load time — when the real answer could be under 200ms.
The fix is obvious once you see it:
async function loadDashboard(userId) {
const user = await getUser(userId);
// These don't depend on each other — run them in parallel
const [posts, comments, notifications, settings] = await Promise.all([
getPosts(user.id),
getComments(user.id),
getNotifications(user.id),
getSettings(user.id),
]);
return { user, posts, comments, notifications, settings };
}
But Promise.all fails fast — if one rejects, everything fails. Sometimes that's not what you want.
Pattern 1: Resilient Parallel Fetching With Promise.allSettled
Imagine a dashboard where the notification count failing shouldn't tank the entire page load. This is where Promise.allSettled shines:
async function loadDashboardResilient(userId) {
const user = await getUser(userId);
const results = await Promise.allSettled([
getPosts(user.id),
getComments(user.id),
getNotifications(user.id),
getSettings(user.id),
]);
const [posts, comments, notifications, settings] = results.map((result) => {
if (result.status === 'fulfilled') return result.value;
console.warn('A dashboard section failed to load:', result.reason);
return null; // graceful fallback
});
return { user, posts, comments, notifications, settings };
}
Now you get partial data even when some calls fail. The UI can render what it has, and you can show a subtle "some data unavailable" message instead of a full error screen.
Pattern 2: The Timeout Wrapper
External APIs are unreliable. You can't control them. What you can control is how long you wait for them.
function withTimeout(promise, ms, label = 'Operation') {
const timeout = new Promise((_, reject) =>
setTimeout(() => reject(new Error(`${label} timed out after ${ms}ms`)), ms)
);
return Promise.race([promise, timeout]);
}
// Usage
const data = await withTimeout(
fetchFromSlowAPI(),
3000,
'SlowAPI fetch'
);
Promise.race resolves or rejects with whichever promise settles first. Pair it with allSettled and you have a resilient system that degrades gracefully under load.
const results = await Promise.allSettled([
withTimeout(fetchUserProfile(), 2000, 'UserProfile'),
withTimeout(fetchRecommendations(), 5000, 'Recommendations'),
withTimeout(fetchAds(), 1000, 'Ads'),
]);
Pattern 3: Async Queue for Rate Limiting
Here's a scenario: you need to process 500 items, but the API you're calling only allows 10 concurrent requests. Sending all 500 at once will get you rate-limited or banned.
A naive loop is too slow. Promise.all is too aggressive. You need a concurrency limiter.
async function processWithConcurrency(items, asyncFn, concurrency = 10) {
const results = [];
let index = 0;
async function worker() {
while (index < items.length) {
const currentIndex = index++;
results[currentIndex] = await asyncFn(items[currentIndex]);
}
}
// Spin up N workers that each grab the next item as they finish
await Promise.all(
Array.from({ length: Math.min(concurrency, items.length) }, worker)
);
return results;
}
// Usage
const processed = await processWithConcurrency(
userIds,
(id) => fetchAndTransformUser(id),
10
);
This pattern is elegant: you create N worker promises. Each one pulls the next available item from the shared index. No external library needed.
Pattern 4: Async Memoization (Cache That Plays Nice With Promises)
Caching is easy when values are synchronous. It gets tricky with async code — you might fire the same async call multiple times before the first one finishes.
function asyncMemo(fn) {
const cache = new Map();
return async function (...args) {
const key = JSON.stringify(args);
if (cache.has(key)) {
return cache.get(key); // returns the same Promise if in-flight
}
const promise = fn(...args).catch((err) => {
cache.delete(key); // evict on error so next call retries
throw err;
});
cache.set(key, promise);
return promise;
};
}
// Usage
const cachedFetchUser = asyncMemo(fetchUser);
// Even if called 5 times simultaneously, fetchUser only runs once per userId
const users = await Promise.all([
cachedFetchUser(42),
cachedFetchUser(42),
cachedFetchUser(42),
]);
The key insight: we cache the Promise, not the resolved value. Multiple callers awaiting the same key all get the same promise — so the underlying function only runs once, even under concurrent calls.
Pattern 5: Async Iterator for Streaming Data
Sometimes data arrives in chunks — paginated APIs, WebSocket streams, file reads. Async generators make this clean:
async function* paginatedFetch(url) {
let nextUrl = url;
while (nextUrl) {
const response = await fetch(nextUrl);
const data = await response.json();
yield data.items; // yield each page's worth of data
nextUrl = data.nextPageUrl || null;
}
}
// Consume it naturally with for-await-of
for await (const items of paginatedFetch('https://api.example.com/posts')) {
await processBatch(items);
console.log(`Processed ${items.length} items`);
}
You can add early exit conditions, error handling, and rate limiting without polluting the iteration logic itself.
Putting It Together: A Real-World Data Pipeline
Here's how these patterns combine in a realistic scenario — fetching, processing, and storing data from multiple sources:
const cachedGetUser = asyncMemo(getUser);
async function runDataPipeline(userIds) {
console.log(`Processing ${userIds.length} users...`);
const users = await processWithConcurrency(
userIds,
async (id) => {
const user = await withTimeout(cachedGetUser(id), 3000, `User ${id}`);
return transformUser(user);
},
15
);
const results = await Promise.allSettled(
users.filter(Boolean).map((u) => saveToDatabase(u))
);
const failed = results.filter((r) => r.status === 'rejected');
if (failed.length > 0) {
console.warn(`${failed.length} users failed to save`);
}
return results.filter((r) => r.status === 'fulfilled').length;
}
Quick Reference
| Pattern | When to Use |
|---|---|
Promise.all |
All must succeed, run in parallel |
Promise.allSettled |
Partial success is acceptable |
Promise.race |
First to finish wins (timeout pattern) |
withTimeout |
Protect against hanging promises |
processWithConcurrency |
Batch processing with rate limits |
asyncMemo |
Deduplicate concurrent identical calls |
| Async generators | Streaming / paginated data |
Final Thought
Good async code isn't just about making things work — it's about making them work predictably under real-world conditions: slow networks, rate limits, partial failures, and concurrent users.
The patterns above aren't exotic. They're things you'll reach for constantly once you internalize them. Bookmark this, add these utilities to your toolkit, and your future self (and teammates) will thank you.
What async patterns have you found most useful? Drop them in the comments — always looking to expand the toolkit.
Top comments (0)