Most JavaScript developers use async/await every day. It reads cleanly, it feels synchronous, and it removed the callback hell that made early Node.js code a nightmare to maintain.
But there's a problem: async/await makes it dangerously easy to write sequential code when you actually want parallel execution. And most developers don't notice until something is embarrassingly slow in production.
Let's dig into the traps, why they happen, and how to write async JavaScript that actually performs.
The Classic Trap: Sequential Awaits
Here's code that looks perfectly fine:
async function getUserDashboard(userId) {
const user = await fetchUser(userId);
const posts = await fetchUserPosts(userId);
const notifications = await fetchNotifications(userId);
return { user, posts, notifications };
}
Looks clean, right? Three awaits, three results. But here's what's actually happening:
- Fetch user — wait for it to finish
- Then fetch posts — wait for it to finish
- Then fetch notifications — wait for it to finish
If each request takes 200ms, your function takes 600ms total. But none of these fetches depend on each other. They could all run at the same time and finish in 200ms.
This is the most common async mistake in JavaScript, and it's hiding in codebases everywhere.
The Fix: Promise.all for Independent Operations
async function getUserDashboard(userId) {
const [user, posts, notifications] = await Promise.all([
fetchUser(userId),
fetchUserPosts(userId),
fetchNotifications(userId),
]);
return { user, posts, notifications };
}
All three fetches fire simultaneously. You wait for all of them together. Total time: the slowest one, not the sum of all of them.
This is a 3x performance improvement with zero extra complexity.
But Promise.all Has Its Own Trap: It Fails Loudly
Promise.all rejects the moment any promise rejects. If fetchNotifications fails, you lose the user data and posts too — even if those succeeded.
For a dashboard where partial data is acceptable:
async function getUserDashboard(userId) {
const results = await Promise.allSettled([
fetchUser(userId),
fetchUserPosts(userId),
fetchNotifications(userId),
]);
const [userResult, postsResult, notificationsResult] = results;
return {
user: userResult.status === 'fulfilled' ? userResult.value : null,
posts: postsResult.status === 'fulfilled' ? postsResult.value : [],
notifications: notificationsResult.status === 'fulfilled' ? notificationsResult.value : [],
};
}
Promise.allSettled waits for every promise to settle — fulfilled or rejected — and gives you the result of each. Partial failures don't blow up the whole thing.
Rule of thumb: Use Promise.all when all results are mandatory. Use Promise.allSettled when partial data is acceptable.
The Loop Trap: await Inside forEach
Here's another one that catches developers off guard:
// This does NOT work the way you think
async function processOrders(orders) {
orders.forEach(async (order) => {
await processOrder(order); // This runs in parallel, not sequentially
});
console.log('All done!'); // Prints immediately — before any orders finish
}
forEach doesn't know about promises. It fires each callback and moves on. Your awaits are happening inside callbacks that forEach never waits for. The console.log runs before any processing finishes.
If you need sequential processing:
async function processOrdersSequentially(orders) {
for (const order of orders) {
await processOrder(order); // Genuinely sequential
}
console.log('All done!'); // Now this is accurate
}
If you want parallel processing:
async function processOrdersInParallel(orders) {
await Promise.all(orders.map((order) => processOrder(order)));
console.log('All done!');
}
Use for...of with await when order matters or you need to process one at a time. Use Promise.all with .map() when operations are independent.
Controlled Concurrency: The Middle Ground
Sometimes full parallelism is too aggressive. Imagine processing 1000 database records simultaneously — you'll likely overwhelm your database or hit rate limits.
You need a concurrency limit. Here's a practical utility:
async function processWithConcurrency(items, asyncFn, limit = 5) {
const results = [];
const executing = new Set();
for (const item of items) {
const promise = asyncFn(item).then((result) => {
executing.delete(promise);
return result;
});
executing.add(promise);
results.push(promise);
if (executing.size >= limit) {
await Promise.race(executing);
}
}
return Promise.all(results);
}
// Usage
await processWithConcurrency(orders, processOrder, 10);
This processes up to 10 items at a time. When one slot frees up, the next item starts. You get the speed benefits of parallelism without drowning your downstream services.
Alternatively, the p-limit library does this cleanly if you don't want to write it yourself:
import pLimit from 'p-limit';
const limit = pLimit(10);
const results = await Promise.all(
orders.map((order) => limit(() => processOrder(order)))
);
Error Handling: Don't Let Unhandled Rejections Silently Rot
One more trap worth calling out:
// Dangerous: unhandled rejection if someAsyncFn() rejects
const promise = someAsyncFn();
// ... you forget to await it
In older Node.js versions, unhandled rejections were silent. In modern Node.js, they crash the process (or will soon). Either way, you don't want them.
Always handle rejections:
const result = await someAsyncFn().catch((err) => {
console.error('Failed:', err);
return null; // or a default value
});
Or use try/catch for blocks of awaits:
try {
const result = await someAsyncFn();
// handle result
} catch (err) {
// handle error
}
Both are valid. Pick whichever fits the flow of your code.
Quick Reference: The Decision Framework
When writing async code, ask yourself:
| Scenario | Use |
|---|---|
| Operations depend on each other | Sequential await
|
| Operations are independent, all required | Promise.all |
| Operations are independent, partial failure OK | Promise.allSettled |
| First successful result is enough | Promise.any |
| Race condition / timeout pattern | Promise.race |
| Large batch, need concurrency limit |
p-limit or custom pool |
| Loop where order matters |
for...of with await
|
The Bigger Picture
Async/await is one of JavaScript's best features. The syntax hides the complexity of promises and makes asynchronous code readable. But that readability is a double-edged sword — it makes it easy to write code that looks fast but isn't.
The performance gaps here aren't academic. Sequential awaits on a dashboard that makes 5 API calls could mean 1000ms vs 200ms — a full second of unnecessary waiting that your users feel every time they load the page.
Get in the habit of asking: do these operations actually depend on each other? If the answer is no, there's almost always a way to run them together.
Your future users (and your future self, debugging production slowness at 2am) will thank you.
Found a pattern I missed? Drop it in the comments — always curious what async traps people have hit in the wild.
Top comments (0)