DEV Community

Jeremy Dorn
Jeremy Dorn

Posted on

25 6

Beware of Promise.all

In Javascript, Promise.all lets you execute a bunch of Promises in parallel and get an array of results back.

const responses = await Promise.all([
  fetch("/api/1"),
  fetch("/api/2")
])
Enter fullscreen mode Exit fullscreen mode

Pretty straight forward. However, if you were to do the above with 100 fetch calls instead, you might accidentally take down your server in a self-inflicted Denial of Service attack. Even if you protect against this in the API with rate-limiting, you're still going to see a lot of errors for failed requests as you scale up.

APIs are the exception. Most types of external calls have no concept of rate-limiting at all - filesystem operations, system calls, etc.

For example, in NodeJS you can spawn new shells to call out to other programs on the computer. I use this in my open source A/B testing platform GrowthBook to call a Python script. Something like this:

const results = await Promise.all(
  metrics.map(m => callPython(m))
)
Enter fullscreen mode Exit fullscreen mode

The above will happily spawn hundreds of Python shells if given a large array and start executing them all in parallel. My dev machine is pretty powerful, so I didn't notice during testing that all 8 CPU cores would go to 100% for a couple seconds. When I deployed the code to a Docker container on AWS though, I definitely noticed when it started crashing and restarting all the time.

The solution is to add rate-limiting or concurrency limits to your Promise.all calls. There are a few ways to do this.

For API calls where you want to limit the number of calls per second, you can use the simple p-throttle library:

import pThrottle from 'p-throttle'

// Limit to 2 calls per second
const throttle = pThrottle({
  limit: 2,
  interval: 1000
})

const responses = await Promise.all([
  throttle(() => fetch("/api/1")),
  throttle(() => fetch("/api/2")),
  ...
])
Enter fullscreen mode Exit fullscreen mode

For system calls where you want to limit the number of parallel executions, no matter how long they take, there is the simple p-limit library:

import pLimit from 'p-limit'

// Only 5 promises will run at a time
const limit = pLimit(5)

const results = await Promise.all(
  metrics.map(
    m => limit(() => callPython(m))
  )
)
Enter fullscreen mode Exit fullscreen mode

For more advanced use cases, you might want to look into using a full-featured job queue instead like bree, bull, or agenda.

As developers we spend a lot of time worrying about external attacks and not enough time on protecting our apps from naive internal code. I hope this helps others avoid the same CPU crashing bugs in production that I had to work through. Good luck out there!

Hostinger image

Get n8n VPS hosting 3x cheaper than a cloud solution

Get fast, easy, secure n8n VPS hosting from $4.99/mo at Hostinger. Automate any workflow using a pre-installed n8n application and no-code customization.

Start now

Top comments (4)

Collapse
 
seanghay profile image
Seanghay

Thanks for sharing. If possible I think you should have used message queue like Bull, agenda or Amazon SQS instead for long running task like this.

Collapse
 
johnbwoodruff profile image
John Woodruff

Ah you’re awesome. I had used p-throttle a long time ago and literally today needed it again but could not for the life of me remember what I used! Perfect timing and great article!

Some comments may only be visible to logged-in visitors. Sign in to view all comments.

👋 Kindness is contagious

Explore a trove of insights in this engaging article, celebrated within our welcoming DEV Community. Developers from every background are invited to join and enhance our shared wisdom.

A genuine "thank you" can truly uplift someone’s day. Feel free to express your gratitude in the comments below!

On DEV, our collective exchange of knowledge lightens the road ahead and strengthens our community bonds. Found something valuable here? A small thank you to the author can make a big difference.

Okay