DEV Community

Cover image for Processing promises in Batch
Sibelius Seraphini for Woovi

Posted on

Processing promises in Batch

Running code concurrently

To make execution faster, we usually run code concurrently.
One way to run code concurrently in JavaScript is to call many promises at the same time without waiting for their results, then you use Promise.all to await all the promises to finish. Check the example below:

const promiseA = asyncFnA();
const promiseB = asyncFnB();

const results = await Promise.all([promiseA, promiseB]);
Enter fullscreen mode Exit fullscreen mode

The code above will execute asyncFnA and asyncFnB concurrently, and Promise.all will await both execution to resolve.

Running many promises at the same time

Let's take a look at this code

const users = await User.find(); // return all users in the database

const results = await Promise.all(users.map(async (user) => processUser(user));
Enter fullscreen mode Exit fullscreen mode

This code will run as many promises as users in your database. Node and JavaScript does not handle so well many promises running concurrently, Go handles this well.
This code will probably consume a lot of CPU and memory and will run out of memory.
To solve this, we need to process all these promises in batch

Processing promises in Batch

export async function processPromisesBatch(
  items: Array<any>,
  limit: number,
  fn: (item: any) => Promise<any>,
): Promise<any> {
  let results = [];
  for (let start = 0; start < items.length; start += limit) {
    const end = start + limit > items.length ? items.length : start + limit;

    const slicedResults = await Promise.all(items.slice(start, end).map(fn));

    results = [
      ...results,
      ...slicedResults,
    ]
  }

  return results;
}
Enter fullscreen mode Exit fullscreen mode

Usage

const results = await processPromisesBatch(users, 100, processUser)
Enter fullscreen mode Exit fullscreen mode

processPromisesBatch will slice your items in chunks of size N, and will execute N promises at the same time.
This ensures it won't consume a lot of CPU and memory, and exhaust the event loop.

In Conclusion

Understanding the limitation of your programming language and runtime, can help you design a solution to workaround them.

Share solutions that you design based on limitations of your programming language, runtime or framework.


Woovi
Woovi is a Startup that enables shoppers to pay as they like. To make this possible, Woovi provides instant payment solutions for merchants to accept orders.

If you want to work with us, we are hiring!


Photo by Elevate on Unsplash

Top comments (6)

Collapse
 
trueromanus profile image
Roman Vladimirov

It good example how to limit parallelism (if it term proper for JS). But what about rejecting? If in some of group inside function processPromisesBatch happend reject it stay in inconsistant state some groups successfully completed and fill results but some not. Promise.all - working by principle all of notning. May be need extra conditions for handling rejecting also.

Collapse
 
echofly profile image
echofly

This article is great, it explains some important concepts of JavaScript concurrent programming and type system. I especially like how the author uses set as a conceptual model for type, and how TypeScript automatically narrows down the type of a variable based on control flow. These knowledge are very helpful for improving the readability and maintainability of code. Thanks for sharing!

Collapse
 
corners2wall profile image
Corners 2 Wall

Maybe best way use Promise.allSettled instead of Promise.all. This gives more flexibility.
Check this or this

Collapse
 
tgmarinhodev profile image
Thiago Marinho

Nice article, thanks!
I like to use this library: caolan.github.io/async/v3/docs.htm...
I can run promises in batch using chunks in parallel.

Collapse
 
reacthunter0324 profile image
React Hunter

It's good article!
Thank you

Collapse
 
jricardoprog profile image
Ricardo

Bad code anyway. Your function does not work with maximum parallelism. It also doesn't solve the memory problem, it just postpones it.