In the same spirit, I very often use this helper, which is very handy to limit your concurrency while keeping things parallel.
For instance, if you want to gather 4938935 users based on a list of IDs, its not advisable to launch 4938935 requests in parallel, but you could use parallel(100, userIds, fetchUser) to always run 100 concurrent requests until all are processed.
/**
* Similar to Promise.all(),
* but limits parallelization to a certain number of parallel executions.
*/exportasyncfunctionparallel<T>(concurrent:number,collection:Iterable<T>,processor:(item:T)=>Promise<any>){// queue up simultaneous callsconstqueue=[];constret=[];for(constfnofcollection){// fire the async function, add its promise to the queue, and remove// it from queue when completeconstp=processor(fn).then(res=>{queue.splice(queue.indexOf(p),1);returnres;});queue.push(p);ret.push(p);// if max concurrent, wait for one to finishif(queue.length>=concurrent){awaitPromise.race(queue);}}// wait for the rest of the calls to finishawaitPromise.all(queue);}
In the same spirit, I very often use this helper, which is very handy to limit your concurrency while keeping things parallel.
For instance, if you want to gather 4938935 users based on a list of IDs, its not advisable to launch 4938935 requests in parallel, but you could use
parallel(100, userIds, fetchUser)
to always run 100 concurrent requests until all are processed.`
Interesting, that's a very cool way to use
Promise.race
!