loading...

Concurrent Iteration

hoodwink73 profile image Arijit Bhattacharya ・2 min read

The concept of iteration intuitively seems to be synchronous — when the iteration completes we will have our results ready.

There are many native array methods which helps us iterate over an array.

Lets say we have an array of student details. Each student has a field which states their date of birth and now we want to calculate their age.

const ageOfStudents = studentDetails.map(({dob}) => calculateAgeFromDOB);

The ageOfStudents will be ready for us immediately.

calculateAgeFromDOB is a synchronous operation. So, we will calculate the age of each student strictly one after the other.

But what if the operation to be applied to each student does not resolve synchronously.

Lets say, we need yearly performance record for each student. And each yearly performance record of a student is a network request.

const studentPerformanceRecordsPromises = studentDetails  
  .map(({id}) => getPerformanceRecordOfStudent(id));

Each iteration will spawn a concurrent task. And these tasks will resolve in their own arbitrary order.

We have to wait for the performance records even after the iteration completes. This is the critical distinction between ordinary iteration and concurrent iteration.

If getPerformanceRecordOfStudent returns a Promise which resolves after a successful network request, studentPerformanceRecordsPromises will be an array of Promises.

We can use Promise.all to wait on an array of Promises.

Promise.all(studentPerformanceRecordsPromises)  
   .then(doSomethingWithPerformanceRecordsOfAllStudents)

Since we are contrasting synchronous and asynchronous iteration, it will be good to have an async counterpart of our Array.map.

We will like to use it like this

Promise  
.map(studentDetails, getPerformanceRecordOfStudent)  
.then(doSomethingWithPerformanceRecordsOfAllStudents)

And here is how a trivial definition of Promise.map will look like

if (!Promise.map) {  
 Promise.map = function(vals,cb) {  
  return Promise.all(  
   vals.map( function(val){  
      // we are expecting \`cb\` to return a promise  
      // even if it does not we are converting it in to  
      // a promise using \`Promise.resolve\`  
      return Promise.resolve(cb(val))  
   } )  
  );  
 };  
}

This thought was spawned while I was reading YDKJS by @getify. Particularly this part aptly titled Concurrent Iterations.

I was scratching my head for a little while. So thought of sharing my resolved understanding. Maybe this will help somebody.

Thanks for reading.

I am a big fan of YDKJS. Highly recommend it!

Resolve your relationship with JavaScript. Sorry!

Posted on by:

hoodwink73 profile

Arijit Bhattacharya

@hoodwink73

Always wanted to say I build systems. Can I?

Discussion

pic
Editor guide
 

I think what would be really nice to have is, instead of Promsie.map, Array.prototype.amap/Array.prototype.afilter and Array.prototype.areduce. They all return a promise. This would be really handy.

studentDetails
  .amap(getPerformanceRecordOfStudent)
  .then(doSomethingWithPerformanceRecordsOfAllStudents)

This is essentially the same thing, but I think it is more intuitive to use.

What do you think about this idea?

 

BTW, based on your comment, you would be super interested in the new async iteration proposal

Kushan also gives a small example in his comment.

 

That would be interesting.

But adding something to native JS API, thats a lot to hope for :)

There are lots of other Promise utilities which will offer us handy utilities like this out of the box.

 

I have often seen the need for a Promise.map when one has to deal with concurrency of promises. As the native Promise.all doesn't allow for controlling how many promises are executed.

I also find using the async for of loop a pretty neat way of adding concurrency to an array of promises.

Concurrency = 1

const studentDetails = [{ name: 'kj' }, { name: 'ab' }]; // an array of sync values

for (const student of studentDetails) {
  const data = await getPerformanceRecordOfStudent(student);
  // do whatever
}

Concurrency = N

const studentDetails = [{ name: 'kj' }, { name: 'ab' }]; // an array of sync values

const concurrency = 10;
let batch = [];
for (const student of studentDetails) {
  if (executeAll.length < concurrency) {
    batch.push(student);
    continue; // continue the loop
  }
  // runs only a batch of promises i.e. concurrency
  const batchResult = await Promise.all(
    batch.map(getPerformanceRecordOfStudent)
  );

  // do whatever
}

The fancy for-await-of

You can also use for-await-of but remember it works for async iterables, you can use them for an array of promises, but then you aren't really using them to full extent.

Here's how you would create an async iterable.

const fetchNextPage = page => Promise.resolve(page);

async function* asyncGen() {
  let page = 0;
  while (page < 10) yield fetchNextPage(page++);
}

for await (const g of asyncGen()) {
  console.log(g);
}

The best part about this approach is again concurrency, the async generator asyncGen only creates a new promise when asked for, and the for-await-of automatically awaits at the start of for loop, resolves it and puts it as the const g.

 

Hey Kushan

The batching idea is neat. Didn't occur to me that you can play with the degree of concurrency.

Async iterables are pretty awesome. Just discovered them. I guess they officially landed in ES2018.

Thanks for the addendum.

Fun Fact: I was there for your talk at last year's ReactFoo. It proved helpful for my own redux saga journey.

 

here is what i ran into with reduce and Promise.
gyandeeps.com/array-reduce-async-a...