loading...

reduce or for…of?

lukeshiru profile image ▲ LUKE知る Updated on ・5 min read

Recently Jake Archibald did a "bold claim" on Twitter about the use of Array.prototype.reduce, and it inspired me to summarize what I think and what I tweeted in a nice blog post.

The inspiration

TL;DR

It depends on your priorities, but there is no "perfect" solution:

  • If you value immutability and the functional style, and performance is not a priority, then between for…of and reduce, pick reduce.
  • If you value performance and readability for the vast majority of devs, and you're sure mutation will not be an issue, then use for…of.
  • If you want "the best of both worlds", then you could try libraries like Immer or Immutable.js.

Let's dive in!

So, first we will talk about mutations. Let's say we want a function which takes an object and a key, and returns the same object but adding that key with the null value. We can do it either with or without mutations:

const object = {};

// Without mutations
const addNullKey = (target = {}, key) => ({
  ...target,
  [key]: null
});

// With mutations
const insertNullKey = (target = {}, key) => {
  target[key] = null;
  return target;
};

const foo = addNullKey(object, "foo"); // `object` isn't mutated
const bar = insertNullKey(object, "bar"); // `object` is mutated

After running this code, foo has a copy of object, with the added property foo in it (so the value is { foo: null }), and bar has a reference to object with the added property bar (value { bar: null }), but it also changes the original object with mutation. Even if you don't care about the mutation itself, you have the comparison problem:

foo === object; // false because foo is a new object
bar === object; // true, because is the same object

So you need to do a deep comparison to actually get if that bar has different properties compared to the original object. You could argue that to avoid that comparison problem and mutation, we can change insertNullKey to be something like this:

const insertNullKey = (target = {}, key) => {
  const copy = Object.assign({}, target)
  copy[key] = null;
  return copy;
};

But with that change you're basically falling into the same territory as addNullKey, but with more boilerplate code.

The way of the for…of

We are targeting readability and performance, so let's go with for…of! Picture we have an array of 5000 elements (those good ol' and super realistic benchmark arrays), and we want to now create an object with every element in that array being a key with the value null. We can reuse our friend insertNullKey here:

const array = [/* 5000 elements */];

const insertNullKey = (target = {}, key) => {
  target[key] = null;
  return target;
};

const object = {};
for (key of array) {
  insertNullKey(object, key);
}

This is fine and dandy until we realize that in other place in the same scope there is an async function messing with our nice object, with something like:

setTimeout(() => {
  insertNullKey(object, "derp")
}, 100);

And boom, object suddenly has a derp property we don't want. To fix this, we then need to move the for…of to a separate function, like this:

const array = [/* 5000 elements */];

const insertNullKey = (target = {}, key) => {
  target[key] = null;
  return target;
};

const arrayToNulledKeys = source => {
  const output = {};
  for (key of array) {
    insertNullKey(output, key);
  }
  return output;
}

const object = arrayToNulledKeys(array);

Yey! We got it, a for…of that uses mutation safely! ...... but now is kinda hard to read, right? So the benefit of readability is lost. The cleanest version of the for…of is actually:

const array = [/* 5000 elements */];

const object = {};
for (key of array) {
  object[key] = null;
}

No reuse other than copy and paste, but far easier to read.

The way of the reduce

Now, let's take a look at the reduce approach. Generally if you prefer this approach, you also try to avoid mutations, so for this one we can use our other friend addNullKey:

const array = [/* 5000 elements */];

const addNullKey = (target = {}, key) => ({
  ...target,
  [key]: null
});

const object = array.reduce(addNullKey, {});

That's it. It doesn't need any extra abstractions to make it secure, you don't need to move the reduce to an external function, is just that.

Now, the thing is: This has actually an horrible performance penalty (people way smarter than me mentioned it with O notation and everything). In short: We are generating an entire new copy of the object for every lap in that reduce loop, so we are generating 5000 objects, every one bigger than the previous, just to be "immutable/secure".

So everything sucks?

Not really. I mean if you're only working with Vanilla JS then yup, you should decide if you want strict immutability/chaining/functional style with very poor performance and use reduce, or a more readable/performant without immutability and use for…of. For this specific example (and several others that use reduce to transform an array to an object) you could also use Object.entries/Object.fromEntries with map, which is like a middle point between for…of and reduce (functional style with immutability and good enough performance):

const array = [/* 5000 elements */];

const object = Object.fromEntries(
  array.map(key => [key, null])
);

Then again, that is if you're only using Vanilla JS. Using libraries like Immer or Immutable, you can use either the for…of or the reduce approach, and get good performance and immutability.

The way of the libraries

I love to use the platform every time is possible, and I'm not a big fan of frameworks or adding libraries just for the sake of it. So, I'm not saying that you should use libraries with this (maybe one of the snippets above already works for you), but if you want to use libraries, you can get a for…of with immutability using Immer like this:

import { produce } from "immer"

const array = [/* 5000 elements */];

const object = produce({}, draft => {
  for (key of array) {
    draft[key] = null;
  }
});

Or you can get a reduce with great performance using Immutable like this:

import { Map } from "immutable";

const array = [/* 5000 elements */];

const object = array.reduce(
  (previous, current) => previous.set(current, null),
  Map({})
);

This is the way

Mandalorian

Sorry for the nerdy reference to The Mandalorian, but I think that the tweet Jake did was taken as an "attack against reduce", when it was only his opinion based on his point of view, so is not that he has banned the use of reduce or something like that.

We web developers just recently had a huge debate over twitter about let vs const, and we need to understand that the best and the worst thing about JavaScript is that it allows you to do anything you want, so you can code with the style you want, using the tools you want. You just need to be aware of the effects of your choices, taking performance, mutations and other technicalities like that into consideration, but also the human side of coding with the readability of the final code.

From my personal point of view is more important to have a concise style to have good readability, than choosing between for…of and reduce.

Thanks for taking the time to read this!

Posted on by:

Discussion

markdown guide
 

Nice article. One thing to note with reduce is that there is a gotcha with async/await. Since async functions always return a promise, using an async operator with reduce will end up with a list of promises, which we will need to resolve manually at the end. For..of on the other hand can simply follow the standard await pattern.

 

I think I never saw code that goes from a reduce to a Promise (generally I used map, filter or something like that). If you have a list of promises, remember async/await is just syntax sugar, so you can use Promise.all or Promise.allSetled 😄

 

Sure. I think map and filter all have the same behavior as well. We can always get around it, it is just a bit tricky sometime.

I think this article explains the behavior quite well.

 

Thanks for the post! Great breakdown.

I'm gonna chime in here to be that guy that says "do the thing that makes the most sense semantically".

If what you're doing is best expressed as "working through each item" use "for...of". If it's best expressed as "condensing a series of items down to a single item" use reduce.