DEV Community

Cover image for 1 Simple Trick to Boost Performance Using Reduce

1 Simple Trick to Boost Performance Using Reduce

Have you ever realised that using a map followed by a filter, or vice versa, is quite common? Did you know that you could half the computation time needed if you just used a reduce instead?

We will begin by recapping the three array methods. Feel free to jump ahead to section 5 if you are already comfortable with these.

Contents

  1. Overview
  2. Map
  3. Filter
  4. Reduce
  5. Map + Filter = Reduce
  6. Performance

Overview

Map, filter and reduce are all methods on the Array prototype. They are used for different purposes, but they all involve iterating over the elements within an array using a callback.

Map

Map returns an array of the same length of the original array it was called on. It takes one parameter, a function, which can take 3 arguments:

  1. The current item to iterate over
  2. The index of the current item
  3. The original array

Map does not mutate the original array, it creates a new array, so the value of a map must be assigned to a variable like so:

const nums = [10, 20, 30, 40, 50];

// assign value of map into result variable
const result = nums.map(function(item, index, arr) {});  // RIGHT

nums.map(function(item, index, arr) {}); // WRONG!
Enter fullscreen mode Exit fullscreen mode

Let's take a look at an example where we have an array of years, and we want to get the age that these years represent, and we also want to keep the original year value too. This means that our array of integers, will be mapped to an array of objects, with each object having 2 properties: year and age.

Example:

const years = [1991, 1999, 2000, 2010, 2014];
const currentYear = (new Date).getFullYear();

const ages = years.map(function(year) {
  const age = currentYear - year;

  // each element will now be an object with 2 values: year & age   
  return {
    year,
    age
  }
});
Enter fullscreen mode Exit fullscreen mode

We now have an array, ages, which looks like this:

ages = [
  {year: 1991, age: 29},
  {year: 1999, age: 21},
  {year: 2000, age: 20},
  {year: 2010, age: 10},
  {year: 2014, age: 6}
]
Enter fullscreen mode Exit fullscreen mode

Filter

Filter, as it sounds, filters the elements we want from an array into a new array, disregarding any elements we do not want.
It takes one parameter, a function, which can take 3 arguments:

  1. The current item to iterate over
  2. The index of the current item
  3. The original array

The function acts as a predicate, and returns the contents of the array, as they originally were, into the newly assigned variable. Unlike map, filter does not necessarily return an array with the same length as the array it was called on.

Like map, filter does not mutate the original array, so the value of a filter must be assigned to a variable.

Let's take a look at an example where have a years array, which represents the year people were born, and we want a new array which contains only the years which would equate to a person being over the age of 18.

Example:

const years = [1991, 1999, 2000, 2010, 2014];
const currentYear = (new Date).getFullYear();

const over18 = years.filter(function(year) {
  // if the year equates to over 18, then put that year
  // into our new over18 array
  return (currentYear - year) > 18;
});
Enter fullscreen mode Exit fullscreen mode

We now have an array, over18, which looks like this:

over18 = [1991, 1999, 2000];
Enter fullscreen mode Exit fullscreen mode

Reduce

Reduce reduces an array down into a single value. That single value could be any JavaScript type; such as a string or a number, or even an array or object.

It takes two parameters:

  1. A function which takes 4 arguments:

    a. An accumulator

    b. The current item to iterate over

    c. The current items index

    d. The source array

  2. The initial value of our single value we want to return

Like map and filter, reduce does not mutate the original array, so the value of a reduce must be assigned to a variable.

Example:

const nums = [10, 20, 30, 40, 50];
const sum = nums.reduce(function(total, num) {
  total += num;

  return total;
}, 0);

console.log(sum);  // 150
Enter fullscreen mode Exit fullscreen mode

This could also be written like this:


const nums = [10, 20, 30, 40, 50];

const reducer = (total, num) => total += num;
const sum = nums.reduce(reducer, 0);
Enter fullscreen mode Exit fullscreen mode

We initialise our array, nums, and our reducer function which just adds a number to our current total value. We then initialise sum and call the reduce method, passing our reducer as the first argument, and our initial value we set our total at, which in this case is 0. As 0 is our initial value, this will be the value of total during the first iteration.

Map + Filter = Reduce

Now that we have recapped what map, filter and reduce do, and how they differ from each other, let us now understand the title of this article.

It is quite common in programming that you may want to filter elements in an array, as well as change their contents slightly. I use the word change lightly, as we know that using these methods do not mutate our original array.

Remember:

  • A filter retains the elements from an array that we are interested in, and assigns it to a new variable
  • A map always assigns an array to a new variable of the same length as the array it was called on

So why call a filter and a map, when you can essentially get the task done in half the time using a reduce?

Using a reduce we can complete the task we need of filtering and mapping the contents of an array in one step, instead of two.

Let us look at an example where we have a years array, which represents the years people were born in, and we want to retain only those who are over the age of 18, as well as work out the age of the people.

Example:

const years = [1991, 1999, 2000, 2010, 2014];
const currentYear = (new Date).getFullYear();

const reducer = (accumulator, year) => {
  const age = currentYear - year;

  if (age < 18) {
    return accumulator;
  }

  accumulator.push({
    year,
    age
  });

  return accumulator;
}

const over18Ages = years.reduce(reducer, []);
Enter fullscreen mode Exit fullscreen mode

We have now essentially combined the examples which we had in the filter section, and map section, in to a reduce. This is the result:

over18Ages = [
  {year: 1991, age: 29},
  {year: 1999, age: 21},
  {year: 2000, age: 20}
]
Enter fullscreen mode Exit fullscreen mode

Our original array, years, had 5 elements, if we used a map followed by a filter, we would have completed 10 iterations to get the same result that we got in 5 iterations with a reduce. However, under the hood, map, filter and reduce do slightly different things, so does it actually make a difference in performance?

Performance

Let's see what an extreme, unrealistic, but simple example shows...

let arr = [];

// populate array with 100,000,000 integers
for (let i = 0; i < 100000000; i++) {
  arr.push(i);
}

// calculate time taken to perform a simple map,
// of multiplying each element by 2
const mapStart = performance.now();
const mapResult = arr.map((num) => num * 2);
const mapEnd = performance.now();

// calculate time taken to perform a simple filter,
// of only returning numbers greater than 10,000
const filterStart = performance.now();
const filterResult = mapResult.filter((num) => num > 10000);
const filterEnd = performance.now();

// calculate time taken to perform a simple reduce,
// of populating an array of numbers whose initial value
// is greater than 10,000, then doubling this number
// and pushing it to our total
const reduceStart = performance.now();
const reduceResult = arr.reduce((total, num) => {
  const double = num * 2;

  if (double <= 10000) {
    return total;
  }

  total.push(double);

  return total;
}, []);
const reduceEnd = performance.now();

console.log(`map time (ms): ${mapEnd - mapStart}`);
console.log(`filter time(ms): ${filterEnd - filterStart}`);
console.log(`reduce time(ms): ${reduceEnd - reduceStart}`);

// map time (ms): 2415.8499999903142
// filter time(ms): 3142.439999995986
// reduce time(ms): 3068.4299999993527
Enter fullscreen mode Exit fullscreen mode

I was extreme to push 100,000,000 integers into an array, but I wanted to show you the performance difference in seconds. The results show that it took reduce 3.14 seconds, using a reduce, to compute what was essentially done in 5.56 seconds using filter and map. And bear in mind this is just dealing with an array of integers. The computation time would have taken longer if dealing with strings or objects.

Alt Text

conclusion

When you see yourself using a map followed by a filter, or vice versa, consider using a reduce instead and complete the computation in half the time! As a programmer you need to weigh up the pros and cons, as what you may gain in terms of performance, you may lose in readability in your code.

Reduce has many use cases, this is just one.

Happy programming 😊

Header photo by chuttersnap on Unsplash

Top comments (11)

Collapse
 
fblind profile image
Facundo Soria

Cool research! πŸ‘
As you mention at the end you have to weigh up the pros and cons, I personally prefer readability over performance...
BTW you can even go beyond and implement almost all "array transform methods" (.map, .filter, .find, .every...) using reduce

Collapse
 
jrdev_ profile image
Jr. Dev πŸ‘¨πŸΎβ€πŸ’»

Thank you Facundo.
I myself also prefer readability, but if the performance to be gained is great, then I would definitely reconsider. I don't think just because reduce in general uses more code, means that readability is lost.

Collapse
 
dwilmer profile image
Daan Wilmer

The readability issue here, is that map, filter, and reduce communicate an intent. When you read map, you know that it takes an array, maps each element of that array to a new value, and returns that new array. When you read filter, you know that it takes an array and returns an array of all elements that satisfy a condition. When you read reduce, you know that it takes an array and returns a single value. Except, in this case, this single value is another array, containing some values that are derived from the original array and that satisfy a condition. It's not that it's just more code, it's that you're saying one thing and doing another.

And if performance of this code is of concern, I'd chuck the whole thing in a for-loop anyway and eliminate all function calls.

Thread Thread
 
fblind profile image
Facundo Soria

For sure, I think one of the main concepts we as developers should apply is expressiveness, the how-to implementation should be hidden in its own function or method or chunk of code. For example, in this case, you can create a function called mapFilter and you could implement it whatever you like, with a for, a reduce, composing a map and filter (I personally would choose this one) and the caller is benefices when using the function which has a name that represents the intention.

Collapse
 
alainvanhout profile image
Alain Van Hout • Edited

Quite interesting! Do note though that these results could also be reduced (thanks, I'll be here all week) to 'one operation takes half as much time as two operations'.

Have you by any chance looked at how a classic for loop does in this test?

Collapse
 
dwilmer profile image
Daan Wilmer • Edited

Good point! I just did that, and this is what I get:

map time (ms): 13755.813225030899
filter time(ms): 3706.2631920576096
reduce time(ms): 3794.9764149188995
for-loop time(ms): 2467.463256955147
Enter fullscreen mode Exit fullscreen mode

Somehow, for me, the map-time takes about four times as long as the other functions (it's weird and I have no clue why), but the for-loop outperforms all others.

Collapse
 
moopet profile image
Ben Sinclair

I appreciate this, and see the benefit to performance, but unless it's an actual issue I prefer to keep them separate. I only like using reduce for something simple, because it gets unreadable very quickly.

Collapse
 
chochos profile image
Enrique Zamudio

If you filter first (as you should always do, when possible) then map iterates over a smaller array.

Others have already mentioned readability. But composition is also important: sometimes you already have the mapping and filtering functions somewhere and they are used in different contexts, so you just pass them to map and filter. Of course you can still implement a reduce that calls the mapping and filtering functions, but we go back to readability.
This seems like a useful trick when you need to filter and map over enormous arrays, but as a general guideline I would keep the mapping and filtering functions separate for readability and comparability (if that's even a word).
Maybe some day Javascript will have lazily evaluated streams and then this discussion will be moot.

Collapse
 
ted_fed_bell profile image
Ted Bell

Excellent breakdown. Thank you for your detailed approach to this! Bookmarking this for sure.

Collapse
 
jrdev_ profile image
Jr. Dev πŸ‘¨πŸΎβ€πŸ’»

Thanks Ted, I appreciate that

Collapse
 
jrdev_ profile image
Jr. Dev πŸ‘¨πŸΎβ€πŸ’»

Great to hear, I am glad it helped πŸ‘Œ