DEV Community

Cover image for Why I don't like reduce
Dominik D
Dominik D

Posted on • Originally published at tkdodo.eu

Why I don't like reduce

The popular eslint-plugin-unicorn recently added a no-reduce rule, and it is set to error per default. The argument is that Array.reduce will likely result in code that is hard to reason about, and can be replaced with other methods in most cases (Read this twitter thread for a lengthy discussion if you like).

I have to say: I wholeheartedly agree, and I have personally turned on that rule in some projects.

What is wrong with reduce?

For me, there are many reasons why I rarely like to see reduce when reviewing code. First and foremost, it is hard to grasp. I believe one of the reasons for this is that reduce can do way too much.

  • Need to sum up values?
  • Need to transform Arrays into Objects?
  • Need to build a string?

Array.reduce can do it all.

While it might sound nice to have such a tool at your disposal, when looking at something implemented with reduce, you don't immediately see what that code is for.

What also adds to the confusion for me is that you cannot read reduce from left to right, top to bottom - at least not in JavaScript. Whenever I see reduce, I usually skim to the very end to get ahold of the initial value, because it will tell me what this reduce is trying to do. Then, I can go back to the beginning and try to understand it.

This is not the case in other languages, for example scala, where the initial value is the first parameter:

val numbers = List(1, 2, 3)

numbers.fold(0)(_ + _) // 6
Enter fullscreen mode Exit fullscreen mode

Try me in scastie

Reduce is so mighty, you can implement all the other Array functions you are using on a daily basis with it:

const mapWithReduce = (array, callback) =>
    array.reduce((accumulator, currentValue, index) => {
        accumulator[index] = callback(currentValue, index, array)
        return accumulator
    }, [])

mapWithReduce([1, 2, 3], (value) => value * 2) // 2, 4, 6
Enter fullscreen mode Exit fullscreen mode

I have even seen people re-implement join with reduce:

const joinWithReduce = (array, delimiter) =>
    array.reduce(
        (accumulator, currentValue, index) =>
            accumulator + currentValue + (index === array.length - 1 ? '' : delimiter),
        ''
    )

joinWithReduce(['foo', 'bar', 'baz'], ';') //foo;bar;baz
Enter fullscreen mode Exit fullscreen mode

The question is: why would you? For almost all cases, there are methods that:

  • are not as powerful, with a limited scope
  • have a clear API
  • have a good name, so you know what it is doing

Array.join is a very good example of such a limited method. Everyone understands what is going on when we read:

values.join(';')

Compare that to the above implementation - I think we can agree that the simplicity is preferred.

When is it okay to reduce?

For me, (mostly) only when implementing reusable util methods. It usually doesn't matter how they are implemented. You give them a good name, a clear purpose, write some tests and that's it.

Most usages of reduce I was reviewing lately fall in one of three categories:

1. Transforming Arrays to Objects

Yes, there is no easy native way to do that, and not even popular util libraries like lodash have no good way of achieving this (keyBy is okay, but doesn't transform values).

In one project, we frequently had the need for such transformations, so we made our own util for it. The implementation is something like this:

export const toObject = <T, K extends string | number | symbol, V>(
    array: ReadonlyArray<T>,
    iteratee: (element: T, index: number, array: ReadonlyArray<T>) => [K, V]
): Record<K, V> =>
    array.reduce((result, element, index) => {
        const [key, value] = iteratee(element, index, array)
        result[key] = value
        return result
    }, {} as Record<K, V>)

toObject(['foo', 'bar', 'baz'], (element) => ['key-' + element, 'value-' + element])
Enter fullscreen mode Exit fullscreen mode

Good name, strong types, ease of use. The rest is implementation detail (including the type cast for the initial value).

2. Grouping Arrays

Again, pick a util library (lodash, ramda, remeda, ...) or write your own util. Encapsulate that complex reduce so that you don't have to re-implement it every time you need it.

3. Do many things at once

Iterating over big lists many times can be costly, so people often fallback to reduce because it can do everything in one go.

The truth is: usually, it doesn't matter. Even when working with very large lists (tens of thousands of entries), I have made the experience that performance is rarely negatively impacted as long as you keep iterations linear.

If your toObject util does one iteration with a reduce or two iterations with a map followed by Object.fromEntries is irrelevant, unless you have measured it and found it to be a bottleneck.

Reduce performance pitfalls

Talking about performance and linear iterations, I've learned the hard way not to do this when working with reduce:

export const toObject = <T, K extends string | number | symbol, V>(
    array: ReadonlyArray<T>,
    iteratee: (element: T, index: number, array: ReadonlyArray<T>) => [K, V]
): Record<K, V> =>
    array.reduce((result, element, index) => {
        const [key, value] = iteratee(element, index, array)
        return {
            ...result,
            [key]: value,
        }
    }, {} as Record<K, V>)
Enter fullscreen mode Exit fullscreen mode

Why should I be dirty and mutate the result, when I can be super fancy instead and create a new object every time πŸ€”πŸ€¦β€.

Here is a perf analysis how the two compare when run over an Array with 10k entries:

1.700 operations per second vs. 47 operations per second.

Yes, it's that slow, because it has to re-create an ever-growing object with every iteration. It will get exponentially slower the more entries the array has. Mutation is not the root of all evil, it does not have to be avoided at all costs. If the scope is small, and the intent is clear - mutate away πŸš€.

But still - avoid reduce


Do you like reduce or not? Let me know in the comments below ⬇️

Top comments (5)

Collapse
 
madza profile image
Madza • Edited

I need to look up it's syntax every time I use it πŸ˜€πŸ˜€

arr.reduce(callback( accumulator, currentValue, [, index[, array]] )[, initialValue])

Collapse
 
metalmikester profile image
Michel Renaud

When I saw the title of the article my first thought was, "right... how does that thing work again?" It's one of those things that I'm unable to memorize and always have to look up.

Collapse
 
vonheikemen profile image
Heiker

I would like to add to that reduce is very good with binary operations that are closed under one type. Like "add" where you have (Number, Number) -> Number. The kind of functions where you don't even care who is an accumulator and what's the current value. You just plug it like this arr.reduce(binary_op) and you're done.

Collapse
 
tkdodo profile image
Dominik D

yes, summing two numbers is a classical example for reduce. It's actually reasonable because it really "reduces" the input into one value. I still prefer to use a util function for readability, like sum(numbers). Can be taken from lodash, but can also be just my own util that implements it with reduce. The name alone is worth the abstraction that I don't have to read reduce a bunch of times and have to grasp the implementation :)

Collapse
 
havespacesuit profile image
Eric Sundquist

In general it is difficult to read, which is a good reason not to use it. If a simple for iteration gets the job done, your teammates will probably thank you.