DEV Community

Cover image for The Power of Reduce: Optimizing JavaScript Code for Speed and Efficiency

The Power of Reduce: Optimizing JavaScript Code for Speed and Efficiency

Diogo Almeida on May 14, 2024

Hey everybody! This is my first-ever post! In this article, I will be talking about the reduce method of JavaScript arrays, which I feel is someti...
Collapse
 
jonrandy profile image
Jon Randy 🎖️ • Edited

Not looked in any great detail - but something a bit odd in your forEach test in your code... it isn't using forEach!!

You might also want to compare just using a normal for loop and accessing the array using a numeric index. That can very often be the fastest method if you're really concerned about speed.

For benchmarking, you might want to consider a pre-existing tool like perf.link:

Perflink | JS Benchmarks

JavaScript performance benchmarks you can share via URL.

favicon perf.link

I made a test of the different methods here - and using the index method is by far the fastest, with forEach and reduce jostling for 2nd place (sometimes one is faster, sometimes the other is) on Chrome (desktop), with Firefox (desktop) consistently putting reduce in 2nd place followed by forEach.

Collapse
 
qm3ster profile image
Mihail Malo • Edited

After the first run, №4 (filter.map) consistently outperforms even indexing!
Go figure!
Turns out it was due to the tiny array. With large input data, index absolutely dominates to this day.
Image description
And iterators (I changed №4 to the below for this picture) are extremely unoptimized in my mobile V8:

const mapFilterTest = () => people.values()
    .filter(p => p.age >= 18)
    .map(p => p.name)
    .toArray()
Enter fullscreen mode Exit fullscreen mode
Collapse
 
red-dial profile image
Diogo Almeida • Edited

Hi!
You're right! That really isn't a forEach, but more of a for in I'll make that correction.
Also thank you for sharing the index alternative and the Perflink tool!
I guess I didn't think of using the for loop with the index since it's not something I would typically use.

Collapse
 
budgiewatts profile image
John Watts

Never prematurely optimize - you save 3ms processing 100,000 numbers but the next developer that looks at the code will spend three minutes trying to figure out what's supposed to happen in your reduce code - that's going to cost the business a whole lot more than the difference in processing time, especially if they don't figure it out and introduce a bug.

In a real application those milliseconds will be nothing compared to latencies caused by network and database calls but the clarity gained from using filter/map is always priceless.

All of that said, your observations around the spread operator are the real gem in this article and worth sharing. Keep up the good work but only fix the problems that actually exist :)

Collapse
 
vicariousv profile image
AndrewBarrell1

I'm not sure about the tldr use reduce instead of map and filter.

The whole point of using map and filter is that you separate the logic into parts, the what(condition to determine what you need) and how (the translation).

For a tiny benefit in an abnormal situation you made the code much less intuitive at a glance. I don't think this is a good blanket takeaway.

Writing readable and maintainable code should be the first port of call unless time and/or memory optimisation is a specific requirement to the task. Over optimising for time for a project where it doesn't matter if something takes 200ms or 2 minutes at the expense of making you or your successor have to work longer to make sense of it in the future is not ideal.

Also why did you use for in? It's not the most ideal one, it has some pitfalls unless that has been corrected in recent updates. Either use for of or a normal for loop. (If arr.forEach is out of the question)

Collapse
 
teamtoyumi profile image
teamtoyumi

Great article! You made reduce sound so much simpler! I also liked how you called out that "reduce is not for sums only", because yes the naming of reducer, total/acc can make it so easy to misunderstand

Collapse
 
hugohub profile image
Karlis Melderis

I find .reduce hard to reason about and seen too often a code where people are creating new arrays and objects inside .reduce instead of mutating accumulated value.

Hence we made an agreement to default to for .. of loops

Maybe it does end up with couple more lines of code but reasoning about the flow is easier

Collapse
 
chu_lcninh_6b61c8429c94 profile image
Chu Lục Ninh • Edited

If you really want to do the kind of operation that do much things in a single iteration, check out generator. You can build generator version of map, filter, then use that version to lazily compute when you iterate through items. That achieve both the logic separation of map and filter, while keeping the performance and memory usage optimized

Collapse
 
williamukoh profile image
Big Will

TLDRs generally appears at the beginning of articles

Collapse
 
g1itcher profile image
G1itcher

Using reduce instead of map or filter robs your code of a clear intent narrative that's is arguably more important in the majority of cases than performance gains.

It also doesn't help that in my experience many developers seem to have difficulty parsing what a reduce function is doing.

Collapse
 
revenity profile image
Revenity

That is slower than a normal for loop:

for (let  i = 0, { length } = arr; i < length; ++i) {
}
Enter fullscreen mode Exit fullscreen mode

Nobody uses for in for arrays it doesn't make any sense

Collapse
 
pveercs profile image
Prashant Verma

This one is marginally faster.

Collapse
 
revenity profile image
Revenity

Not really marginally if it gets JITed

Collapse
 
j-256 profile image
James

You lost me as soon as you talked about accessing arrays with for..in rather than for..of.