Before modern JavaScript was a thing, the only conceivable way of iterating over arrays was to use the classic C-style for
loop. It was cumbersome to say the least. It was too verbose and had a lot of boilerplate code. With the rising popularity of concepts in functional programming came the array methods we love and enjoy today. Thanks to forEach
, map
, filter
, and reduce
, iterating over arrays has never been easier. Coupled with ES6 arrow functions, they have never been more concise.
In addition to its brevity, these array methods—which are essentially just glorified for
loops—also allow us to chain various array operations as much as we need without sacrificing readability (depending on your code). It's a real work of art to see a beautiful chain of sequential method calls. Seeing how an array is manipulated step-by-step for each method call makes it all the more natural to read. What had to be done with several lines of code back then can now be done with a single one.
Although they have virtually eliminated the need for for
loops, array methods introduce new problems to the table. As elegant as a chain of method calls can be, we have to remember that for each method we attach to the chain is a whole new iteration of the array. To write performant code, we must keep in mind that these long chains mean more iterations.
Combine your Math operations
To illustrate the problem of unnecessarily long chains, consider an array of numbers from -2
to 2
. It is our objective to find the sum of thrice the squares of these numbers. At first glance, we can go about the problem with a chain of map
and reduce
calls.
const nums = [ -2, -1, 0, 1, 2 ];
const sum = nums
.map(x => x * x)
.map(x => x * 3)
.reduce((prev, curr) => prev + curr, 0);
This will indeed meet our objective. The only problem with it is the fact that it has three chained methods. Three chained methods mean three whole new array iterations. We can prove that fact by adding an intermediary console.log
before returning each callback function but I won't do that in this article because you probably get the point by now. If that sounds very time-inefficient, especially at scale, then you'd be correct. To make this code more performant, we simply have to find a way to combine the method calls in such a way that minimizes the number of iterations the CPU has to do over the same array of data.
const nums = [ -2, -1, 0, 1, 2 ];
// Level 1: Combine the `map` calls
const level1Sum = nums
.map(x => 3 * x ** 2)
.reduce((prev, curr) => prev + curr, 0);
// Level 2: Combine _everything_
const level2Sum = nums
.reduce((prev, curr) => prev + 3 * curr ** 2, 0);
Use compound Boolean expressions
The same rule can be applied to Boolean expressions and the filter
method. Let's say we have an array of User
objects. We want to find the User
objects that currently have premium accounts. Then, from those accounts, we look for administrators whose ages are over 18
.
class User {
constructor(isAdmin, hasPremium, age) {
this.isAdmin = isAdmin;
this.hasPremium = hasPremium;
this.age = age;
}
}
// Array of `User` accounts
const users = [
new User(false, false, 9),
new User(false, true, 30),
new User(true, true, 15),
new User(true, true, 19),
new User(false, true, 3)
];
Instead of combining Math operations, we can use compound Boolean expressions to combine each condition. This way, we can minimize the number of array iterations.
// Level 0: Chain _everything_
const level0 = users
.filter(user => user.isAdmin)
.filter(user => user.hasPremium)
.filter(user => user.age > 18);
// Level 2: Combine _everything_
const level3 = users
.filter(user => (
user.isAdmin
&& user.hasPremium
&& user.age > 18
));
Take advantage of operand omission
It is also worth noting that it is still possible to further optimize similar code. By arranging Boolean conditions in a clever way, the code can run slightly faster. This is because the ECMAScript specification states that the logical AND operator (&&
) must immediately stop evaluating succeeding operands as soon as it encounters an expression that evaluates to false
.
function willRun() {
console.log('I just stopped the `&&` operator from evaluating the next operand.');
return false;
}
function neverRuns() { console.log('This function will never run.'); }
// 'I just stopped the `&&` operator from evaluating the next operand.'
true && willRun() && neverRuns();
To write (slightly) more performant code, Boolean expressions that are more likely to be evaluated to false
must be placed at the beginning of the compound Boolean condition in order to prevent the unnecessary execution and evaluation of succeeding operands.
// Arranging conditions properly will
// make your code run slightly faster.
arr.filter(x => (
x.mostLikelyToBeFalse
&& x.moreLikelyToBeFalse
&& x.likelyToBeFalse
&& x.leastLikelyToBeFalse
));
Conclusion
Of course, the examples I presented are trivial. Running these examples won't present a huge performance difference, if at all. The performance impact of an unnecessarily long chain of iterations only becomes apparent at scale with more computationally expensive calculations. For most cases, we don't need to worry about it. Besides, most chains do not even exceed the length of four.
The point of this article is to serve as a reminder to all that just because we can chain methods calls, it doesn't mean we should overdo it. It is our responsibility as developers to make sure that we do not abuse this power. No matter how negligible, there really is a performance impact for each method we attach to a chain. If there is one thing that you should learn from this article, it is the fact that longer chains mean more iterations.
Unless you want to face the wrath of unnecessary iterations, please don't "overchain" array methods.
Latest comments (61)
Thank you for an interesting post. Just worth to mention maybe that on the opposite side one reduce or flatMap that rule dem all, is not the best option sometimes in the context of maintainability.
JSPerf is down atm, but i'm pretty sure this is faster and easier to read:
I mean... yes, it definitely is, but the point of my article is not to promote the most performant code; rather, the point of this article is to remind the reader to be wary of "overchaining" array methods because of its implications on the number of array iterations. 😅
I mean, sure, but my arrays are generally pages of blog posts or user information. If I have more than 10 items in an array at at time, sure, I'll start optimising. Until then, readability wins. Optimising has a cost.
A rule that I like about optimisation is this:
You are only allowed to fix a performance problem without profiling if you have already fixed that same performance problem with production data. If you haven't, leave the code alone, in a readable form.
Once production shows signs of slowness and you have profiled the code and confirmed that it is indeed the chained calls on an array, you can fix it. Until then, leave it alone, it's probably fine as it is.
There will be times when a loop will be much better than chained array methods in terms of elegance and readability.
Good article, but this shows JS is not functional. By the isomorphism of FP with CT we know that
(fmap g) ∘ (fmap f)
equalsfmap (g ∘ f)
, so a compiler would be able to optimize this away easily. This is not the case in JS becauseg
andf
might have order-dependent side-effects, i.e. it is not functional.Precisely! That is a good point. I believe this is exactly why JavaScript is considered to be a "multi-paradigm" programming language, at least according to Wikipedia. It may not exactly have the purest functional programming practices, but it sure does blend them pretty well with object-oriented programming, considering that the two are kind of like polar opposites of each other.
I know this is a popular stance, but I disagree with it.
Just being able to pass functions around as arguments doesn't make something functional. Claiming otherwise seems disingenuous to me, as you'd be claiming the benefits of FP without delivering (barring some syntactic sugar). You might call JS "functional style" or "closure-centric procedural". I just made the later up, but I think it's rather descriptive.
Now that you mention it, I think I agree with you in that regard. It does seem more accurate to consider JavaScript a "functional-style" programming language rather than a "functional programming language mixed in with some object-oriented features".
Very useful to remind people of that.
This is basic algorithm time complexity : with an array of N elements
one loop = O(N)
2 loop = O(N+N)
X loop = O(N*X)
So for an array of 10 elements and 4 ".map" the time to execution is 4 times what is needed.
And you need to add to that the time to push new values in the new arrayS.
I think a general rule can be "NEVER USE SAME ARRAY METHOD TWICE".
Also maybe always use "filter" before "map" ?
And i would also like to point that if for some people a 3 term operation is a problem of readability : god are people becoming this lazy ?
I too like readable code, but our primary goal is "efficiency".
Efficiency = performance and should ALWAYS take over what any idea of "beauty" someone may have.
I'm not talking about micro-optimizations but to avoid unnecessary operations.
First do the job in an efficient manner, then refactor to make it human readable as possible.
Sometimes complex things are complex to read.
We are not making art.
I like the perspective that we have to decide what we're optimizing: the writing, the execution, or the reading. Each has its place and tradeoffs.
I mean, one can't really say that those who disagree with this article are completely wrong. Readability really is an important aspect of writing code. Performant code means nothing if it is unreadable.
Also, yup, I definitely agree with using
filter
beforemap
. That should honestly be a default for all JavaScript programmers.This is what I call unnecessary optimization. If you follow time complexity optimization, O(3n) is still just O(n). Now, I'm not sure what the space complexity is on this, but unless your arrays are massive, I'm sure it's negligible for integers.
.map and .filter do create new arrays so that definitely is something to be aware of, but doing 3 operations in 1 loop is effectively the same thing as 3 loops with 1 operation.
I think you're misleading beginner JS developers with this article.
Waiting 1m is still better than waiting 3m. Disregarding the less significant factors in O(3n) is not always a good idea in practice.
I agree all of this is a pretty useless exercise when you have 3 consecutive maps over 10 or even 1000 items, but this optimization is the reason many FP libraries offer super easy to use pipe/compose functions.
I'd say, if the optimization is that easy, do it — even if the gains small. Without pipe/compose available, I'd be in favor of readability over performance though.
What are you talking about. The time complexity doesn't change by moving everything into the reducer, it's the space complexity that changes because .map creates a new array. It's not 1ms vs 3ms...
You are right that it doesn't take three times as long, my mistake.
But it's never just "1+1+1=3" either.
Especially if the arrays are large, and the operations are simple, then the overhead of copying the array doesn't just affect memory use, it also takes longer.
Yes, that is right. But even though they ultimately boil down to an
O(n)
time complexity, it can still have a significant performance impact if your arrays are massive, as you said. In a way, I'm not really misleading beginners here because it's the truth of the matter.But for most cases, one does not need to worry about these performance impacts. It really is just an unnecessary optimization.
I don't believe that you make any comments about
.map
using more space in your article do you? You also don't comment and say this is only worth doing in one step if your arrays are massive. That's why I suggest that you are misleading beginners. Your example is contrived. For this article to be really useful, you should use a really big array as an example and demonstrate the performance benefits of moving all your operations into the reducer. At the same time, you should speak about the code using significantly less space and that's why it's faster.Ah, I see now. There's a reason why I didn't add the #beginners tag for this article. It is exactly for the reason you stated. I wasn't targeting beginners for this article. I was assuming that those who read it will most likely know how to optimize their algorithms themselves with what they learned from this article as a guide on how to do so. I suppose that was wrong of me to assume.
Also, I was actually thinking about discussing the space inefficiency of long chains of array methods, but I felt that doing so would have caused me to stray away from the actual message I was trying to convey throughout the whole article: longer chains mean more iterations.
The only comment based on the O notation in both time and space complexity. Looks like not all folks know about that.
That's why ghc does "loop fusion". JavaScript has the problem of allowing impure functions, but it should be able to notice pure functions and do loop fusion when it can, so the programmer needn't do the compiler's job for it.
This sounds like a great idea. Really, the only obstacle that stands in the way is implementation. I don't think it will be easy to make a JavaScript engine intelligent enough to detect pure functions and "fuse loops". Given the workload of JavaScript engines nowadays (execution, optimization, event loops, garbage collection, etc.), it seems to be a stretch to see this feature come in the near future.
It is exactly this problem, the trade-off between readability and efficiency, pertaining to chaining array methods that has inspired me to create a solution. I've been working on a small library that transforms iterables into a form that can have chained array methods with a single iteration. Something similar to Java 8's Stream API.
I disagree with this article.
We shouldn't sacrifice readability and composability for performance gains, and we don't have to.
Our goal here is to iterate over things in a way that is both readable and performant. Let's have a look at different methods of achieving this.
We want to multiply stuff by 2, and then add 4. Let's define these goals as functions:
This, as mentioned in the article, is readable, but may become a performance issue:
The proposed solution, however, is not very readable, and doesn't scale/compose well:
This uses composition, but is far from being readable because of the
reversed, "inside-out" order of execution
Solution? Best of both worlds. Let's create a higher-order function which would allow us to pipe
other function calls in a nice, clean, left-to-right order:
We can now do stuff!
In fact, multiple functional/semi-functional libraries have
pipe
function in their toolbelt. it's also a part of my very minimal fp library.We can as well define
all
,some
ornone
functions and use them to compose ourfilter
logic.Funny, I usually define my pipe as following:
I also make args unary, so that I don't have to think about methods like
map
injecting spurious parameters.Next comes giving it a TS type signature.
Which involves overloads.
It's horrible, but the results are great.
What is the performance of this solution?
Because this will be guaranteed slower than the solution in the artice because the JavaScript engine almost certainly doesn't do functional optimization. (+ You create a capture witch is slow, needs more optimization which adds to potential jit time)
It is also less simple as you have to understand more concepts to understand the code, especially because JS is an imperative language the functional style is a bad fit. In imparative languages a for-loop with the proper if-statements and calculations will always be the better option for both performance and simplicity.
Please don't bully my phone's battery more than JS already has to. :)
Now this is an elegant solution. The only problem with it is that it isn't exactly the most beginner-friendly code. You must have a significant amount of experience with functional programming to be able to fathom the notion of functions inside functions inside functions for composition.
Besides that, it really is an elegant solution that combines the best of both worlds. Pipes truly are a marvel of (software) engineering.
Maybe it's because I started out learning programming from the FP side, but composition feels like pretty intuitive beginner stuff, and I've always had trouble reading ugly loops which store all kinds of stuff in variables.
I think "beginner friendly" highly depends on what you've been exposed to so far.
Yuppp, loops are always ugly. There is no question about that. You have no idea how many times I've gotten myself inside infinite loops. It's just a horrible experience when I was a beginner.
And I think you're right about that. It does depend on what you've been exposed to as a beginner.
This is a bit like saying that nails are always ugly, you should use screws. Beginners should learn when to use which, in addition to how not to hit their fingers.
FP is cool and all but FP in JS has become an unbearable infomercial lately: "Have you suffered from infinite loops? Ever feel like your code doesn't have enough dots or arrows? Tired of using plus signs for addition? Try FP"
Some comments may only be visible to logged-in visitors. Sign in to view all comments.