I spare you the time of reading some long boring intro, here's the meat of the article:
Let's say you have an array like this:
[
{id: 1, ca...
For further actions, you may consider blocking this person and/or reporting abuse
Honestly, I really don't like naming the first parameter of the callback function as
acc
. It's very ambiguous, at least for me. I prefer to name itprev
because it's much clearer to me that theprev
contains the result of the previous iteration (or the initialized value if it is the first iteration).That makes sense for
.map()
, but for.reduce()
the previous value is also the accumulated value which will eventually be returned. Making that distinction in the naming convention is a nice visual cue imo.That's the imperative name for it though, no?
You're not supposed to know that iteration is taking place, just that the values are being absorbed into the accumulator :v
That is indeed the true and only technical name for it, but semantically speaking, I prefer naming it
prev
. To each its own, I suppose.Yeah, I'm totally just being a smartass.
Many, most even, reducers we write are not commutative.
This one could be parallelizable, actually, if we add a merging function, but still we start with an empty object for
acc
/prev
, not one of the items.@_bigblind You've seen people write something like
right?
Excuse my lack of knowledge on the subject, but what does it mean for a reducer to be "commutative" and "parallelizable"? And what do you mean by "merging function"?
Oh, now I understand your point about not thinking about the fact that iteration is being used! If they're not done in parallel, you don't get the previous value :).
If we don't care about the order of the incoming
id
s, and just want to get the sets ofid
s of each article, we could split the counting between multiple threads or even machines.Something like this silly thing:
And by "commutative" I mean that if you pushed an array into a number you'd get an error, and that
'a'+'b'
and'b'+'a'
gives you different strings.Whereas integer addition without overflow is commutative:
1+2
gives the same result as2+1
andconst s = new Set; s.add(1); s.add(2)
as well.Oh, wow. You're right about calling it "silly". π
It's silly in the sense we have only 10 items instead of billions, they are in memory at once, and it doesn't actually spawn threads or workers.
This is really inefficient because of the nested spread operator in the reduce function. Here's some more info: prateeksurana.me/blog/why-using-ob...
Hey, thanks for your comment, and great post! I had no idea that the spread operator was O(n) in terms of number of properties, though that totally makes sense. I wonder if some JS engines would optimize this code into a mutation if they could somehow make sure that the value before mutation is never accessed, but of course, we shouldn't rely on JS engine optimizations to fix our bad JS habits :).
That's a really cool fully immutable solution.
I understand it, but I would personally avoid it in JS, if not for readability by juniors then at least because I have an addiction to micro optimization.
Amazing thx!
Solid explanation, thanks!
Thx, fixing that now :)
Loved it, thank you! I am stepping my toes into reduce and it's hard to wrap my head around.
Your example makes it so much easier to grasp. Thanks.
Would help if you showed the result... I know I can just do a console.log but, yeah