DEV Community

Discussion on: Handling Array Duplicates Can Be Tricky

Collapse
 
moopet profile image
Ben Sinclair

I find this quite difficult to read with the reuse of names like resultItem and the double-negation of things like !notFound. I think the first example is better (the one without the reduce(), because it's more readable.

Aren't you deferring the problem, though? You've moved the comparison to checking the properties of an object for equality, but if those properties are also objects... you're back to square one. So you'd need to recurse, and do a deep comparison, which is expensive and has to have some compromises of its own (like picking a max depth or facing what to do if there's recursion in the object itself).

Collapse
 
proticm profile image
Milos Protic

Yes, I see your point about the double negation, I admit it should be done the other way around to improve readability, especially due to the reason that this post was written to show what is going on under the hood while finding duplicates.

About the recursion, the assumption for the given example was that we have only one level. As you said, a deep comparison is a thing of its own which wasn't the focus here.

If you are interested, take a look at the same post on devinduct.com and see Scott Sauyet comment. It's a quite interesting way to do the same thing.