Qudos for posting your code so that others can see what you did. And critique it :-). I would have liked a copy of the workbench.js used in the original post.
Anyway, you should really output the results to an array and check they all give the same (correct) answer.
For example, your findObjectForfindObjectForCached and findObjectReduce have a simple bug in them that means they give the wrong answer and artificially makes them appear faster than they really are (the seen[item] test never passes).
You'll need to be a little careful fixing these, as some ways of fixing it would not work correctly if the sample data includes the number 0.
Most of the functions don't behave as I would expect if something is repeated more than once, although the code in the original article has the same problem e.g.
> findArrayReduce([1, 2, 1, 2, 3, 1])
[ 1, 2, 1 ]
In a few places, there would be equivalent versions where the Maps could be replaced with Sets. The performance should be more or less the same, but I think the code would reflect intent better.
In terms of approaches rather than micro-optimisations, I have some approaches that haven't been mentioned - one that sorts and mutates the passed array and so creates very little garbage
and here's one that uses sparse arrays and creates a count.
functionfindDupByCount(sampleData){constcounts=[],seen=[]for(leti=0,len=sampleData.length,value;i<len;value=sampleData[i++]){if(counts[value]===undefined){seen.push(value)counts[value]=0}counts[value]++}// could use Object.keys(counts) here instead of keeping track of seen,// but that makes everything into strings!returnseen.filter(x=>counts[x]>1)}
Neither approach has particularly amazing performance for the scenario posted, but it's interesting to consider the different ways you can solve the problem.
For further actions, you may consider blocking this person and/or reporting abuse
We're a place where coders share, stay up-to-date and grow their careers.
Qudos for posting your code so that others can see what you did. And critique it :-). I would have liked a copy of the workbench.js used in the original post.
Anyway, you should really output the results to an array and check they all give the same (correct) answer.
For example, your
findObjectFor
findObjectForCached
andfindObjectReduce
have a simple bug in them that means they give the wrong answer and artificially makes them appear faster than they really are (theseen[item]
test never passes).e.g.
You'll need to be a little careful fixing these, as some ways of fixing it would not work correctly if the sample data includes the number 0.
Most of the functions don't behave as I would expect if something is repeated more than once, although the code in the original article has the same problem e.g.
In a few places, there would be equivalent versions where the Maps could be replaced with Sets. The performance should be more or less the same, but I think the code would reflect intent better.
In terms of approaches rather than micro-optimisations, I have some approaches that haven't been mentioned - one that sorts and mutates the passed array and so creates very little garbage
and here's one that uses sparse arrays and creates a count.
Neither approach has particularly amazing performance for the scenario posted, but it's interesting to consider the different ways you can solve the problem.