DEV Community

Discussion on: Assigning [ ] performs better than Array(n) - Reports attached.

Collapse
 
svaani profile image
Vani Shivanand • Edited

That's right. I'm sorry that I didn't observe that. But as per the explanation holey array should be slower unless there's a trade off on memory allocation on a repeated basis. I'll get back on this. Thanks for correcting!

Update

I have updated the image with the latest example. Please check

Alt Text

You may argue that the length of array is 2000 in the other case. But in reality there are no 2000 elements in the memory.

Also, if we do assign 2000 memory then we will be doing initial 20 operations which are slower. But yes, there's a trade-off in memory allocation performance

Collapse
 
miketalbot profile image
Mike Talbot ⭐ • Edited

Well you are doing an array.join on 2000 elements in the 2nd example and only 20 in the first. The fact that they are undefined and therefore add up to very little output doesn't change the fact it has to process it with the join. It's your innerHTML that is taking the time now...

Remove that inconsistency (and make it so there's enough data to really compare) jsperf.com/assign-vs-push-d2/1

I guess if you are saying that you have to account for empty elements in a bigger array, then I get that - over-allocating an array would certainly cause problems as everything will have to deal with the empty space when running across the whole array. The point I'm making is that it's more performant for allocation than pushing on-demand, individual item lookup performance remains optimal, but I agree that whole comprehensions are slower if the empty space is meaningless.

Thread Thread
 
svaani profile image
Vani Shivanand

Please refer,

dev.to/svaani/comment/12443

As you scroll down the comments, you will see the same discussion that went through. Looks like, chrome team has recently has optimized on holey arrays. Firefox gives different results even with the link that you have provided.