and although you are right in that the first example is inefficient, slower and may consume more memory than the other one I have hard time believing that it would cause a browser to stop working. For that to be an actual problem you'd need the feature array to have had millions of entries, or have the the function that invokes the .reduce run in a loop or something.
Another (more elegant imo) solution to this could be to use Object.fromEntries and a Array.map:
and although you are right in that the first example is inefficient, slower and may consume more memory than the other one I have hard time believing that it would cause a browser to stop working. For that to be an actual problem you'd need the
feature
array to have had millions of entries, or have the the function that invokes the.reduce
run in a loop or something.about 200Mil features to be exact >_<Sorry I ment it was 20k features that I loop N^2 on which are 400mil iterations
Wait what, are you serious? A list this big would weight multiple gigabyte's as a JSON, there's no way you transfer that via API requests
Sorry I ment it was 20k features that I loop N^2 on which are 400mil iterations
I Will Edit the answer