I'm from Aguascalientes, Mexico! now based on New York. Most of my experience is related to code websites and applications, using JavaScript stack-based.
I agree depends on the case, some of the issues are related to maintainability other to performance issues, most of the cases is because some people is trying to overcomplicate things because some imaginary scalability issues.
Despite the fact the code is more performant, we should consider:
1 - Number of items
2 - Amount of operations
3 - Think in the different benefits coming from different worlds, FP, OOP, etc.
I saw more issues coming from extremely premature optimizations than performance issues, even with large datasets...
In addition, the code can be even readable
// example 1consttotal=numbers.reduce((t,c)=>t+(!Number.isInteger(c)?0:(c*2)),0);// Example 2letacc=0;for(leti=0,size=numbers.length;i<size;i++){constcur=numbers[i]acc+=(!Number.isInteger(cur)?0:(cur*2));}// Example 3consttotal=numbers.filter(cur=>!!cur&&Number.isInteger(cur)).reduce((t,c)=>(t+c*2),0)// We clean up the falsy values (including 0), and we avoided another iteration
Again, it depends on the conditions,
For further actions, you may consider blocking this person and/or reporting abuse
We're a place where coders share, stay up-to-date and grow their careers.
I agree depends on the case, some of the issues are related to maintainability other to performance issues, most of the cases is because some people is trying to overcomplicate things because some imaginary scalability issues.
Despite the fact the code is more performant, we should consider:
1 - Number of items
2 - Amount of operations
3 - Think in the different benefits coming from different worlds, FP, OOP, etc.
I saw more issues coming from extremely premature optimizations than performance issues, even with large datasets...
In addition, the code can be even readable
Again, it depends on the conditions,