DEV Community

Discussion on: Are we "developers" gatekeeping "knowledge" from our juniors and peers? 🤦

Collapse
 
dennismohan profile image
Dennis Mohan

First thing that popped to mind with the concat vs push was similar to your first reaction. This should be some basic comp sci, so it's somewhat surprising people aren't aware.

Interesting write up and probably something I can be more conscious about. Sometimes even very knowledgeable people will misunderstand certain implementation details. These mistakes can happen to anyone, and code reviews could/should be used to question all things. Its supposed serve as a medium for constructive criticism and learning. Unfortunately, all to often I see people looking at code reviews and PRs as a chore to get over with.

On a side note, I'm curious if you run into cases like this more with informal training/education vs a more formal comp sci degree.

Collapse
 
picocreator profile image
Eugene Cheah • Edited

Unfortunately, all too often I see people looking at code reviews and PRs as a chore to get over with.

Sadly very true, due to various constraints

On a side note, I'm curious if you run into cases like this more with informal training/education vs a more formal comp sci degree.

I would say, it really depends on the company culture, maturity of product, and/or process.

Personally, I will admit that even I would make such a "performance issue" and would let it be approved in a PR. I would comment on it, but I will not force it. Personally, I run by the statement...

premature optimization is the root of all evil (or at least most of it) in programming.
~ Donald Knuth (The Art of Computer Programming)

Something I learned, first hand. Is that in the process of creating an application where everyone has strong formal education, is that we can end up endless chasing for O(1) or O(n). It can be a huge time sink, especially on more complex problems.

Because until you have real data and real use cases: sometimes counter-intuitively when n is a small number, u can have O(n^2) performing faster than the O(n) solutions.

Personally, I have spent time on a team, which did a whole 2 weeks on changing a system from O(n^2) to O(n), only to revert it back after launch a month later. Because it was much slower (oversimplifying the problem, the O(n^2) had cachable steps in between, while the O(n) did not.

Since then, learning the hard way, for new features I run by the following in sequence.

  • Is it under <500ms? (For API calls for example)?
  • If not why?, is it a quick fix using cache?, is it good enough for now to ship (UI loading bars, etc)?
  • Do we have actual use case data?
  • How do we make it better then?

Internally we are constantly monitoring our user flows, and looking into areas to improve based on actual usage. Which is precisely how this whole "concat" vs "push" came about, as it was detected in our monitoring process.


It's also something that can happen in not so obvious ways for even a skilled team.

function addCustomObjX( inArray, objX ) {
    // ... does some obj processing
    return inArray.concat(objX)
}

The above would be approved as a merge request. Because if anything its safer to assume one should never modify the input array. If a function does so, it is required for the developer, to do the additional step of commenting and documenting so due to the potential unintended side effect.

What was not predicted (or even in the scope) of the above function was that the resulting usage was ...

while( some condition is true ) {
    // ... does some complex stuff
    domArray = addCustomObjX( domArray, objX )
}

Which caused the huge performance hit! - and it's ok because all we needed to do then next, was make the slight change and document it.

/** 
 * [Original comments]
 *
 * Note: This modifys the input array, and returns it. The developer is expected to 
 * clone the input array if they expect it to be unmodified.
 **/
function addCustomObjX( inArray, objX ) {
    // ... does some obj processing
    return inArray.concat(objX)
}