Thx for the article. I'm sure it's very important for the frontend community.
People these days often use .reduce with spread .... So it becomes O(n^2). There're lots of articles (even on dev.to) which showcase such an approach. My experience of interviewing says me that people don't understand that there's a problem.
Also I think that BigO of .splice depends on the arguments. I don't think e.g. that 1 to 1 replacement would cause O(n), because it's a pretty simple optimization.
I would update this to can be a problem. Often time/space complexity only matters at scale and in many cases it will never be a problem; for instance, when populating a select-dropdown. In other cases, for instance big data, time/space complexity may always be a consideration, especially for an iterative process like that of graphing or machine learning.
I think that BigO of .splice depends on the arguments
Absolutely. That's often the case with Big-O. Big-O is the worst-case scenario, but worst-case may rarely be reached. Big-O means f(n) is less than a constant of O(g(n)). So in the worst case, splice may accept every element but on average it won't and a best-case scenario it would be zero or one element (if 0 isn't practical) at the end of the stack.
For best-case big-omega is used (e.g., Ω(n)) and for average-case big-theta is used (e.g., Θ(n)). Notice the internal ligature of the big-theta vs that of a big O as it may be easy to mistake one for the other.
For further actions, you may consider blocking this person and/or reporting abuse
We're a place where coders share, stay up-to-date and grow their careers.
Thx for the article. I'm sure it's very important for the frontend community.
People these days often use
.reducewith spread.... So it becomesO(n^2). There're lots of articles (even on dev.to) which showcase such an approach. My experience of interviewing says me that people don't understand that there's a problem.Also I think that BigO of
.splicedepends on the arguments. I don't think e.g. that 1 to 1 replacement would causeO(n), because it's a pretty simple optimization.I would update this to can be a problem. Often time/space complexity only matters at scale and in many cases it will never be a problem; for instance, when populating a select-dropdown. In other cases, for instance big data, time/space complexity may always be a consideration, especially for an iterative process like that of graphing or machine learning.
Absolutely. That's often the case with Big-O. Big-O is the worst-case scenario, but worst-case may rarely be reached. Big-O means f(n) is less than a constant of O(g(n)). So in the worst case,
splicemay accept every element but on average it won't and a best-case scenario it would be zero or one element (if 0 isn't practical) at the end of the stack.For best-case big-omega is used (e.g.,
Ω(n)) and for average-case big-theta is used (e.g.,Θ(n)). Notice the internal ligature of the big-theta vs that of a big O as it may be easy to mistake one for the other.