re: What does "Big-O notation" mean anyway? VIEW POST

FULL DISCUSSION
 

I like the idea of a fixed step size for explaining the concept. One idea that helped me grasp it was that it is not so much about the time of specific runs, but how the runtime behaves as a function of input size. Then its obvious why constants dont matter because the notion is only about change in input size. And it also points at the limitations of the concept 😉

code of conduct - report abuse