I like the idea of a fixed step size for explaining the concept. One idea that helped me grasp it was that it is not so much about the time of specific runs, but how the runtime behaves as a function of input size. Then its obvious why constants dont matter because the notion is only about change in input size. And it also points at the limitations of the concept 😉
We're a place where coders share, stay up-to-date and grow their careers.
We strive for transparency and don't collect excess data.