When you need to measure how long some code takes to execute, the first thing that usually comes to mind is currentTimeMillis:
long start = Syst...
For further actions, you may consider blocking this person and/or reporting abuse
Both
System.currentTimeMillis()andSystem.nanoTime()measure "wall clock" time, at least in the sense that the difference between 2 measurements is how much time has passed on A clock.There are 2 main differences between them. The first is that they are at different levels of precision. The second, which your changing the clock trick demonstrates is that
System.currentTimeMillis()gives you the current time (as in date and time of day) since Jan 1, 1970, which is why changing your computer's clock changes what it returns; whileSystem.nanoTime()uses that instance of the JVM's time source to give you time since some origin time that varies per instance of a running JVM. It still gives you time that has elapsed on a clock, so in that sense still sort of "wall clock" time. It is just not the literal wall clock, so changing computer's clock shouldn't affect it.If you really want to time code, ideally use a microbenchmarking framework like JMH. But otherwise, better than using either
System.currentTimeMillisorSystem.nanoTimeis to time elapsed CPU time with something like the following:This will give you how much time the code utilized the CPU.
There is a difference in resolution, not precision.
No. Precision is the units the method returns. In this case the one method is in milliseconds and the other is in nanoseconds. Thus, precision is different for the 2 methods.
Resolution is how frequently the values change.
System.nanoTimeis defined as nanosecond precision in the documentation, but its resolution may be (and probably is) different than that. From the javadocs:The resolution of the 2 methods is probably also different. The resolution of
System.nanoTimeis whatever the resolution of the JVM's "high-resolution clock" which likely varies by system, but is guaranteed by definition to be at least the resolution ofSystem.currentTimeMillis. From the javadocs:So precision is definitely different for the 2 methods, while resolution is probably different but could be the same.
Hmm. That is somewhat different from what I was taught. Precision defines repeatability of the result, so even if you perform measurements in nanoseconds, but it actually changes in thousands of nanoseconds, precision remains "milliseconds". Resolution is what you see "written at scale" (in our case - units for value returned by each method). Other definition of resolution - smallest detectable difference. It also agrees with my version.
Another view on the issue: you can multiply value returned by .currentTimeMillis by 1000 and get nanosecond resolution, but this doesn't give you ability to measure half of millisecond, because precision remains unchanged.
I think you might be mixing up precision with accuracy. Here is a good concise explanation of difference (from wp.stolaf.edu/it/gis-precision-acc...
You'll notice that the javadocs for the 2 methods in question say absolutely nothing about accuracy.
I think your notion of resolution, "resolution - smallest detectable difference", is correct. Although
System.nanoTimeis nanosecond precision, its resolution may vary by system. The "smallest detectable difference" in your words is basically how frequently the high resolution clock changes.Your example of multiplying milliseconds by 1000 increases precision, but doesn't increase accuracy or resolution. Although to go from milliseconds to nanoseconds you'd actually need to multiply by 1 million (1 second = 1000 milliseconds = 1 billion nanoseconds).
I don't think that I'm mixing these two. Even text quoted by you mentions "decimal places" for precision as number of valid digits in measurement.
You're right about scaling from millis to nanos. And no, scaling don't change precision, but changes resolution.
Here is one randomly chosen quote about the difference between two: