DEV Community

Cover image for [Tiny] Elapsed Time in Java: currentTimeMillis vs nanoTime

[Tiny] Elapsed Time in Java: currentTimeMillis vs nanoTime

Petr Filaretov on May 25, 2023

When you need to measure how long some code takes to execute, the first thing that usually comes to mind is currentTimeMillis: long start = Syst...
Collapse
 
cicirello profile image
Vincent A. Cicirello • Edited

Both System.currentTimeMillis() and System.nanoTime() measure "wall clock" time, at least in the sense that the difference between 2 measurements is how much time has passed on A clock.

There are 2 main differences between them. The first is that they are at different levels of precision. The second, which your changing the clock trick demonstrates is that System.currentTimeMillis() gives you the current time (as in date and time of day) since Jan 1, 1970, which is why changing your computer's clock changes what it returns; while System.nanoTime() uses that instance of the JVM's time source to give you time since some origin time that varies per instance of a running JVM. It still gives you time that has elapsed on a clock, so in that sense still sort of "wall clock" time. It is just not the literal wall clock, so changing computer's clock shouldn't affect it.

If you really want to time code, ideally use a microbenchmarking framework like JMH. But otherwise, better than using either System.currentTimeMillis or System.nanoTime is to time elapsed CPU time with something like the following:

ThreadMXBean bean = ManagementFactory.getThreadMXBean();
long start = bean.getCurrentThreadCpuTime();
// do something to time here
long end = bean.getCurrentThreadCpuTime();
long elapsedCpuTimeInNanoseconds = end - start;
Enter fullscreen mode Exit fullscreen mode

This will give you how much time the code utilized the CPU.

Collapse
 
siy profile image
Sergiy Yevtushenko

There is a difference in resolution, not precision.

Collapse
 
cicirello profile image
Vincent A. Cicirello • Edited

No. Precision is the units the method returns. In this case the one method is in milliseconds and the other is in nanoseconds. Thus, precision is different for the 2 methods.

Resolution is how frequently the values change. System.nanoTime is defined as nanosecond precision in the documentation, but its resolution may be (and probably is) different than that. From the javadocs:

This method provides nanosecond precision, but not necessarily nanosecond resolution (that is, how frequently the value changes)

The resolution of the 2 methods is probably also different. The resolution of System.nanoTime is whatever the resolution of the JVM's "high-resolution clock" which likely varies by system, but is guaranteed by definition to be at least the resolution of System.currentTimeMillis. From the javadocs:

no guarantees are made except that the resolution is at least as good as that of currentTimeMillis()

So precision is definitely different for the 2 methods, while resolution is probably different but could be the same.

Thread Thread
 
siy profile image
Sergiy Yevtushenko

Hmm. That is somewhat different from what I was taught. Precision defines repeatability of the result, so even if you perform measurements in nanoseconds, but it actually changes in thousands of nanoseconds, precision remains "milliseconds". Resolution is what you see "written at scale" (in our case - units for value returned by each method). Other definition of resolution - smallest detectable difference. It also agrees with my version.
Another view on the issue: you can multiply value returned by .currentTimeMillis by 1000 and get nanosecond resolution, but this doesn't give you ability to measure half of millisecond, because precision remains unchanged.

Thread Thread
 
cicirello profile image
Vincent A. Cicirello • Edited

I think you might be mixing up precision with accuracy. Here is a good concise explanation of difference (from wp.stolaf.edu/it/gis-precision-acc...

Precision is how close measure values are to each other, basically how many decimal places are at the end of a given measurement. Accuracy is how close a measure value is to the true value.

You'll notice that the javadocs for the 2 methods in question say absolutely nothing about accuracy.

I think your notion of resolution, "resolution - smallest detectable difference", is correct. Although System.nanoTime is nanosecond precision, its resolution may vary by system. The "smallest detectable difference" in your words is basically how frequently the high resolution clock changes.

Your example of multiplying milliseconds by 1000 increases precision, but doesn't increase accuracy or resolution. Although to go from milliseconds to nanoseconds you'd actually need to multiply by 1 million (1 second = 1000 milliseconds = 1 billion nanoseconds).

Thread Thread
 
siy profile image
Sergiy Yevtushenko

I don't think that I'm mixing these two. Even text quoted by you mentions "decimal places" for precision as number of valid digits in measurement.

You're right about scaling from millis to nanos. And no, scaling don't change precision, but changes resolution.
Here is one randomly chosen quote about the difference between two:

Precision lets the operator known how well-repeated measurements of the same object will agree with one another. Resolution is the total weighing range of a scale divided by the readability of the display.
(taken here). As you can see, this definition uses terms in the same way as me.