It seems System.currentTimeMillis
is not very accurate.
See this sample:
public class MillisTime {
public static void main(String[] args) {
long start = 0;
long end = 0;
while (true) {
if (start == 0) {
start = System.currentTimeMillis();
} else {
long current = System.currentTimeMillis();
if (current != start) {
end = current;
break;
}
}
}
System.out.println("The time interval of your OS: " + (end - start) + "ms");
}
}
The result is (on Windows XP):
The time interval of your OS: 15ms
Why it's not 1ms
? And how to get accurate millis second of current time?
Best Answer
This is entirely expected. You'd see the same thing on .NET using
DateTime.Now
. (See Eric Lippert's blog post on the topic for a .NET-oriented view on this same topic.)You can use
System.nanoTime()
to get a more accurate timer for measurements only - that's not meant to give an absolute time, it's only for measuring intervals.I don't know of any way to get a more accurate absolute time, either from Java or from Win32. To be honest, how accurate is the system clock going to be anyway? Even with regular syncing with an NTP server I'd expect at least a few milliseconds inaccuracy.
Basically, if you're relying on getting an absolute time really accurately, you should probably change your design.