Why is it not accurate?
Because you are using setTimeout()
or setInterval()
. They cannot be trusted, there are no accuracy guarantees for them. They are allowed to lag arbitrarily, and they do not keep a constant pace but tend to drift (as you have observed).
How can I create an accurate timer?
Use the Date
object instead to get the (millisecond-)accurate, current time. Then base your logic on the current time value, instead of counting how often your callback has been executed.
For a simple timer or clock, keep track of the time difference explicitly:
var start = Date.now(); setInterval(function() { var delta = Date.now() - start; // milliseconds elapsed since start … output(Math.floor(delta / 1000)); // in seconds // alternatively just show wall clock time: output(new Date().toUTCString()); }, 1000); // update about every second
Now, that has the problem of possibly jumping values. When the interval lags a bit and executes your callback after 990
, 1993
, 2996
, 3999
, 5002
milliseconds, you will see the second count 0
, 1
, 2
, 3
, 5
(!). So it would be advisable to update more often, like about every 100ms, to avoid such jumps.
However, sometimes you really need a steady interval executing your callbacks without drifting. This requires a bit more advanced strategy (and code), though it pays out well (and registers less timeouts). Those are known as self-adjusting timers. Here the exact delay for each of the repeated timeouts is adapted to the actually elapsed time, compared to the expected intervals:
var interval = 1000; // ms var expected = Date.now() + interval; setTimeout(step, interval); function step() { var dt = Date.now() - expected; // the drift (positive for overshooting) if (dt > interval) { // something really bad happened. Maybe the browser (tab) was inactive? // possibly special handling to avoid futile "catch up" run } … // do what is to be done expected += interval; setTimeout(step, Math.max(0, interval - dt)); // take into account drift }