I am currently upgrading our business's production metrics dashboard. Previously, the data was delivered once and to get new data you had to hit the refresh button on the browser.
Now, they want the data to auto-refresh every 5 minutes. And be synchronized between users, so in theory, based off server time.
That basic code works, and does so fairly well for the most part.
The problem I'm having with it though is one that I only noticed after allowing it to run for awhile in the background on two browsers (IE and FF)and then monitoring it.
The IE instance seems to catch the very last bit of the minute it's in when it goes to redraw the "last updated" message, so instead of displaying an expected 11:55, it displays 11:54 also, as time goes on, it slips further back. So eventually you're redrawing at like 12:32 instead of 12:35. And the FF instance seems to grab near the end of the timeframe to redraw that message. It will initially redraw at 11:55 and say so, but as time elapses, it'll insert a refresh time of 12:16 or similar.
So, on the surface, it looks like FF might be running a tiny bit fast, and IE running slow. Either way, it's causing the "synch" to not happen when and how it should. This is an obviously undesireable issue.
Has anyone else come into contact with similar issues, and if so, how have you resolved them?
PS: Here's my timer code, as an FYI. Let me know if any other supplementary code would be helpful!
The code is likely just accumulating error over time. Rather than always using a relative value (5 minutes), calculate the delta to the desired next point on each iteration. Or do all the calculations on the server and just have the server tell the page how long until it should reload.
It is probably rendering that may freeze the execution time of the timers. I would agree with Bear and do the calculation of the delta each time. You could always look to see if setInterval works better than setTimeout.
Come have lunch with me Arthur. Adventure will follow. This tiny ad: