Timers not counting at the same speed on different systems

I’ve noticed the timers seem to count at slightly different speeds per system. I thought timers were absolute on time?

I’ve got a game demo in my game and it’s set so the last level in the demo (it cycles through every level) is the crescendo of the music. Each level is a different scene. It starts a timer at the beginning of each scene/level and flips to the next level after a specific time.

On my PC the timing is perfect, but on my mobile the music finishes before the last level.

Interestingly, I have three level modes (one set of twenty, one set of twenty-one and a full set of forty). It’s only the full set of forty that seems to show the issue.

Any ideas?

To clarify, Timers are just timedelta between frames added to a single number (the timer name).

They have always been impacted by game slowdown below minimum framerate, so if your game drops below your minimum framerate on one system (mobile) but doesn’t on another (pc), the system that didn’t drop framerate will have the more accurate timer (and therefore increase “faster” than the other).

I don’t believe they should be impacted if you don’t drop below the minimum framerate though.

Thanks for the reply.

I guess the only way around this would to be to use demo music that was non timing critical, unless you have a diffrerent suggestion?

The only thing I can think of to have it be more accurate would be using Time() expressions with a datestamp instead, since that’ll use real-world time. You could compare a “Starter time” at the beginning of the scene (Storing Time(“timestamp”) as soon as the scene starts as a scene variable) against “Current time” (Time(“timestamp”)) to get a real world timer, I suppose?

1 Like