Manage DeltaTime above 60FPS

How do I…

I am trying to do a timed bar that fills in x seconds after the player takes action.

What is the expected result

I am using a tiled object as the bar and normalizing the size to 70 pixels. I have the expected time (e.g 10 seconds) and the elapsed time. So basically, the size of the tiled object is:
70 * elapsed time / total time
The only variable is the “elapsed time”. I have tried it being a timer, or a variable that updates every frame as follows => elapsed time = elapsed time + TimeDelta()
The expected result is that, no matter the frame rate (that is 1/TimeDelta()) the bar will fill in the total time (e.g. 10 seconds)

What is the actual result

The action works as intended ONLY for FPS below 60FPS. When FPS is higher (e.g, 90FPS) the time is not 10, but 15 seconds. The in game timers and all the others will show as if 10 seconds elapsed, but measuring using a real-life chronometer will show 15 seconds. This is consistent with elapsed time being = elapsed time + 1/60… (but it should be 1/90). All in all, it feels like the engine had a 60FPS threshold.

Is there any explaination for this behavior? What is wrong with this?
Note: if I use a fixed number (like 1/60) instead of DeltaTime() in the formula, the opposite behavior happens, and the bar fills in 10 seconds for any time ABOVE 60FPS, but fails to perform at slower rates (e.g. taking 20 seconds to fill at 30FPS). While a possible solution could be to check if FPS is above or below 60, and fix the variable accordingly, I wonder if there is a general solution and if I am doing something wrong or missing something.

Related screenshots

This is the screenshot of the related events. These are only two that use the variables and/or the bar object.

What is the value of VEL.max?

10 seconds, for the sake of thet example.
It is a constant for the purpose of the bar filling.
I have it as a variable because I expect it to be a player stat, so as the player levels up, the waiting time reduces, making the character faster over time.

When I have a game at 240 fps on a 240 hz display timedelta still matches actual seconds passed it I compare to a watch or time tracker (just set up a variable that adds timedelta() to it).

There has to be something odd with the formula or variables causing the issue.

Thank you Silver-Streak. As you can see in the image, the variable Master.VEL.actual is actually just a timer as you describe. I just did a new game only with text that display the FPS, the timer and the variable. No more variables, no more events. Nothing: the problem persists.

My screen is 60 Hz, so I dont know if that is part of the issue. After your comment, I changed it to 120 Hz, and it works fine when I force the game to 30, 60 or 120 FPS, but if I try 45, 90 or 110 FPS (or any other weird number) then it does not seem to work.
Notice that when I say “force” I put both min FPS and max FPS at the desired rate, and that might be part of the issue. In any case, this seems to be a hardware techincality more than a formula issue (or a software issue for that matter).

Electron and chromium are vsync refreshrate locked. You cannot go above 60fps on a 60hz monitor. You also will never get 110fps on an 120hz monitor (it will usually drop to 60, if I remember right. Maybe 90? I can remember for sure)

That is likely your issue.

1 Like