Are timers affected by processor speed and/or frame rate (I don’t really understand frame rate stuff so sorry if putting those two things together makes no sense). My experience indicates that they are affected by it.
I have a tween that’s meant to take 5 minutes and also have a countdown timer running to show the 5 minutes. The tween takes exactly 5 minutes to happen on the game timer. But in real time it takes much longer, for example 9 minutes. Soon after running the game my laptop fan comes on and after the first minute of the game timer being accurate, the timer just lags out and gets slow. And I would actually want the tween and timer to run for much longer than 5 minutes.
I went looking for some explanations and older posts said that timers have nothing to do with frame rate and newer ones said that it did. Did something in GDevelop change? I have included some quotes below.
If timers are affected by the device speed, then is there a reliable way to get around this? I know there’s a thing called time delta and I tried adding it but either I don’t understand it properly or it’s not suitable for my needs. And according to Slash in the third quote, time delta is also inconsistent. The only thing I can think of is to get the current clock time and use that as a basis for determining the length of time.
And sorry if I’ve taken the quotes out of context and they don’t mean what I think they mean.
time delta Nov 2019: (No one answered Slash’s question)
A lot of the older posts have a few things being discussed differently.
Timers/timedelta progress in real time. (Excluding timescale changes in the game)
Your ability to interact with timers or time delta ARE framerate limited. Every event is only polled every frame. In a 60fps game, every frame is 0.016ms, so you can only check a timer every 0.016ms (Correction edit: 0.016 seconds or 16 milliseconds) and won’t be able to check exact times. This is why the timer conditions only allow for less than/greater than, and not just equals.
There are situations where if your game is bogged down enough that it goes below your set minimum framerate. It’ll slow the entire game down. My understanding is that the engine deliberately slows down timescale if below minimum framerate in those cases.
Thank you, I think I understand. So the game slows down including timers because of the performance of the computer. But other things on the computer keep working fine, such as the clock. So if I did manage to have my timer based off getting the computer time then it would still be limited by its ability to ‘access’ it?
I was studying the 60fps stuff and how long one frame takes from what you wrote. I understand it except for one thing, would the units of one frame taking 0.016 be seconds rather than milliseconds?
I guess my only solution is to try to streamline my game (but it has no for-each-object events and very few and small pictures) or just forget about it.
You can also run it through the debugger, and use the analyser to get an idea of where most of the processing is taking place. The analyser will also break it down to event groups, so using a few of these helps narrow it down when indentifying the bottle necks.
Thank you Silver-Streak and MrMen for your help. I ran the profiler and found that two physics joints that were getting tweened caused the huge resources draw. I deleted the physics behaviours and instead pinned the joints with the ‘put the object around another’ action. Everything worked fine after that with no problems. I really was ready to throw this project away as I’m really making it just for me, so thank you once again.