Question about Day 18 Enforcing framerate

Greetings,

I just wanted to ask to make sure what Casey is talking about @1:13:00
https://hero.handmadedev.org/videos/win32-platform/day018.html

I'm not sure I fully understand what he's saying not to do. Is he referring to the 'DeltaTime' value? (time in seconds that the last frame took to complete). Cause I usually use that value when handling player movement so that it's time dependent instead of frame dependent.

If anyone could clarify it would be appreciated, thanks!

-elxenoanizd/vexe

Edited by vexe on
I'll give it a shot but honestly, I'm still trying to wrap my head around all the different ways to handle different timelines.

So far, the best way I have found to think about it is as having three different timelines. You have your game timeline, rendering timeline and real life timeline. Lets start with 1:14:00 on day 18 and lets start with frame 1. Lets say your run your game timeline at a fixed update rate of 16ms. So you say okay, I'm starting a frame so hey Mr. Game code, advance your timeline by 16ms and we'll also say that the rendering timeline advances by 16ms to match the game timeline BUT during that frame, the real timeline advanced by 33ms. "wrong wrong wrong wrong, it is all wrong." The game and rendering timelines are lagging behind the real timeline.

| Frame 1 | shown to player

Game timeline(16ms)

Rendering timeline(16ms)

Real timeline(33ms)

Now frame two is where I am a bit confused myself and I'm not sure if it is my understanding or that his explanation was mixing a variety of things. He says, a lot of people will add 33ms to the next frame. I have do idea why you would do that. The lagging timelines are only 17ms behind the real timeline so I would think that you should add 17ms to the lagging timelines plus the fixed 16ms that is your fixed update rate and then IF you have a variable refresh rate
monitor and IF the next flip only takes 16ms then you end up with:

| Frame 2 | shown to player

Game timeline(48ms)

Rendering timeline(48ms)

Real timeline(49ms)

Now you may think that the missing second should be accounted for if you are thinking that the actual update rates are 16.666666... and 13.33333... (I did when I first wrote this) BUT those aren't the numbers we are using in this example. We have stated that the update rate IS 16ms and said that the monitor did its refresh at real time marks of 33ms and 16ms. So after frame two you are still wrong. The problem is that you never know how long the current frame will take to update/render so you are always using a guess (your fixed frame rate) and adding in the error from the last frame but there is no way you will ever be right. Just closer or further from wrong. To extrapolate his example to a third frame, lets say you do your fixed update plus frame 2's error for a total of 17ms but frame 3 took a total of 24ms

| Frame 3 | shown to player

Game timeline(65ms)

Rendering timeline(65ms)

Real timeline(73ms)

I have to say here that to me, this seems okay. I honestly don't understand the problem with allowing the game and rendering timelines to be somewhat behind the real timeline as long as the game and rendering timelines are in sync, always behind the real timeline and never fall behind the real timeline to a magnitude greater or equal to the fixed update rate (more on why I say the fixed update rate later).

Taking your case where you take the length that the last frame took to update the current frame. Let me first make one assumption, that being that you do not fix the update rate at which you advance your gametime line. One problem here is that you have no idea what the inputs (time) into your system are. As an example of how this could be a problem, lets say your game has a lava river, and across the river is a treasure. You have designed the lava river to be just
large enough so that the player is supposed jump across but always land a little in the lava to take some damage (treasure ain't free ya know!). How big do you make the river? You don't know because you don't know what the input (time) into your system will be for any given frame. So if a player initiates a jump, the height at which they get to is undetermined since you advance the player movement by a variable rate.

Now, lets remove the assumption above and now say that you measure the time that the last frame took but you update everything at a fixed time rate. I think this is what the person who asked the question at 1:14:00 was asking. As an example, lets say you update your game code at 16ms per update and you determine the number of updates you do per frame as

NumberOfUpdatesThisFrame = LengthOfLastFrameInMS / 16ms

or you could do something like

while(1)
{
currentTime = getTime()
lastFrameLength = currentTime - lastTime
lastTime = currentTime
accumulatedTime += lastFrameLength
while(accumulatedTime >= fixedTimeStep)
{
update()
accumulatedTime -= fixedTimeStep
}

render()
}

This allows you to run your game code multiple times so that you can always keep the game timeline and rendering timeline close to the real timeline but you will more often than not have some error value between the game/rendering timelines and real timelines. Essential, the problem described by having a variable refresh rate monitor. The bigger issue (I think) is that you are allowing your simulation to run multiple times per rendering as necessary to keep up with the real timeline. This result in having situations happen in the game but the player never sees them happen. For example, lets say you have a fast moving ball that is about to collide with a wall and you have determined that you need to update your simulation twice on this frame. During the first update you determine the ball hits the wall and therefore you reposition and bounce it. Then you do your second update and the ball is moving away from the wall. Now you render the current state. The player never saw the ball hit the wall. They saw it moving toward the wall and then away from the wall but missed the bounce.
I suspect this has led to many a gamer rage about how they were hit by something that never hit them.

I think it all comes down to knowing what your options are, what the requirements for your game are, and what problems your going to have to solve for the solution you use are no matter what solution you use. After all this I am of the opinion that Casey is right. Determine your frame rate and hit that frame rate.

Sorry this got so long. Hopefully something here was helpful (and right :p). If nothing else this explains one reason why garbage collection sucks. You can never guarantee a frame rate.

Curiosity is time consuming! and creates long posts.