2 frames of input lag

In one of the early videos Casey said something like "You always have one frame of input lag, and if you do silly things, it may even be 2 frames".

My question is - how is it possible to have only one frame of lag?

Suppose you have a separate thread that collects the input during one frame, then in the next frame GameUpdateAndRender() runs, using the collected data, and only the frame after it will you see the consequences of your input. So if you pressed a button in the beginning of a frame, you'll then have 2 frames to wait.

Edited by Ivan Ivanov on
The game loop does something like this:
1
2
3
4
5
6
while(running)
{
  GetInput();
  RunGameLogic();
  DrawFrame();
}

Any input collected by the OS after the GetInput() call would not be processed until the current frame is displaying and the next frame begins to process.
1
2
3
4
5
6
7
8
9
A  GetInput
   RunGameLogic
   DrawFrameA
B  GetInput
   RunGameLogic
   DrawFrameB
C  GetInput
   RunGameLogic
   DrawFrameC



Any key pressed between A and B will get drawn on Frame B.
Any key pressed between B and C will get drawn on Frame C.

If you pressed a button right after A, it would take almost 2 frames of lag before being displayed.
If you pressed a button right before B it would be slightly less than 1 frame of lag before it was displayed.

Edited by Anthony on
Thanks, so that is exactly what I meant and looks like this is the minimum of lag that we can count on.
Just wanted to clarify something here, because I'm not sure if you guys are thinking of the term "frame of lag" in the same way that I usually hear it being used.

If the display is updating at 60hz, let's say, there are ~16ms in between changes of what the user sees on the screen. If a button is pressed any time in the 16ms before frame A is shown, and it is not shown in frame A, but _is_ shown in frame B, that would generally be called only one frame of lag, even though it may be closer to 32ms of lag if it was pressed very close to the beginning of the original 16ms interval.

Stated alternately, the number of "frames of lag" as I usually hear it used is measured by flooring, not rounding or ceilinging, the time from the press to the display (for better or for worse). I believe this is because there is no way to have any less than 16ms of delay in the best case scenario, so it's not typically called "lag" until you actually miss the first frame on which you could _reasonably_ be expected to show the results.

Hopefully that makes some sense. I don't think there's any confusion about what's actually going, just how the term tends to be applied.

- Casey
Thanks very much for the clarification, all is clear now.