I like Casey's attitude of simplifying things by asserting on fixed behavior. Many Amiga games were very smooth when they were designed with that constant frame budget of 50 Hz (PAL) and so never skipped a frame. But it remains an open question whether supporting only 120/60/30 Hz (and temporary transitions between modes) is enough? Interesting to see what Casey has in mind here. With clever design everything can be multiples of a base rate (so having constant physics step quite easy) and frame budget can perhaps be ensured by having strict budgets to all resources (spawning particles etc), allowing for "progressive jpeg" style degradation, always hitting the target frame budget.
The opposite way is more complex, preparing for variable frame rate by admitting/regretting one cannot control the resource usage (complex physics interactions in 3D-engine, for example), where that decoupling of physics and render loops becomes relevant. But then one has effectively two snapshots of the world at render time and a proportional blend factor between them (because the physics time step is constant and separate), a sort of like subpixel rendering situation. I guess that kind of game loops are found in many browser games, for example, where gc and stuff may interrupt at any point, and so skipping cannot be avoided. That gaffer link above had nice explanation, and for an operational example see for example
https://github.com/vladikoff/cube.../blob/master/lib/game.js#L97-L150 (actual game with nice tracker/mod-like music at
https://cubeslam.com).
EDIT: It may be notable that 8 years later the author of that gaffer article
commented: "A much easier solution [than interpolation] is to just always lock to 60HZ or so, if you can get away with that, it’s the best and simplest option." :)