I've done pretty much the same as Casey so far (day 48). I create a 960x540 buffer and a window of the same size, but I get a 1180x626 window with blurry pixels, that doesn't show the whole buffer. If I drag the borders of the window and make it bigger, I see that the buffer is shown at 1200x675, exactly 1.25X the original size.
I found that this scaling factor is determined by my display settings. My monitor's "Change the size of text, apps, and other items" option was set to 125%, and when I changed it to 100% my window looked right. But how can I have control over my pixels on monitors that don't have the scale set at 100%?
I didn't see this addressed in the 48 episodes I've watched, in the forums, or in some other episodes I've searched about the Windows API. I've checked the Windows API documentation and it seemed like calling SetProcessDpiAwarenessContext() might solve this, but I tried calling it and the compiler doesn't detect that function. Maybe it's because it's a Windows10 function.