Day 11 - Platform API Design

So, I'm lagging a bit behind on the stream. I just finished Day 011, where Casey explains how he will go about the architecture of platform code and I have a question. If you do Casey's method, aren't you implicitly assuming that all platform data representation is resolved the same way? IE; bitmap buffers and pixel representation are the same accross every platform, etc. And if so, how would he adapt to a platform which has a different way of representing the data? Is it #ifdefs in critical parts of the Game "service"? This may be addressed in more current episodes (or actually I may have really misunderstood the whole episode), in which case I'm very sorry, but I didn't want to forget asking this.

Edited by Ignacio Amigo on
BlueMagic
So, I'm lagging a bit behind on the stream. I just finished Day 011, where Casey explains how he will go about the architecture of platform code and I have a question. If you do Casey's method, aren't you implicitly assuming that all platform data representation is resolved the same way? IE; bitmap buffers and pixel representation are the same accross every platform, etc. And if so, how would he adapt to a platform which has a different way of representing the data? Is it #ifdefs in critical parts of the Game "service"? This may be addressed in more current episodes (or actually I may have really misunderstood the whole episode), in which case I'm very sorry, but I didn't want to forget asking this.


We have not seen everything Casey will do yet since it's a work in progress.

One sane way of doing it, is to select a single format which the game will produce for, and then do final conversions within the platform layer for platforms which internally expects different formats.

Another way is to create a separate low-level API for rendering graphics/sound and have it be a part of the platform layer, letting it simply handle this differently on different platforms.
If I'm not mistaken, it doesn't matter. Literally the only thing the game is doing is filling standard buffers with the audio and pixel data. Then it's up to the platform layer to take those buffers and do whatever it needs to to render to the screen and play the audio.
Troncoso
If I'm not mistaken, it doesn't matter. Literally the only thing the game is doing is filling standard buffers with the audio and pixel data. Then it's up to the platform layer to take those buffers and do whatever it needs to to render to the screen and play the audio.


Of course, it does matter if we would like to use hardware acceleration, and one wants to keep the format in the game-code easily and swiftly convertible to the format used by the platform layer. And one question pops up when I read your reply, what is a standard buffer? We saw in one of the Handmade Hero videos that bitmap buffers in WINAPI is encoded as BGR, rather than RGB. If one platform had easy support for 32-bit audio, and we output 16-bit audio, we would get slightly worse audio quality if we just output 16-bit audio and let the platform layer re-convert. All things to keep in mind.

One can ignore this and do things how they are easiest in the game code and then just do the conversions in the platform layer but there is a price there to pay.
Uberstedt
Troncoso
If I'm not mistaken, it doesn't matter. Literally the only thing the game is doing is filling standard buffers with the audio and pixel data. Then it's up to the platform layer to take those buffers and do whatever it needs to to render to the screen and play the audio.


Of course, it does matter if we would like to use hardware acceleration, and one wants to keep the format in the game-code easily and swiftly convertible to the format used by the platform layer. And one question pops up when I read your reply, what is a standard buffer? We saw in one of the Handmade Hero videos that bitmap buffers in WINAPI is encoded as BGR, rather than RGB. If one platform had easy support for 32-bit audio, and we output 16-bit audio, we would get slightly worse audio quality if we just output 16-bit audio and let the platform layer re-convert. All things to keep in mind.

One can ignore this and do things how they are easiest in the game code and then just do the conversions in the platform layer but there is a price there to pay.


When I say standard buffer, I don't mean how the bytes are laid out. I just mean it's not some platform defined struct or alias that we are conforming to. It's just a block of memory that we are dumping our data into, so any platform can get that back and manipulate it how they need to and their wouldn't be a cross-compatibility issue.
Isn't that basically what you said as well in your first reply? The game would format the data its own way and then the platforms would each be coded to handle that format.
The answer is that only graphics is expensive to "reformat", so it's actually the only one you care about in terms of performance. Sound is simply never an issue, at ~1600 samples a frame at maximum, to change into whatever format you want.

Since graphics is the only performance-critical one, you typically take a 3-tiered approach, which I explained in the stream. You'll see this in action when we get there!

- Casey
Aside: I'm late to this series having just started and trying to get caught up. I am sorry if this has been covered since it first appeared.

As I understand this model, the game layer defines a generic interface GameUpdateAndRender that uses a generic type game_offscreen_buffer to get frame data. The game_offscreen_buffer defines a type that the game layer treats as uint32 buf[Height][Width]. So, every platform must deal with a frame buffer using the type and layout. The platform is responsible for any transformation from that type to a platform specific type, win32_offscreen_buffer or equivalent. In this case, the transformation is a direct mapping but that won't always be the case.

Generally, the platform would need to translate a win32_offscreen_buffer object into a game_offscreen_buffer or the equivalent and do the reverse after the call to GameUpdateAndRender. The direct mapping between the two types obscures this.

One way to express this is the C++
1
2
3
4
struct win32_offscreen_buffer : public game_offscreen_buffer
{
    BITMAPINFO Info;
};

n.b. the memory layout for this changes but that should not be vital. If it is then a composition would work.
1
2
3
4
5
struct win32_offscreen_buffer 
{
    BITMAPINFO Info;
    game_offscreen_buffer game_buffer;
};

The point is that such a relationship won't always be the case and it is the platform layer that must perform the translation.

- tim
It's very unlikely that you would have to do any conversion regardless of the platform. Most (all?) platforms support BGRA graphics submission so long as they control the stride, which we allow (the game code does not demand a particular stride IIRC).

- Casey