Handmade Hero»Forums»Code
Adrian
46 posts
None
Rendering pipeline question
Edited by Adrian on
Hi, I started watching Handmade Hero and really had a good time. I'm really not an experienced programmer. I know some C, Objective-C and Swift as I'm using a mac. I always wanted to know how game engines really work way beneath all this confusing frameworks mostly everybody uses nowadays. And Handmade Hero really scratches that itch.

However as I'm on a mac I got some questions. In the first part of the series the game was completely rendered by software right? And as Casey then explained concerning the graphics there is just one thing to do to which is platform specific and that is allocate some memory, write a bitmap into it and draw it on the screen using a os specific function. This is all done by software. No opengl, no directx right?
As I'm pretty new to programming, this seems quite easy to me and I started asking why bother and use opengl anyways if its that simple? OpenGl seems like a huge framework. There has to be more to it than just drawing a buffer to the screen right?
It has vertex buffers, does rasterization and so on. Is this all done manually by Casey before he starts using opengl? Performance aside is Caseys software renderer as capable as opengl in terms of functionality? Like can you draw single pixels or draw a line or a triangle?

Moreover am I right in assuming that you can't do software based rendering on a mac? As far as I know mostly everything which has to do with graphics used opengl to draw to the screen, now its metal, apples own "faster" connection to the GPU. Will Casey use metal?
And if it is possible will he do the rendering by software or opengl by the time he writes the platform code for OS X?
Or is there no platform specific rendering going on except drawing the bitmap to the window. But there has to be some rendering going on right? He has to draw images and stuff. Or is everything drawn into the bitmap which is then drawn to the screen using either os specific software (cpu) or hardware (gpu) functions?

As you can probably tell, I'm a little bit confuzzzzed.

I already asked this on stackexchange but the answers there didn't quite satisfy my knowledge hunger.
How is a game drawn onto the screen?

Please correct me if I'm wrong with my assumptions.
511 posts
Rendering pipeline question
Edited by ratchetfreak on
adge
Hi, I started watching Handmade Hero and really had a good time. I'm really not an experienced programmer. I know some C, Objective-C and Swift as I'm using a mac. I always wanted to know how game engines really work way beneath all this confusing frameworks mostly everybody uses nowadays. And Handmade Hero really scratches that itch.

However as I'm on a mac I got some questions. In the first part of the series the game was completely rendered by software right? And as Casey then explained concerning the graphics there is just one thing to do to which is platform specific and that is allocate some memory, write a bitmap into it and draw it on the screen using a os specific function. This is all done by software. No opengl, no directx right?
As I'm pretty new to programming, this seems quite easy to me and I started asking why bother and use opengl anyways if its that simple? OpenGl seems like a huge framework. There has to be more to it than just drawing a buffer to the screen right?
It has vertex buffers, does rasterization and so on. Is this all done manually by Casey before he starts using opengl? Performance aside is Caseys software renderer as capable as opengl in terms of functionality? Like can you draw single pixels or draw a line or a triangle?

Moreover am I right in assuming that you can't do software based rendering on a mac? As far as I know mostly everything which has to do with graphics used opengl to draw to the screen, now its metal, apples own "faster" connection to the GPU. Will Casey use metal?
And if it is possible will he do the rendering by software or opengl by the time he writes the platform code for OS X?
Or is there no platform specific rendering going on except drawing the bitmap to the window. But there has to be some rendering going on right? He has to draw images and stuff. Or is everything drawn into the bitmap which is then drawn to the screen using either os specific software (cpu) or hardware (gpu) functions?

As you can probably tell, I'm a little bit confuzzzzed.

I already asked this on stackexchange but the answers there didn't quite satisfy my knowledge hunger.
How is a game drawn onto the screen?

Please correct me if I'm wrong with my assumptions.


openGL is just faster at rendering.

If he wants to start drawing single pixels and lines with his software renderer then he can easily create the code to do so.

It's very likely that he'll use openGL for OS X accelerated rendering given that he already has most of the code required. Once you have the openGL context the platform specific parts fall off.

I believe that you can still mess with software rendering on Apple with CGBitmapContext, though I have no experience with that.

Bryan Taylor
55 posts
Rendering pipeline question
When we talk about software rendering or hardware rendering, the deciding factor is where the rasterization (finding which pixels are inside a polygon and filling them) happens. Nowadays, everything goes through the GPU to get to the screen at some point. But, when rendering in software, we simply give the GPU a buffer of pixels (the bitmap we allocate) and tell it to blit that to the screen. When using OpenGL (or DirectX, or Metal, etc) we give the GPU buffers of vertex data and tell it to render them. It's a lot faster -- not only are GPUs better suited for rasterization (due to having *really* wide SIMD pipelines), by moving the bulk of that work off the CPU, we can use that extra time for other things (like AI, physics simulation, etc.) Especially if you want full 3D with nice textures and lighting, you *need* that extra throughput (though this isn't particularly relevant to Handmade Hero itself.) Casey's software renderer is fairly representative of how the GPU renders, with the exception that he renders quads, while GPUs almost exclusively deal with triangles.

There's nothing to stop you doing software rendering on a Mac, either. (That's kind of the point.) All software rendering does is create a buffer of pixels that then get handed off to the OS to be put on the screen. That tiny bit of code is all that cares about the platform.
Adrian
46 posts
None
Rendering pipeline question
Edited by Adrian on
The first post today that really scratches my itch! Let me rewind that to understand this fully:

Both ways (software & hardware) rendering use the gpu to some extent? The difference is just that with the software way you pass the already finished bitmap to the gpu and did all the rasterization (and other stuff I don't know jet) all yourself.
Using the hardware you pass all the different bitmaps to the gpu like the vertex bitmap (and other stuff that I don't know yet) and let the hardware do the rasterization (and stuff).

Right?

I thought software didn't use the gpu at all. But how were images or games rendered before the time of graphic processor units?
Mārtiņš Možeiko
2559 posts / 2 projects
Rendering pipeline question
Modern GPU's work differently than old video cards. On old video cards you could directly access the memory that contains pixel values displayed on screen. So software rendering meant simply writing into these memory locations.

Modern GPU's and modern OS'es it doesn't work like that anymore. First of all modern OS doesn't allow you direct access to GPU memory or registers. It all needs to go through video driver that has this access. Also GPU's offer different interface than just raw memory block of pixels. They support various primitives useful for 3d accelerated rendering (shaders, buffers, textures, etc...) So software rendering on these GPU's means you prepare bitmap with block of pixels and render it as 2 triangles or 1 quad (or something similar).

While the preparation of bitmap pixels in software rendering is more or less same for modern and old video cards. But the way how you actually make this bitmap appear on screen is completely different. In old times you simply wrote into special memory location. Now you use OS API (like OpenGL, D3D, Vulkan, Metal) to make this bitmap to appear on screen.
105 posts
Rendering pipeline question
It is absolutely possible to do software rendering on OS X without going through OpenGL or Metal. What you're looking for is Quartz: Quartz 2D Programming Guid

At the start of Handmade Hero I followed along very closely (also on a Mac), and this is how I set up my platform layer. Here's a brief look at how it worked:
- Allocate the memory for your bitmap and pass it to CGBitmapContextCreate. This gives you a raw buffer to write into in the bitmap context.
- Once you've drawn a frame into the bitmap memory, you create an actual bitmap (CGImageRef) by calling CGBitmapContextCreateImage.
- Then draw the bitmap into the graphics context of your NSView with CGContextDrawImage.

This hopefully gives you an idea of the steps involved and a place to start.
Adrian
46 posts
None
Rendering pipeline question
Is there an up to date platform layer for OS X?
Does somebody know when Casey is going to do the platform layer for OS X? He could do it anytime because what the game takes in and puts out is already all done right? Or will there be future changes on the game engine side that would also affect the platform code?
105 posts
Rendering pipeline question
OS X Port
It's not up to date, but it's the only one I'm aware of that doesn't use any libraries (e.g. SDL). He does use OpenGL for rendering though. I don't know of any others that use software rendering. I would put mine up but it's in a sorry state since I have no time to work on it. :)

Casey won't do any porting work until the game is done because there is no point in spending time on it when things in the platform layer can (and will) easily change as the game is developed. So porting is still a good year away.
Abner Coimbre
320 posts
Founder
Rendering pipeline question
Edited by Abner Coimbre on Reason: Wording
There's a Resources and Thank You page that has yet to go live for Handmade Hero. It should include the links to all community ports. Aside from the forums, our web chat will give you the resources you need as well.
Adrian
46 posts
None
Rendering pipeline question
Thanks I will take a look into that.

Is there even rasterization and stuff like that implemented in the rendering engine? I cant find it and stuff like that would only be needed if you are doing 3d right? Because 3d objects are drawn through triangles that then get filled with the correct color to make it look lighted right?
Bill Strong
50 posts
Rendering pipeline question
Edited by Bill Strong on
The current game has some rasterization functions in it, such as the box drawing code that draws an outline of our screens.

The definition of Rasterize is "convert (an image stored as an outline) into pixels that can be displayed on a screen or printed." We essentially create the vector image at runtime. So HMH handles rasterization of these basic shapes, and we can add more at any time.

In 3D, we rasterize polygons into screen space to create our image to be displayed. This is traditionally done with triangles.

The whole engine is essentially rasterizer, that only draws quads, which are all rectangles, that are all on the x and y plane. And our bitmap assets are essentially textures.
Mārtiņš Možeiko
2559 posts / 2 projects
Rendering pipeline question
Edited by Mārtiņš Možeiko on
Here's a pretty cool tutorial/guide how to create software renderer in C++ completely from scratch that can do similar features as OpenGL: https://github.com/ssloy/tinyrenderer/wiki
Includes all the source code, if you don't want to read lessons.
Adrian
46 posts
None
Rendering pipeline question
Wow thats awesome! Will definitely go through!
Adrian
46 posts
None
Rendering pipeline question
Flyingsand
It is absolutely possible to do software rendering on OS X without going through OpenGL or Metal. What you're looking for is Quartz: Quartz 2D Programming Guid

At the start of Handmade Hero I followed along very closely (also on a Mac), and this is how I set up my platform layer. Here's a brief look at how it worked:
- Allocate the memory for your bitmap and pass it to CGBitmapContextCreate. This gives you a raw buffer to write into in the bitmap context.
- Once you've drawn a frame into the bitmap memory, you create an actual bitmap (CGImageRef) by calling CGBitmapContextCreateImage.
- Then draw the bitmap into the graphics context of your NSView with CGContextDrawImage.

This hopefully gives you an idea of the steps involved and a place to start.


Isn't quartz using metal on its backbone?
511 posts
Rendering pipeline question
adge

Isn't quartz using metal on its backbone?


It doesn't matter for software rendering, All that it will be doing is pushing a bitmap, that you give it, to the screen.