Line rendering on the GPU

This is something I've been curious about for some time, so I figured I would put the question out there to the handmade community. Based on some research, I have implemented line rendering for arbitrarily thick lines in my own software vector renderer and just recently adapted it for OpenGL. The results are for the most part pretty decent I think, but there are still a few issues to work out.

Based on what I found scouring the web and reading posts/articles, the predominant method is to render thick lines using (as you would expect) triangles. So given a series of points and a thickness, you would calculate the miter join then tessellate the line and submit that to the GPU for rendering. This is the approach that I took, and is illustrated in the following image (it's a bit crude, but it should get the point across).

http://imgur.com/a/E8cgP

Furthermore, drawing is done using indexed primitives, so the two line segments in the image would drawn using 6 vertices and 12 indices with the call glDrawElements(GL_TRIANGLES, ...);. Here are some examples of lines drawn in OpenGL using various line thicknesses (results in my software renderer are nearly identical).

http://imgur.com/a/vVkia

One of the main issues that I've struggled with is with lines that are around a 45 degree angle appearing slightly thinner than it should (this is most apparent in some of the 8-pixel wide lines). And I guess this should be expected when you consider a rectangular grid of pixels. I suppose this this can be compensated for by adding a small value to the thickness for line segments that are around this angle, but what would the range of angles be? Are there other (better?) ways of drawing lines?

I haven't tried introducing anti-aliasing yet, so I don't know what effect that will have on fixing or covering up this issue, or whether it's sufficient to use OpenGL's multi-sampling or to handle AA specially for the lines. Some of the line segment joints & ends have very minor issues as well, such as a stray pixel or appearing almost "rounded". It seems really fussy to render really nice, smooth thick lines, so I'm wondering if I'm on the right track here and approaching this in the right way. And I want to make sure that rendering non-AA is as good as possible before adding AA into it if that makes sense.
You are on right track.

For antialiasing you don't need to necessary use multisampling. You could do simple alpha blending on the side of polygons. Sure, that will require more polygons (or more fancy shader), but it removes dependency on setting up multi-sampling.

For handling sharp corners you could consider using something like this: https://forum.libcinder.org/viewI...53&forumGroupId=23286000000003001

That's a good point about not having to rely on multi-sampling for AA.

And yes, doing a rounded (bevel?) join is on my TODO list. For that my plan is to do a triangle fan.

Interesting that glLineWidth is deprecated (and as far as I can tell, only guaranteed support for 1-pixel wide lines). I would have thought line rendering was common enough to warrant more wide support for this, but perhaps it's not as common as I thought to render these kinds of lines.
Don't do triangle fan. Do the same indexed triangle list, so you can draw all the segments in same draw call. Reducing draw calls is more important than having a bit smaller index buffer.

I guess drawing wide lines efficiently would require more specialized hardware which is more expensive to put in gpu. So they simply draw wide lines by expanding that in driver, not in hardware. And in that case there's no point of keeping this functionality in driver, just let users do that themselves because it most likely can be done more efficiently because user knows exactly what and why he is drawing (instead of driver trying to guess that);
mmozeiko
Don't do triangle fan. Do the same indexed triangle list, so you can draw all the segments in same draw call. Reducing draw calls is more important than having a bit smaller index buffer.

Ah yes, of course. Thank you!