Worth Learning Build Tools?

I'm pretty new to the stream but I was interested in the fact that Casey uses such a lean build script.

I agree with his argument that well, optimizing the build of my project has such marginal benefits and consumes so much time via maintenance and updating that it's not worth the effort.

As a non C/C++ developer I was alway kind of taken aback by the degree of sophistication of Makefiles in the projects I would look at. That being said, I'm looking at a lot of open source clones of games and almost all tend to use some sort of build tool, typically CMake.

To anyone who works in C/C++ professionally, what are your thoughts on the value of learning something like CMake? Is it necessary if you want to contribute to open source projects? Is CMake currently the most common build tool or is there something else that I should be looking at?

Thanks in advance.
You should probably know CMake, as much as i hate it, you kinda need to know it to build dependencies sometimes
As someone working with CMake professionally for the last 5 years I can tell you that my hatred for CMake knows no bounds. It is a very, very poorly designed language with horrible syntax and an API which excels only at displaying how little thought went into it. It is definitely the worst language I have ever had to deal with. For the sake of your mental health I would recommend you stay away from it as much as you can, especially in your free time.

Oh, and there is no good build system for C/C++.

Alright. So much for the rant.

I think you should ask yourself if it is a good idea to even consider questions like this. Like "Is Java better than C?" Or "What is the best sorting algorithm?" Or "Which editor should I use?" Does the answer really matter? The real question is: What are you interested in? What do you want to do? Find that out, then ask how you can reach that goal.
and then you realize that cmake is just a front end for another even crappier build system.

At one point I looked into the source code to see if I could make a build.bat-style backend for it. Cause for building dependency you just need to get a single clean build going once (which is all that build.bat will do for you)

Then I found the source code and looked for my 10-foot pole...
Buildsystems are terrible.

CMake is terrible because not only you, the developer, need the tools to write and maintain a build with it, so does your user. When I'm the user, I do *not* want to have to install *your* crap on my box to build your code. Period.

Autotools is terrible because its sooooo hard to find any documentation to help you make anything, so what you end up doing is copy and pasting from other people's scripts just to get your thing to ever work, and you end up with tonnes and tonnes of crap in your build for stuff you don't even need, but don't know how to get rid of without breaking your build at the same time.

The one plus side of Autotools is that your users do not need to have it to run the build. If you are really dedicated you should read A Practitioner's Guide to GNU Aut...make, and Libtool by John Calcote along with 21st Century C, by Ben Kiemens and start an empty "Hello, World" project and get it all the way to a project buildable with Autotools. Expect around a week to learn this.

Casey's system avoids *both* shortcomings. It isn't too complex, documentation on the tools is easy to find, and not only do your users not have to install anything extra for their builds, you the developer don't need to either.

Clear winner: Casey's build
Clear loser: CMake, just no. Don't. No no no no no
Runner up: Autotools, if you *have* to.
Build.bat isn't the greatest either because few people actually still do daily programming in bat files. Those that do also tend to copy and paste.

Frankly my next step once the project because too complex for build.bat would be to create a compile.cpp that does a few system("cl ..."); calls (or CreatePocess).

Which I think would still win over autotools because you know exactly what's being run, you can dispatch multiple compiles at a time (for compilers that can't do multithreading) and guide any error/info messages into a file.

Also that's a win over general build systems because there is no need to parse a build file and create the dependency graph before you can start to check file timestamps. Instead you can hardcode those dependencies.
I work with complex C/C++ projects in my day job, and develop an open-source C/C++ application in my free time. My two cents:

Learn CMake. You do not get to choose how other people's code builds. CMake is used widely in the industry. No way around it.

On your own projects, whatever tools you choose, builds should be simple to understand, set up, and execute. If your CMake/make/Visual Studio/<Insert fancy tool here> build is complex, that is a code smell, and usually an indicator that you can improve your linkage units.

The "degree of sophistication of Makefiles" you talked about is usually quite the opposite of sophistication.
From what I understand,
The only reason to use a build system - is to reduce iteration time (compile times and a like).
So that only the changed files have to be rebuilt.

Thanks to the awful C/C++ compilation model, Casey's build ends up being multiple times faster for complete builds.
Getting away without using a build system in the first place seems like a huge win by itself.
But is it worth losing the ability to do incremental builds?

I've noticed that as a project becomes larger, incremental build times become annoyingly long anyway (mostly due to increasing link times - as there are more parts to pull together?), so it's hard to say if unity builds lose out or win in the long term.




Edited by pragmatic_hero on
pragmatic_hero
The only reason to use a build system - is to reduce iteration time (compile times and a like).


I wouldn't say that is the only reason, nor even the first reason, although a useful feature. Every project will need to be built, and whichever way you decide to manage that could be considered a build system, even if you manually type it in by hand (do check out 21st Century C to see some tips on how to make ad-hoc manual compilations from the terminal a breeze).

The first reason thing that a build system should provide is the documentation and execution of your build plan. Even if the one you end up with is horrible, it should at least do this much. The worst thing you could possibly do is forget how to build your code. Arguably the next worst is to have a build system that you can execute, but not know how or why any of it works. One of Casey's early arguments against implementation independent architectural blueprinting is that it is a simply a non-functional duplicate of the code itself. Similarly, a build system's artifacts in your project should serve as both documentation and executable.

In that respect, Autotools is terrible, maybe even the worst. Unless you've handrolled your autotools build as I described in my previous post and know exactly how you did everything, its a pretty steep learning curve. Even I've gone back to autotools builds I've authored myself and have to take a couple hours to read enough of it to remember how it works.

That pretty much takes care of anything out there other than the simplest possible script that's directly executable on the shell of your environment. To add the feature of reduced iteration, the next place I would go is to a simple makefile. It does mean that both you and your downstream have to have a compatible make exectuable installed, but that is pretty trivial price to pay next to the opacity of Autotools or having to install the entirety of CMake.

Probably one of the largest implementations of a make driven build is Kbuild from the Linux Kernel project. Bear in mind, this system encompasses the complexity of thousands of developers all submitting code that ultimately is merged into a single project tree.
The most important thing is to do the simplest thing first. If/when the simplest thing starts to become troublesome, reevaluate.

My progression typically goes like this:

* one-off script: just run gcc/clang manually
* have a few different configs: shell script or simple makefile
* have build rules that aren't just compiling: makefile
* need to search for+link w/ system libraries, or support other platforms: CMake

But that's for small stuff. As soon as a project becomes large enough, things change quite a bit.
For example, Chromium has its own meta-build system (GN). Similarly, a former Chromium engineer wrote Ninja because incremental builds in Make were too slow.

These are pretty cool, and have a lot of nice features. But they're typically not worth using unless your project is huge.