When build tools matter

I would just leave this if I haven't actually experienced the problem trying to build someone else's code, and wasn't myself building something using traditional, complex build tools, so with that in mind...

Ref to my github build tools scripts. Creates a super simple GNU Autotools project in the directory of your choosing. I'll be adding more cool things as I think of them.

One of the buildsystem features that I forgot to mention in the other thread, is reliably building the same code under a variety of different platforms with one command. The stability and complexity of your build tools is also proportional to the diversity of platforms under which you expect your code to build. The learning curve of something like GNU Autotools is soon rewarded by the ease with which you can add compiler flags, search paths, and feature checks for 3rd party libraries and expect it to build on the most systems possible, out of the box.

For the non-educational version of a project like Handmade Hero, where the expectation is to ship a game in binary form to players, the developer doesn't expect to distribute code that will ever be built by users. This is extremely different from code that is distributed with the express intent that it must be built by the users, or by package maintainers for the users. Even on very predictable platforms there will be variations, as seen early on in the HMH series with versions of XInput, and linker flag differences for WinXP.

The specific problem that spurred me to write this, was in a handwritten build that determined the word size of the build platform based on semantics, without directly testing the hardware. The system distribution I use is sufficiently different from the developer's that the check would always fail. Probably on any other distribution, too.

The easiest thing to do, of course, is ask the kernel what its architecture is. But that, too, is semantic. It turns out this was the solution I suggested and ultimately taken by upstream, but a better way to do it is to test the architecture for features. Eg.,

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
$ gcc -xc - -include stdio.h << '...'
int main(){
#if __x86_64__
printf("64\n");
#else
printf("32\n");
#endif
}
...
$ if test $? -eq 0; then ./a.out && rm a.out; else echo failed; fi


I'm sure there are even better ways that that. Feel free to reply with your method in the comments :D

Granted, any of these problems can be solved without big, hairy build tools. If you are just planning to distribute binaries to users, then you probably don't have much to worry about apart from ensuring those binaries will work. But, if your intent is to bring the Handmade Philosophy into the Free Software world, and I know a lot of projects here speak about exactly that, then this observation is of much greater value to you than anyone following HMH just to learn how to write good low-level code. Write good, portable builds, too.

Edited by k2t0f12d on Reason: Removed quotes from URL tags + clarify "this" for "that"
Both of your links are pointing to this topic.

Anyways - I strongly dislike autotools. CMake is at least tolerable, but autotools is plain ugly. And it doesn't work on Windows. Nowadays if I really need to support multiple platforms, I just create simple makefiles (like compile *.c files or something like that).
mmozeiko
Both of your links are pointing to this topic.


[EDIT: fixed by removing the quotes in the URL tags] There must be a bug in the site code:

mmozeiko
Anyways - I strongly dislike autotools. CMake is at least tolerable, but autotools is plain ugly. And it doesn't work on Windows.


That is not correct. Pretty much anything you need can run in Windows. It just might not be very convenient. I'm not sure. I haven't tried this before.

http://gnuwin32.sourceforge.net/packages/m4.htm
http://gnuwin32.sourceforge.net/packages/autoconf.htm
http://gnuwin32.sourceforge.net/packages/automake.htm
http://gnuwin32.sourceforge.net/packages/libtool.htm

If someone has set up a GNU Autotools build in Windows using PE binaries, please share it! :D

mmozeiko
Nowadays if I really need to support multiple platforms, I just create simple makefiles (like compile *.c files or something like that).


Will someone other than you ever be expected to build your code? If no I already mentioned in my original post that this advice is probably not for you. If yes, multiple makefiles can work, because ultimately big, hairy build tools like GNU Autotools just output makefiles. The point isn't whether or not the makefiles are simple for you, the point is whether or not they just work out of the box for people building your code downstream.

I included my expertise on Autotools as an example and a courtesy.


Edited by k2t0f12d on
Sure you can get autotools to run on Windows, but that will require to install bunch of other utilities. Then it will run. I'm not sure, but will it actually automatically generate reasonable makefile that will be able to use VS?
mmozeiko
Sure you can get autotools to run on Windows, but that will require to install bunch of other utilities.


The developer always has to install the entire build tool suite. The point of Autotools is that downstream doesn't. The complete Autotool suite is a few megabytes. Visual Studio is a few gigabytes.

Since cl.exe is specifically mentioned in the Autoconf documentation, there is an expectation somewhere that it can work.

Edited by k2t0f12d on
Just to clarify, I don't use a build.bat on Handmade Hero because I need other people to be able to build it. I do that because that is how I build all my projects now. I find all build systems I've ever used (including Autotools, CMake, Ninja, etc., etc.) to be just more maintenance headaches that slow me down and make it less likely that my build will continue to work as the years go by, so it's just a cost-benefit analysis that leads me to eschew them.

- Casey

Edited by Jeroen van Rijn on Reason: Deleted inadvertent duplicate post
Not to mention when your from-scratch build times are <10 seconds, even when doing all commands sequentially one by one, you don't need a complex dependency-based build system to cache build artifacts.

Just a list of commands ot run to get the final binary out the end.

Edited by ratchetfreak on
http://www.tedunangst.com/flak/post/its-hard-work-printing-nothing
The talloc configure script runs for a while, checking for all sorts of various features.

1
Checking for variable IPV6_V6ONLY            : ok 

IPv6 is the future of the internet, so obviously we want to make sure that our malloc replacement is ready to go in IPv6 only environments. Time to throw away that legacy IPv4 malloc!

This carries on for a few minutes. libtalloc itself takes less than five seconds to compile, but the configure script will spend 100x that long probing for functions to get and set filesystem extended attributes. Code reuse is the key to efficiency.
cmuratori
I find all build systems I've ever used (including Autotools, CMake, Ninja, etc., etc.) to be just more maintenance headaches that slow me down and make it less likely that my build will continue to work as the years go by, so it's just a cost-benefit analysis that leads me to eschew them.


Yeah, that's a caveat. When you are distributing source with build tools designed to take as much of the pressure you can off of your downstream, the build itself is a project in its own right. It needs maintenance and development just like the other project that it builds.

Just starting out with some code I'm really excited about, I don't even use a script. I have aliases set in my shell that I can just throw .c files at until I think its code worth keeping around with some kind of a formal build, whether that's a small script or a full blown Rube Goldberg.
ratchetfreak
Just a list of commands ot run to get the final binary out the end.


I personally don't care about build artifacts. The only time I've ever cared about build artifacts is when I build a kernel for my workstation and I want to have as many of the original object files around in case a wanna build in a feature I didn't realize I needed for whatever reason and have no reason to build the whole thing all over again.

Some source is distributed as just a bunch of stuff with the understanding that it is what it is, and the value is in there somewhere and you can have it if you can figure out how to get it into your own code, but without any clues about how to build it - buh bla bla.

But if you are putting source code out there expecting other people to build it, use it, make packages and maintain it for their platform, you want to have a good, robust build. It is literally as important as the program itself. Trying to speed up a build by keeping build artifacts has nothing to do with anything at that point.

While your simple list of commands works for you, if I get your code and try to build it and I find there's some sort of undocumented assumption about the build platform and it breaks on my system, its a problem. Simple build scripts are Not Robust™. The thing that made me start this thread was the worst possible "simple" list of commands, because it did have assumptions and kept trying to do stuff with those assumptions. That's wrong wrong, all wrong, and potentially screws with the system of your downstream constituency. Don't do that.
Yeah, once your code is more than a bunch of source and header files there should be a plain language description of the build process.

Or at least something clear int he build setup itself.
Well, I wanted to mention how JAI is going to have the build in the code itself. I like what that's all about, only unsure about how robustly that is going to work across platforms. And it doesn't solve anything for code in languages the rest of us already have.

In particular, I've looked at Autotools on Windows using various POSIX crutches, and none of them look awesome to me. I was getting all excited over MYS2 and MinGW32-x64, but those appear to be unmaintained. MKS promotes itself as the best cross-platform UNIX enabler, blah bluh blah bluh bluh.

The main problem with Autotools is that it is made out of the scripts of three or four languages. M4, Shell, Perl, and make I've hand-counted by reading the source so far. I've been reading the source with the thought of porting the functionality to something that can be truly platform independent, and right off the bat everything I've read is just boilerplate code that sets the shell up so it can even run the code I haven't gotten to that ultimately generates the real build info.

I've been thinking about starting a project with Handmade Network, besides working on games, and I think maybe a Handmade Build System is the way to go. Something that:
* Compiles to a tool that just works everywhere
* Understands your platform's toolchain
* Sensible, overrideable defaults
* Reads scripts that are just as simple as "build.bat"
* Super fast performance
* Sets up new basic projects instantly
* Doesn't need to be installed to run the build, only to generate the build

Edited by k2t0f12d on Reason: clarity
k2t0f12d

I've been thinking about starting a project with Handmade Network, besides working on games, and I think maybe a Handmade Build System is the way to go. Something that:
* Compiles to a tool that just works everywhere
* Understands your platform's toolchain
* Sensible, overrideable defaults
* Reads scripts that are just as simple as "build.bat"
* Super fast performance
* Sets up new basic projects instantly
* Doesn't need to be installed to run the build, only to generate the build


Something like this sounds great. Keeping it very light and not require outside libraries would also be important imo.
Keeping the tools light isn't as much of a priority as making them robust and cross-platform. Making the developer's interactions with them light and easy is more important than that. Making the build effortless for the people that build packages and maintain them for their platforms is paramount. Since building the code shall not require the presence of the tools, they can be whatever they need to be, since only the developer need have them installed.

I do not like to have extra library dependencies. Even when I really like what some piece of software does, I still feel like the developer has punched me in the guts and stomped on my feet when I see something like Boost or CMake pop up in the dependency tree. I'm not likely to do that to anyone myself.