Speaking of "Build" -> Performance / Details?

In the first episode (afair), Casey talks about people who love their build systems. That in fact they love 'em so much, that they even use build systems to automate their build systems (not to mentions the tools that optimize even those).

Since Casey does everything by hand, he consequently introduces something on his own to build Handmade Hero: Basically a simple call to the compiler and merely a batch file so that at least he can make sure that proper folders exist.

He explains, that he uses this kind of "Build System" even on very (very) large projects and that they are way faster than anything other people could possibly introduce for automation purposes.

Also he states that he actually does not have a problem with partial builds (that any "real" build system is capable of managing while his batch script is not), because he simply does not do any partial builds. Instead he builds everything. Everytime. And thus simply preventing any problems from arising due to misconfigured partial builds.

I don't know what you people think when you listen to that story. Because I... well,... I have been introduced to MSBuild (which has been open sourced yesterday on github btw) many years ago and have been using it implicitly due to the use of Visual Studio and its project templates in virtually every project. Also, I am using Team Build as a service on build server systems to automatically build sources I check into TVCS (Integration Build). This way I get immediate feedback about what works and what not which is especially valuable when working in a team with more than just one developer.

I also configure my builds to lets say create installers, CD images, do deployment tasks etc. And yes, I am perfectly aware of the fact that this process will most probably take longer than it would take if I used batch scripts and configured everything by hand.

So I am very interested in hearing what you people think about this. On the one hand, we learn to understand the very asics of what we do... on the other hand, Casey does not spare with critics about "modern" design / development practices.

It seems as if the way he works would perfectly fit if you are a solo developer working on a machine that does not have any spare resources, so that you simply HAVE to optimize for speed and performance (I am not talking about graphic routines, but about processes such as building the software once in a while).

Also, I quite like the way modern version control systems work... HOWEVER... (and now it becomes ironic...) I am starting to OVERTHINK what I learned to believe.

Is it really true?

Does version controlling my source really solve any problems?
Does build management really help creating better quality software?
Does test automation really produce better code?
Is it really air that I breathe...

This guy is so damn serious about what he says and - without question - has so much experience, that I just cannot put aside his thoughts about how and why he does things the way he does.

And I would really like to know what other people think about this.

Sorry about this long post.
Thanks for reading.
Cheers,
Mephisztoe.
From what I can tell, unity builds are a fairly common practice in game programming circles (they are used by UBISOFT for example, according to this talk CppCon 2014: Nicolas Fleury "C++ in Huge AAA Games" - although he did mention they use FastBuild which I'm not familiar with). They are also used outside of games - sqlite, for example, recommends using a unity build (they call it "the amalgamation") and they see a 5% performance again when that is done because the compiler can see all the code at once.

I think the unity build approach basically means "modern" C++ (i.e. lots of templates, boost) can't be used - they make a full build impractical because of how much they kill compilation time.

I don't have enough real world experience with Casey's approach to know if my concerns about how his approach would scale are legitimate or not. There is certainly less structure to the code than I am used to - it is not clear when things go into headers vs. source files and, because it is a unity build, the compiler doesn't care. For an individual developer that works fine but I don't know how it would scale to the kind of teams I have worked with. You lose the ability to truly encapsulate things - everything is basically global. Again that is fine if you are disciplined but opens the door to lots of nasty hacks (to be fair the door is already "open" but you need to be willing to use a lock pick :) ).

I'm a big fan of source control although I do sympathize with the complaints about the complexity of the existing systems (I think git became the new "darling" more because of who wrote than its technical merits, but that is just my opinion). I think frequent check-ins with useful commit messages becomes a practical form of "literate programming".

Software testing has been discussed here before. The cost/benefit really depends on the nature of the code you are writing. I've done extensive unit testing in the past and current do more of a "design by contract lite" style approach (i.e. lots of various kinds of asserts) and find that fits the kind of work I'm currently doing. Casey has mentioned that he uses asserts more in the code he normally writes - I wish he would do that on the stream to demonstrate the benefits. Far too few people seems to be aware of just how useful DBC can be.
The fundamental point that Casey is making is really about using what you need and only what you need. If there is not some measurable value with a good return on investment (ROI), then ask yourself, "should I really be doing it?"

As for your "questions":

> Does version controlling my source really solve any problems?

I use VCS for three primary reasons: backup, shared access to code, history to use to see why someone changed something (e.g. from a close bugfix). I use GitHub because it provides all of those with very little cost for me (e.g. I do not need to maintain a backup server).

ROI is a win for me here.

> Does build management really help creating better quality software?

Of course not. Build management is only about producing builds for your target platform(s). Sometimes this is beneficial, but it really depends on what your actual needs are.

I have no need for build management for my personal projects and my former team at work is aggressively trying to simply their build management.

ROI on this could be pretty bad.

> Does test automation really produce better code?

Of course not. Tests can really only validate two broad categories: regressions (functionality and performance) and crashes. A test case is not good just because it exists, it needs to have a purpose that has an ROI that makes it worthwhile to author, run, and most importantly, maintain.

The other broad class of automated tests is around mutating input in random ways to help find security flaws in your software.

Remember, a test doesn't prove that your software is correct; it simply cannot do that. However, good tests can build confidence that your software behaves correctly. The distinction is critical.

ROI is variable here. Write the tests with high ROI and remove tests that aren't giving you good ROI.
I have used many different build-systems on different platforms and MSBuild is by far one of most terrible ones. And I'm not only one who thinks that, just read what devs are currently talking about MSBuild on Twitter or Hacker News: https://news.ycombinator.com/item?id=9228323

I will take plain Makefiles with sometimes a bit obscure syntax over MSBuild any time of day. But lately for few past years I'm also using similar builds as Casey does, just a single batch file or shell script that does everything it needs. Be it calling compiler, copying files, or making installers. If I need a bit more complex operations or more cross-platform'ish build steps I turn to Python script.

I think the unity build approach basically means "modern" C++ (i.e. lots of templates, boost) can't be used - they make a full build impractical because of how much they kill compilation time.
Unity build helps modern C++ with template bloat a lot. If you have big template files like STL or Boost headers, then instead of parsing them N times for each translation unit compiler parses them only once in unity build. So build speed increases tremendously.
Also, I think it's worth noting that in the future, we will probably finally get an integrated compiler/build-system thing that actually is a win and you would always use it. That's definitely the direction my tools layer is going. So the reason not to use "build systems" today is because they are awful, not because there is something inherently bad about build systems, I suspect.

- Casey
I appreciate this kind of post. Since the community already answered your concern with examples and specifics, I'll paint my answer with broader strokes. And I'll only answer the first and last one, since I believe the second and third were explained by @owensd better than I could've.

Does version controlling my source really solve any problems?
Does build management really help creating better quality software?
Does test automation really produce better code?
Is it really air that I breathe...
Given how you actually asked these questions in the first place, you are ahead of the game (I'll explain why in just a second), and already capable of answering them. But, since you asked us, I'll give my two cents too:
Does version controlling my source really solve any problems?
It might do you well to frame "Does X solve Y" as a "What is X and what can it do" question. You'll invariably answer the former by considering the latter.

So, what is version control? A system capable of handling and tracking the changes of a collection of data —whether a document, source files, website, or any information mutating over time. It can be as simple as tracking one revision to another, or as complex as managing variable features in your project (e.g. through branches) and take them to different paths until they merge into a single one yet again. The system can be made just for a single person or to support multiple teams. So on and so forth.

Once you understand what a Version Control System (VCS) can offer, you can use it to your advantage and restrict it to the uses that you actually need. Users like @rathersleepy, @owensd, and @mmozeiki know this, and made reasonable value judgments (not in its disparaging sense) accordingly.
Is it really air that I breathe...
Heh, this is why you're ahead of the game. In my experience working with other software engineers, they tend to not question things, rarely try things out themselves or even consider ROI, and immediately default to a set of prescribed rules and principles. This horse has been beaten to death on this forum, but it still rings true. The "Global Variables are Evil" mentality is what keeps me up at night, reminding myself that I have to take a stand and complain when people are spending hours thinking about the code much more than they are the problem.

"Is it really air that I breathe..." questions are both funny and, dare I say, necessary.

Edited by Abner Coimbre on
Unity build helps modern C++ with template bloat a lot. If you have big template files like STL or Boost headers, then instead of parsing them N times for each translation unit compiler parses them only once in unity build. So build speed increases tremendously.

That is absolutely true but I still believe my original statement. Having to do full builds of code that makes heavy use of boost, for example, would still be quite painful I suspect (but I have no firm numbers to back it up so YMMV).
I forget where I read it, but it suggested that the 4 levels of expertise are

  1. I don't know that I don't know
  2. I know that I don't know
  3. I don't know that I know
  4. I know that I know

I believe @abnercoimbre is suggesting that @Mephisztoe has reached level 3 :).
Does version controlling my source really solve any problems?

Not really, it just lets you store your old code in case something goes wrong or a feature did meet it's requirements. Sometimes Version Control can be more of a pain, esp if you use SVN. It's that horrible and Linus wasn't kidding. I have experienced the horror of SVN.

Does build management really help creating better quality software?
Most of them are a pain to configure and build and you end up trying to figure how it works which would just waste your time.

Speaking of time, there's a lot of people bragging about how tools saves time but it that true? It really depends.

I thought building a linkedlist was hard and painful as first but after doing it over and over for years, it has gotten easier to the point where I don't really need any tools to built it because I can do it fast. My point is that you'll get faster and faster every time you do sometime over and over and eventually you'll get fast enough to the point where using tools is just a waste of time. It's called learning. :)

Edited by popcorn on
rathersleepy
I forget where I read it, but it suggested that the 4 levels of expertise are

  1. I don't know that I don't know
  2. I know that I don't know
  3. I don't know that I know
  4. I know that I know

I believe @abnercoimbre is suggesting that @Mephisztoe has reached level 3 :).


Ha, spot on. I wonder who the original author was.