Handmade Hero»Forums»Code
4 posts
Unity Builds vs Compiling Only What Changed
Edited by Blkerf on Reason: Initial post
Hi Guys!

So I get what a Unity Build is, and I understand that for a full rebuild it would certainly be a lot faster than a standard Makefile or Build System, but for a partial build, where only a few files changed, wouldn't a Makefile be faster? I feel like the savings on not having to compile, say, 60% of the files every time, for instance, would outweigh the speed benefits of a Unity Build no?
Mārtiņš Možeiko
2559 posts / 2 projects
Unity Builds vs Compiling Only What Changed
The problem with splitting files in many TUs is that compiler needs to do a lot of redundant work - parsing windows.h, stdlib.h, etc... This time adds up quickly. So even if you need to recompile two or three files, it could (depending on your project structure) take as long or even longer than compiling whole unity build.
70 posts
Professional programmer, working in the video games industry since 2012
Unity Builds vs Compiling Only What Changed
Hello Blkerf,

In my experience, on several big projects I've worked on, the link times always outweighed the partial build times, so that you could lose several minutes per iteration, even if you modified a single line in a single file. I guess in every professional setting I've worked in, build times were not a big concern for the teams.

Linking is where you win the most time in an Unity build, because there is no need to do any.

I guess if your build times are getting too long using an Unity build, you could mitigate this by making an Unity build-per-subsystem, and rebuild only the subsystem you modified, the link times will still be shorter than with a "normal" build because you will have a reduced number of translation units.

Anyway, the only good answer to give to this question is: test, profile, measure, and see for yourself what is the fastest.
511 posts
Unity Builds vs Compiling Only What Changed
The problem is the link step.

Once you have a bunch of obj files the linker tends to become really slow.

Discovering which items to build isn't free. It requires getting the last-modified timestamps on every file that contributes to the build.

It is not uncommon that builds like that get bugged where a object file wasn't rebuild even though it should have been and you need to clean-rebuild. This most often happens when a header is updated and the object file wasn't marked as depending on that header.
Miles
131 posts / 4 projects
Unity Builds vs Compiling Only What Changed
mmozeiko
The problem with splitting files in many TUs is that compiler needs to do a lot of redundant work - parsing windows.h, stdlib.h, etc... This time adds up quickly. So even if you need to recompile two or three files, it could (depending on your project structure) take as long or even longer than compiling whole unity build.


In a single-threaded build, this is true. In a multithreaded build with as many cores as translation units, the redundant work isn't a problem because it's being done in parallel, so the total build time will generally be the same as the slowest single translation unit.

It would be best if the compiler itself was multithreaded, of course, but with C++ you have to take what you can get.

ratchetfreak
Discovering which items to build isn't free. It requires getting the last-modified timestamps on every file that contributes to the build.


True, but in practice this is always going to be faster than actually reading and parsing the file, which you have to do during a full rebuild.

ratchetfreak
It is not uncommon that builds like that get bugged where a object file wasn't rebuild even though it should have been and you need to clean-rebuild. This most often happens when a header is updated and the object file wasn't marked as depending on that header.


For whatever it's worth, this has never happened to me. -MMD is very reliable with clang and gcc in my experience. I'd guess that this problem is more common in some build environments than in others.