

Such an incremental build compiles only a handful of source files and then links the libraries and the application. You already have done a full build, then you make a small change to one file to fix a bug or do an enhancement, and you build and test it. Compile clusters can speed things up by distributing the compile jobs across multiple machines. Therefore, each source file will take seconds to compile, and a large application can have thousands of source files. To a large extent this is caused by the insufficient module concept of C++: Each source includes a lot of headers, so after preprocessing, there can be thousands or even millions of lines of C++ code the compiler has to process. We can distinguish these two scenarios:Īfter pulling in the latest changes from upstream, or after a major refactoring that affects central headers, a lot of source files need to be rebuilt. Large- and medium-sized C++ projects often suffer from long build times.
