BUILD is quite capable of parallel builds. If it wasn’t, NT builds would take far longer than they do. This is documented in MSDN. Look at the macros for SYNCHRONIZE_DRAIN, SYNCHRONIZE_BLOCK, BUILD_PRODUCES, BUILD_CONSUMES, the command-line docs for BUILD, and the MSDN topic “Building on a Multiprocessor Computer”.
Sometimes people (including me) have trouble with BUILD because the DIRS and SOURCES files are read and interpreted by two very different programs – NMAKE and BUILD. BUILD reads DIRS and SOURCES files, analyzes inter-component dependencies (including parallelization), and runs any number of NMAKE instances. But BUILD and NMAKE interpret the SOURCES file differently. NMAKE doesn’t have any special knowledge of SOURCES files; that’s why the one-line MAKEFILE includes MAKEFILE.DEF, which aligns NMAKE’s perception of the SOURCES file with that of BUILD. BUILD assigns specific meanings to many of the macros documented, such as SYNC* ones I mentioned.
This is why you can’t do some tricks, such as building the SOURCES macro (not file) using !INCLUDE, or within !IF blocks, or a lot of other things that seem rational when you’re writing a makefile. BUILD does not implement the same NMAKE preprocessor, so it interprets all lines unconditionally. So think of SOURCES as a declarative description of a component, not a makefile. Sure, there are escapes, such as makefil0 and makefile.inc. All this is documented in MSDN, as well.
So, long story short, BUILD handles inter-component dependency analysis (ordering and running parallel NMAKE instances) for very large source trees, and NMAKE handles intra-component dependencies. Actually BUILD does *some* intra-component analysis (header files, etc.), but there are situations that confuse it.
Personally, I think that people who chose (2) are insane, as it breaks
basically every fundamental rule and/or convention governing separate
compilation, and still results in a far from transparent hack to
workaround around a build system.
Not quite. Separate compilation doesn’t really buy you much. Separate compilation can even be a detriment. The majority of the large components I’ve worked on usually converge on having a single project-wide #include file, so that every .C file is working with the same environment. With precompiled headers, this usually reduces your compile time, too. (If you aren’t using PRECOMPILED_INCLUDE… then why aren’t you?!) In projects where it is possible to rebuild a single .obj file and then rebuild the component, you often hit problems where the dependency analysis wasn’t quite right and several .obj files have different ideas of the layout of shared structures, function prototypes, etc. So I usually rebuild the entire component anyway. With precompiled headers, it’s very fast. This is one area where I really, truly prefer C# and its relatives – file-by-file compilation is a maintenance headache, and it was only necessary as a work-around for machines with very little memory. We’re beyond that time.
#including other C files is conceptually quite similar to precompiled headers. You expose the compiler to all of the prototypes, structure definitions, etc. in a single pass, so again you get consistency. And I don’t see how it breaks any “fundamental rules”. You can still organize your code any way you please.
Anyway, this is clearly one of those “taste” issues. Some like Coke, some like Pepsi. (And in some provinces, they’re both outlawed, at least right now!) We’re *kind* of off-topic, but since BUILD ships with the DDK, I guess we’re safe.
Anyways, I think build.exe is a good tool for small scale drivers
and for getting started project. But as the size of your driver
increase and you start having auto code/header/ Makefile
generation, you ll need to pull back with a traditional build system.
Like 30 million lines of code?
– arlie
-----Original Message-----
From: xxxxx@lists.osr.com [mailto:xxxxx@lists.osr.com] On Behalf Of bank kus
Sent: Wednesday, August 30, 2006 1:51 AM
To: Windows System Software Devs Interest List
Subject: Re:[ntdev] trouble with SOURCES
“Martin O’Brien” wrote in message news:xxxxx@ntdev…
> JEREMY:
>
> As far as SOURCES/DIRS/BUILD goes, no you’ve got it right - it really,
> really sucks. To do what you wish you have three basic options:
>
> 1. The one you have already described 2. Include the .C/.CPP files
> from the folder in which you compile 3. Don’t use BUILD
>
> Personally, I think that people who chose (2) are insane, as it breaks
> basically every fundamental rule and/or convention governing separate
> compilation, and still results in a far from transparent hack to
> workaround around a build system.
>
> I’ve used something I’ve written (just a set of makefiles and a couple
> of perl scripts) for about seven years. As long as you check that the
> compiler and linker settings are still valid with each new DDK/WDK
> that you use, you shouldn’t have any trouble. The only nice feature
> of BUILD that gets lost here (assuming you use only CL, LINK, and
> NMAKE) is automatic dependency information.
Cant agree any more with it. GNU Make ( and its Windows port Mingw ) works for me but it has its own restrictions.
Particularly the topic of recursive build can be a hairy one that modern build systems like Makepp/ JAM do better at. The topic of contention being do you createprocess (make.exe or whatever your tool is ) once for each directory or do you include your sub directory Makefiles into the parent makefile and run Make just once.
Apart from the createprocess overhead, continous directory recursion is bad for build level parallelism.
I think in one of the replies somebody from Microsoft mentioned parallel builds. I was not aware whether build.exe / nmake is capable of parallel build / dependency tree walking capability. Or is it?
Anyways, I think build.exe is a good tool for small scale drivers and for getting started project. But as the size of your driver increase and you start having auto code/header/ Makefile generation, you ll need to pull back with a traditional build system.
> I can’t help you with your second question (WPP), as I know nothing
> about it. I looked at it once, and decided that KdPrint was just fine.
>
> Good luck,
>
> MM
>
>
>
>>>> xxxxx@telestream.net 2006-08-28 17:56:02 >>>
> I would like to organize my driver source code into subdirectories
> based on the codes primary function and I’m having a lot of trouble.
> I’ve read through OSR Online, Google, and the DDK Docs and I’ve found
> that the only good way to do this is to make each subfolder a library,
> and then
>
> link them all together. First, I just want to say that I think this
> sucks. It is frustrating enough that I have to jump through so many
> hoops to just use basic C++, and then to add insult to injury, I can’t
>
> even organize my source files in an object oriented way. Has everyone
> just gotten used to the ridiculous restrictions in SOURCES files, or
> am
>
> I just missing out on some import concept?
>
> Anyway, enough ranting. My new problem is trying to include *.tmh
> files
>
> in my library. I tried adding the RUN_WPP code to my libraries SOURCES
>
> file, but nothing was generated. Should I try to include the *.tmh
> file
>
> from my parent project? How do I add the correct include path to my
> libraries SOURCES file?
>
> Thanks,
> --Jeremy
>
> —
> Questions? First check the Kernel Driver FAQ at
> http://www.osronline.com/article.cfm?id=256
>
> To unsubscribe, visit the List Server section of OSR Online at
> http://www.osronline.com/page.cfm?name=ListServer
>
—
Questions? First check the Kernel Driver FAQ at http://www.osronline.com/article.cfm?id=256
To unsubscribe, visit the List Server section of OSR Online at http://www.osronline.com/page.cfm?name=ListServer