Rewriting INN's build system
Russ Allbery
rra at stanford.edu
Tue Feb 21 01:49:49 UTC 2012
Clearly, I didn't get to this as soon as I thought I might. But I'm still
playing with the idea and it still seems like it would be worthwhile,
particularly being able to do more merging of portability layers, Autoconf
macros, and the like.
Julien ÉLIE <julien at trigofacile.com> writes:
> Hi Russ,
>> * Proper support for builds outside of the source tree or with a read-only
>> source tree.
> Ah, I bet that is the reason for the built-in SOURCE and BUILD variables
> in runtests. I did not know well how to set them in the Makefile,
> because they appear to be the same for our current build system (the
> test suite is run from the SOURCE tree).
Yes, indeed, that's why that's there. I've been doing more and more work
on this with my other packages to try to automate as much of the machinery
of finding files as possible so that the same code paths are used for both
types of builds and I don't keep breaking one while fixing the other.
The latest C TAP Harness, for example, now has functions like
test_tmpdir() in the TAP library that creates a temporary directory for
files under BUILD.
>> * Could stop manually importing libtool and config.{guess,sub} and just
>> install them in the tree using the autogen script, which means carrying
>> around fewer external files in the source tree.
> From where would autogen take these files (especially when network is
> down)?
autoreconf will pull them from Automake. When you install Automake on the
local system, it comes with copies of all of these files. (Well, except
for libtool; that comes with Libtool, of course.) On Debian systems, for
example, you find them in /usr/share/automake-1.11 and
/usr/share/libtool/config (with some overlap). INN hasn't cared about
exactly what version of the files that you pull in for years (other than
install-sh), so whatever is found on the local system would be fine.
Of course, for snapshot tarballs, we'd distribute the post-autogen
results, which would include whatever was on the system generating the
snapshots.
>> * Generate man pages and text documentation from POD in autogen rather
>> than as part of the build with Makefiles. This is needed for proper
>> support of building outside the source tree. It's a bit more tedious if
>> only one file has changed, but I do this now with other projects and I
>> think it's cleaner.
> Couldn't a "make doc" command be also available, though not triggered off
> by "make" (but probably triggered off by "make all")?
That would also work, although one of the problems with using Automake is
that you don't have a makefile until after you've run it. But one thing
we could do is to use a separate maintainer-only makefile if we want to
write the build rules in make rather than in shell scripts. The advantage
to writing them in make is that you don't regenerate all the docs all the
time, which can take a little while.
So you'd be able to build the documentation without configuring the tree
by running something like:
make -f maint.mk doc
and of course we could include that file in Makefile once it's been
generated by configure, making the same targets available.
>> * Eliminate MANIFEST and instead use Automake's make dist support. This
>> means fewer support scripts and idiosyncratic ways of doing things, but
>> it does mean we won't have the file documenting what every file is for
>> (although I'm not sure how really useful that is).
> We could also keep MANIFEST, which could be an interesting thing to have
> for an overview of the whole distributed files. The only difference
> would be that the MANIFEST file is no longer used for the generation of
> snapshots or releases. We could imagine to update it when doing a
> release. Just generate the tar ball, and check that every file is
> mentioned in MANIFEST. Update it, if needed, and regenerate the tar
> ball. It does not take much time.
> Well, that's just a suggestion.
Yeah, that's not a bad idea. Do you find MANIFEST valuable to orient
yourself in the source tree? Does anyone else find it useful? If people
find it useful, I don't mind maintaining it (it's not that much work).
But if no one uses it, we may as well drop it.
>> * We'd lose the backup .OLD support for updated binaries and similar files
>> on make upgrade. I'm not sure what to think about this. We could try
>> to hack it into Automake, I suppose, but it's definitely a bunch of
>> extra complexity.
> A nice thing is that it will fix an "issue" with the -perm flag of
> inncheck :-)
> % inncheck -perm
> /home/news/bin/rnews.libexec:0: ERROR: illegal file `c7unbatch.OLD' in
> directory (it may be a valid backup if it ends with '.OLD')
> [...]
*heh*. That's kind of silly; we could just make it ignore files that end
in *.OLD.
> On a side note, we have other subtleties in install-sh. According to the
> beginning of the script:
> #############################
> # NOTICE TO INN MAINTAINERS #
> #############################
> # This file has been modified from the standard libtool version by
> # adding the -B option to request saving of the original file
> # (if install is overwriting an existing file). -B takes an argument:
> # the suffix to use.
> # INN invokes this script as "install-sh -B .OLD". Also modified
> # to use cp -p instead of just cp to install programs when invoked as
> # install -c.
> # It also handles .exe extensions, for Cygwin portability, and prevents
> # existing directories from being chown'ed.
> # Search for comments below containing "INN".
The *.exe stuff for Cygwin isn't really too useful any more; I don't think
INN has worked on Cygwin for a bit. Regardless, Automake has machinery
for handling that and will fix things for us automatically. It would
require a bit of work to fix a few edge cases, but we can do that if
anyone is interested in Cygwin again.
The reason why the -c implies cp -p support is there because it's useful
to preserve file timestamps sometimes for determining what happened, but
I'm not sure that's really useful enough to keep the local modification
(and to force install actions to go through install-sh, rather than using
the system install program if it's available).
> So dropping the .OLD backup generates other drawbacks.
> Anyway, re-adding support for all these local change is usually
> quick. All the parts that need being changed are properly marked up in
> install-sh and ltmain.sh.
> Of course, it would be best not to have to do all these changes. But if
> we cannot avoid them...
Yeah, we'd have to maintain our own copy and also configure Automake to
use it instead of detecting at configure time whether there's a system
install program already. It's more useful than a lot of our divergence,
because of the backups, but it's the sort of difference from the normal
build machinery of free software projects that I'd like to move away from.
>> Could we just tell people to make a backup of their install tree
>> before running make upgrade? How much do people actually use this?
> It once was useful when I ran by mistake "make install" instead of "make
> update" :-)
> Hmm... Maybe we could just add a check, when running "make install"
> that inn.conf is not already installed. If the file is present, we
> could just abort, saying that inn.conf should be removed before
> proceeding, or ask "Are you sure?"
> Otherwise, I do not see obvious uses for the .OLD backup.
It's basically a cheap packaging system that allows one level of
reversion. I admit, I've gotten the packaging religion and I'd never
install INN the way that I used to; I'd build an OS package. So I've
gotten out of the target audience for make install/update. I used to use
them all the time.
For just the configuration files, we could make a backup of the whole
directory before make install. I wonder if that would be a good idea.
Create $etcdir.bak and copy verything in $etcdir over before doing the
installation?
>> * Non-recursive make means that you can't just run make in the directory
>> you're working in. make only works at the top level.
> Can't a Makefile running "$(top)/make" be present in a subdirectory?
> (It may not be an Automake best practice so perhaps such a Makefile should
> not be created.)
Oh, we can do that, but it would need to be more complex than that to
build specifically the stuff inside that directory and not the whole
source base. I'm not sure there's much utility if it just runs top-level
make. The benefit would be if it duplicated the current behavior of
building just the stuff inside the current directory, but that would also
restore a chunk of the maintenance burden.
>> * We'd lose the modular plug-in build system for history, storage, and
>> overview methods. Or, rather, we wouldn't have to, but I'd not bother
>> to convert that and would replace it with straightforward Automake rules
>> for the in-tree methods, since I don't think the complexity is worth it.
>> It's a really neat idea, but I don't think there are a bunch of
>> alternative backend developers out there who are actively using the
>> ability to drop their code into INN's tree but don't want us to just
>> check it into CURRENT.
> Or if there are, maybe they will add the extra-complexity themselves to
> go on supporting their private methods (?)
Yup. And I really don't think there is anyone that's using this.
In summary, I think the only two big open issues are whether to keep our
own install-sh or otherwise do something about backups, and whether to add
stub makefiles to each directory, but the basic idea of the conversion
sounds like a good idea. I think that's true of the original discussion
when I first sent this message as well, although alas it's been a while.
--
Russ Allbery (rra at stanford.edu) <http://www.eyrie.org/~eagle/>
More information about the inn-workers
mailing list