Git and Autotools – a hate relation?
For a project where I contribute I started to rewrite the build system using autotools (autoconf/automake), since this is one of the most popular build systems as far as I know. The actual conversion turned out to be not too complicated. But it turned out that in modern world of git(hub), it seems that the autotools have lost their place.
Source of all the problems is the dreaded
WARNING: 'aclocal-1.16' is missing on your system.
You should only need it if you modified 'acinclude.m4' or
'configure.ac' or m4 files included by 'configure.ac'.
which, when you search for it, appears all over projects hosted on some instance of git. The reason is rather simple: git does not preserve timestamps, and autotools try to rebuild each and everything from the most basic files upward if there is a timestamp squeeze.
Consequences of this PITA is that:
- Every developer needs to install the whole autotools stack, and before running anything always has to do a
autoreconf -i
- github generated tarballs might have arbitrary time stamps, so release management via git tags is borken, and users who want to install the software package might see the same errors
The best answer I have found to this problem is to use the make dist
generated tarballs and replace the github generated ones with the home-generated ones. Not an optimal solution that is purely git based.
Another option that seems to work is mentioned here: enable maintainer mode by adding AM_MAINTAINER_MODE
to configure.ac
. This is discouraged in the automake manual. But then, the automake manual’s only reference to a VCS is CVS – which is, well, slightly outdated nowadays.
If anyone has a working solution that fixes the above two problems I would be grateful to hear about it.
So where to go from here in the modern world? The answer is (for me) meson and ninja: no intermediate generation of files, no generated files that are necessary for distribution (configure
, Makefile.in
), no bunch of maintainer mode, distribution mode, extra complicated targets, readable source files, … For the named project I also created a meson/ninja build system and it turned out to be much easier to handle.
The other path I could take (and actually implemented) is dropping automake, and only use autoconf to build Makefile
from a pre-made Makefile.in
, without the Makefile.am
. This is in our case where there is no need for Makefile.am
probably the approach best when one wants to use autotools.
Bottomline for me: give up on automake if you want to use git – either meson/ninja or autoconf only.
Well, Meson by itself has problems. Namely, almost always, a too-new version is required. see e.g. this: https://gitlab.freedesktop.org/pulseaudio/pulseaudio/commit/1b7fab22a419ad284a28e84083ba29cae2fb4b2f
With Autotools and CMake, developers at least have some of the “respect users of old distributions” mindset by default.
Agreed, the relative “newness” of meson is problematic in itself. Agreed.
Tried cmake?
Sorry no. I always hate it fighting with cmake because I cannot set prefix only for the installation step, and only have to go via DESTDIR. It looks too complicated for me.
Simply set `CMAKE_INSTALL_PREFIX` which is semantically the same as `–prefix`. Regarding `DESTDIR`, if you use the `Unix Makefiles` generator, it works identically to Automake as a prefix to the installation prefix. If you have some really complex requirements, please do ask on the list.
I switched away from the Autotools to CMake about 5 years ago for Debian, work and personal projects. After the initial learning curve, I haven’t looked back. It made me realise how limited and poorly maintained the Autotools really were (and I say this as someone who is a former GNU Autotools contributor who went through all the GNU copyright assignment paperwork stuff). They are sorely outdated when it comes to modern portability problems.
Thanks Roger for your comment. Concerning the CMAKE_INSTALL_PREFIX: Maybe it was in old times, but in some projects it did trigger a recompilation AFAIR. Since I use graft (similar to stow) to link stuff into /usr/local I don’t want a recompilation, but simply staged install. But that might be project specific.
Recently I have also heard lots of good things about cmake, and that it has improved considerable from back then when it was extremely hard to grok.
If you re-run CMake to change the configuration, I think this is entirely expected and an intentional design choice (and would typically affect Autotools as well due to config.h and similar being touched, assuming your build uses generated headers). If you configure a clean build tree with the correct prefix up front, it should just do the one build as expected.
It’s definitely improved over the last few years. I’ve been using it since 2.8.x, and recent 3.x releases have so many improvements it’s vastly better. Though it still has rough edges and the documentation could be better, there’s still room for improvement. However, it’s a good pragmatic choice, and unlike the Autotools it hasn’t stagnated. There’s a steady stream of improvements and new features, with reasonable care over backward and forward compatibility, and at least there’s just one language to learn instead of five+! I think the lack of copyright assignment has resulted in hundreds of individual contributors, which led to a very active and healthy ecosystem around it. That never really happened for the Autotools; the Autoconf Archive was about it, and it wasn’t very active even at the best of times.
Well, what I do is the equivalent of
Normally the above install step does not run a recompilation, only adjusts the destination where files are installed.
Thanks for your comments on cmake history and development!
I’m always fighting CMake. The scripting language is at least as bad as Makefiles and all the simple things I want to do always turn into a painful -DCMAKE_ETC nightmare. I inwardly sigh every time I see it. By contrast, Meson featured none of my issues with CMake in some quick tests, though I’m sure it has its own unique problems when you actually start using it. But the fact that systemd, GTK+ and Norbert Preining use it certainly speaks in Meson’s favor.
From my perspective, it looks like the main reason CMake took off is because it has better Windows support than Autotools, or perhaps most people’s goals are just very different from my own.
CMake feels very magical in the rare cases it works, sure, but it’s very hostile to something boring and simple like quickly cross-compiling something or other for my ereader. In CMake that turns into an ordeal, as does slightly tweaking the build more generally. In Autotools/Make it doesn’t. In Meson it doesn’t seem to either.
Anyway, that’s just my opinion. You can take it with a grain of salt, but if you know how to tame CMake properly, let me know. 😉
And here is my favorite complaint about CMake. They have a tutorial how to use generated files, with generators written in C or C++: https://cmake.org/cmake-tutorial/#s5 – and it does not cross-compile, at all.
CMake documentation is completely silent on this issue of generators + cross-builds. I was able to find a solution on the mailing lists (it involves separating out the generator into a subfolder and connecting it via ExternalProject_Add, and importing the resulting executable). For new projects I recommend rewriting all the generators in Python.
On the other hand, with recent meson, such setup is a piece of cake (6 lines of meson.build).
Isn’t possible to make github tag creation start a job which does “make distcheck; make dist” and then automatically publishes tarballs?
Historically, autotools encouraged the concept of “make dist” including all the built makefiles, whereas from day 1 tools like meson said you have to have meson installed and run it to generate makefiles or ninja files.
Ignore the historical autotools practice, and treat it like a tool you always have to have installed. *Always* run `autoreconf -v -f -i` before building.
Installing the autotools stack and running autoreconf is a good thing. This way you are actually bulding from source, not half-building, and taking some random stuff from old versions of autoconf. That model may have made sense in the 1980s and maybe 1990s when computers were slow and there was much more incompatibility between systems, but it certainly doesn’t make sense any more. Just build the whole thing, using current tools and it’s all likely to work much better.
So I really don’t see a problem here.
BTW yet another candidate for a largely-declarative build system is QBS from the Qt people. It’s not terrible.
I somehow disagree with this: autoconf is too sensitive to changes in the version of the various auto* tools. In the TeX Live sources we often see people doing auotreconf and messing up everything because they use some slightly old auto* system or just one version different, and boom, configuration fails.
Furthermore, several developers I work with do *not* have auto* installed.
But the biggest no-go is simply the github issue. One cannot tag a release and let users get the github generated sources. This is 2018 (soon 2019), this is impossible, an absolute no-go. And what does the auto* manuals have to say to this? Nothing. They mention CVS … and nothing else. I mean, that is previous century …
Yeah, there’s something to be said for version locking dependencies in general. In the case of autotools, we have files “configure”, “Makefile.in”, “config.h.in”, etc… whose contents depend on the particular version of various auto* tools that the person generating them has. Although autotools is rather stagnant, that does at least mean it’s more “stable” regarding the version of these tools, and perhaps in best case scenarios it can result in less breaking changes to the files they generate.
Unfortunately, we don’t live in a best-case world and that can break down quickly due to entropy & version dependency hell scenarios.
So, perhaps there is something to be said for the “make dist” / “make distcheck” workflow for maintainers of packages based on autotools and GIt repos.
Some of this was apparently discussed by the autotools folks in 2009, but there was some pushback against supporting git as the only SCM:
https://lists.gnu.org/archive/html/automake/2009-08/msg00038.html
Hi James,
thanks for your comment on this old post. Unfortunately, nothing has changed AFAIS. Tagging a release in git creates already “release tarballs”, and it would be great if they could be used as is, *without* the need to have autotools installed.
The “make dist” is of course an option, but then you have to upload the tarball manually to github, replace the autogenerated one, … – that is not what good devops and software development practices recommend, because it is error-prone.
Thanks for the link to the 2009 thread, very interesting read, but depressing that 11 years on we are still in the void 🙁
Hi Norbert,
Yeah, this post and discussion was an interesting read. Recently I’ve been experimenting a bit with autotools, and a lot of other Unix esotericism. I fully realize the crazy situation that the advent of Git has put on old and crufty tools like the GNU Autotools suite. It’s certainly as you point out, the older “make dist” process for generating a distribution tarball isn’t truly nicely compatible with the Git workflow, and public repository sites like GitHub that auto-generate tarballs from the repo.
I was able to find another thread from September 2010 about an updated version of that same “git-dist” Automake Makefile helper targets here:
https://www.mail-archive.com/automake@gnu.org/msg16384.html
After some fussing around with it, I was able to get a demo project working. It turns out that it does a great job at committing the result of “make dist” into the git history under “distribution/$(PROJECT)-$(VERSION)” tags. After some more digging, I was able to find this project that was using it at some point:
https://salsa.debian.org/rleigh-guest/schroot/-/blob/debian/master/scripts/git-dist.mk
Modern Autotools was complaining about a GNU-Make specific issue on line 47:
GIT_DIST_BRANCH=”$(basename distribution-$(VERSION))”
I was able to work around this by simply avoiding the call to “basename” GNU Make function. This seems to work well enough for my purposes, but it has the side effect of creating the dist branches as “distribution-N.N.N” instead of what he originally intended: “distribution-N.N”. I guess there was some need to have a “Major.Minor” style distribution branch.
Anyway, after getting this working in the demo project… I also found some more modern DevOps methods with the new “GitHub Actions” workflows for cobbling together a nice little release pipeline with automatic Changelog generation. Still I’m sure I could make it a little easier… this totally feels clunky when compared to “docker build” workflows.
Here’s the example project I tested it out on: https://github.com/trinitronx/range2cidr/tree/release/range2cidr-0.0.9
Hi James,
It was interesting reading your post, and I’m glad you are finding that this code actually works, well over a decade after I originally wrote it!
While I used this for several years for the schroot project, and was certainly happy with it for my needs at the time, I gained zero traction in getting any improvements made in Autoconf or Automake. They didn’t care about it and thought they shouldn’t add any VCS-specific functionality. (Why?! It’s not like it hurts CVS or Subversion and it’s for use by project release managers, not end users. Projects only use on VCS at once!). My take on this, unfortunately, is that the Autotools projects are effectively mothballed at this point. There’s barely any activity, and they ceased to solve contemporary portability problems nearly two decades back. I switched all my projects over to CMake a good six plus years back, and have not looked back. See my other post here for more details about that.
Just as a recent example, last week I found a small portability bug in the CMake ICU module. PR submitted, reviewed and merged within a day. It will be in the next minor release in a few weeks. Total turnaround time from finding the problem to fixed in a production release: a few weeks. With the Autotools it’s years to never. I’ve submitted dozens of CMake changes; they all get turned around really fast. It actually has developers who care about maintaining it and pushing the boundaries, and catering for new use cases. If I still cared about this functionality, it could be rewritten for CMake and merged upstream inside a week.
I originally wrote this for the Debian git source format, and the anticipation that we might phase out tarballs as an intermediate distribution format in the medium to long term. It’s never really taken off. Other systems (FreeBSD ports, MacOS homebrew, Microsoft vcpkg) are more likely to allow direct shallow clone of a release tag; it fits with their workflow much better, and it’s becoming more popular. Why use a tarball when you can get it directly from the source? I think this lack of traction is Debian’s problem to deal with. It’s not a technical problem, it’s the way packages are maintained disjointly. All the above use a common VCS for everything.
When it comes to storing the Autotools’ generated files in a VCS, I’ve never been truly happy with it. We don’t typically store generated files for any other tools in our VCS, like lex/yacc, doxygen documentation, generated manuals etc. (well, some do, but it’s frowned upon.) We require the end user to install the build dependencies. What makes the Autotools a special-case? I think historically convenience and poor compatibility were part of the problem. Today, it’s so stagnated this isn’t an issue. Easy to get the user to run “autoreconf”. It’s not like it’s a deal-breaker to get people to install other tools, like cmake, doxygen, sphinx, compilers etc., so I don’t think the Autotools really deserve this special treatment, despite them taking it upon themselves to embed themselves this way via “make dist”.
So I abandoned this a good while back I’m afraid. Today, my releases are simple git tags of the main source branch. The CI will build everything from that tag automatically. User documentation, API reference, binaries etc. As a result, it no longer solves any problems which I have. With current CI systems using docker and the like, it’s easier than ever to ensure all the needed tools are there, which includes the Autotools.
I hope you continue to have success using it. But I would definitely suggest looking beyond the Autotools and investigating a migration away from them. Their heyday was the late ’90s, and they are still stuck in the previous century. The day when every single open source project used them is long gone. It’s now primarily inertia which keeps it around, rather than technical merit of any kind.
Kind regards,
Roger
I’ve found autotools tractable, on Linux. On Windows, qmake. Tried cmake, and it seemed just as complex as autotools, and less mature. CMakefile.txt? Gimme a break. Anyway, to me its a non-issue using autotools on Linux (and testing qmake on Linux as well, in preparation for …) using qmake on Windows.
As a developer and maintainer of a couple of C/C++ repos on GitHub, I have abandoned autoconf and automake when I could. However, that is not always a viable option. With a bit of experimentation and based on suggestions by others to touch files, I found the following method works successfully to work around this specific issue: create a bash build script, e.g. ‘build.sh’ that invokes ‘configure’ after touching some files:
touch aclocal.m4 Makefile.am */Makefile.am
sleep 1
touch config.h.in Makefile.in */Makefile.in
sleep 1
touch configure
./configure
make -j clean all
I threw in two sleeps just in case the timestamps have a poor resolution on a file system.
Hope this helps.
Thanks for your comment,indeed,that would work around most problems. Only sour point is that those hacking the .am sources need to remember not to use the build.sh script 😉 Nothing is perfect.