automake or not?

Posted 3 Aug 2000 at 10:04 UTC by mbp Share This

What's your opinion on building using the GNU automake tools?

automake and autoconf are GNU tools that together produce the famous

# ./configure
# make
# make install

setup. They read meta-scripts called and or, and from them produce a Makefile at configure time.

The advantages are obvious:

  • The configuration script and the targets understood by the Makefile are standardized
  • The author's saved most of the tedious work of writing the Makefile
  • autoconf helps the program discover and work around peculiarities of the OS (header files, broken functions, etc)
  • automake and libtool hide most of the details of building shared libraries

In particular, I've been thinking about how to manage the build process for rproxy. I've run into a couple of disadvantages of using automake in particular, and wondered about the experiences of the Advogato crowd:

  • We require more tools installed to build from CVS (assuming you don't check the generated versions into CVS): at least GNU m4, automake, autoconf and libtool
  • The generated Makefiles from automake sometimes break on proprietary unix, and it's harder to work around this

What's your opinion?

(There's too much moderation-angst and too much politics here for my taste, so I thought a decent geeky article wouldn't go astray. ;-)

can't live without them, posted 3 Aug 2000 at 10:32 UTC by bagder » (Master)

I'd be the first in line to say that the current incarnations of automake and autoconf lack funtionality, features and sometimes act rather silly. Although there are improvements in the pipe and I just can't name any other competing tools and thus those are the only one that do what they do.

You claim "the generated Makefiles from automake sometimes break on proprietary unix". Would you care to elaborate on this? I've successfully used autoconf+automake for years and my curl project builds on many platforms, many of them being "proprietary unix".

What I think is the most frustrating part, is that we all have to re-write so much work when we do our files. I've been using autoconf for years and I still find myself editing my and writing custom rules every now and then.

I'd never be able to make a replacement for autoconf + automake + libtool for my project and make it as portable as these tools do it. Not within reasonable time.

Terribly unclean, posted 3 Aug 2000 at 10:40 UTC by davidw » (Master)

Shell and perl scripts that generate shell scripts and makefiles. Can you get much uglier than that?

I really wish there were a cleaner system that did more or less the same thing. This is just such a beast to use:-/

break on proprietary unix, posted 3 Aug 2000 at 10:58 UTC by mbp » (Master)

I think a Makefile built by automake contained

foo := bar quux

which SGI make doesn't understand. However it was late and I might have been confused, so I don't hold it against them. Certainly the documentation admits that you need perl and(!) m4 to rebuild from, and they're not always present.

:= syntax, posted 3 Aug 2000 at 14:23 UTC by jamesh » (Master)

The makefiles generated by automake when you do "make dist" (another reason to like automake) should be very portable. If you look in the docs, you will see that it says that GNU make (and gcc) is required for the automatic dependency tracking to work. When you do make dist, the dependencies are compiled into the make file, and result should not require GNU make.

The CVS version of automake has new dependency code, which can work without gcc. I don't know if it still requires GNU make for development.

So automake should be fine unless you don't have gnu make on your development machine (people who build from tarballs won't have a problem). The Software Carpentry competition may provide an alternative to these utilities at some point in the future.

automake/-conf don't make this world a better place, posted 3 Aug 2000 at 15:28 UTC by doobee » (Journeyer)

I have learned to hate automake/autoconf when using OS/2

There is a set of tools called EMX for OS/2 which helps to compile Unix aplications for OS/2. EMX is pretty complete and even non-trivial Programs often compile and run without changes. EMX was the base to port Xfree86 to OS/2.

So you can compile and use many Unix aplications - unless they use automake/-conf. automake/-conf preach cross-platform development and tie your build process so close to Unix that it is almost impssible to compile the aplication on another OS.

In addition automake/-conf can add an significant layer of complexity to a simple aplication. It is hard to find out what all this config* and *.in files do. A simple Makefile becomes a multiple-houndred line beast. So automake makes your computer on step further un-understandable which is a bad thing(TM).

Maybe such a beast is needed for monster applications like emacs or whatever, but I often see more or less simple aplications using automake/autoconf without real need.

In addition it just relives the pain of portability but doesn't cure it. I don't want do know if this system supports poll() or select(), I want to use a single function, I don't want to know if this system has snprintf but I want to be able to code without buffer overflows.
And I still want my code to be easy understandable and auditable. autoconf and automake often seldom help getting there.

OS/2 not valid criticism basis against automake/autoconf, posted 3 Aug 2000 at 16:14 UTC by atai » (Journeyer)

OS/2 is not a Unix platform, and as such, should not be used as the basis of criticism against automake/autoconf.

Automake/autoconf basically solves the portability problem among Unix platforms.

Good for simple stuff, not so good for complex stuff, posted 3 Aug 2000 at 16:16 UTC by movement » (Master)

I've used automake a couple of times and have generally found it to be very useful. autoconf is even more useful though, but that's not what we're talking about :)

mbp: I'm not sure building from CVS requiring more tools is really that much of a problem. If you have regular users rather developers who use CVS versions, you can just make snapshots including the generated files - this is more convenient for them as well,and can be automated. As jamesh pointed out your portability problems was probably because you didn't "make dist" - I've done this myself before.

Some things are *very* awkward though when you want something a little beyond the usual. For example, it's difficult if not impossible to split the source for a single target into several sub-directories. I assume there's some way to hack around this, but I couldn't find it.

I think the general principles of autoconf/automake are good ones, and I'd be very interested to see if something better comes out of Software Carpentry.

Also, if you do use automake, watch out for the "make dist" target. It will include the dependencies but *not* remake the dependencies. This means if you've configured in, say, -lefence for testing, you must delete all the .deps files, otherwise your generated tarball will have very broken dependencies and won't compile on a clean machine. I seem to remember asking about this on the mailing list, but I don't think I ever got a response ...

doobee: I agree the UNIXness of the auto system is a problem. I fundamentally disagree with your claim of complexity though. Who cares how complex generated files are ? No one needs to look at them in the usual course of things. A typical is about 3 lines. It's much better to have a robust build/configure system than some crufty hacked-up makefiles you wrote at 4AM.

Your "single function" point is orthogonal to the discussion here. The fact is, your sources must still check for availability of this library function to be portable. Autoconf is ideal for this. Furthermore, this function must be built out of the platform primitives (poll()/select()) so there's a further advantage for the *function's* source to be configured via autoconf. You can easily hide all the #ifdefs behind #define's so the main source is clean of this cruft (something GNU souces don't seem to have grasped yet, but the Linux kernel is generally exemplary at). Anyway, this is about automake ...

Alternatives?, posted 3 Aug 2000 at 17:20 UTC by Damian » (Observer)

I know this doesn't fall under the category of automake/autoconf replacement, but cons is a Perl replacement for make; this might be a Good Thing, depending on your orientation toward Perl.

Of course, there's always the way Dan Bernstein handles UNIX portability.

Not everyone's cup of tea.

automake and single target with source in subdirectories, posted 3 Aug 2000 at 19:00 UTC by atai » (Journeyer)

The version of automake currently in CVS supports source files of a single target in several sub directories.

It just sucks less.... , posted 3 Aug 2000 at 19:04 UTC by bbense » (Journeyer)

Autoconf/make are just the most recent in a long line of attempts to solve the problem. My rough recollection of the sequence is something like:

Editing Makefiles and .h files for week to compile package X.

Configure ( I think only rn/trn and perl ever really adopted this).

Imake ( Pure evil on a stick.... )

Autoconf/make ( Fueled the open source revolution, great as long as you don't have to write files. )

- One issue that I seem to be running into a lot these days is detecting the installation of another OS package. Particularly one that has several different flavors ( i.e. MIT K5 vs Hiemdal , SSL variations, Netscape LDAP SDK vs OpenLdap). While many of the OS variations have standard tests, it would be very useful if packages would distribute the appropriate tests to check for their installation in other packages.

Is there an "ultimate" autoconf?, posted 3 Aug 2000 at 19:10 UTC by Zaitcev » (Master)

I was bitten by Japhar requiring one version of autoconf and mpeg2dec requiring another version of autoconf. Hmm.

Autoconf yes, automake no., posted 3 Aug 2000 at 20:58 UTC by claudio » (Master)

I like autoconf (and libtool too), and despite the large scripts it can generate it does well its job. Makefiles generated by automake, on the other hand, are bloated and inelegant. Perhaps I'm just from another era when you could write your own makefiles with feeling and emotion ;) Writing the makefiles is part of the creative process of working in a software project.

Anyway, it's important to know how to write makefiles, since automake won't be the best solution or even won't work in some specific projects and environments. You can usually write your own makefiles very quickly, and they will be smaller and more customizable.

make distcheck, posted 4 Aug 2000 at 00:40 UTC by ole » (Journeyer)

I don't know any other build systems that offer this hook.

gnome-config and make distcheck, posted 4 Aug 2000 at 01:59 UTC by mbp » (Master)

I think the := problem on SGI was my fault. Certainly it's easier to get confused using automake than when there's just a simple Makefile.

make distcheck is extremely cool: basically it tries to build a tarball, then extract it into a new directory and tries to build, configure and test the distribution in there. It's very nice for making sure you haven't ommitted files from the distribution lists, for example. I run it every night from cron. Obviously it'd be pretty simple to write it from scratch, but it's good to have it be there as standard.

Also there seems to be a trend towards library packages distributing a script describing their installed configuration, like this:

$ gtk-config 
Usage: gtk-config [OPTIONS] [LIBRARIES]
$ gtk-config  --libs
-L/usr/lib -L/usr/X11R6/lib -lgtk -lgdk -rdynamic -lgmodule -lglib -ldl
-lXi -lXext -lX11 -lm

This is probably a good thing, though also overkill if every little tool does it.

Autoconf is OK, Automake is nasty and libtool is slow.., posted 4 Aug 2000 at 06:18 UTC by jgg » (Master)

My observation is fairly simple.. Configure has good uses to allow end-user configuration and selection of tools and options, environment verification (you must have libfoo) and testing for features the OS lacks so the build system can install substitutes.

Automake is IMHO very lame. It exists because 'standard' make is virtually useless. So when you use automake you are condoning the lack of advancement in build tools and are basically causing automake to self-perpetuate itself. The problem it is designed to work around will never be fixed as long as it exists .

There does exist a very powerfull and extremely popular make from GNU that has a startling number of features that are not present in 'standard' make. If you make use of these features your make files can be very elegent and very readable. Coupled with a sort of 'function library' you can basically duplicate the automake using only the macro language in GNU make.

The linux kernel has some good examples of how GNU make can be used, and I use it in my software packages.

Libtool.. Libtool is useful, but if your main dev target is Linux then all it does is slow down your compile/test cycle - thus the best thing is to have your makefiles only use libtool if you are not running on linux (grin).

Libtool, to some extent, is also a work around for GCC shortcomings. For platforms with similar shared library capabilities (Solaris like ELF) a standard set of command line options to gcc in linking mode should produce a proper shared library - but sadly it doesn't. Nobody seems to care because you can always use libtool... Imagine the chaos if 'cc' wasn't even quasi-standard between compiler vendors, you'd need a cctool! Aren't you glad you don't need that!

In conclusion, take a stand! Help make everyones job easier by encouraging the use of better tools and not wrappers around crappy tools.

Just say no to gvoid syndrom! (grin)

libtool and automake work fine for me..., posted 4 Aug 2000 at 07:10 UTC by mettw » (Observer)

The extra time taken for XDBM to compile because of libtool is so small that it doesn't make any real difference to my hacking on it. As for automake, it may force you to learn some new commands, but it is much easier to use then trying to account for everything with autoconf alone. I think that trying to get standardisation across every platform is a lost cause. To do that you'd have to make every system essentially the same by forcing strict standards compliance. Now, standards compliance is a good thing, but non-compliance is a good way to try out new ideas without having to wait for the IETF et al to come up with drafts. Shifting crossplatform code to another tool is IMO a better solution than forcing every system to be the same.

On the issue of automake being broken on some systems, that's probably just from non-GNU tools. GNU make etc are pretty ubiquitous nowadays so it shouldn't be hard to find someone with them installed to compile rproxy for you. After all, you only need these tools to make rproxy, not to install it.

autoconf/automake/libtool is shit, posted 4 Aug 2000 at 10:22 UTC by raph » (Master)

First, let me say that the auto* suite is a tremenous improvement over what came before in terms of a portable build infrastructure. Hand-hacking makefiles is simply not portable, and thus doesn't even count in any such comparison.

That said, auto* is shit. Let me enumerate some of the ways, in no particular order:

* Libtool is slow, make no mistake. In the measurements I've done, it's almost two times slower than a stripped-down makefile. When you count the time spent in ./configure itself, it can be much worse, especially for small projects. This is all the more shocking considering that gcc is a pig. If we had a decent check-out compiler (ie, decidedly mediocre code optimization, but zippy compilation speed), I'd expect auto* overhead to dominate.

* Libtool'ed binaries are a pain in the ass to debug. In addition to gdb being much harder to invoke (still haven't figured out how to get gdb, libtool, and emacs gud-mode to play happily together), gdb often simply breaks. In my own work, I've basically given up and just use a nice stripped-down makefile whenever I need to do serious debugging.

* Libtool seriously constrains the structure of your project. In particular, I've foud it makes it quite a bit harder to split up source files into subdirectories.

* Yes, ./configure; make; make install is nice. However, this process only covers part of the issues. Packagers (in the Linux distro sense) still have a hell of a lot of work to do.

* Speaking of packaging, auto* is not integrated at all well with RPM. In particular, running a mixed system consisting of both RPM'ed modules and modules built from CVS is a sure recipe for disaster.

* Auto* is just way too complex. When (not if) things go wrong, it's nearly impossible to track down the problem through layers of makefile, shell script, m4, and other cruft.

* Auto* is not portable past Unix (I understand Win32 sorta works, though). Unix is not the only interesting platform to develop on.

In sum, we are direly in need of dramatically better build tools. I am personally very disappointed that there doesn't seem to be much promising work in this direction (and yes, I'm including the Software Carpentry design competition in this criticism).

autoconf is good - for nonXP applications, posted 4 Aug 2000 at 13:55 UTC by czw » (Apprentice)

If you plan to write all your applications for the Unix platform, autoconf is a very useful too. It helps smudging out all those small but oh so annoying differences. However, a project that is supposed to be usable on BeOS, MacOS, OS/2, Unix and Windows... This makes that util almost useless for me, who develops for OS/2, Solaris and Win32.

Automake and libtool? Can't say anything about them, actually, since I have never tried any of those two.

Good idea, too much history in the implementation, posted 4 Aug 2000 at 18:42 UTC by jhermann » (Master)

I started to use autoconf/automake on a larger (and growing) project about half a year ago. Took me roughly a day to get going, and since then we have a half-decent build environment. It works. It's ugly in many parts.

My major problem was lack of documentation (and I did not have the time to subscribe to the mailing lists, I already have enough of them). Especially use-cases of common project layouts (hierarchical source directories contributing to ONE library) and such things were very hard to extract out of the docs. My current solution for that is half-assed, at best.

What I'd really like is a portable and maintainable build system. That would include concentration on one portable tool (Python), not the shell/perl/m4 mix we have now. I really like the build system for Java (Ant), and a similar language- independant system written in Python would rock.

As a user or programmer?, posted 4 Aug 2000 at 21:14 UTC by Uruk » (Apprentice)

As a user, I love configure/autoconf/automake and family.

As a programmer, I hate them. For complicated programs with a lot of portability issues, i'm constantly having to drag out documentation, and it can take quite a while to put all of that stuff together if you're not too familiar with it.

It seems like most's and's are ripped off of other projects and modified. I don't see any problem with that, but it's a sign that the syntax of all of those tools is too annoying or too obtrusive for most people to learn and be proficient at. I mean, damn, I wanted to write a program in C, I didn't want to learn yet another obscure scripting language, m4, and everything else that comes along with building that stuff correctly.

And there is 0 reuse other than informal rip-offs I mentioend earlier. There are some canned m4 macros for commonly used libraries to detect their presence through configure, but not enough. And I have yet to find a good tutorial on using all of the tools. The GNU reference manuals are great, but they only help if you already know what's going on, which takes quite a while without a tutorial.

It certainly sucks less, posted 4 Aug 2000 at 21:40 UTC by imp » (Master)

the automake/autoconfig stuff certainly does suck less. imake was a good idea, but it had its problems (the biggest one was that it was almost impossible to debug the twisty maze of macros).

These tools aren't ideal. libtool helps too, but there are certainly much room for improvement. There are many different intersecting problems in building: variation in make, variation in APIs, variations in shell scripting, variations in installed packages, etc. The bluring of these lines has given us the packages that we have. Someone with about 6 months to kill, a clue gained from many years of battling these systems, a bunch of oddball systems and the keen desire to fix all of this needs to happen. Until then, we'll have the cobbled together system we have now. It is certainly better to use them than to not use them because it greatly enhances the program's portability.

But things still have a degree of suckage.

Programming/Project Conventions, posted 5 Aug 2000 at 00:51 UTC by ZachGarner » (Observer)

Is there any one place on the web for a developer or a project administrator or even an occasional programmer to learn the Shoulds and Should Nots of programming?

Think of questions like "How do I write portable code?" What about secure code? Things like CVS and the Auto utils are, to me, nearly vital to programming anything big, but what if i didnt already know about them? That makes me very curious about what else i'm missing or what is currently in development to replace the currently popular tools.

Is there anything on the web like a Developers Repository (i'm more or less only concerned with Unix developement)? A place to learn everything that a Good programmer Should know? From how to use GDB to deciding on the best Style to use, there Should be a place on the web to learn these things. (When it comes to things like editor choice and programming style, its very opinion based, but i have seen decent overviews)

I've thought about posting an article about this... but i figure a comment under here would get some attention (its not THAT offtopic...)

libtool, gdb, and emacs, posted 5 Aug 2000 at 03:53 UTC by mbp » (Master)

raph, I've found a wrapper script like this can work OK:

#! /bin/sh -x

exec libtool --mode=execute gdb $*

Install this in ~/bin, put that on your path, and then in emacs when prompted for the gdb command to run enter

ltgdb myprogram

This is a kludge, certainly, but I'm too lazy to read through the elisp.

One additional annoyance is that gdb won't automatically reload files that are rebuilt using libtool. I think this is because libtool moves the file into place, rather than truncating and rewriting the existing file. So you have to quit and restart gdb each time, or something similar.

automake rocks; autoconf sucks, posted 5 Aug 2000 at 13:48 UTC by LotR » (Master)

I seem to think the complete opposite of everyone here. Let me try to explain why. I think people here are mixing up what is what. Automake really only is about the files, which are pretty damn nifty. The problems happen because it is based on a lot of crap.

Rebar, Raph's Software Carpentry entry looks amazingly like automake xml-ified. And for a reason IMHO. Automake got the concepts of the build right.

The real problems are autoconf and make with their arcane syntax. Nobody wants to learn shell syntax, m4 and Makefile syntax just to be able to check for a library or which endianness the target machine has. Sure, there's a lot of m4 involved when you use automake, but that's only because it is based on autoconf. And yes, automake still has the ugly make syntax when you want a custom rule, but again, that's because the developers didn't have the courage to dump make.

Basically, I think noone would be complaining about automake if it got rewritten to rely on a sane autoconf replacement, and would have some tool to invoke gcc, etc immediately, instead of having to generate files in the awkward make syntax.

Necessary evil, posted 6 Aug 2000 at 03:01 UTC by aaronl » (Master)

My main gripes about auto* are that the files they autogenerate can be larger than the sources themselves in small projects, and that they have arcane syntax. I know about 15 languages (not including human languages) and I am not interested in learning others that are only useful for describing build environments. My complaint about file size would be much more minor if the "configure" script was a standard, required, system utility just as gmake is for autoconf-generated Makefiles (AFAIK). "configure" would work well as a system utility that read a tiny config file from the source directory becuase autoconf is so widespread that it is wasteful having each package contain so much autogenerated code.

Another problem that I have with autoconf is speed. autoconf and automake both run in resonable time, but the shell scripts that autoconf produce are, as raph pointed out, very slow and huge.

That said, I don't really hate autoconf. I use it for major projects that I plan to distribute widely. When I only want a small environment for my own developing, I usually prop things up with Makefiles.

The AbiWord project had a discussion about whether to switch to autoconf (started by me). The current build system is based on makefiles that are huge hacks. They don't even have dependencies. I stayed up all night learning the 'make' language and finally gave up when I was no closer to getting compiler-managed dependencies working. This was actually due to the layout of our source tree. Autoconf and automake would handle stuff like this automatically.

Fortunately, a new developer has shown us some code demonstrating an unintrusive auto* based build system for AbiWord. It works by generating makefiles named GNUmakefile, which takes precedence for gmake over Makefile. If you run ./configure, you will get a set of GNUmakefiles that will be run by 'make' causing make to skip over the legacy build system. People on OS's like QNX and Win32 where we haven't really tested this autoconf setup yet can just run make, skiping ./configure. I haven't worked much on AbiWord since I discovered LaTeX but I wish them luck in ariving at a sane build mechanism.

performance hit of libtool, posted 7 Aug 2000 at 02:50 UTC by mbp » (Master)

I ended up cutting out libtool and leaving in automake and autoconf. The performance improvement while compiling is remarkable: it seems about twice as fast to rebuild after changing a global header. (I suppose this is not surprising given the size of the libtool shell script.)

So for the time being I will make do without shared libraries, which is fine for rproxy.

half the speed, twice the work?, posted 7 Aug 2000 at 17:36 UTC by rillian » (Master)

Um, a lot of people are complaining about libtool builds taking twice as long. Are you sure this isn't just from its default policy of building both shared and static libraries? (imake does this too).

Try timing a build configured with '--disable-static' for comparison. Otherwise I don't see how the wrapper script could slow down the build so significantly.

libtool--rewritting in C?, posted 7 Aug 2000 at 21:51 UTC by atai » (Journeyer)

If libtool is really that slow, would a re-write in C help?

libtools is a huge shell script, posted 8 Aug 2000 at 02:59 UTC by mbp » (Master)

rillian, I already had --disable-shared, but thankyou for suggesting it. The default of compiling everything twice is another bad thing from the naive user's point of view, of course.

I think the reason it's slow is that ~/rproxy/libtool is on my machine a 4284 line bash script. Starting bash, running that script, and all of the processes it runs is pretty expensive: comparable, as raph observed, to the cost of compiling a small file with gcc.

It's a real shame that libtool does so much compilation for each file when in the end it's very likely that it is just going to exec gcc with the same arguments every time. Perhaps libtool would be more reasonable if it calculated these arguments once for each equivalence class of files, and stored them either in CFLAGS in the Makefile, or in a little shell script generated at configure time. For example, ./.lt.compile.shared

#! /bin/sh
exec gcc -Wall -W -shared -fPIC $*

That could be much faster. Perhaps there's some reason why it's impossible.

Autoconf archive, posted 8 Aug 2000 at 06:14 UTC by mcoletti » (Journeyer)

Someone recently told me of this:

It's a collection of (potentially) useful autoconf macros. I particularly like the C++ set.


The auto* tools are good for cross-compilation, posted 8 Aug 2000 at 10:48 UTC by Raphael » (Master)

There is something that has not been mentioned so far about automake and autoconf: they are designed to allow cross-compilation as well as compilation in a target directory that is not the source directory (VPATH). As long as you do not do bad things in your configure script (e.g. AC_TRY_RUN), then it is possible to cross-compile your software easily. Compared to that, the older tools such as imake/xmkmf are a nightmare.

For example, I have set up a Solaris machine so that it can compile native programs (Solaris/SPARC) and cross-compile executables for Linux/x86 as well as Linux/ARM. Thanks to automake and autoconf, it was quite easy to have only one source tree and several build directories for each platform. All it takes is to cd to each directory and to run ../srcdir/configure --target=i386-intel-linux-gnu (or sparc-sun-solaris2.7, or arm-*-linux-gnu, or whatever) and the makefiles and object files are created in the corresponding build directory. This makes it easy to build the same software for several platforms from a single machine (for small devices using the ARM processor, cross-compilation is often the only solution). Very often, the makefiles created by other tools or hacked by hand do not offer this solution. If you ever tried to cross-compile XFig, Ghostview, Perl or other packages that are not using the auto* tools, you will probably understand the problem.

Automake and autoconf are far from perfect, but at least they got this part right. I hope that the proposals in the Software Carpentry design competition do not forget about cross-compilation...

Why Software Carpentry is going to fail, posted 8 Aug 2000 at 16:02 UTC by raph » (Master)

Apologies for the somewhat provocative title. These are merely my impressions about how the process is flawed and, so in my opinion, unlikely to come up with tools dramatically better than what we have now. Since they're not based on careful readings of the second round winners, and they are based on my bitterness at having my entry rejected, please take with 1 grain NaCl.

1. Dependencies on unstable stuff. In particular, the fact that build files are Python programs means that sccons (and most of the other submissions, based on what was requested) is likely to break when Python is upgraded.

Note that automake has many of the same problems. People above have discussed automake's wonky syntax, but in my opinion the syntax issues are much less important than the fact that make, autoconf, shell, and handrolled sed stuff (@LIKETHIS@) shows through. You're not writing to one abstraction, you're writing to a hodgepodge. This hurts especially hard when you get error messages. It's not unlike trying to compile a C program with a syntax error and being given a strange error about some x86 weirdness.

2. Bad factorization. Basically, the existing factorization between make and auto* was kept in the SC structure, with separate entries and separate judging. This is the wrong factorization, folks. It's entirely plausible that we'll end up with two mismatched tools, both of which are necessary to cover what happens in a modern build.

3. Insufficient fear of complexity. One of the stated reasons rebar was rejected is that it was too simple, ie not ambitious enough. To me, this reasoning is insane. It is far easier to add needed functionality later than to remove unneeded complexity from a design.

So basically, I feel that the design contest format is a poor one for grappling with really difficult problems, such as those involved in creating a build system. Therefore, I'm going to try a different format - one person who more or less understands the issues creating a prototype. Right now, I'm busy as hell, so it won't be any time soon, but that probably won't matter much.

All these great minds..., posted 8 Aug 2000 at 20:12 UTC by tladuca » (Apprentice)

Well, I know I am insanely naive, but...

There are a lot of people out there(here?) who love Free Software. There are a lot of people willing to spend a lot of their time make things better for everyone. It would make sense if we, instead of wasting time fighting with the current system, put some things on hold to make a new one. I am just a lowly student, so I am sure I have absolutely no idea what would be invlolved, but it would be nice to set in motion a project to make it all better. The project, initially, would just be ideas. Tons of ideas. Everyone's ideas. Ideas like "this is all unnecessary, 'auto*' is just fine. This is not worth it.", and "We need a completely new syntax", or "It's gotta be done in Python", whatever. A project that is just about gripes and complaints at first, but that will, if anyone wants to do more than just complain, slowly but surely start to materialize into specifications, and maybe even code.

I guess what it comes down to is someone has to be a shepard. Not necessarily one person, but one entity. Moreover, it the investment really going to be worth it? Will it really save us that much time and headaches that a project like this would take off? That someone out there would be compelled to do something about it once and for all? (Sorry I don't know anything about the Software Capentry yet.)

What I see is that we have gotten to the point where our tools are "good enough" for people to get the job done, we stand on the backs of giants(or something like that). The ground work, for the most part, was completed by the many others. But is anyone taking care of the ground work any more? GNU/Linux is basically a lot of "stuff" ported from existing Unix "stuff". Can we improve on the good old standby's? Make a better Make?

It's gotta be done in Python :), posted 9 Aug 2000 at 20:42 UTC by jhermann » (Master)

... and use XML for project descriptions. I have checking out the Software Carpentry stuff on my TODO, but Python + XML are _my_ K.O. criteria.

If you like Python..., posted 10 Aug 2000 at 18:09 UTC by gward » (Master)

...and are interested in new build tools, then you should take a look at my current major project, the Python Distribution Utilities (Distutils). The goal is a standard tool for building, distributing, and installing Python modules, extensions, and applications, and so far it's working pretty well. However, the architecture is quite flexible; for example, it can be used to build an arbitrary C library (the code is there, but the external hooks are a bit weak -- currently all you can do with any such libraries is link them into a Python extension). Most of the marbles for building whole C applications (ie. a binary instead of libfoo.{a,so}) are already there, in order to support building a new Python binary with the current extension statically linked in.

Generalizing the Distutils to other classes of project than "Python module libraries" and "Python applications" is certainly doable: why not a Distutils flavour for "C console applications", or "GNOME C applications", or "KDE C++ applications"? All that's really required is to code up the appropriate "knowledge" about that class of project in Python, which is a heck of a lot cleaner (if a tad more verbose) than kluging together shell/m4/perl code.

Alas, my entry to Software Carpentry was turfed out on the first round -- I don't know if it's because the entry was a rehash of my paper from the Python Conference in January, or if it's because it described a system already implemented rather than just a design, or if the scope was too narrow, or what. Regardless, it's still a pretty cool project (IMNSHO); worth looking at if you're interested in Python and configuration/build tools.

New Advogato Features

New HTML Parser: The long-awaited libxml2 based HTML parser code is live. It needs further work but already handles most markup better than the original parser.

Keep up with the latest Advogato features by reading the Advogato status blog.

If you're a C programmer with some spare time, take a look at the mod_virgule project page and help us with one of the tasks on the ToDo list!

Share this page