HOWTO distribute a project

Posted 18 Mar 2003 at 13:18 UTC by Lehrig Share This

.tar.gz seems to be the only way to distribute a project

.tar.gz seems to be the only way to distribute a project,

because all the stuff with rpm ... is only working on one distribution. My project depends on: qt, Tcl/Tk, VTK and some more libraries - When you link qt statically, the app doesn't conform to your style and has no jpeg support - there are different versions of Tcl/Tk around

rpm's can only be build for the distributions, that are availabe to you

Is this the ultimate answer ? Is .tar.gz the only way to distribute your project ?

.tar.bz2 ;-), posted 18 Mar 2003 at 14:14 UTC by sjanes71 » (Journeyer)

But seriously, .tar.gz is a tribute to the UNIX philosophy of using small tools to do specific jobs: tar multiplexes files into one file and leaves the task of compression to another tool. Before the open-source explosion that compression tool was compress but now there are many more choices with gzip and bzip2. The next task to be done is packaging and we're still in an evolutionary phase to define the best system-- but still in the UNIX way-- by wrapping the source archive with a packaging format.

Deep down inside, every "sucessful" packaging format still uses discrete components (tar, gz, or bzip2) with some scripting support to handle patches and building. Gentoo Linux's Portage/ebuild system, Debian's DEB, Red Hat's RPM all use .tar.gz or .tar.bz2 source archives in some way to build executables. GNU Autoconf and Automake probably have done more to improve source-portability of UNIX applications than any application binary interface (ABI) could have possibly done-- except at that price of having to recompile your application or not supporting the platform because the ABI was not stable or did not exist. Many commercial software providers have simply chosen Red Hat as the standard and provide only RPM's.

If your need is to distribute binaries, then unfortunately yes, we all are in a DLL-Hell. Even with existing package management tools the state of Linux systems can rapidly fall into a chaos after many updates-- even more so when you are a developer trying to stay current with the libraries you use.

Tools like alien as a method of translating source or binary package formats into other package formats and help ease some of the pain of maintaining packages and if you include automated test, regression and functionality suites it can be possible to use services such as SourceForge's Compile Farm to build (and maybe test?) your application on different platforms-- but Murphy's Law of course demands that whatever library revisions that exist on the compile farm will be different and potentially broken in the systems of your application's install-base.

I've probably left out any of a number of alternative solutions for packaging source or binaries which might be a good discussion to start here because this issue affects everyone here who distributes software they write.

Different Packages for Different Cultures, posted 18 Mar 2003 at 18:50 UTC by glyph » (Master)

The correct way to package your project is:

  1. Follow as many idioms from your language or environment as possible.

    In C, this means using autoconf. In Java, this means using ant. In Python, this means using distutils. Other languages have other idioms. A good way to make sure your project will be difficult to use is writing your own build system that only one project uses.

  2. Make multiple binary distributions for each platform you support.

    If you plan to support RedHat, produce RPMs. If you want to support Debian, provide an apt source, or better yet get integrated into debian. (DO NOT SIMPLY PROVIDE .deb FILES!!!). For MacOS, a .app.tar.gz/.dmg/.installer, depending on context. For Win32, a setup.exe.

    This probably sounds like a lot of work! Guess what: it is a lot of work to create good, cross-platform packages.

  3. Avoid hard-coding any distribution dependencies.

    This should go without saying, but it's a huge pain in the ass to create a portable library that builds and runs fine on win32, but crashes unless C:\VAR\SPOOL\MAIL is a valid directory.

In other words, this is a very small part of the larger question, "How do I write portable software?" This is a very difficult problem, and there is no simple 3-step HOWTO. It can grow to take up all of your available time, as the libhello experiment indicates.

Re:Different Packages for Different Cultures, posted 18 Mar 2003 at 23:45 UTC by dhess » (Observer)

What are some good tools for automating package distribution, especially on OS X and Windows? Are there any cross-platform open source projects that have particularly good examples to follow? I've been going through this same evaluation for the OpenEXR project, and I haven't found any yet.

On GNU/Linux, I use autoconf for .tar.gz distributions ("make dist").

I've also got a .spec file for building RPMs, though this isn't as smooth as I'd like because I have to su to root, copy the .tar.gz dist file to /usr/src/redhat/SOURCES, and then do the build. I'd like something that will unpack source and build from any directory and not require root so that I can just type "make rpm", but the rpmbuild manpage doesn't appear to address that.

There is someone already working on getting OpenEXR into Debian. I agree that's the best solution for getting Debian support for your package.

On OS X, I also use autoconf to build OpenEXR, but I'm not really sure where prebuilt libraries, includes, and binaries should go. What's the OS X idiom for that? I did find a script here that creates a .dmg automatically. I think I'll add a rule to the makefiles so I can type "make dmg" and have it automatically generated for me.

I would also like to support fink, which uses the /sw prefix, but I don't want to require OS X users to have fink in order to use OpenEXR. That makes yet another target.

On Windows, almost everyone uses Visual Studio, so I have project files for that, but I have no idea where prebuilt libs and includes should go (not even sure there's a standard for that). Bram has mentioned the Nullsoft Super Pimp installer before, and it looks like it's kinda scriptable, but what I'd really like is something that can build self-installing .EXEs from GNU/Linux, since that's where I do all my own development. I'm not sure if mingw32 generates Visual C++-compatible libraries, maybe somebody here can fill me in on that.

If anyone has solved these problems in his or her open source project, I'd greatly appreciate pointers to it. Thanks!

Portability, posted 19 Mar 2003 at 02:43 UTC by djm » (Master)

This started off as a discussion on package formats, but the topic seems to have drifted far enough to "portability" that I can feel that this rant is justified :)

Achieving portability is hard. Here are some of the lessons I have learned when porting OpenSSH:

1. It is much easier and cleaner to provide replacement functions for system that lack them than to use conditional compilation. By doing this, be aware that you trade executable size for better reliability. Make sure you also pick a good source for your replacement functions, otherwise you are just trading one problem for another (we use OpenBSD as the source).

2. Push as much of your environment-specific login into your autoconf configuration. Make you autoconf tests as platform independant as possible, this will serve you later when people start porting your software to weird platforms (like Ancient NCR BSD variants and old NeXTStep m68k)

2.a Autoconf is utterly horrid. Its syntax is gross (unless you are pervese enough to like m4) and extremely fragile. It is also the only tool which can do the job that it does. If this grosses you out (as it very nearly did to me), the alternative is to pick a bunch of "known good" platforms and provide makefile targets for them. Postfix, for example, does this to good effect.

3. If you have to deal with authentication or [uw]tmp{,x}|lastlog handling, suicide is an attractive option.

4. Remember that Cygwin is now a pretty popular target for many packages. It has its own peculiarities, especially on the stupid MS platforms (Win9X, WinME).

5. If you are coming from a Linux background, you will soon learn that Linux/glibc deviates from many standards/idioms in many ways.
5.a So do Sun
5.b So do HP

6. Finally, and most importantly: Involve your user community in testing, porting and QA. There are people out there with more platform-specific wisdom than I will ever accrue - most of the "hard stuff" ends up being done by these kind wizards.

Suggestion for a new distribution format, posted 19 Mar 2003 at 07:57 UTC by Lehrig » (Apprentice)

On OpenVMS and Windows my project brings all shared libraries with it. On OpenVMS there is not even a make utility by standard. Although make utilities are available. - On OpenVMS I use a Backup Saveset (similar to tar) for distribution. Then you have to run a shellscript to link and install the software (not compiling). This works fine. - On Windows I use a self extracting zip file for distribution. After the user has restored the software he has to click SetupRegistry.exe .Then the software can be used. This works on different versions of Windows.

- On GNU/Linux I have build rpm's for SuSE. But you will have problems installing them on different distributions. I don't want to link statically and to duplicated all shared objects, that are available (in a different version) anyway.

My suggestion is: Why do we not define a new package format ? - The package would include objectcode + additional files - The executables within the package would be linked on the target system - The result would be an rpm file which could be installed

This would be a portable appoarch for different GNU/Linux distributions Unfortunateley the Sourceforge Compile Farm offers only Debian for GNU/Linux.

Packaging projects and communities, posted 19 Mar 2003 at 20:24 UTC by realblades » (Journeyer)

Most OS's and distributions have packaging projects (Fedora, freshrpms for RH and YDL IIRC, debian for debian, IBM rpms for Aix, for Irix...).

If you write and document well and it's a useful program, someone will probably do the packaging.

If you want to package, package and enter it into a project or contrib site if possible. Not all package systems have contribs, but some do. Usually it's better that people who track the system very closely and have reached a consensus on how packaging should be done or even have strong policies do it than that several different people butcher together single packages, even if a few of them are really good at it.

Why not YAPF?, posted 20 Mar 2003 at 12:11 UTC by glyph » (Master)

There is a good reason that we do not need yet another package format. In order to determine whether a package file would work on a given system, one needs to make assertions about what it depends on. Pretty much any package format worth its salt has some notion of dependency tracking. With a new package format, you would need a new package repository. Of course, it will be problematic to track dependencies into the "native" packaging systems of all the platforms that you wish your format to work on.

When you create a new package format, what you're really creating is a new platform for distributing software. That does not save you any work unless the platform is a good wrapper over several underlying systems. In fact, it makes more work: now software developers need to make .RPMs and .DEBs and .YAPF files.

It is a very difficult thing to correctly design a platform. There are a lot of issues you have to think about. To make matters worse, a platform is ultimately only worthwhile if it has good applications on it. As Paul Graham notes, it's pretty difficult to convince people to use your platform, even if it's a very good one to begin with.

To demonstrate what I mean by this: many folks have the notion that self-contained applications should just be directories full of files. This is actually a pretty good idea, but it's still unpopular on platforms that don't support it natively. The people who defined a platform to support this notion have had good success with it, but those who want to make it work on somebody else's haven't managed to convince anyone to do it their way.

Packaging and AppDirs, posted 21 Mar 2003 at 07:05 UTC by jlbec » (Master)

Every packager has its problems. I'm not going to go into my dislike of RPM or the problems with DEB. I'm not going to rant too much about how AIX's package format is the best (so why are they making RPMs now?). It doesn't matter.

Package formats are pretty much beyond the killer feature stage. You'll never get people to switch, and they're pretty religious about it. YAPF, as glyph points out, won't be taken up by others because it isn't significantly better than the alternatives. No matter how good it is.

ROX is barking up the wrong tree with AppDirs. It works on the Mac because no one shares anything. Think about Adobe. They make two great graphics programs, Illustrator and Photoshop. Both programs have some operations in common. I suspect that they use the same code. In an AppDir configuration, the library for these operations is duplicated in the Illustrator AppDir and the PhotoShop AppDir. There is no sharing at all.

This makes perfect sense for Adobe. A user might have bought only one of the products, and so the library has to ship with each. A user using these products needs a ton of disk space and RAM, so it is easy to justify wasting a little. Finally, only Adobe is using this code, so they don't have to worry about sharing.

In the base operating system, and in the world of free software, sharing is essential. All of a sudden, libreadline is in the AppDir for bash, and no other application can use it. Sure, you move the bash dir around and bash still works, but other applications have no idea. Whoops.

Yes, you can create library AppDirs. Inside a library AppDir (like an application AppDir) is a specific file mapping to the proper library. However, once you want to share this among users (and you do), you effectively have /usr/lib. Whether it is /usr/lib/ or /usr/lib/readline/, it's the same effect. The files have to be owned and maintained by root. No benefit.

This is why Unix and Linux have the complex but predictable and pretty-well-thought-out file hierarchy. AppDirs make sense for large, self-contained applications, but they can't work in the core system. Once you've defined a package format for the core system, you don't want to package non-core apps with a different format, so non-core apps end up packaged the same way.

A Little Homework, Perhaps?, posted 22 Mar 2003 at 06:55 UTC by glyph » (Master)

Apple uses a directory-based approach to both shared code and applications. It is better for both than the traditional UNIX single-file approach, because you can associate non-code resources with code without bundling them into a single monolithic file. Of particular benefit to users is the fact that you can edit those resources without mangling compiled code.

See /System/Library/Frameworks, if you have a mac.

New package format? No, just standards, posted 24 Mar 2003 at 06:09 UTC by chipx86 » (Journeyer)

A new package format won't do anybody any good. It'll simply be one more to support. However, if the metadata is all standardized between the formats and distributions, the format will essentially be irrelevant. It'll be like comparing the advantages of .tar.gz to .tar.bz2. They're basically the same. Different formats, but the same data.

This is actually being worked on. has setup a listserv to discuss the future of package management, and has some pretty active, insightful discussions. Currently, the contributors on the listserv and the (unofficial) IRC channel (#packaging on consist of jbj of RPM, a few Gentoo portage developers, a couple Debian guys, Mike Hearn from autopackage, and myself from GNUpdate, to name a few.

Whether we'll succeed or not is another question, but we should be able to decide on at least some standards.

Not a common package, but maybe a packaging tool?, posted 31 Mar 2003 at 20:35 UTC by johnnyb » (Journeyer)

Maybe what we need instead of a common package format, is a common package tool which can create all of the necessary formats. That way, from a single source we can compile, and then run our uber-tool to generate .rpm's .deb's and slack-compatible .tgz's. With other compiles, maybe we can even get it to create setup.exe's.

I think this is a better approach than the yet-another-package-format, because it doesn't require end-users to do anything new. In fact, only one person has to have it for it to be useful.

A Packager Hooked To CVS, posted 1 Apr 2003 at 22:19 UTC by nymia » (Master)

I was wondering if there is a package app that can talk to CVS in a way any labelled versions can be pulled from it and packaged into different formats like .tgz, .tar.gz, .bz2, .zip, etc.

That would be really cool, though.

an hypothetical example of it would be like this:

CVS --dump --tag=my0.0.1 | bzip2 mypackage_0.0.1.bz2

Re: Not a common package, but maybe a packaging tool, posted 7 Apr 2003 at 05:57 UTC by Lehrig » (Apprentice)

> Maybe what we need instead of a common package format, is a common package tool which can create all of the necessary formats. That way, from a single source we can compile, and then run our uber-tool to generate .rpm's .deb's and slack-compatible .tgz's. With other compiles, maybe we can even get it to create setup.exe's. -------------------------------------------------------------------- That is exactly what I mean. I would be glad, if I could generate .rpm's .deb's ... on my own computer for different target distributions. Currently I can only build rpm's for the same distribution, my computer is using. I don't want to introduce a new package format. I only want to be able to build packages for all distributions, whithout having to install each distribution. This would be possible with chroot eventually. What do you think ?

New Advogato Features

New HTML Parser: The long-awaited libxml2 based HTML parser code is live. It needs further work but already handles most markup better than the original parser.

Keep up with the latest Advogato features by reading the Advogato status blog.

If you're a C programmer with some spare time, take a look at the mod_virgule project page and help us with one of the tasks on the ToDo list!

Share this page