Am 05.12.2010 20:40, schrieb Jeroen Ruigrok van der Werven:
-On [20101205 20:23], "Martin v. Löwis" (firstname.lastname@example.org) wrote:
For those of us involved in the release process, every single file is a big problem, indeed. Seriously.
Correct me if wrong, but isn't rolling up a tgz, tbz2, or txz not a matter of repeating the same actions on a tarball you already have, namely merely compressing the resulting tarball that you already have. Easy enough to script/put in a Makefile.
Which only leaves generating hashes/sig files, uploading, and linking it on the site. All of which can be automated to a high degree.
So what exactly is the big problem here then?
One problem is that it is not automated (at least not when I do it, nor is the creation of the release page automated).
The next problem is that, with so many files, it becomes just not feasible to test them all. As a consequence, you risk releasing broken files. For example, when Benjamin released both 2.7.1 and 3.1.3 on the same day, I had to produce 16 files. I lost track of what files where in what state, and broke some of them. I didn't test all installers I built.
The last problem is that the release page gets cluttered, confusing users as to what file they need to download for what purpose.