[Distutils] developing & managing multiple packages with dependencies
Brad Allen
bradallen137 at gmail.com
Fri May 7 04:42:56 CEST 2010
On Thu, May 6, 2010 at 5:51 PM, Antonio Cavallo <a.cavallo at cavallinux.eu> wrote:
> Without knowing the size (small, medium or large size company),
> the platforms (win, linux, mac) and the packages (external or internal)
> is hard to make any sensible reasoning.
Well, I am looking for stories here from other organizations as a
source of lessons. However, maybe it would help if I spent more time
describing what we're trying to accomplish.
We're a medium size software product company (over 100 employees), the
supported platforms are Linux and Windows servers, and most of our
package dependencies are internal. However, we also have third party
open source dependencies.
> For what is is worth keep a central repository for dependencies (either external or
> internally produced) and enforce it with an iron fist or thinks will go pretty
> quickly out of control (project A depends on project B and project C on project B' but
> projectB and projectB' cannot be used at the same time).
With a DVCS it makes sense to have multiple repositories (a repo for
each package, er, I mean 'module distribution'), though we do have a
centralized workflow with a central server containing all the repos.
The approach we're moving toward is to have the 'master' branch of
each repo associated with the most current stable release, and a
'development' branch for the most current development work. For old
releases of a given package under maintenance, there would be a
development and a release branch for each major + minor version
number. For example maintenance version 2.7 would have the branches
dev_2.7 and rel_2.7.
Each development and release branch of each package would have a
separate Hudson job running the tests, and building eggs, sdists, and
pip requirements files. The script for building the eggs creates a
version consisting of the version in setup.py + the tag_build from
setup.cfg + the Hudson build number. For release branches, the build
script takes the extra step of creating a Git tag of the version
(including the Hudson build number), so each build can be linked back
to a commit id in the repo.
> In my experience I avoided anything that relies on setuptools or any
> magic/clever stuff that replaces a native installing system (rpm, msi, dpkg
> or pkg).
Well, we're pretty much relying on setuptools/buildout, and I don't
see anything wrong with that, as long as we have a reasonable
migration path distribute/distutils2 in the future. Up till now we've
built the eggs manually and put them on our package server, and
deliver buildouts to our customers using zc.sourcerelease. Now that
we've automated our build process using Hudson the natural next step
will be get buildout to make use of the pip requirements file created
by the Hudson jobs.
The actual release to the customer usually involves a selection of
'top level' packages, and all of them need to rely on the same version
of the core library dependencies. In the past, we hand-coded a
version.cfg file for use by buildout, and that file was put under
version control and tagged for each release. In the future, we're
considering hand-coding a pip requirements file to contain the desired
version of each package needed by a customer, and setting up a Hudson
job to run the tests for that particular set of versions. That would
define a 'KGS' (Known Good Set) of top level packages and their
dependencies, and we would keep that KGS as a release configuration
under version control.
> For legal reason and traceability reasons anything that attempt to
> download or "dinamically" do things is a no option.
I have no clue what you mean by that. What do you have against 'downloading'?
More information about the Distutils-SIG
mailing list