[Distutils] "just use debian"
Toshio Kuratomi
a.badger at gmail.com
Thu Oct 2 19:08:01 CEST 2008
David Cournapeau wrote:
> Toshio Kuratomi wrote:
>> I'm not 100% certain but I think that Josselin is speaking of glibc in
>> particular here and you're speaking of c libraries in general.
>
> Maybe, but I don't see how this change the point: when you change the
> soname of a library, it has 0 impact on the source code of the software
> which links against it. That's not true in python.
>
<nod>. I just noticed that you guys seemed to be speaking past each
other and wanted to point it out.
>>
>> I've said before that ideally a Linux distribution only wants one
>> version of a library.
>
> And ideally, developers do not want to care about versioning issues :)
> There is a middle ground to find; up to now, I think python distutils
> and co did not care at all about this, but you can not ask to move 180
> degrees and do only as linux vendors want. I understand the reasons why
> OS vendors want to avoid distributing multiple versions as much as
> possible. But that's just not realistic in every case.
>
> Even in C-derived languages, this is sometimes a PITA. I could give you
> examples where distributions screwed up the packaging badly and hurt
> some projects I am involved with. But that would be unfair and besides
> the point (we all screw up); the point is that there are some practical
> reasons for sometimes including private copies. Because for example in
> Fortran's case, there is this huge mess of gfortran and g77 not being
> ABI compatible; there are examples in C++, and even in C. You also can't
> impose every software to follow distributions time-schedule.
>
> Reasons for single version for OS vendors are valid; but so are the ones
> to have multiple versions. I think compat modules would cover most
> needsl the problem is that python does not have a mechanism to request a
> particular version of a module. But wouldn't this help OS vendors to
> have such a mechanism (to decrease the burden of compat versions ?)
>
Very true! Which is why say single package versions are ideal for Linux
distributions. Ideal being one of those ever striven for, never
achieved goals and Linux distributions being the demographic whose
opinion I'm pretending to represent :-)
I can definitely understand the need to develop packages with different
versions of python packages than a system might have. (I develop
software as well as package it.)
So where the concerns intersect is when you go to distribute your
package. To have your package run in as many places as possible, you
want to guarantee the correct versions of libraries are installed in
those places. OTOH, in order to get Linux distributions interested in
packaging your project you really must not use your own private copies
of those libraries.
(BTW, Josselin seems to be saying something different is true on Debian
but I had posted this question to the cross-distro distributions-list
freedesktop.org two weeks ago after dealing with it in the banshee
package and people seemed to agree that it was not proper packaging.
I'll have to ask for clarification here. Perhaps I phrased my question
poorly on that list :-)
Anyhow... the problems I outlined in my mail are the reasons that
packagers have a wtf moment when they untar a source tarball and find
that there's fifteen other upstream packages included. Ways that this
can be remedied:
1) Have a source tarball that's separate from binary distribution. The
binary distribution can contain the third party modules while the source
tarball just contains your code. Distro packagers will love you for
this because it means the source tarball is clean and they can just g to
work packaging it.
2) If you must distribute the source tarball with third party modules,
make sure your build scripts work with the installed system packages
instead of the modules you are including. This lets a packager build
and install just your code and ignore the rest.
3) Make sure you document how to do this. Good packagers read the
README. If you have to rm -rf THIRD_PARTY-DIR prior to building to just
build and install your code, mention that.
4) make sure your package works with vanilla upstream versions of the
third party modules. It's tempting to fix things in your local copies
of modules. If at all possible don't. If that's not possible, make
sure upstream has incorporated the patch and make a note in the README
-- using a patched version of Foo-x.y project. The patch is in their
svn as of DATE. patch is located in myproject/foo-x.y/patches. Doing
this means that the distribution packager of your package can take your
patch to the packager of Foo and ask that the patch be incorporated there.
>> In Fedora we're willing to have compat packages
>> that hold old API versions if we must but by and large we would rather
>> help upstream apps port their applications forward than to have
>> compatibility packages.
>
> Yes, but here again the C comparison breaks. Some people use python as a
> "tool", not so much as a "programming language". Their applications are
> scripts, or softwares for experiment, that are not released, because
> they can't open source it, or simply because it has no use for anyone
> else. You can't port that.
>
If complaints in Fedora are any indication this happens with C as well ;-).
If by you, you mean me or a distribution you're right. Of course, the
"can't port that" doesn't apply to the person with the script.
The solution at the distro level for the non-OSS software in deployment
scenario is not to run a distribution with a limited lifespan. If you
need something to run reliably on python2.4 for years, run RHEL5 or
CentOS or Debian stable. They won't change the major components without
reason and you'll be able to run just your changes (more recent
packages, etc) on top.
There is no solution at the distro level for experimental, in
development stuff. But I think that using eggs, workingenv, etc is a
fine solution for the development case.
scripts and small, local software is a problem. "I have a script to
download por^Wfiles from the internet. You started shipping py3k and
now urllib is busted!" Debian has solved one portion of this by
shipping different versions of python that are parallel installable.
Fedora has solved a different portion by shipping compat packages (using
setuptools for parallel install) when there's a need. If the software
is small the best answer may be to port. If the software is intricate,
the best answer may be to use workingenv or something else from the
development case.
I think it's important to note that I see different use cases for using
distribution packages and using local solutions like workingenv. Local
solutions let you use more current (or less current) versions of things.
They let you experiment and give you the option to refuse to port your
local scripts. Supporting people overriding what's installed on their
OS distro is important for doing these things.
But distro packaging serves a very useful purpose as well. It lets
people experiment with your work who are just looking for something that
can get a specific job done. It lets developers worry about something
other than security fixes to packages which they depend on.
-Toshio
-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 197 bytes
Desc: OpenPGP digital signature
URL: <http://mail.python.org/pipermail/distutils-sig/attachments/20081002/14df007f/attachment.pgp>
More information about the Distutils-SIG
mailing list