
At 10:46 PM 7/21/2006 +0200, Matthias Klose wrote:
again, they are not mixed. I didn't see that many packages with different files between Python versions. could you give some examples?
The one that's most obvious to me is of course setuptools itself; its installation metadata changes between Python versions, to reflect differences in the distutils between Python versions. The most common case of a project shipping different modules are projects that depend on features that were part of the stdlib in a newer Python version, but must be bundled for an older Python version. e.g., the logging module, optparse/optik, doctest, the threading.local facility, etc. I seem to recall Zope releases doing this a lot.
- remove included third party modules and depend on the modules instead which are shipped in the distribution (removed three copies of pytz in the last months). Maybe it's the desire of upstream authors to provide a complete one-click-setup lacking a decent package management system for other platforms, but it should be an option, not the default.
Luckily, setuptools permits improvement in this area, and certainly I agree that the dependencies *should* be made separate. That's not always possible, however, if you are dealing with the existing base of Python projects.
- add exact meta information (dependencies) that allows upgrades, even partial ones, between distribution releases. It would be nice to use upstream meta information, without relying on a newly invented dependency tracking system which doesn't integrate well with existing packaging systems. I think the PEP trying to get this information into the meta-data was rejected as over-specified.
Actually, it was only over-specified on *syntax*. The *semantics* on the other hand, are badly *under*-specified. The version syntax was so narrowly-defined that *Python's own* version numbers wouldn't be considered conformant, but the meaning of the names was left entirely unspecified! I could say I wanted "Foo>1.2.7", but there was no definition of what "Foo" would mean. Regarding the rest of your comments (e.g. FHS), I think it would be counterproductive for me to respond in detail, because I do understand that these things that are meaningful within the Debian world-view. However, they are self-fulfilling prophecies in the sense that these forms of "QA" works primarily to ensure the need for more of the same "QA". If this is what is valued, then by all means, feel free to go for as much of it as you like. :) Where this leads to friction with outside work (like Andrew's stdeb) is that the Debian viewpoint doesn't encompass the idea that the proper division of Python modules is according to the projects that provide them. This is an impedance mismatch that means that any tool designed to help get Python projects into Debian (like easy_deb or stdeb or even bdist_deb) is never going to be both reasonably automated *and* reasonably policy-compliant, because the conceptual framework on which the policy is based cannot encompass the idea of projects. It seems that to Debian, the only software project is Debian itself, and anything else is merely a source of components to be broken down and reassembled in Debian's own image. This may or may not be what you want, of course, and I certainly don't object to people doing what they want. :) I'm bringing this up, however, in the hopes of saving more people from slaving away trying to build Python packaging tools for Debian based on a misconception that their work will ever be accepted (or even understood) in the scope of Debian's Python policy, which effectively precludes any useful mapping from PyPI to Debian.