On Mon, Mar 24, 2008 at 04:57:44PM -0400, Alexander Michael wrote:
With that preamble, here's my attempt at an explicit rationale for a database of installed packages (A.K.A. The New PEP 262):
Nice effort, thanks.
Rationale ========= It is often necessary during the course of managing a python installation for an administrator to determine the following things:
1. If the current installation state satisfies the requirements of a new package being considered for installation. 2. If it is safe for the administrator to upgrade or remove a package, and if so, how (e.g. use a system-level package management tool). 3. What files to remove in order to uninstall the package (if a system-level package management tool was not used to install it). 4. If the current installation was modified in-place or custom configured in an way, so that such changes can be noted before upgrading.
I agree with 1 and 2, but 3 and 4 are optional I think. They are not important for the different tools to interact AIUI. As long as everyone only removes files they own all is fine. By making the database extensible (e.g. X- headers in the RFC822 case) each tool could use their own extensions for recording these specifics as they need them, eliminating required information being spread around the system. Not that I think it would do much harm if they where specified by the PEP, but then I never tried implementing one of these tools so I might not have the best opinion on how much specifying this might unnecessarily restrict implementations. Something that this requirement should probably explicitly describe: does this database only concern modules, i.e. installed in a directory that's on sys.path? My understanding is that it does. However I'm unsure as to how that can be combined with supporting applications that require certain modules. Would every tool need to keep track of the apps it installed separately to make sure that, when it's used to uninstall a module, it doesn't render an app that it installed useless? To tie in another part of this discussion, this could mean that module installation is forced to be a simple list of files while still allowing a Turing complete installer for applications.
Alternatives ============ [...] Ubiquitous System Packages -------------------------- Eliminate the need to *not* use your OS's system-level package manager by creating the necessary infrastructure so that python packages can easily be distributed in the major system-level packager formats (RPM, DEB, MSI, MPKG, etc.) for all publicly released python packages. A windows-based developer who releases their pure-python package should be able to create RPMs, DEBs, etc. automatically (using distutils, for instance) without access to a computer running the appropriate OS (python setup.py bdist_all upload).
This is a bad or even impossible approach. It is very hard or even impossible to create a system package on a different system. It is even harder to keep the releases of *all* python packaging tools in sync with the current policy of a system. It is not a surprise that most of past efforts to create a "bdist_deb" target for setup.py have failed to gain much traction. A system might change policies halfway their release cycle and can't afford the wait for the python world to catch up with that policy change first. This gets even worse when the sytem can't really agree on a policy (which does unfortunately happen). Systems might also have different policies for their different versions (e.g. old-stable, stable and in-development) only making it more impossible for everyone else to try and play catch-up with the system as they would have to supply packages for each supported release. I think it's best to allow systems to create their own packages by giving them a dependable "setup.py install" command --which most systems use in their packaging scripts anyway-- to build on.
Remote Hosted Installation Database ----------------------------------- Eliminate the requirement that your installation be self-describing by maintaining the information in PyPI necessary for one to figure out what versions of what things are installed in your system (either by recipes or a table of message digest values for the directories) in addition the dependencies and list of installed files for each version of each package thats ever been released.
The easiest way to distribute some python stuff is to quickly write a setup.py, even if it's only to share some quick and dirty module with a handful of people. Not everything will want or should be in PyPI. Some machines might also not have net access. So I'm dubious about this option (it also seems technically quite hard). Thanks for writing this up, I think it's very good and useful. Regards Floris -- Debian GNU/Linux -- The Power of Freedom www.debian.org | www.gnu.org | www.kernel.org