[Catalog-sig] Package maintenance mechanism for Python

Clayton Brown clayton.brown at digitalrum.com
Fri Apr 29 17:36:48 CEST 2005


My 2 cents since I've been quietly watching this thread and some others
related to this.....

---- 
In a nutshell: 
does Pypi intend to distribute versioned platform specific modules/eggs, 
will core python 1.5 syntax extend to allow auto download/installation &
varying depth version requirement specification for imports
will ppm/cpan like service be established to satisfy python dependancies on
a given platform
----

Now the affle:


Surely with PyPI, CP(ython)AN, PythonArchives, PyVault, Temple of Star
Trek............ Whatever.............you get what in talking about PyPI for
now.
Personally: CPAN + 100, PyCatalogue + 50, eggcrate perhaps pending further
reading on eggs, religious/political/klingon/star-trek-punns - 500

Python being able to determine and resolve its dependencies at
import/runtime as well as have fine grained control over versions (and lazy
loading) with a global flag to prevent automatic resolution
(download/installation) for security purposes (exit raising an error
instead) I feel would be a the greatest leap forward for it's usability and
deployability and plans to take over the world, all python enthusiasts are
well aware that its syntax and power are not its problem with use, but for
satisfying dependencies Perls PPM has us over a barrel, call PyPI CPAN
flatter them for what its worth, we know they did it first and they did it
well, their syntax is still crap or we'd be using it.

Certainly PyPI is on the right track of at least aggregating the efforts of
the python community into a centralised archive but before it can be
entirely useful, the way these modules are searched/distributed is the most
important feature of the archive IMHO. I have not read up on the PyPI
functionality other than uploading/distutils references I have seen in
threads, as no real doc's are linked from http://python.org/pypi for
functionality beyond this. Some documentation refering to eggs/modules
versioning/platform etc on PyPi wouldn't go astray.

A while ago, out of frustration of working on several machines and keeping
their site-packages in sync, I wrote (adapted from some wxSomething or
another) a bootstrap loader to allow me to both import
exact/approx/latest(default) versions of archived/versioned modules as well
as specify python interpreter requirements at run time within my code. This
had a hidden benefit of allowing my scripts to quietly upgrade both
interpreter and module versions to later modules as I installed/versioned
them and as version incompatibilities arose I could either set them at that
version or perform upgrades. I now keep one versioned site-packages under
revision control and use a '.pth' file to add this to each  of my python
installations. Side thought: Static analysis of my code would be able to
yield a complete dependency list for a build/packaging and distribution but
this was beyond the scope of my needs. I personally don't think
__init__.py's import perversions are a solution. More a band aid until the
core python syntax can accommodate such versioning functionality.

The loader is in ASPN:
http://aspn.activestate.com/ASPN/Cookbook/Python/Recipe/285215
My intention now that im migrating towards python2.4 with most of my code
and have zipimports/eggs available is to zip all my versioned imports as a
deployable unit and re-test this loader(I guess this is what eggs are as I
saw a reference to them been quite like jars - I haven't really read up on
yet them... Oh well ill get onto this in time at least before making afore
mentioned adjustments), the crux is this allows me to keep one repository of
site-packages allowing access to multiple concurrent versions of modules at
any one time, and rsync deploy my entire site-packages to new machines to
satisfy dependencies. 

An obvious stumbling block I came across with this is the ability to store
non-pure-python versioned modules along with built system binaries that they
wrap for multiple platforms etc, for the most parts I just placed the shared
objects/dll's etc within the modules local path and whala presto it worked
great (perhaps eggs cater for this problem), the only problems beyond here
was knowing how to generate a text string which would uniquely identify a
platform (now I've seen reference to this in PEP-243
"<os_name>-<os_version>-<platform architecture>-<python-version>" is this
definitive and can this string be produced by a import? and are these the
only items which would affect built binaries compatibility
(os/gcc/kernel/etc/etc) ?) I imagine for PyPI to archive non pure-python
modules it must deal with these exact issues in the cataloguing process, id
be glad if someone could point me to somewhere definitive on this topic of
packaging Im guessing ill get a distutils link, hmmm distutils 101 probably.

I think if a CPAN like service evolves for python we need to address the
issue of python syntax for both specifying loose/strict version requirements
both for the interpreter and the imports as I have, and ensuring this
dependency is met allowing it additionally to satisfy this (by retrieving /
installing) via a CPAN like service (ok perhaps having a global flag to
prevent this by default for security)

The idea I have is for a CP(ython)AN PPM service to be invoked automatically
at run time to search/attain/install when an import from the local
(versioned) site-packages fails. Perhaps this could be handled by an actual
module itself which one would import to ensure site-packages are sorted all
auto auto during execution of the script.


from pypi import autoinstaller
import foo version = '1.2.3' #autoinstaller catches import fail, retrieves
module, installs, re-runs-import, resumes......game on.

Then integrate pypi within the core site-packages, along with extending the
import syntax/mechanism to allow specification of version.

import foo version 1.1.1.1.etc.etc.etc.....
This would allow varying accuracy as to imports so version 1.1  would load
1.1.5 / 1.1.6 / 1.1.7 as they became available whereas 1.1.5 would be fixed
at 1.1.5


We could add other helper methods to pypi for installing packages etc

import pypi
avail_modules = pypi.search("foomodule","1.2.3")


== or ==
setup.py
----------
import pypi
pypi.install("foomodule","1.2.3",autoresolveModulesDependancies=True)
print "Game on!"
-----------
pypi.pruneLocalCatalogue(last_imported_datetimestamp < now-1year) #import
mechanism touch file to show module usage perhaps







> -----Original Message-----
> From: Richard Jones [mailto:richardjones at optushome.com.au] 
> Sent: 29 April 2005 10:52
> To: catalog-sig at python.org
> Subject: Re: [Catalog-sig] Package maintenance mechanism for Python
> 
> 
> -----BEGIN PGP SIGNED MESSAGE-----
> Hash: SHA1
> 
> On Wed, 27 Apr 2005 08:34 am, Maurice Ling wrote:
> > PyPI has a module upload mechanism. But is there a module download 
> > mechanism.
> 
> Not yet. You are welcome to write it.
> 
> 
> > By that, I meant a mechanism which system admins can use to
> > download and install a module (library)
> 
> This, as you already note, is trivial. Python's built-in 
> urllib and xmlrpclib 
> make it so.
> 
> 
> > and keep track of what is
> > installed into site-package directory through this mechanism?
> 
> PEP 262 is an attempt to address this:
> 
>   http://www.python.org/peps/pep-0262.html
> 
> A large amount of discussion exists around whether Python 
> *should* have such a 
> database, or whether we should leave such things up to the 
> Operating System 
> packaging mechanisms (rpm, deb, etc.) I fall in the latter 
> category at the 
> moment.
> 
> 
> > The problem I see is that maintaining 3rd party libraries in 
> > site-package is a task for system admins or developers, especially 
> > when new versions of Python is installed (say from Python 2.3 to 
> > Python 2.4).
> 
> The Ubuntu people have ideas about addressing this for 
> pure-python packages, 
> but they include massive amounts of symlinks, and just not 
> bothering with 
> trying to address cross-version issues when they hit a 
> non-pure-python 
> package (ie. one that has a C extension). I think. It's all 
> still a bit 
> vague :)
> 
> 
> > All of the libraries (esp those with C modules) needs to be 
> > recompiled. Is there a mechanism for the programmers to 
> find out what 
> > is installed in say, Python2.3's site-packages and download 
> the same 
> > modules from PyPI and install them in Python2.4's site-packages?
> 
> I believe PEP 246 is intended to have this kind of 
> information. Feel free to 
> read and flesh it out. And implement it.
> 
> 
>      Richard
> -----BEGIN PGP SIGNATURE-----
> Version: GnuPG v1.2.4 (GNU/Linux)
> 
> iD8DBQFCcgPKrGisBEHG6TARArtCAJ0WDFvJ7vZviKrx0VAJsU+2p9MRmgCeMv8N
> Gjcpk6OA4tuqUQmAAJzuPjw=
> =zir+
> -----END PGP SIGNATURE----- 
> _______________________________________________
> Catalog-sig mailing list
> Catalog-sig at python.org 
> http://mail.python.org/mailman/listinfo/catalog-sig
> 


More information about the Catalog-sig mailing list