[Python-Dev] sharing stdlib across python implementations

Olemis Lang olemis at gmail.com
Wed Sep 30 17:18:09 CEST 2009


On Wed, Sep 30, 2009 at 9:28 AM, Chris Withers <chris at simplistix.co.uk> wrote:
> Frank Wierzbicki wrote:
>>
>> Talk has started up again on the stdlib-sig list about finding a core
>> stdlib + tests that can be shared by all implementations, potentially
>> living apart from CPython.
>
[...]
>
> if the
> stdlib was actually a set of separate python packages with their own version
> metadata so that packaging tools could manage them, and upgrade them
> independently of python packages when there are bug fixes. If that were the
> case, then pure python packages in the stdlib, of which there are many,
> *really* could be used across python implementations with no changes
> whatsoever...
>

nice ! That's something I really liked about Python.NET

:)

BTW ... is there something like that for Java ? I mean to use J2[SE]E
classes using CPython ?

This could also be useful to have personalized distributions. I mean,
if I want to implement a Py app that will run in devices with limited
capabilities, and let's say that it only imports `sockets` module (or
a few more ;o), then it will be easier to prepare a subset of stdlib
in order to deploy just what is needed in such devices, and save some
space ;o).

Projects like py2exe or others, could use something like that in order
to extract relevant stdlib (modules | packages) and make them
available to Windows apps distributed as exe files (e.g. Hg )

CMIIW anyway, perhaps I'm just dreaming.

however ...

> The big changes I can see from here would be moving the tests to the
> packages from the central tests directory, and adding a setup.py file or
> some other form of metadata providion for each package. Not that big now
> that I've written it ;-)
>

In this case I envision the following issues if one setup.py file is
generated for every module or top-level package (... which is
-considering the previous message- how u plan to do it, isn't it ? )

  - the maintenance effort might increase
  - what about dependencies between stdlib modules ?
  - there are many attributes which will take the same values for each
and every
    packages (e.g. version info, issue tracker, ...) and some that
will be specific
    (e,g, maintainer, author, contact info, dependencies ...)
  - Overhead when a new package is included in stdlib (need to create and
    maintain `setup.py` script, and so on ...)

So my $0.02 here are :

  - to have a single `setup.py` file (in the end it's a script, isn't it ? )
  - provide an argument in order to select module(s) to be included
    (full stdlib if missing) in source distribution. In general any other
    parameterization of the `setup.py` may be just ok, the goal is
    to have only one
  - use a mechanism in order to specify config options for specific pkgs
    modules, and make it available to the global `setup.py`. For example :
    * Specify metadata using top-level fields in modules (e.g. __author__,
       __maintainer__, ...)
    * Specify metadata using separate INI files for each target

What d'u think ?

There may be some issues with sdist anyway
:-/

PS: Will those packages be submitted to PyPI too ? I mean if not
sdists, at least meta-data ?

-- 
Regards,

Olemis.

Blog ES: http://simelo-es.blogspot.com/
Blog EN: http://simelo-en.blogspot.com/

Featured article:
Sobrepasa las 300 descargas el módulo dutest  -
http://feedproxy.google.com/~r/simelo-es/~3/U5rff5iTQPI/sobrepasa-las-300-descargas-el-modulo.html


More information about the Python-Dev mailing list