Re: [Distutils] Disposition of C extensions and packages
On Dec 20, 12:06am, M.-A. Lemburg wrote:
Subject: Re: [Distutils] Disposition of C extensions and packages "A.M. Kuchling" wrote:
Questions:
1) A general Python question about packaging style: Is mixing C extensions and Python modules in one package tree a bad idea? It makes the whole tree platform-dependent, which is probably annoying for sites maintaining Python installation for different architectures.
I have been using that setup for two years now with all of my mx extensions and so far it has been working great.
If you maintain packages with C extensions for several platforms, you can simply install the packages under the platform subdirs in /usr/local/lib/python1.5 -- one copy for every platform. Disk space is no argument anymore nowadays.
Agreed, disk space is not an issue, but file duplication is not very nice. I think this is a real weakness in Python: on one hand we have platform independent extensions which are great on the other hand we have native extensions when performance is required. This is, at least for me (in scientific computing) what makes Python such a great tool, BUT Python does not provide a mechanism to split a single package into platform dependent and platform independent part. We maintain Python and a number of packages for IRIX, SunOS, Dec Alpha OSF, Win32 and linux ... and having to update 5 .py files at every bug fix is a real pain. -Michel -- -----------------------------------------------------------------------
>>>> AREA CODE CHANGE <<<<<<<<< we are now 858 !!!!!!!
Michel F. Sanner Ph.D. The Scripps Research Institute Assistant Professor Department of Molecular Biology 10550 North Torrey Pines Road Tel. (858) 784-2341 La Jolla, CA 92037 Fax. (858) 784-2860 sanner@scripps.edu http://www.scripps.edu/sanner -----------------------------------------------------------------------
Michel Sanner wrote:
On Dec 20, 12:06am, M.-A. Lemburg wrote:
Subject: Re: [Distutils] Disposition of C extensions and packages "A.M. Kuchling" wrote:
Questions:
1) A general Python question about packaging style: Is mixing C extensions and Python modules in one package tree a bad idea? It makes the whole tree platform-dependent, which is probably annoying for sites maintaining Python installation for different architectures.
I have been using that setup for two years now with all of my mx extensions and so far it has been working great.
If you maintain packages with C extensions for several platforms, you can simply install the packages under the platform subdirs in /usr/local/lib/python1.5 -- one copy for every platform. Disk space is no argument anymore nowadays.
Agreed, disk space is not an issue, but file duplication is not very nice. I think this is a real weakness in Python: on one hand we have platform independent extensions which are great on the other hand we have native extensions when performance is required. This is, at least for me (in scientific computing) what makes Python such a great tool, BUT Python does not provide a mechanism to split a single package into platform dependent and platform independent part. We maintain Python and a number of packages for IRIX, SunOS, Dec Alpha OSF, Win32 and linux ... and having to update 5 .py files at every bug fix is a real pain.
One way to solve this is by editing the __init__.py module of the package containing the C extension and tweaking the __path__ global so that the correct shared modules for the importing platform is found. I've never tried this, but it should work... Note that at least Linux .so files and Windows .pyd files can live happily side-by-side in one directory. With distutils in place this issue should basically disappear altoghether, since then the whole building process would be automated -- not sure whether distutils allows having different setups for different architectures, but given that it is written in Python, it should be possible to automate all aspects of multi-platform installations. -- Marc-Andre Lemburg ______________________________________________________________________ Y2000: 11 days left Business: http://www.lemburg.com/ Python Pages: http://www.lemburg.com/python/
[MAL]
One way to solve this is by editing the __init__.py module of the package containing the C extension and tweaking the __path__ global so that the correct shared modules for the importing platform is found. I've never tried this, but it should work...
Note that the idea of packages tweaking their __path__ is not very future-proof; one of the things under consideration elsewhere as a distribution mechanism is to place a group of modules in a zip archive (either the standard Python library, a package, or perhaps the library plus a set of packages needed by an application). I think it's worth looking again into the issue of where package-specific shared libs should come from when the package itself is loaded from an archive. --Guido van Rossum (home page: http://www.python.org/~guido/)
Guido van Rossum writes:
I think it's worth looking again into the issue of where package-specific shared libs should come from when the package itself is loaded from an archive.
As well as in the existing case of packages with both Python and C
components. It simply doesn't make sense for a package containing
primarily cross-platform files to be installed in platform-specific
locations simply because the infrastructure doesn't understand the
split.
This is *not* an issue of disk space; I expect packages will appear
which include data as well as code; these packages should be able to
locate their associate data files using __path__ (or something
similar). This makes a lot of sense for packages that perform
character-set recoding and the like, where a large number of
translation files may be carried along as part of the package.
-Fred
--
Fred L. Drake, Jr.
Guido van Rossum writes:
I think it's worth looking again into the issue of where package-specific shared libs should come from when the package itself is loaded from an archive.
[Fred Drake]
As well as in the existing case of packages with both Python and C components. It simply doesn't make sense for a package containing primarily cross-platform files to be installed in platform-specific locations simply because the infrastructure doesn't understand the split. This is *not* an issue of disk space; I expect packages will appear which include data as well as code; these packages should be able to locate their associate data files using __path__ (or something similar). This makes a lot of sense for packages that perform character-set recoding and the like, where a large number of translation files may be carried along as part of the package.
I claim that it *is* an issue of disk space. Having the installation of a particular package spread out over two places is inconvenient from a management point of view, and sharing one of those places between different installations (for different platforms) of the same package just makes it a lot worse. Note that your example hinges on "a large number of translation files". That reeks of a disk space argument! (If you were thinking of network bandwidth or download time, there are other solutions that don't affect the install locations.) --Guido van Rossum (home page: http://www.python.org/~guido/)
Guido van Rossum writes:
Note that your example hinges on "a large number of translation files". That reeks of a disk space argument! (If you were thinking
I conceed the point.
My example was bad. If I make it "stuff that might be changed at
the site", I see that the right solution is to have a file outside the
package rather than inside it.
I still don't like having to have two copies of the Python files to
support multiple platforms for a couple of package modules, but that's
very different and not clearly anything other than personal
preference. I'm sure someone else can come up with a really good
argument against it, though.
-Fred
--
Fred L. Drake, Jr.
участники (4)
-
Fred L. Drake, Jr.
-
Guido van Rossum
-
M.-A. Lemburg
-
Michel Sanner