Hi all,
I've been aware that the distutils sig has been simmerring away, but
until recently it has not been directly relevant to what I do.
I like the look of the proposed api, but have one question. Will this
support an installed system that has multiple versions of the same
package installed simultaneously? If not, then this would seem to be a
significant limitation, especially when dependencies between packages
are considered.
Assuming it does, then how will this be achieved? I am presently
managing this with a messy arrangement of symlinks. A package is
installed with its version number in it's name, and a separate
directory is created for an application with links from the
unversioned package name to the versioned one. Then I just set the
pythonpath to this directory.
A sample of what the directory looks like is shown below.
I'm sure there is a better solution that this, and I'm not sure that
this would work under windows anyway (does windows have symlinks?).
So, has this SIG considered such versioning issues yet?
Cheers,
Tim
--------------------------------------------------------------
Tim Docker timd(a)macquarie.com.au
Quantative Applications Division
Macquarie Bank
--------------------------------------------------------------
qad16:qad $ ls -l lib/python/
total 110
drwxr-xr-x 2 mts mts 512 Nov 11 11:23 1.1
-r--r----- 1 root mts 45172 Sep 1 1998 cdrmodule_0_7_1.so
drwxr-xr-x 2 mts mts 512 Sep 1 1998 chart_1_1
drwxr-xr-x 3 mts mts 512 Sep 1 1998 Fnorb_0_7_1
dr-xr-x--- 3 mts mts 512 Nov 11 11:21 Fnorb_0_8
drwxr-xr-x 3 mts mts 1536 Mar 3 12:45 mts_1_1
dr-xr-x--- 7 mts mts 512 Nov 11 11:22 OpenGL_1_5_1
dr-xr-x--- 2 mts mts 1024 Nov 11 11:23 PIL_0_3
drwxr-xr-x 3 mts mts 512 Sep 1 1998 Pmw_0_7
dr-xr-x--- 2 mts mts 512 Nov 11 11:21 v3d_1_1
qad16:qad $ ls -l lib/python/1.1
total 30
lrwxrwxrwx 1 root other 29 Apr 10 10:43 _glumodule.so -> ../OpenGL_1_5_1/_glumodule.so
lrwxrwxrwx 1 root other 30 Apr 10 10:43 _glutmodule.so -> ../OpenGL_1_5_1/_glutmodule.so
lrwxrwxrwx 1 root other 22 Apr 10 10:43 _imaging.so -> ../PIL_0_3/_imaging.so
lrwxrwxrwx 1 root other 36 Apr 10 10:43 _opengl_nummodule.so -> ../OpenGL_1_5_1/_opengl_nummodule.so
lrwxrwxrwx 1 root other 27 Apr 10 10:43 _tkinter.so -> ../OpenGL_1_5_1/_tkinter.so
lrwxrwxrwx 1 mts mts 21 Apr 10 10:43 cdrmodule.so -> ../cdrmodule_0_7_1.so
lrwxrwxrwx 1 mts mts 12 Apr 10 10:43 chart -> ../chart_1_1
lrwxrwxrwx 1 root other 12 Apr 10 10:43 Fnorb -> ../Fnorb_0_8
lrwxrwxrwx 1 mts mts 12 Apr 10 10:43 mts -> ../mts_1_1
lrwxrwxrwx 1 root other 15 Apr 10 10:43 OpenGL -> ../OpenGL_1_5_1
lrwxrwxrwx 1 root other 33 Apr 10 10:43 opengltrmodule.so -> ../OpenGL_1_5_1/opengltrmodule.so
lrwxrwxrwx 1 root other 33 Apr 10 10:43 openglutil_num.so -> ../OpenGL_1_5_1/openglutil_num.so
lrwxrwxrwx 1 root other 10 Apr 10 10:43 PIL -> ../PIL_0_3
lrwxrwxrwx 1 mts mts 10 Apr 10 10:43 Pmw -> ../Pmw_0_7
lrwxrwxrwx 1 root other 10 Apr 10 10:43 v3d -> ../v3d_1_1
Hi,
I've just started with Python and I discovered that the binding to the GNU
readline is really basic. So about a week ago I started working on a binding
for it and I have most of the work done. I need to finish the bindings for
keymap functions and some of the functions in the completion part before the
package could be considered complete.
I started to investigate the ways to package the work I've done and I
discovered the distutil package. I'm trying to write a setup.py file, but I
need the following things and I didn't figure out how to express them:
- the main C file is obtained by running the m4 program on a .m4 file. How can
I specify this dependency and the rule for generating the C file?
- I need to define a C preprocessor macro that contains the version of the
readline library. The way the things are setup now is by running a configure
script that determines the version of the library. Can I do this with setup.py?
Thanks,
--
Ovidiu Predescu <ovidiu(a)cup.hp.com>
http://www.geocities.com/SiliconValley/Monitor/7464/
Just van Rossum wrote:
> foo.bar is registered as a "builtin" in config.c file as
>
> {"foo.bar", initbar},
>
> (Hm, this is problemetic if the is a distinct global builtin module "bar")
Or if any other package has a module "bar"!
> find_module() should then first check sys.builtin_module_names with the
> full name before doing anything else. (probably only when it is confirmed
> that "foo" is a package.)
All that would be doable, but the real problem is the name of the init
function! Only one module can define a global symbol "initbar". So the
one for foo.bar would have to be called "initfoo.bar" (or something
similar). On the other hand, when the same module is used dynamically,
the init function must be called "initbar" again (unless the current
import mechanism is changed).
Konrad.
--
-------------------------------------------------------------------------------
Konrad Hinsen | E-Mail: hinsen(a)cnrs-orleans.fr
Centre de Biophysique Moleculaire (CNRS) | Tel.: +33-2.38.25.55.69
Rue Charles Sadron | Fax: +33-2.38.63.15.17
45071 Orleans Cedex 2 | Deutsch/Esperanto/English/
France | Nederlands/Francais
-------------------------------------------------------------------------------
> M.-A. Lemburg <mal(a)lemburg.com> writes:
>[Problem with dynamic extensions in packages being platform dependent]
>
>I haven't followed the thread too closely, but was alarmed by
>the recent proposals of splitting .so files out of the "normal"
>package distribution under a separate dir tree.
Fair enough. There's the the GNU configure view of life, where $prefix and
$exec_prefix are separate directories, and there is the perl view of life
$PERL_ARCHLIB is usually a subdirectory of the install directory). M.A.
prefers the perl-ish approach. Fine with me, as long as we do it explicitly.
> (you can't have two top-level packages with the same name
>on the path: only the first one on the path will be used).
That's only because that's how it's done today. Just a matter of some
code...(and the thought and design behind it).
> [ snipped scheme for having packages do the platform specific import ]
I'd rather not burden the package writer. I think it's better to include the
batteries for this one.
> [recommendation that you just have a different install dir for each
platform ]
> Disk space is no argument nowadays
Ease of maintenance is the overriding argument here. The .py files are the
same for all platforms so why do I want different copies of those files when
I have python install for three platforms?
> its likely that different platforms need different Setup files anyway.
But that's a platform dependent file which goes in the $INSTALL_ARCHLIB.
In short, I think we need to get this infrastructure into Python itself to
ease the creation of package authors. But then I'm probably preaching to the
choir.
-Perry
> I think in this case the only problem is getting all the right
>directories on the package's __path__; am I missing something? It
>avoids the need for the "conditional" import, but that's largely
>separate.
Fred,
Good point. Can you recommend a concise place that the import mechanism (in
all it's glory) is documented?
That should solve the problem, except for when freeze-ing or making a static
python binary (as previously mentioned by Konrad).
I was poking around in ihooks.py. It looks like it should be possible to
cook up something approximating this using ihooks. What do you think?
-Perry
>> The problem here is that the package's __path__ is not being created
>> this way now; if anyone has time to work on a patch for
>> Python/import.c, I'd be glad to help test it! ;-)
>>
>I take care of this in my extension module. If I have a platform dependent
>implementation of a module I:
>
>try:
> import extension
>Fail:
> use common
I don't think this is the case that's causing problem. The problem is when a
submodule on a package is *always* platform dependent (because, for example,
it interfaces to another library).
-Perry
On May 21, 11:11am, Fred L. Drake wrote:
> Subject: [Distutils] Re: [PSA MEMBERS] packages in Python
>
> Konrad Hinsen writes:
> > Shouldn't that be the other way round? I'd expect to be able to override
> > general modules by platform-specific modules.
>
> Greg Ward and I were talking about this stuff the other day, and I
> think we decided that there was no good way to have multiple
> implementations of a module installed such that the platform dependent
> version was sure to take precedence over a platform independent
> version; this relies on the sequence of directories in the relevant
> search path (whether it be sys.path or a package's __path__).
> The general solution seems to be that two things need to be done: a
> package's __path__ needs to include *all* the appropriate directories
> found along the search path, not just the one holding __init__.py*,
> AND the platform dependent modules should have different names from
> the platform independent modules. The platform independent module
> should be the public interface, and it can load platform dependent
> code the same way that string loads functions from strop.
> The problem here is that the package's __path__ is not being created
> this way now; if anyone has time to work on a patch for
> Python/import.c, I'd be glad to help test it! ;-)
>
I take care of this in my extension module. If I have a platform dependent
implementation of a module I:
try:
import extension
Fail:
use common
This requires minimal amount of coding. Personally I did not see the need of
this being automatic
-Michel
On May 21, 5:07pm, Konrad Hinsen wrote:
>
> Which makes me wonder how others develop extension modules: I always
> use a debugger at some point, and I haven't yet found one which lets
> me set breakpoints in dynamic libraries that haven't been loaded yet!
>
After you import the .so you can set a break point. I do that all the time on
my sgi uner dbx or cvd .. no problem.
I also have about 10 extensions modules all of which use .so files !
-Michel
On May 21, 10:24am, M.-A. Lemburg wrote:
> Subject: Re: [PSA MEMBERS] packages in Python
> [Problem with dynamic extensions in packages being platform dependent]
>
> I haven't followed the thread too closely, but was alarmed by
> the recent proposals of splitting .so files out of the "normal"
> package distribution under a separate dir tree. This is really
> not such a good idea because it would cause the package information
> stored in the extension module to be lost (you can't have two
> top-level packages with the same name on the path: only the first one
> on the path will be used).
>
> Here is the scheme I would use: create a subpackage for the
> extension and have it take care of importing the correct
> shared lib for the platform Python is currently running on.
> The libs themselves could be placed in plat-<platform> subdirs
> of that subpackage and the __init__.py would then load the
> shared lib using either a sys.path+__import__() hack or
> thread safe via imp.load_dynamic().
>
> An even simpler solution is installing the whole package under
> .../python1.5/plat-<platform> separately for each supported
> platform rather than putting it under site-packages. [Disk space
> is no argument nowadays and its likely that different platforms
> need different Setup files anyway.]
>
As someone who maintains Python for several unix based architectures I am not
concerned with disk space but really file duplication with the obvious risc to
run out of sync.
Also, the plat-<platform> scheme is far from being able to capture the
complexity of this world. SGI alone has 3 ABIs o32 n32 n64 multiplied by MIPS1,
MIPS3, MIPS4 instruction sets yimes IRIX5.x, IRIX6.2, IRIX6.3, IRIX6.4, IRIX6.5
and many of these combinations are incompatible. !
Finally, why have a $prefix and a $ exec_prefix if it is not used to split
plateform dependent stuff from platform independent.
And we should really take this dot distutil-sig :)
-Michel