RE: [Python-Dev] AIX shared library support
Bill Tutt wrote:
Is there any particular reason that dynload_aix.c doesn't use AIX's
MAL wrote: dlopen?
From what I can tell, libtool seems to use dlopen on AIX 3.xxx and AIX 4.xxx.
I was all set to integrate the \N{...} support into the Unicode-escape encoding, and dynamically load the hash table data in on demand, but dynload_aix only cares about module entry points and won't let you specify an explicit function name.
Uhm, why do you need to go that deep into the internals of the import mechanism ?
Wouldn't is suffice to put the hash table into a _ucnhash module which get imported on demand (the import mechanism would then take care of the rest) ?
It would suffice, but it'd be less efficient (and more work) since I'd then have to access the data/hash functions through the normal PyObject mechanisms instead of directly as C functions. That is, I'm trying to avoid creating a PyString of whatever is inside the braces. (\N{SMILEY}). Then again, maybe I'm just being too performance sensitive. The import mechanism got redone weirdly, all of the ways of doing dynamic loading except AIX encode the idea that the function name must be preceeded by "init". Dynload_xxx.c is the wrong place to put that requirement. Whomever calls _PyImport_GetDynLoadFunc() should handle that issue. Just as an FYI, the only problem I've found so far with my last patch was wrt handling Unicode characters with values in the UCS-4 code space. (Not that there are any yet, but the code must handle that case anyway, sine I don't want to have to go change it later.) Bill
Bill Tutt wrote:
MAL wrote:
Bill Tutt wrote:
Is there any particular reason that dynload_aix.c doesn't use AIX's
dlopen?
From what I can tell, libtool seems to use dlopen on AIX 3.xxx and AIX 4.xxx.
I was all set to integrate the \N{...} support into the Unicode-escape encoding, and dynamically load the hash table data in on demand, but dynload_aix only cares about module entry points and won't let you specify an explicit function name.
Uhm, why do you need to go that deep into the internals of the import mechanism ?
Wouldn't is suffice to put the hash table into a _ucnhash module which get imported on demand (the import mechanism would then take care of the rest) ?
It would suffice, but it'd be less efficient (and more work) since I'd then have to access the data/hash functions through the normal PyObject mechanisms instead of directly as C functions. That is, I'm trying to avoid creating a PyString of whatever is inside the braces. (\N{SMILEY}). Then again, maybe I'm just being too performance sensitive.
You could pass a pointer to the char* via a PyCObject. I usually export the whole module C API via such an object and it works great: you can rely on module import to get the magic right and still have a C API around which you can call directly from C -- without the need to wrap anything in Python objects. FYI, mxODBC and mxDateTime are tied together using this mechanism; look in the mxDateTime.h|c files for details.
The import mechanism got redone weirdly, all of the ways of doing dynamic loading except AIX encode the idea that the function name must be preceeded by "init". Dynload_xxx.c is the wrong place to put that requirement. Whomever calls _PyImport_GetDynLoadFunc() should handle that issue.
That's to Greg's address, I guess ;-)
Just as an FYI, the only problem I've found so far with my last patch was wrt handling Unicode characters with values in the UCS-4 code space. (Not that there are any yet, but the code must handle that case anyway, sine I don't want to have to go change it later.)
Why is there a problem there ? Python currently uses UTF16 as native format. UTF-16 surrogates are not supported though... and probably won't be for a while. I don't see much of a problem here. -- Marc-Andre Lemburg ______________________________________________________________________ Business: http://www.lemburg.com/ Python Pages: http://www.lemburg.com/python/
participants (2)
-
Bill Tutt
-
M.-A. Lemburg