Re: 2.1a2 release issues; mail.python.org still down

[Pushing this to python-dev w/out M-A's permission, now that mail is starting to flow again.] M.-A. Lemburg writes:
Another issue: importing old extensions now causes a core dump due to the new slots for weak refs beind written to.
I think(!) this should only affect really modules from 1.5.? and earlier; type objects compiled after tp_xxx7/tp_xxx8 were added *should not* have a problem with this. You don't give enough information for me to be sure. Please let me know more if I'm wrong (possible!). The only way I can see that there would be a problem like this is if the type object contains a positive value for the tp_weaklistoffset field (formerly tp_xxx8).
Solution: in addition to printing a warning, the _PyModule_Init() APIs should ignore all modules having an API level < 1010.
For the specific problem you mention, we could add a type flag (Py_TPFLAGS_HAVE_WEAKREFS) that could be tested; it would be set in Py_TPFLAGS_DEFAULT. On the other hand, I'd be perfectly happy to "ignore" modules with the older C API version (especially if "ignore" lets me call Py_FatalError()!). The API version changed because of the changes to the function signatures of PyCode_New() and PyFrame_New(); these both require additional parameters in API version 1010. -Fred -- Fred L. Drake, Jr. <fdrake at acm.org> PythonLabs at Digital Creations

"Fred L. Drake, Jr." wrote:
I've only tested these using my mx tools compiled against 1.5 -- really old, I know, but I still actively use that version. tp_xxx7/8 were added in Python 1.5.2, I think, so writing to them causes the core dump.
That would work, but is it really worth it ? The APIs have changed considerably, so the fact that I got away with a warning in Python2.0 doesn't really mean anything -- I do have a problem now, though, since maintaining versions for 1.5, 1.5.2, 2.0 and 2.1 will be a pain :-/
Py_FatalError() is a bit too harsh, I guess. Wouldn't it suffice to raise an ImportError exception and have Py_InitModule() return NULL in case a module with an incompatible API version is encountered ? BTW, what happened to the same problem on Windows ? Do users still get a seg fault ? -- Marc-Andre Lemburg ______________________________________________________________________ Company: http://www.egenix.com/ Consulting: http://www.lemburg.com/ Python Pages: http://www.lemburg.com/python/

M.-A. Lemburg writes:
Yep. I said:
M-A replied:
That would work, but is it really worth it ? The APIs have changed considerably, so the fact that I got away with a warning in Python2.0
No, which is why I'm happy to tell you to recomple your extensions.
Unless you're using PyCode_New() or PyFrame_New(), recompiling the extension should be all you'll need -- unless you're pulling stunts like ExtensionClass does (defining a type-like object using an old definition of PyTypeObject). If any of the functions you're calling have changed signatures, you'll need to update them anyway. The weakref support will not cause you to change your code unless you want to be able to refer to your extension types via weak refs.
I suppose we could do that, but it'll take more than my agreement to make that happen. Guido seemed to think that few modules will be calling PyCode_New() and PyFrame_New() directly (pyexpat being the exception). -Fred -- Fred L. Drake, Jr. <fdrake at acm.org> PythonLabs at Digital Creations

"Fred L. Drake, Jr." wrote:
The problem is not recompiling the extensions, it's that I will have to keep compiled versions around for all versions I have installed on my machine.
The warnings are at least as annoying as recompiling the extensions, even more since each and every imported extension will moan about the version difference ;-) -- Marc-Andre Lemburg ______________________________________________________________________ Company: http://www.egenix.com/ Consulting: http://www.lemburg.com/ Python Pages: http://www.lemburg.com/python/

Hey, here's a suggestion for a solution then: change the warning-issuing code to use the new PyErr_Warn() function! Patch gratefully accepted on SourceForge. Now, note that using "python -Werror" the user can cause these warnings to be turned into errors, and since few modules test for error returns from Py_InitModule(), this will likely cause core dumps. However, note that there are other reasons why Py_InitModule() can return NULL, so it really behooves us to test for an error return anyway! --Guido van Rossum (home page: http://www.python.org/~guido/)

"Fred L. Drake, Jr." wrote:
I've only tested these using my mx tools compiled against 1.5 -- really old, I know, but I still actively use that version. tp_xxx7/8 were added in Python 1.5.2, I think, so writing to them causes the core dump.
That would work, but is it really worth it ? The APIs have changed considerably, so the fact that I got away with a warning in Python2.0 doesn't really mean anything -- I do have a problem now, though, since maintaining versions for 1.5, 1.5.2, 2.0 and 2.1 will be a pain :-/
Py_FatalError() is a bit too harsh, I guess. Wouldn't it suffice to raise an ImportError exception and have Py_InitModule() return NULL in case a module with an incompatible API version is encountered ? BTW, what happened to the same problem on Windows ? Do users still get a seg fault ? -- Marc-Andre Lemburg ______________________________________________________________________ Company: http://www.egenix.com/ Consulting: http://www.lemburg.com/ Python Pages: http://www.lemburg.com/python/

M.-A. Lemburg writes:
Yep. I said:
M-A replied:
That would work, but is it really worth it ? The APIs have changed considerably, so the fact that I got away with a warning in Python2.0
No, which is why I'm happy to tell you to recomple your extensions.
Unless you're using PyCode_New() or PyFrame_New(), recompiling the extension should be all you'll need -- unless you're pulling stunts like ExtensionClass does (defining a type-like object using an old definition of PyTypeObject). If any of the functions you're calling have changed signatures, you'll need to update them anyway. The weakref support will not cause you to change your code unless you want to be able to refer to your extension types via weak refs.
I suppose we could do that, but it'll take more than my agreement to make that happen. Guido seemed to think that few modules will be calling PyCode_New() and PyFrame_New() directly (pyexpat being the exception). -Fred -- Fred L. Drake, Jr. <fdrake at acm.org> PythonLabs at Digital Creations

"Fred L. Drake, Jr." wrote:
The problem is not recompiling the extensions, it's that I will have to keep compiled versions around for all versions I have installed on my machine.
The warnings are at least as annoying as recompiling the extensions, even more since each and every imported extension will moan about the version difference ;-) -- Marc-Andre Lemburg ______________________________________________________________________ Company: http://www.egenix.com/ Consulting: http://www.lemburg.com/ Python Pages: http://www.lemburg.com/python/

Hey, here's a suggestion for a solution then: change the warning-issuing code to use the new PyErr_Warn() function! Patch gratefully accepted on SourceForge. Now, note that using "python -Werror" the user can cause these warnings to be turned into errors, and since few modules test for error returns from Py_InitModule(), this will likely cause core dumps. However, note that there are other reasons why Py_InitModule() can return NULL, so it really behooves us to test for an error return anyway! --Guido van Rossum (home page: http://www.python.org/~guido/)
participants (3)
-
Fred L. Drake, Jr.
-
Guido van Rossum
-
M.-A. Lemburg