Hello there. I'm sure this has come up before, but here it is again: Python exports a different api in debug mode, depending on whether PYMALLOC_DEBUG and WITH_PYMALLOC are exported. This means that _d.pyd files that are used must have been compiled with a version of python using the same settings for these macros. It is unfortunate that the _PyObject_DebugMalloc() api is exposed to external applications using macros in objimpl.h I would suggest two things: 1) provide dummy or thunking versions of those in builds that don't have PYMALLOC_DEBUG impolemented, that thunk to PyObject_Malloc et al. (This is what we have done at CCP) 2) Remove the _PyObject_DebugMalloc() from the api. It really should be an implementation of in the exposed PyObject_Malloc() functions whether they use debug functionality at all. the _PyObject_DebugCheckAddress and _PyObject_DebugDumpAddress() can be left in place. But exposing this functionality in macros that external moduled compile in, is not good at all. The reason why this is annoying: Some external software comes with proprietary .pyd bindings. When developing my own application, with modified preprocessor definitions (e.g. to turn off PYMALLOC_DEBUG) we find that those externally provided libraries don't work. It takes a fair amount of detective work to find out why exactly linkage fails. The external API really shouldn't change depending on preprocessor definitions. Cheers, K
Some external software comes with proprietary .pyd bindings.
Can you please explain what a "proprietary .pyd binding" is? Do you mean they come with extension modules? If so, there is no chance of using them in debug mode, anyway, right? So what specifically is the problem? Regards, Martin
What I mean is that a third party software vendor supplies a foobarapp.pyd and a foobarapp_d.pyd dlls that link to python2x.dll and python2x_d.dll respectively. But the latter will have been compiled to match a certain settings of the objimpl.h header, which may not match whatever is being used to build the local python2x_d.dll. And thus, you get strange and hard to debug linker errors when trying to load external libraries. When developing superapp.exe, which uses a custom build of python2x, perhaps even embedded, python2x_d.dll is used extensively both during the development process and the testing process. This is why foobarapp_d.pyd is necessary and why it is supplied by any sensible vendor providing opaque python extensions. But the current objimpl.h api makes it a matter of developer choice whether that foobarapp_d.pyd will successfully link with your python2x_d.dll or not. IMHO, it is not good practice to expose an API that changes depending on preprocessor settings like this. K
-----Original Message----- From: "Martin v. Löwis" [mailto:martin@v.loewis.de] Sent: 14. júní 2010 22:13 To: Kristján Valur Jónsson Cc: python-dev@python.org Subject: Re: [Python-Dev] debug and release python
Some external software comes with proprietary .pyd bindings.
Can you please explain what a "proprietary .pyd binding" is?
Do you mean they come with extension modules? If so, there is no chance of using them in debug mode, anyway, right? So what specifically is the problem?
Regards, Martin
Am 15.06.2010 14:48, schrieb Kristján Valur Jónsson:
What I mean is that a third party software vendor supplies a foobarapp.pyd and a foobarapp_d.pyd dlls that link to python2x.dll and python2x_d.dll respectively. But the latter will have been compiled to match a certain settings of the objimpl.h header, which may not match whatever is being used to build the local python2x_d.dll. And thus, you get strange and hard to debug linker errors when trying to load external libraries.
Ok. But your proposed change doesn't fix that, right? I.e. even with the change, it would *still* depend on objimpl.h (and other) settings what ABI this debug DLL exactly has. So I think this problem can't really be fixed. Instead, you have to trust that the vendor did the most sensible thing when building foobarapp.pyd, namely activating *just* the debug build. Then, if you do the same, it will interoperate just fine.
IMHO, it is not good practice to expose an API that changes depending on preprocessor settings like this.
But there are tons of ABI changes that may happen in a debug build. If you want to cope with all of them, you really need to recompile the source of all extensions. Regards, Martin
-----Original Message----- From: "Martin v. Löwis" [mailto:martin@v.loewis.de] Sent: 15. júní 2010 21:20 To: Kristján Valur Jónsson Cc: python-dev@python.org Subject: Re: [Python-Dev] debug and release python
Am 15.06.2010 14:48, schrieb Kristján Valur Jónsson:
What I mean is that a third party software vendor supplies a foobarapp.pyd and a foobarapp_d.pyd dlls that link to python2x.dll and python2x_d.dll respectively. But the latter will have been compiled to match a certain settings of the objimpl.h header, which may not match whatever is being used to build the local python2x_d.dll. And thus, you get strange and hard to debug linker errors when trying to load external libraries.
Ok. But your proposed change doesn't fix that, right?
I.e. even with the change, it would *still* depend on objimpl.h (and other) settings what ABI this debug DLL exactly has.
I think it does. My proposal was perhaps not clear: For existing python versions, always export _PyObject_DebugMalloc et al. irrespective of the WITH_PYMALLOC and PYMALLOC_DEBUG settings. (PyObject_Malloc()) is always exported, even for builds without WITH_PYMALLOC) On new python versions, remove the _PyObject_DebugMalloc from the ABI. Make the switch internal to obmalloc.c, so that you can turn on the debug library by recompiling pythonxx_d.dll only (currently, you have to recompile the .pyd files too!)
But there are tons of ABI changes that may happen in a debug build. If you want to cope with all of them, you really need to recompile the source of all extensions. Are there? Can you give me an example? I thought we were careful to keep the interface shown to pyd files constant regardless of configuration settings.
K
I.e. even with the change, it would *still* depend on objimpl.h (and other) settings what ABI this debug DLL exactly has.
I think it does. My proposal was perhaps not clear: For existing python versions, always export _PyObject_DebugMalloc et al.
Hmm. That's still not clear. What are "existing Python versions"? You can't change them anymore; any change can only affect future, as-of-yet-non-existing Python versions. Also, what do you mean by "always"? Even in release builds? Would this really help? You shouldn't be mixing PyObject_DebugMalloc and PyObject_Malloc in a single process.
On new python versions, remove the _PyObject_DebugMalloc from the ABI. Make the switch internal to obmalloc.c, so that you can turn on the debug library by recompiling pythonxx_d.dll only (currently, you have to recompile the .pyd files too!)
That sounds fine.
But there are tons of ABI changes that may happen in a debug build. If you want to cope with all of them, you really need to recompile the source of all extensions. Are there? Can you give me an example?
If you define Py_TRACE_REFS, every object has two additional pointers, which aren't there if you don't. So extensions compiled with it are incompatible with extensions compiled without it. If you define COUNT_ALLOCS, every type object will have additional slots; again, you can't mix extensions that have a different setting here than the interpreter. Regards, Martin
2010/6/14 Kristján Valur Jónsson <kristjan@ccpgames.com>:
Hello there.
I‘m sure this has come up before, but here it is again:
Python exports a different api in debug mode, depending on whether PYMALLOC_DEBUG and WITH_PYMALLOC are exported. This means that _d.pyd files that are used must have been compiled with a version of python using the same settings for these macros. It is unfortunate that the _PyObject_DebugMalloc() api is exposed to external applications using macros in objimpl.h
I would suggest two things:
1) provide dummy or thunking versions of those in builds that don‘t have PYMALLOC_DEBUG impolemented, that thunk to PyObject_Malloc et al. (This is what we have done at CCP)
2) Remove the _PyObject_DebugMalloc() from the api. It really should be an implementation of in the exposed PyObject_Malloc() functions whether they use debug functionality at all. the _PyObject_DebugCheckAddress and _PyObject_DebugDumpAddress() can be left in place. But exposing this functionality in macros that external moduled compile in, is not good at all.
The reason why this is annoying:
Some external software comes with proprietary .pyd bindings. When developing my own application, with modified preprocessor definitions (e.g. to turn off PYMALLOC_DEBUG) we find that those externally provided libraries don‘t work. It takes a fair amount of detective work to find out why exactly linkage fails. The external API really shouldn‘t change depending on preprocessor definitions.
I remember having the same issue years ago: http://mail.python.org/pipermail/python-list/2004-July/855844.html At the time, I solved the issue by compiling extension modules with pymalloc options turned on (which it fortunately the default, so this applies to the supplied proprietary .pyd), and I added a (plain) definition for functions like _PyObject_DebugMalloc, even when PYMALLOC_DEBUG is undefined. Since the python_d.dll is a custom build anyway, adding the code is not too much pain. -- Amaury Forgeot d'Arc
-----Original Message----- From: Amaury Forgeot d'Arc [mailto:amauryfa@gmail.com] Sent: 15. júní 2010 21:24 To: Kristján Valur Jónsson Cc: python-dev@python.org Subject: Re: [Python-Dev] debug and release python
I remember having the same issue years ago: http://mail.python.org/pipermail/python-list/2004-July/855844.html
At the time, I solved the issue by compiling extension modules with pymalloc options turned on (which it fortunately the default, so this applies to the supplied proprietary .pyd), and I added a (plain) definition for functions like _PyObject_DebugMalloc, even when PYMALLOC_DEBUG is undefined.
Since the python_d.dll is a custom build anyway, adding the code is not too much pain.
It is not too much pain, once you realize the problem, no. But I just got bitten by this and spent the best part of a weekend trying to solve the problem. On Windows, you get an import failure on the .pyd file with the message: "Procedure entry point not found". I had come across this previously, some three years ago perhaps, and forgotten all about it, so I was sufficiently annoyed to post to python-dev. We use python27_d.dll a lot and typically have WITH_PYMALLOC disabled in debug build to for the benefit of using the debug malloc libraries present on windows. I've solved the issue now by making sure that obmalloc.c always exports _PyObject_DebugMalloc(), much as it always exports PyObject_Malloc() whether WITH_PYMALLOC is defined or not. My suggestion for python core would be the same: expose these always for existing python versions, and remove them from the API in new python versions. K
participants (3)
-
"Martin v. Löwis"
-
Amaury Forgeot d'Arc
-
Kristján Valur Jónsson