
On 1 October 2014 17:44, David Genest <david.genest@ubisoft.com> wrote:
- If you run python setup.py bdist_wheel, the dlls specified in the scripts parameter end up in the wheel archive and does what is needed for our setup. (the dlls are copied to the scripts directory which is on PATH for the activated environment).
It sounds like you're using an old version of wheel. The --skip-scripts argument was removed (and skipping scripts made the default) in 0.23.0.
- On the other hand, if you run pip wheel on the same package, the dlls are not placed in the archive. Is this a pip bug ?
No, this is not a pip bug. Scripts are omitted from wheels and generated on install from the metadata. DLLs aren't scripts, and putting them into the scripts list in setup.py will cause them to be treated inappropriately (as you see).
Knowing that there are alternatives on the way (in metadata 2.0 ?) and workarounds, we will go with our current wheel solution using setup.py bdist_wheel.
You're likely to hit issues pretty soon, I suspect. You're using undefined (and generally strongly discouraged) behaviour, I'm afraid. But there *is* an intention to allow wheels to specify more possible locations for files to be installed (along the lines of the autoconf directory classes), so "the appropriate binary directory" should be a location you can specify in a supported manner in the longer term.
If the bdist_wheel command ever loses the "package binary files in scripts dir" we have alternatives (listed in order of dependability):
1) add the dependent dlls to every package that needs it (Steve's answer https://mail.python.org/pipermail/distutils-sig/2014-September/024982.html concurs that the dependent dll would be loaded only once) 2) use technique to modify path in the first __init__.py (https://mail.python.org/pipermail/distutils-sig/2014-September/024962.html) 3) statically link the dependent library: not very good in the face of sharing code and having multiple copies in different extension modules with its state etc.
Does the community still think this is a "I would not design my solution like yours" use-case ? The extension modules are a really good way to accelerate python, so they are bound to be constructed with other dependent libraries. It is not only a sdist world :-), particularly on Windows.
I certainly wouldn't recommend using undefined behaviour like you are. Personally, I'd probably have designed my system around a single interface package that contained the DLL alongside a Python extension wrapping it. Other packages in your system could then depend on that one, and the DLLs would only be stored in one place. Other packages access the DLL via the extension (extensions can publish a C API for that purpose). But that's with hindsight, and learning the lessons from the issues you're having, so I wouldn't expect you to have known that in advance! Paul