Thank you all for the precious info.
Here are my observations:
- We are merely writing extension modules with third party dependant code packaged in a dll. In my mind, this use case is not the exception, and would not necessarily warrant the use of a full blown solution like conda. Our deployed environments are self-contained.
- If you run python setup.py bdist_wheel, the dlls specified in the scripts parameter end up in the wheel archive and does what is needed for our setup. (the dlls are copied to the scripts directory which is on PATH for the activated environment).
- On the other hand, if you run pip wheel on the same package, the dlls are not placed in the archive. Is this a pip bug ?
In an ideal world, the scripts directory would be called bin, like the unix counter-part, and any dependency, being startup scripts or dlls could be installed in the bin/ "environment global space". This path would be added to the python startup sequence (in order to not rely on the env's activate).
I feel that the current state of affairs is not that far, because setup.py bdist_wheel works now.
Knowing that there are alternatives on the way (in metadata 2.0 ?) and workarounds, we will go with our current wheel solution using setup.py bdist_wheel.
If the bdist_wheel command ever loses the "package binary files in scripts dir" we have alternatives (listed in order of dependability):
1) add the dependent dlls to every package that needs it (Steve's answer https://mail.python.org/pipermail/distutils-sig/2014-September/024982.html
concurs that the dependent dll would be loaded only once)