On Wed, Oct 1, 2014 at 9:44 AM, David Genest firstname.lastname@example.org wrote:
- We are merely writing extension modules with third party dependent code
packaged in a dll. In my mind, this use case is not the exception, and would not necessarily warrant the use of a full blown solution like conda.
agreed -- it is not rare, so yes, it would be nice if the core python (pypa) systems addressed it. But like David said, Windows makes this really hard...
- If you run python setup.py bdist_wheel, the dlls specified in the
scripts parameter end up in the wheel archive and does what is needed for our setup. (the dlls are copied to the scripts directory which is on PATH for the activated environment).
If this is the PATH only for that environment, then this is probably fine. But one of the biggest sources of "dll hell" is that the same PATH is used for executables and dlls, and that dlls placed next to executables will be found. this means that any old app could find any old dll on teh PATH, and that there are a lot of dll on teh PATH.
So putting dlls into the python "scripts" or "bin" dir is a bad idea in general -- who know what apps may find them?
Couple this with the (absolutely incomprehensible to me) habit of folks to use short (still 8.3) names for dlls, without much version info, and you really have a mess.
So if you do put your dlls into the Script dir -- do please give them nice long descriptive names!
But isn't there a library or somethign directory where other python dlls are that could be used instead? then you could get clashes between python extensions, but it wouldn't clash with anything else on the system.
In an ideal world, the scripts directory would be called bin, like the unix counter-part,
why does the name matter at all?
and any dependency, being startup scripts or dlls could be installed in the bin/ "environment global space". This path would be added to the python startup sequence (in order to not rely on the env's activate).
ouch -- no dlls and top level scripts don't belong in the same place, period.
ANother option is to make a python package that has little other than that dll in it, then yoru packaged list it as a dependency, and I _THINK_ there is some relative path magic that you can do so that your other extensions can find it.
Anyone know what Anaconda does on Windows?
- add the dependent dlls to every package that needs it (Steve's answer
https://mail.python.org/pipermail/distutils-sig/2014-September/024982.html concurs that the dependent dll would be loaded only once)
If Steve is correct, which he probably is -- this is a great way to go. Alternatively, alter my suggestion above a bit -- and have your "dll package" have a tiny extension that does nothing but link the dll in. then everything that depends on that dll will have a "import the_funny_dll_package" line at the top -- and this ends up looking just like a regular old python dependency.
Again, make sure to use a descriptive enough name for the dll so that it doesn't clash with other packages (Not yours) that may use the same (similar) dll.
Does the community still think this is a "I would not design my solution
like yours" use-case ? The extension modules are a really good way to accelerate python, so they are bound to be constructed with other dependent libraries. It is not only a sdist world :-), particularly on Windows.
this is common problem we'd all love to be able to solve! (and conda does help....)
and sdist doesn't help anyway -- folks need to build and install it somehow anyway.