Wheels and dependent third party dlls on windows

Hi, I was wondering what is the recommended approach to bundling runtime dll dependencies when using wheels. We are migrating from egg to wheels for environment installation and of various python dependencies. Some of those have extension modules, and some have extension modules that depend on the presence of a third party dll (in our situation, libzmq-v100-mt-4_0_3.dll). Up to now, these dlls have been installed by the use of the scripts parameter in the setup command of setup.py, but https://mail.python.org/pipermail/distutils-sig/2014-July/024554.html points to it as not being a good idea. But the only way to get a dependent dll found on windows is to have it on PATH, and the scripts directory on windows is on path when a virtualenv is activated. I have observed two situations: 1) If we use pip wheel to build the wheel, the scripts parameter is ignored and the dlls do not even get to the archive. 2) If we use setup.py bdist_wheel, the dll gets into the archive, but this relies on the non-documented feature of packaging scripts-as-data of dlls. What is the correct approach at this time ? Thanks, David

On 30 September 2014 14:32, David Genest <david.genest@ubisoft.com> wrote:
But the only way to get a dependent dll found on windows is to have it on PATH, and the scripts directory on windows is on path when a virtualenv is activated.
This is not true. Python loads DLLs with LOAD_WITH_ALTERED_SEARCH_PATH, to allow them to be located alongside the pyd file. You should therefore be able to ship the dependent dll in the package directory (which wheels support fine). Paul

This is not true. Python loads DLLs with LOAD_WITH_ALTERED_SEARCH_PATH, to allow them to be located alongside the pyd file. You should therefore be able to ship the dependent dll in the package directory (which wheels support fine).
Paul
Ok, so what if the dll is shared in a given environment (multiple extensions use it)?, the shared dll should be copied to every package? Won't that cause multiple loads by the system? Thanks for your response, D.

On 30 September 2014 15:31, David Genest <david.genest@ubisoft.com> wrote:
Ok, so what if the dll is shared in a given environment (multiple extensions use it)?, the shared dll should be copied to every package? Won't that cause multiple loads by the system?
I honestly don't know in that case, sorry. You might get a better answer on python-list for that, if no-one here can help. Presumably the usage is all within one distribution, otherwise the question would have to be, which distribution ships the DLL? But that question ends up leading onto the sort of discussion that starts "well, I wouldn't design your system the way you have", which isn't likely to be of much help to you :-( Sorry I can't offer any more help. Paul

On 1 October 2014 00:37, Paul Moore <p.f.moore@gmail.com> wrote:
On 30 September 2014 15:31, David Genest <david.genest@ubisoft.com> wrote:
Ok, so what if the dll is shared in a given environment (multiple extensions use it)?, the shared dll should be copied to every package? Won't that cause multiple loads by the system?
I honestly don't know in that case, sorry. You might get a better answer on python-list for that, if no-one here can help.
Presumably the usage is all within one distribution, otherwise the question would have to be, which distribution ships the DLL? But that question ends up leading onto the sort of discussion that starts "well, I wouldn't design your system the way you have", which isn't likely to be of much help to you :-(
Sorry I can't offer any more help.
Note that this is the external binary dependency problem that the scientific folks are currently using conda to address. It's basically the point where you cross the line from "language specific packaging system" to "multi-language cross-platform platform". That said, pip/wheel *may* get some capabilities along these lines in the future, it just isn't a high priority at this point. Cheers, Nick. -- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia

Or you could just create a Python package that only contains the dll, and depend on it from your others. On Tue, Sep 30, 2014 at 10:44 AM, Nick Coghlan <ncoghlan@gmail.com> wrote:
On 1 October 2014 00:37, Paul Moore <p.f.moore@gmail.com> wrote:
On 30 September 2014 15:31, David Genest <david.genest@ubisoft.com> wrote:
Ok, so what if the dll is shared in a given environment (multiple extensions use it)?, the shared dll should be copied to every package? Won't that cause multiple loads by the system?
I honestly don't know in that case, sorry. You might get a better answer on python-list for that, if no-one here can help.
Presumably the usage is all within one distribution, otherwise the question would have to be, which distribution ships the DLL? But that question ends up leading onto the sort of discussion that starts "well, I wouldn't design your system the way you have", which isn't likely to be of much help to you :-(
Sorry I can't offer any more help.
Note that this is the external binary dependency problem that the scientific folks are currently using conda to address. It's basically the point where you cross the line from "language specific packaging system" to "multi-language cross-platform platform".
That said, pip/wheel *may* get some capabilities along these lines in the future, it just isn't a high priority at this point.
Cheers, Nick.
-- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia _______________________________________________ Distutils-SIG maillist - Distutils-SIG@python.org https://mail.python.org/mailman/listinfo/distutils-sig

On 30 September 2014 15:45, Daniel Holth <dholth@gmail.com> wrote:
Or you could just create a Python package that only contains the dll, and depend on it from your others.
The problem is getting the DLL on PATH. What you could do is distribute a package containing: 1. The dll 2. An __init__.py that adds the package directory (where the DLL is) to os.environ['PATH']. If you import this package before any of the ones that depend on the DLL (or even in the __init__ of those packages) then you should have PATH set up correctly, and things will work as you need. I think this works, although I will admit it feels like a bit of a hack to me. Paul

On Tue, Sep 30, 2014 at 7:45 AM, Daniel Holth <dholth@gmail.com> wrote:
Or you could just create a Python package that only contains the dll, and depend on it from your others.
but it won't be on the dll search path. Paul Moore wrote:
What you could do is distribute a package containing: 1. The dll 2. An __init__.py that adds the package directory (where the DLL is) to os.environ['PATH']. If you import this package before any of the ones that depend on the DLL (or even in the __init__ of those packages) then you should have PATH set up correctly, and things will work as you need. I think this works, although I will admit it feels like a bit of a hack to me. --
I'm not sure it will -- I tried somethign similar -- where I compiled some code into an extension, hoping that importing that extension would make that code available to other extensions -- works fine on OS-X and Linux, but no dice on Windows. So we build a dll of the code we need to share, and link all the extensions that need it to it. In our case, all the extensions are part of the same package, so we can put the dll in with the extensions and we're set. It seems the "right" thing to do here is put the dll in with the dlls provided by python (can't remember that path right now -- no Windows box running) -- but I don't know that you can do that with wheel -- and it would make it easy for different packages to stomp on each-other. I actually think the thing to do here is either statically link it to each extension that needs it, or deliver it with each of them, in the package dir. Or, if you don't want duplicates, then use conda -- it's designed just for this. I'd also look at what Chris Gohlke does with his MSI installers: http://www.lfd.uci.edu/~gohlke/pythonlibs/ maybe he's doing something you can do with MSI that you can't do with wheels, but worth a look. -Chris Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker@noaa.gov

On Tue, Sep 30, 2014 at 3:44 PM, Nick Coghlan <ncoghlan@gmail.com> wrote:
On 1 October 2014 00:37, Paul Moore <p.f.moore@gmail.com> wrote:
On 30 September 2014 15:31, David Genest <david.genest@ubisoft.com> wrote:
Ok, so what if the dll is shared in a given environment (multiple extensions use it)?, the shared dll should be copied to every package? Won't that cause multiple loads by the system?
I honestly don't know in that case, sorry. You might get a better answer on python-list for that, if no-one here can help.
Presumably the usage is all within one distribution, otherwise the question would have to be, which distribution ships the DLL? But that question ends up leading onto the sort of discussion that starts "well, I wouldn't design your system the way you have", which isn't likely to be of much help to you :-(
Sorry I can't offer any more help.
Note that this is the external binary dependency problem that the scientific folks are currently using conda to address. It's basically the point where you cross the line from "language specific packaging system" to "multi-language cross-platform platform".
Conda is one such solution, not the solution ;) I don't know any "sumo" distribution which solves this problem correctly ATM, and windows makes this rather difficult to solve. David
That said, pip/wheel *may* get some capabilities along these lines in the future, it just isn't a high priority at this point.
Cheers, Nick.
-- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia _______________________________________________ Distutils-SIG maillist - Distutils-SIG@python.org https://mail.python.org/mailman/listinfo/distutils-sig

Thank you all for the precious info. Here are my observations: - We are merely writing extension modules with third party dependant code packaged in a dll. In my mind, this use case is not the exception, and would not necessarily warrant the use of a full blown solution like conda. Our deployed environments are self-contained. - If you run python setup.py bdist_wheel, the dlls specified in the scripts parameter end up in the wheel archive and does what is needed for our setup. (the dlls are copied to the scripts directory which is on PATH for the activated environment). - On the other hand, if you run pip wheel on the same package, the dlls are not placed in the archive. Is this a pip bug ? In an ideal world, the scripts directory would be called bin, like the unix counter-part, and any dependency, being startup scripts or dlls could be installed in the bin/ "environment global space". This path would be added to the python startup sequence (in order to not rely on the env's activate). I feel that the current state of affairs is not that far, because setup.py bdist_wheel works now. Knowing that there are alternatives on the way (in metadata 2.0 ?) and workarounds, we will go with our current wheel solution using setup.py bdist_wheel. If the bdist_wheel command ever loses the "package binary files in scripts dir" we have alternatives (listed in order of dependability): 1) add the dependent dlls to every package that needs it (Steve's answer https://mail.python.org/pipermail/distutils-sig/2014-September/024982.html concurs that the dependent dll would be loaded only once) 2) use technique to modify path in the first __init__.py (https://mail.python.org/pipermail/distutils-sig/2014-September/024962.html) 3) statically link the dependent library: not very good in the face of sharing code and having multiple copies in different extension modules with its state etc. Does the community still think this is a "I would not design my solution like yours" use-case ? The extension modules are a really good way to accelerate python, so they are bound to be constructed with other dependent libraries. It is not only a sdist world :-), particularly on Windows. Once again thanks, D.

David Genest wrote:
1) add the dependent dlls to every package that needs it (Steve's answer https://mail.python.org/pipermail/distutils-sig/2014-September/024982.html concurs that the dependent dll would be loaded only once)
This is the best approach regardless of what else works/doesn't work. Doing this will ensure that your extension module loads the correct dependencies regardless of what other packages or versions are installed in the same environment. The Scripts folder is not necessarily on the path for all users, and nor is the Python directory itself. Depending on how you launch Python, you may get DLLs loaded from the Python directory or the Scripts directory, but either way you should _always_ get DLLs loaded from the same directory as the extension module. Neither directory is intended to be a dumping ground for all the dependencies used by packages. I see no reason 'pip wheel' should produce different wheels to bdist_wheel (but perhaps there is one?), so I would consider this a bug. Cheers, Steve

On 1 October 2014 17:44, David Genest <david.genest@ubisoft.com> wrote:
- If you run python setup.py bdist_wheel, the dlls specified in the scripts parameter end up in the wheel archive and does what is needed for our setup. (the dlls are copied to the scripts directory which is on PATH for the activated environment).
It sounds like you're using an old version of wheel. The --skip-scripts argument was removed (and skipping scripts made the default) in 0.23.0.
- On the other hand, if you run pip wheel on the same package, the dlls are not placed in the archive. Is this a pip bug ?
No, this is not a pip bug. Scripts are omitted from wheels and generated on install from the metadata. DLLs aren't scripts, and putting them into the scripts list in setup.py will cause them to be treated inappropriately (as you see).
Knowing that there are alternatives on the way (in metadata 2.0 ?) and workarounds, we will go with our current wheel solution using setup.py bdist_wheel.
You're likely to hit issues pretty soon, I suspect. You're using undefined (and generally strongly discouraged) behaviour, I'm afraid. But there *is* an intention to allow wheels to specify more possible locations for files to be installed (along the lines of the autoconf directory classes), so "the appropriate binary directory" should be a location you can specify in a supported manner in the longer term.
If the bdist_wheel command ever loses the "package binary files in scripts dir" we have alternatives (listed in order of dependability):
1) add the dependent dlls to every package that needs it (Steve's answer https://mail.python.org/pipermail/distutils-sig/2014-September/024982.html concurs that the dependent dll would be loaded only once) 2) use technique to modify path in the first __init__.py (https://mail.python.org/pipermail/distutils-sig/2014-September/024962.html) 3) statically link the dependent library: not very good in the face of sharing code and having multiple copies in different extension modules with its state etc.
Does the community still think this is a "I would not design my solution like yours" use-case ? The extension modules are a really good way to accelerate python, so they are bound to be constructed with other dependent libraries. It is not only a sdist world :-), particularly on Windows.
I certainly wouldn't recommend using undefined behaviour like you are. Personally, I'd probably have designed my system around a single interface package that contained the DLL alongside a Python extension wrapping it. Other packages in your system could then depend on that one, and the DLLs would only be stored in one place. Other packages access the DLL via the extension (extensions can publish a C API for that purpose). But that's with hindsight, and learning the lessons from the issues you're having, so I wouldn't expect you to have known that in advance! Paul

On Wed, Oct 1, 2014 at 1:35 PM, Paul Moore <p.f.moore@gmail.com> wrote:
On 1 October 2014 17:44, David Genest <david.genest@ubisoft.com> wrote:
- If you run python setup.py bdist_wheel, the dlls specified in the scripts parameter end up in the wheel archive and does what is needed for our setup. (the dlls are copied to the scripts directory which is on PATH for the activated environment).
It sounds like you're using an old version of wheel. The --skip-scripts argument was removed (and skipping scripts made the default) in 0.23.0.
- On the other hand, if you run pip wheel on the same package, the dlls are not placed in the archive. Is this a pip bug ?
No, this is not a pip bug. Scripts are omitted from wheels and generated on install from the metadata. DLLs aren't scripts, and putting them into the scripts list in setup.py will cause them to be treated inappropriately (as you see).
You are confusing generated entry_points script wrappers with the setup(scripts=...) scripts. The scripts=... scripts should never be skipped, even with --skip-scripts, they should work the same as they always have.

On 1 October 2014 21:06, Daniel Holth <dholth@gmail.com> wrote:
You are confusing generated entry_points script wrappers with the setup(scripts=...) scripts. The scripts=... scripts should never be skipped, even with --skip-scripts, they should work the same as they always have.
Sorry, you're right. But the legacy (non entry-point) scripts are certainly fragile, and I'd recommend avoiding them. Even for actual scripts, and *certainly* as a hack to get things in the "Scripts" directory... Paul

On 2 Oct 2014 06:12, "Paul Moore" <p.f.moore@gmail.com> wrote:
On 1 October 2014 21:06, Daniel Holth <dholth@gmail.com> wrote:
You are confusing generated entry_points script wrappers with the setup(scripts=...) scripts. The scripts=... scripts should never be skipped, even with --skip-scripts, they should work the same as they always have.
Sorry, you're right. But the legacy (non entry-point) scripts are certainly fragile, and I'd recommend avoiding them. Even for actual scripts, and *certainly* as a hack to get things in the "Scripts" directory...
Note that PEP 459 currently proposes preserving this capability as "python.commands.prebuilt", so I personally consider it reasonable as a way of packaging arbitrary executables and non-entry-point based scripts. The main problem with using it for DLLs is the potential for "DLL hell" that you and others have mentioned, as version management on DLLs installed into shared directories can get very messy. Cheers, Nick.
Paul _______________________________________________ Distutils-SIG maillist - Distutils-SIG@python.org https://mail.python.org/mailman/listinfo/distutils-sig

On 1 October 2014 23:10, Nick Coghlan <ncoghlan@gmail.com> wrote:
Sorry, you're right. But the legacy (non entry-point) scripts are certainly fragile, and I'd recommend avoiding them. Even for actual scripts, and *certainly* as a hack to get things in the "Scripts" directory...
Note that PEP 459 currently proposes preserving this capability as "python.commands.prebuilt", so I personally consider it reasonable as a way of packaging arbitrary executables and non-entry-point based scripts.
Existing code tends to try to rewrite shebang lines on scripts. And maybe on Unix add the executable bit. The wheel spec (installing a wheel, the "Spread" step, item c) actually requires that script rewriting is done, so in theory that makes the scripts subdirectory in a wheel unsuitable. In practice of course, DLLs will be fine as they don't start with #!python. The "python.commands.prebuilt" spec doesn't say whether #! rewrites are done. If they are, they have the same issue as scripts. If they aren't, the current wheel spec doesn't offer an appropriate area to put them. Basically, it needs a wheel spec update to have an area which holds files that are copied unchanged to the destination Scripts folder. But this is entirely standards-style nitpicking. The existing scripts support is fine for DLLs, and will work as expected for every practical purpose. That's why I called it a "hack" - it works, but it's unofficial. Paul

Note that PEP 459 currently proposes preserving this capability as "python.commands.prebuilt", so I personally consider it reasonable as a way of packaging arbitrary executables and non-entry-point based scripts.
Yes, this will prove valuable (for other things than dlls, admittedly).
The main problem with using it for DLLs is the potential for "DLL hell" that you and others have mentioned, as version management on DLLs installed into shared directories can get very messy.
We control our environment and package only what is needed in it. This makes a micro system in which everything is controlled and isolated, even the global dlls (to the virtual env) I wanted to install. They become only accessible to the activated environment. I don’t see how it can become DLL hell in this situation. But hearing many fine and useful comments on the thread made me change my mind and package the dependency near the extension module. I still find that something like pep 459 will be really useful. To the list: Thanks for all you input. D.

On Wed, Oct 1, 2014 at 5:44 PM, David Genest <david.genest@ubisoft.com> wrote:
We control our environment and package only what is needed in it. This makes a micro system in which everything is controlled and isolated, even the global dlls (to the virtual env) I wanted to install.
If that is your use case, you may want to take a good lok at conda -- that is exactly what it is for -- why re-invent the wheel? ( sorry ). Note that while conda is the package manger for Anaconda, it can also be used to build your own distribution, you wouldn't need to adopt Anaconda as a platform. -Chris -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker@noaa.gov

On Wed, Oct 1, 2014 at 5:44 PM, David Genest <david.genest@ubisoft.com> wrote:
Thank you all for the precious info.
Here are my observations:
- We are merely writing extension modules with third party dependant code packaged in a dll. In my mind, this use case is not the exception, and would not necessarily warrant the use of a full blown solution like conda. Our deployed environments are self-contained.
- If you run python setup.py bdist_wheel, the dlls specified in the scripts parameter end up in the wheel archive and does what is needed for our setup. (the dlls are copied to the scripts directory which is on PATH for the activated environment).
One issue with PATH is its global settings: if you copy your dll there and any other program the user may want to use, you will get into trouble. This is mostly an issue for well known 3rd party DLLs, though.
- On the other hand, if you run pip wheel on the same package, the dlls are not placed in the archive. Is this a pip bug ?
In an ideal world, the scripts directory would be called bin, like the unix counter-part, and any dependency, being startup scripts or dlls could be installed in the bin/ "environment global space". This path would be added to the python startup sequence (in order to not rely on the env's activate).
I feel that the current state of affairs is not that far, because setup.py bdist_wheel works now.
Knowing that there are alternatives on the way (in metadata 2.0 ?) and workarounds, we will go with our current wheel solution using setup.py bdist_wheel.
If the bdist_wheel command ever loses the "package binary files in scripts dir" we have alternatives (listed in order of dependability):
1) add the dependent dlls to every package that needs it (Steve's answer https://mail.python.org/pipermail/distutils-sig/2014-September/024982.html concurs that the dependent dll would be loaded only once)
Note Steve's observation regarding manifests. It is not an uncommon issue when integrating with 3rd party libs in my experience (where renaming the dll is often impractical). David

On Wed, Oct 1, 2014 at 9:44 AM, David Genest <david.genest@ubisoft.com> wrote:
- We are merely writing extension modules with third party dependent code packaged in a dll. In my mind, this use case is not the exception, and would not necessarily warrant the use of a full blown solution like conda.
agreed -- it is not rare, so yes, it would be nice if the core python (pypa) systems addressed it. But like David said, Windows makes this really hard...
- If you run python setup.py bdist_wheel, the dlls specified in the scripts parameter end up in the wheel archive and does what is needed for our setup. (the dlls are copied to the scripts directory which is on PATH for the activated environment).
If this is the PATH only for that environment, then this is probably fine. But one of the biggest sources of "dll hell" is that the same PATH is used for executables and dlls, and that dlls placed next to executables will be found. this means that any old app could find any old dll on teh PATH, and that there are a lot of dll on teh PATH. So putting dlls into the python "scripts" or "bin" dir is a bad idea in general -- who know what apps may find them? Couple this with the (absolutely incomprehensible to me) habit of folks to use short (still 8.3) names for dlls, without much version info, and you really have a mess. So if you do put your dlls into the Script dir -- do please give them nice long descriptive names! But isn't there a library or somethign directory where other python dlls are that could be used instead? then you could get clashes between python extensions, but it wouldn't clash with anything else on the system.
In an ideal world, the scripts directory would be called bin, like the unix counter-part,
why does the name matter at all?
and any dependency, being startup scripts or dlls could be installed in the bin/ "environment global space". This path would be added to the python startup sequence (in order to not rely on the env's activate).
ouch -- no dlls and top level scripts don't belong in the same place, period. ANother option is to make a python package that has little other than that dll in it, then yoru packaged list it as a dependency, and I _THINK_ there is some relative path magic that you can do so that your other extensions can find it. Anyone know what Anaconda does on Windows?
1) add the dependent dlls to every package that needs it (Steve's answer https://mail.python.org/pipermail/distutils-sig/2014-September/024982.html concurs that the dependent dll would be loaded only once)
If Steve is correct, which he probably is -- this is a great way to go. Alternatively, alter my suggestion above a bit -- and have your "dll package" have a tiny extension that does nothing but link the dll in. then everything that depends on that dll will have a "import the_funny_dll_package" line at the top -- and this ends up looking just like a regular old python dependency. Again, make sure to use a descriptive enough name for the dll so that it doesn't clash with other packages (Not yours) that may use the same (similar) dll. Does the community still think this is a "I would not design my solution
like yours" use-case ? The extension modules are a really good way to accelerate python, so they are bound to be constructed with other dependent libraries. It is not only a sdist world :-), particularly on Windows.
this is common problem we'd all love to be able to solve! (and conda does help....) and sdist doesn't help anyway -- folks need to build and install it somehow anyway. -Chris -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker@noaa.gov

David Genest wrote:
Subject: Re: [Distutils] Wheels and dependent third party dlls on windows
This is not true. Python loads DLLs with LOAD_WITH_ALTERED_SEARCH_PATH, to allow them to be located alongside the pyd file. You should therefore be able to ship the dependent dll in the package directory (which wheels support fine).
Paul
Ok, so what if the dll is shared in a given environment (multiple extensions use it)?, the shared dll should be copied to every package? Won't that cause multiple loads by the system?
A DLL can only be loaded once per process (python.exe, in this case) and it will be loaded based on its file name (not the full path). Whoever loads first will win every future load for the same filename. If you're loading it directly, it's fairly easy to rename a DLL to something likely to be unique to your project (or at least to put a version number in it) so that other packages won't use it. There are more complicated approaches using manifests and activation contexts (this is how different .pyd files with the same name can be correctly loaded), but ensuring a unique name is much easier. If the DLL is loaded implicitly by a .pyd, then as Paul says it should be loaded correctly if it is alongside the .pyd. Dependency Walker from www.dependencywalker.com is a great tool for checking what DLLs will be loaded by an executable or DLL. I recommend enabling profiling of your python.exe process when you try and import your packages to see where it is looking for its dependencies. Hope that helps, Steve
Thanks for your response,
D.

On Tue, Sep 30, 2014 at 3:31 PM, David Genest <david.genest@ubisoft.com> wrote:
This is not true. Python loads DLLs with LOAD_WITH_ALTERED_SEARCH_PATH, to allow them to be located alongside the pyd file. You should therefore be able to ship the dependent dll in the package directory (which wheels support fine).
Paul
Ok, so what if the dll is shared in a given environment (multiple extensions use it)?, the shared dll should be copied to every package? Won't that cause multiple loads by the system?
Yes it will, and it is indeed problematic. There are no great solutions: - bundle it in your package - have a separate wheel and then put it in PATH - have a separate wheel but use preload tricks (e.g. using ctypes) to avoid using PATH There are better solutions if one can patch python itself, though that's obviously not practical in most cases. David
Thanks for your response,
D.
_______________________________________________ Distutils-SIG maillist - Distutils-SIG@python.org https://mail.python.org/mailman/listinfo/distutils-sig
participants (7)
-
Chris Barker
-
Daniel Holth
-
David Cournapeau
-
David Genest
-
Nick Coghlan
-
Paul Moore
-
Steve Dower