On 10 September 2016 at 23:23, Ronny Pfannschmidt
Hi,
On 10.09.2016 12:37, Nick Coghlan wrote:
On 9 September 2016 at 16:46, Ronny Pfannschmidt
wrote: Hi,
while trying to vendor packages that depend on six, i noticed a problematic implication
it is impossible to do this without patching those modules, as such adding a fragile maintenance burden.
It's already possible to do this without patching by writing an "install aliases" function that looks something like:
import sys import myapp._vendor.six
def install_module_aliases(): sys.modules["six"] = myapp._vendor.six
that specifically breaks the very purpose the vendoring happened which is to decouple from the global installed version
Right, I'd missed the context that you were wanting to do this in py.test while leaving the global module namespace freely available for the code under test.
for example if we vendor six in that way in pytest, we literally break testing anything that depends on a different version of six
As long as the software under test is running in the same process as the pytest infrastructure, you're always going to have that problem of potential side effects via the interpreter's global state. The most you can achieve is to put tight scope control on your global modifications, such that you can do: - modify sys.modules to point to the vendored libraries - force load everything that needs to see the vendored libraries - revert the changes to sys.modules This is a pretty standard problem for testing infrastructure that runs in the same process as the testing target.
the same goes for other libraries
Run that early in your app startup (before you import anything else), and "import six" will get your vendored version from the module cache rather than looking for the normal one.
For app level bundling, though, it's really better to find a way to deploy a full venv if possible, rather than vendoring things (for library level work, aim to avoid vendoring in general and use dependency version specifiers instead).
for my use-case (which is making a infrastructure package robust and having it reuse well tested code) its basically the worst solution to just pin global dependencies, since testing code that depends on different versions will be impossible to test
its seems impossible to avoid vendoring when one wants to use library-specific versions of dependencies without literally poisoning pythonpath.
Aye, py.test is in a tough place here, where most of the conventional porting guidelines don't work for you, and the approach Michael Foord took for unittest (i.e. separate Python 2 and 3 source trees) isn't going to be something you want to do either. pip does successfully vendor a number of other projects, but they tend to be ones that were themselves designed to use vendoring to minimise bootstrapping problems.
the python module system and sys.modules are the main crippling factors in that area, and my understanding so far is, that python-dev will not ever remove that limitation.
Right, even pkg_resources.requires() still restricts you to one version of a given package per process, it just lets multiple versions co-exist in the same filesystem tree.
(and its not a good idea to remove such a limitation without having actual working encapsulation)
Exactly - allowing multiple versions of the same library into the process namespace is problematic enough that we consider it a bug when the interpreter allows that to happen inadvertently (not always a fixable bug, but a bug nonetheless). We don't have the internal sandboxing needed to make that a supportable behaviour, and beyond a few ideas around ironing the various problems with the subinterpreter support, no particular desire to add that either (since you're pretty much reinventing operating system process encapsulation at that point, and the never ending security arms race that goes with it). Cheers, Nick. -- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia