[proposal] module dependency specification and overriding for packages
Hi, while trying to vendor packages that depend on six, i noticed a problematic implication it is impossible to do this without patching those modules, as such adding a fragile maintenance burden. in order to elevate that i propose creating a mechanism that allows to physically or virtually map sets of interdependent libraries (that are prepared for it into a sub-package of the package vendor-ing it this would elevate a few issues a) removing the need to patch b) making de-vendor-ing of exact versions trivial (just unpacked wheels and adding it to the sub-packages __path__) c) easing updates, as the vendor-ing turns into a process of fetching the wheels and unpacking them into the package folder i do have some implementation ideas, but i would like to hear the opinions of others before proposing an exact implementation -- Ronny
On Fri, Sep 9, 2016 at 8:46 AM, Ronny Pfannschmidt
while trying to vendor packages that depend on six, i noticed a problematic implication
it is impossible to do this without patching those modules, as such adding a fragile maintenance burden.
Ronny: I grok the general issue, but would you have a concrete example? -- Cordially Philippe Ombredanne
On 09.09.2016 09:37, Philippe Ombredanne wrote:
On Fri, Sep 9, 2016 at 8:46 AM, Ronny Pfannschmidt
wrote: while trying to vendor packages that depend on six, i noticed a problematic implication
it is impossible to do this without patching those modules, as such adding a fragile maintenance burden.
Ronny: I grok the general issue, but would you have a concrete example? a basic example would be vendoring
https://github.com/kyrus/python-junit-xml and six into pytest following that we would face 2 problems a) python-junit-xml imports plain six instead of _pytest.vendored_packages.six meaning we have to patch it when including it b) debian/fedora will want to remove six and python-junit-xml from their pytest-package instead linking to python-six-wheel-... and python-junit-xml-wheel... packages as part of their deduplication effort meaning they will patch around in the pytest package (which tends to break the world)
So I realise this isn't the specific thing you're dealing with, but
you could use https://pypi.python.org/pypi/junitxml which is mature,
has no external dependencies, and as a LGPL module should be license
compatible with py.test.
-Rob
On 9 September 2016 at 19:48, Ronny Pfannschmidt
On 09.09.2016 09:37, Philippe Ombredanne wrote:
On Fri, Sep 9, 2016 at 8:46 AM, Ronny Pfannschmidt
wrote: while trying to vendor packages that depend on six, i noticed a problematic implication
it is impossible to do this without patching those modules, as such adding a fragile maintenance burden.
Ronny: I grok the general issue, but would you have a concrete example?
a basic example would be vendoring
https://github.com/kyrus/python-junit-xml and six into pytest
following that we would face 2 problems
a) python-junit-xml imports plain six instead of _pytest.vendored_packages.six meaning we have to patch it when including it b) debian/fedora will want to remove six and python-junit-xml from their pytest-package instead linking to python-six-wheel-... and python-junit-xml-wheel... packages as part of their deduplication effort meaning they will patch around in the pytest package (which tends to break the world)
_______________________________________________ Distutils-SIG maillist - Distutils-SIG@python.org https://mail.python.org/mailman/listinfo/distutils-sig
On 9 September 2016 at 16:46, Ronny Pfannschmidt
Hi,
while trying to vendor packages that depend on six, i noticed a problematic implication
it is impossible to do this without patching those modules, as such adding a fragile maintenance burden.
It's already possible to do this without patching by writing an "install aliases" function that looks something like: import sys import myapp._vendor.six def install_module_aliases(): sys.modules["six"] = myapp._vendor.six Run that early in your app startup (before you import anything else), and "import six" will get your vendored version from the module cache rather than looking for the normal one. For app level bundling, though, it's really better to find a way to deploy a full venv if possible, rather than vendoring things (for library level work, aim to avoid vendoring in general and use dependency version specifiers instead). Cheers, Nick. -- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia
On 9 September 2016 at 16:46, Ronny Pfannschmidt
wrote: Hi,
while trying to vendor packages that depend on six, i noticed a problematic implication
it is impossible to do this without patching those modules, as such adding a fragile maintenance burden. It's already possible to do this without patching by writing an "install aliases" function that looks something like:
import sys import myapp._vendor.six
def install_module_aliases(): sys.modules["six"] = myapp._vendor.six
Hi, On 10.09.2016 12:37, Nick Coghlan wrote: that specifically breaks the very purpose the vendoring happened which is to decouple from the global installed version for example if we vendor six in that way in pytest, we literally break testing anything that depends on a different version of six the same goes for other libraries
Run that early in your app startup (before you import anything else), and "import six" will get your vendored version from the module cache rather than looking for the normal one.
For app level bundling, though, it's really better to find a way to deploy a full venv if possible, rather than vendoring things (for library level work, aim to avoid vendoring in general and use dependency version specifiers instead). for my use-case (which is making a infrastructure package robust and having it reuse well tested code) its basically the worst solution to just pin global dependencies, since testing code that depends on different versions will be impossible to test
its seems impossible to avoid vendoring when one wants to use library-specific versions of dependencies without literally poisoning pythonpath. the python module system and sys.modules are the main crippling factors in that area, and my understanding so far is, that python-dev will not ever remove that limitation. (and its not a good idea to remove such a limitation without having actual working encapsulation) -- Ronny
Cheers, Nick.
On 10 September 2016 at 23:23, Ronny Pfannschmidt
Hi,
On 10.09.2016 12:37, Nick Coghlan wrote:
On 9 September 2016 at 16:46, Ronny Pfannschmidt
wrote: Hi,
while trying to vendor packages that depend on six, i noticed a problematic implication
it is impossible to do this without patching those modules, as such adding a fragile maintenance burden.
It's already possible to do this without patching by writing an "install aliases" function that looks something like:
import sys import myapp._vendor.six
def install_module_aliases(): sys.modules["six"] = myapp._vendor.six
that specifically breaks the very purpose the vendoring happened which is to decouple from the global installed version
Right, I'd missed the context that you were wanting to do this in py.test while leaving the global module namespace freely available for the code under test.
for example if we vendor six in that way in pytest, we literally break testing anything that depends on a different version of six
As long as the software under test is running in the same process as the pytest infrastructure, you're always going to have that problem of potential side effects via the interpreter's global state. The most you can achieve is to put tight scope control on your global modifications, such that you can do: - modify sys.modules to point to the vendored libraries - force load everything that needs to see the vendored libraries - revert the changes to sys.modules This is a pretty standard problem for testing infrastructure that runs in the same process as the testing target.
the same goes for other libraries
Run that early in your app startup (before you import anything else), and "import six" will get your vendored version from the module cache rather than looking for the normal one.
For app level bundling, though, it's really better to find a way to deploy a full venv if possible, rather than vendoring things (for library level work, aim to avoid vendoring in general and use dependency version specifiers instead).
for my use-case (which is making a infrastructure package robust and having it reuse well tested code) its basically the worst solution to just pin global dependencies, since testing code that depends on different versions will be impossible to test
its seems impossible to avoid vendoring when one wants to use library-specific versions of dependencies without literally poisoning pythonpath.
Aye, py.test is in a tough place here, where most of the conventional porting guidelines don't work for you, and the approach Michael Foord took for unittest (i.e. separate Python 2 and 3 source trees) isn't going to be something you want to do either. pip does successfully vendor a number of other projects, but they tend to be ones that were themselves designed to use vendoring to minimise bootstrapping problems.
the python module system and sys.modules are the main crippling factors in that area, and my understanding so far is, that python-dev will not ever remove that limitation.
Right, even pkg_resources.requires() still restricts you to one version of a given package per process, it just lets multiple versions co-exist in the same filesystem tree.
(and its not a good idea to remove such a limitation without having actual working encapsulation)
Exactly - allowing multiple versions of the same library into the process namespace is problematic enough that we consider it a bug when the interpreter allows that to happen inadvertently (not always a fixable bug, but a bug nonetheless). We don't have the internal sandboxing needed to make that a supportable behaviour, and beyond a few ideas around ironing the various problems with the subinterpreter support, no particular desire to add that either (since you're pretty much reinventing operating system process encapsulation at that point, and the never ending security arms race that goes with it). Cheers, Nick. -- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia
participants (4)
-
Nick Coghlan
-
Philippe Ombredanne
-
Robert Collins
-
Ronny Pfannschmidt