name of the dependency problem
After again finding that pip doesn't have a correct dependency resolution solution a colleage and I discussed the nature of the problem. We examined the script capture of our install and it seems as though when presented with level 0 A A level 1 1.4<= C level 0 B B level 1 1.6<= C <1.7 pip manages to download version 1.8 of C(Django) using A's requirement, but never even warns us that the B requirement of C was violated. Surely even in the absence of a resolution pip could raise a warning at the end. Anyhow after some discussion I realize I don't even know the name of the problem that pip should try to solve, is there some tree / graph problem that corresponds? Searching on dependency seems to lead to topological sorts of one kind or another, but here we seem to have nodes with discrete values attached so in the above example we might have (assuming only singleton A & B) R --> A R --> B A --> C-1.4 A --> C-1.6 A --> C-1.6.11 A --> C-1.7 A --> C-1.8 B --> C-1.6 B --> C-1.6.11 so looking at C equivalent nodes seems to allow a solution set. Are there any real problem descriptions / solutions to this kind of problem? -- Robin Becker
First of all, I'm surprised that pip doesn't warn or error in this case. I think this is certainly a bug that should be fixed. The problem can come up in much more subtle cases too that are very hard for the user to understand. The good news is that this is a known problem that happens when doing dependency resolution and has a solution. The solution, which is referred to as backtracking dependency resolution, basically boils down to saving the state of the dependency resolver whenever you have multiple choices to resolve a dependency. Then if you reach a later point where there is a conflict, you can backtrack to the point where you made a choice and see if another option would resolve the conflict. I have some of the gory details, in Chapter 3.8.5 of my dissertation ( http://isis.poly.edu/~jcappos/papers/cappos_stork_dissertation_08.pdf ). There is also working Python code out there that shows how this should behave. (I implemented this as part of Stork, a package manager that was used for years in a large academic testbed. ) Thanks, Justin On Wed, Apr 15, 2015 at 7:09 AM, Robin Becker <robin@reportlab.com> wrote:
After again finding that pip doesn't have a correct dependency resolution solution a colleage and I discussed the nature of the problem. We examined the script capture of our install and it seems as though when presented with
level 0 A A level 1 1.4<= C
level 0 B B level 1 1.6<= C <1.7
pip manages to download version 1.8 of C(Django) using A's requirement, but never even warns us that the B requirement of C was violated. Surely even in the absence of a resolution pip could raise a warning at the end.
Anyhow after some discussion I realize I don't even know the name of the problem that pip should try to solve, is there some tree / graph problem that corresponds? Searching on dependency seems to lead to topological sorts of one kind or another, but here we seem to have nodes with discrete values attached so in the above example we might have (assuming only singleton A & B)
R --> A R --> B
A --> C-1.4 A --> C-1.6 A --> C-1.6.11 A --> C-1.7 A --> C-1.8
B --> C-1.6 B --> C-1.6.11
so looking at C equivalent nodes seems to allow a solution set. Are there any real problem descriptions / solutions to this kind of problem? -- Robin Becker _______________________________________________ Distutils-SIG maillist - Distutils-SIG@python.org https://mail.python.org/mailman/listinfo/distutils-sig
See also http://en.wikipedia.org/wiki/ZYpp On Wed, Apr 15, 2015 at 7:43 AM, Justin Cappos <jcappos@nyu.edu> wrote:
First of all, I'm surprised that pip doesn't warn or error in this case. I think this is certainly a bug that should be fixed. The problem can come up in much more subtle cases too that are very hard for the user to understand.
The good news is that this is a known problem that happens when doing dependency resolution and has a solution. The solution, which is referred to as backtracking dependency resolution, basically boils down to saving the state of the dependency resolver whenever you have multiple choices to resolve a dependency. Then if you reach a later point where there is a conflict, you can backtrack to the point where you made a choice and see if another option would resolve the conflict.
I have some of the gory details, in Chapter 3.8.5 of my dissertation ( http://isis.poly.edu/~jcappos/papers/cappos_stork_dissertation_08.pdf ). There is also working Python code out there that shows how this should behave. (I implemented this as part of Stork, a package manager that was used for years in a large academic testbed. )
Thanks, Justin
On Wed, Apr 15, 2015 at 7:09 AM, Robin Becker <robin@reportlab.com> wrote:
After again finding that pip doesn't have a correct dependency resolution solution a colleage and I discussed the nature of the problem. We examined the script capture of our install and it seems as though when presented with
level 0 A A level 1 1.4<= C
level 0 B B level 1 1.6<= C <1.7
pip manages to download version 1.8 of C(Django) using A's requirement, but never even warns us that the B requirement of C was violated. Surely even in the absence of a resolution pip could raise a warning at the end.
Anyhow after some discussion I realize I don't even know the name of the problem that pip should try to solve, is there some tree / graph problem that corresponds? Searching on dependency seems to lead to topological sorts of one kind or another, but here we seem to have nodes with discrete values attached so in the above example we might have (assuming only singleton A & B)
R --> A R --> B
A --> C-1.4 A --> C-1.6 A --> C-1.6.11 A --> C-1.7 A --> C-1.8
B --> C-1.6 B --> C-1.6.11
so looking at C equivalent nodes seems to allow a solution set. Are there any real problem descriptions / solutions to this kind of problem? -- Robin Becker _______________________________________________ Distutils-SIG maillist - Distutils-SIG@python.org https://mail.python.org/mailman/listinfo/distutils-sig
_______________________________________________ Distutils-SIG maillist - Distutils-SIG@python.org https://mail.python.org/mailman/listinfo/distutils-sig
Yes, it's another way to solve the problem. Both backtracking dependency resolution and ZYpp will always find a solution. The tradeoff is really in how they function. ZYpp is faster if there are a lot of dependencies that conflict. The backtracking dependency resolution used in Stork is much easier for the user to understand why it chose what it did. An aside: I'm not necessarily convinced that you need to solve this problem automatically, instead of just raising an error when it occurs. It should be quite rare in practice and as such may not be worth the complexity to have an automatic solution for the problem. Thanks, Justin On Wed, Apr 15, 2015 at 8:55 AM, Daniel Holth <dholth@gmail.com> wrote:
See also http://en.wikipedia.org/wiki/ZYpp
First of all, I'm surprised that pip doesn't warn or error in this case. I think this is certainly a bug that should be fixed. The problem can come up in much more subtle cases too that are very hard for the user to understand.
The good news is that this is a known problem that happens when doing dependency resolution and has a solution. The solution, which is referred to as backtracking dependency resolution, basically boils down to saving
state of the dependency resolver whenever you have multiple choices to resolve a dependency. Then if you reach a later point where there is a conflict, you can backtrack to the point where you made a choice and see if another option would resolve the conflict.
I have some of the gory details, in Chapter 3.8.5 of my dissertation ( http://isis.poly.edu/~jcappos/papers/cappos_stork_dissertation_08.pdf ). There is also working Python code out there that shows how this should behave. (I implemented this as part of Stork, a package manager that was used for years in a large academic testbed. )
Thanks, Justin
On Wed, Apr 15, 2015 at 7:09 AM, Robin Becker <robin@reportlab.com> wrote:
After again finding that pip doesn't have a correct dependency
resolution
solution a colleage and I discussed the nature of the problem. We examined the script capture of our install and it seems as though when presented with
level 0 A A level 1 1.4<= C
level 0 B B level 1 1.6<= C <1.7
pip manages to download version 1.8 of C(Django) using A's requirement, but never even warns us that the B requirement of C was violated. Surely even in the absence of a resolution pip could raise a warning at the end.
Anyhow after some discussion I realize I don't even know the name of the problem that pip should try to solve, is there some tree / graph problem that corresponds? Searching on dependency seems to lead to topological sorts of one kind or another, but here we seem to have nodes with discrete values attached so in the above example we might have (assuming only singleton A & B)
R --> A R --> B
A --> C-1.4 A --> C-1.6 A --> C-1.6.11 A --> C-1.7 A --> C-1.8
B --> C-1.6 B --> C-1.6.11
so looking at C equivalent nodes seems to allow a solution set. Are
On Wed, Apr 15, 2015 at 7:43 AM, Justin Cappos <jcappos@nyu.edu> wrote: the there
any real problem descriptions / solutions to this kind of problem? -- Robin Becker _______________________________________________ Distutils-SIG maillist - Distutils-SIG@python.org https://mail.python.org/mailman/listinfo/distutils-sig
_______________________________________________ Distutils-SIG maillist - Distutils-SIG@python.org https://mail.python.org/mailman/listinfo/distutils-sig
On 4/15/15 9:28 AM, Justin Cappos wrote:
Yes, it's another way to solve the problem. Both backtracking dependency resolution and ZYpp will always find a solution. The tradeoff is really in how they function. ZYpp is faster if there are a lot of dependencies that conflict. The backtracking dependency resolution used in Stork is much easier for the user to understand why it chose what it did.
An aside: I'm not necessarily convinced that you need to solve this problem automatically, instead of just raising an error when it occurs. It should be quite rare in practice and as such may not be worth the complexity to have an automatic solution for the problem.
ZYpp seems to assume that dependency resolution is an NP-complete problem (http://www.watzmann.net/blog/2005/11/package-installation-is-np-complete.htm...). I agree that we need not solve the problem just yet. It may be worthwhile to inspect packages on PyPI to see which package is unsatisfiable, but I am led to understand that this is difficult to do because most package metadata is in setup.py (runtime information). Where did you run into this problem, Robin?
On Wed, Apr 15, 2015 at 9:34 AM, Trishank Karthik Kuppusamy <tk47@nyu.edu> wrote:
On 4/15/15 9:28 AM, Justin Cappos wrote:
Yes, it's another way to solve the problem. Both backtracking dependency resolution and ZYpp will always find a solution. The tradeoff is really in how they function. ZYpp is faster if there are a lot of dependencies that conflict. The backtracking dependency resolution used in Stork is much easier for the user to understand why it chose what it did.
An aside: I'm not necessarily convinced that you need to solve this problem automatically, instead of just raising an error when it occurs. It should be quite rare in practice and as such may not be worth the complexity to have an automatic solution for the problem.
ZYpp seems to assume that dependency resolution is an NP-complete problem ( http://www.watzmann.net/blog/2005/11/package-installation- is-np-complete.html).
I agree that we need not solve the problem just yet. It may be worthwhile to inspect packages on PyPI to see which package is unsatisfiable, but I am led to understand that this is difficult to do because most package metadata is in setup.py (runtime information).
This is indeed the case. If you want to solve dependencies in a way that works well, you want an index that describes all your available package versions. While solving dependencies is indeed NP complete, they can be fairly fast in practice because of various specificities : each rule is generally only a few variables, and the rules have a specific form allowing for more efficient rule representation (e.g. "at most one of variable", etc...). In my experience, it is not more difficult than using graph-based algorithms, and FWIW, at Enthought, we are working on a pure python SAT solver for resolving dependencies, to solve some of those issues. I am actually hacking on it right at PyCon, we hope to have a first working version end of Q2, at which point it will be OSS, and reintegrated in my older project depsolver (https://github.com/enthought/depsolver). David
On 15 April 2015 at 11:15, David Cournapeau <cournape@gmail.com> wrote:
This is indeed the case. If you want to solve dependencies in a way that works well, you want an index that describes all your available package versions.
While solving dependencies is indeed NP complete, they can be fairly fast in practice because of various specificities : each rule is generally only a few variables, and the rules have a specific form allowing for more efficient rule representation (e.g. "at most one of variable", etc...). In my experience, it is not more difficult than using graph-based algorithms, and
FWIW, at Enthought, we are working on a pure python SAT solver for resolving dependencies, to solve some of those issues. I am actually hacking on it right at PyCon, we hope to have a first working version end of Q2, at which point it will be OSS, and reintegrated in my older project depsolver (https://github.com/enthought/depsolver).
Awesome! Then pip could use that in the near future :)
On Wed, Apr 15, 2015 at 11:32 AM, Trishank Karthik Kuppusamy < trishank@nyu.edu> wrote:
On 15 April 2015 at 11:15, David Cournapeau <cournape@gmail.com> wrote:
This is indeed the case. If you want to solve dependencies in a way that works well, you want an index that describes all your available package versions.
While solving dependencies is indeed NP complete, they can be fairly fast in practice because of various specificities : each rule is generally only a few variables, and the rules have a specific form allowing for more efficient rule representation (e.g. "at most one of variable", etc...). In my experience, it is not more difficult than using graph-based algorithms, and
FWIW, at Enthought, we are working on a pure python SAT solver for resolving dependencies, to solve some of those issues. I am actually hacking on it right at PyCon, we hope to have a first working version end of Q2, at which point it will be OSS, and reintegrated in my older project depsolver (https://github.com/enthought/depsolver).
Awesome! Then pip could use that in the near future :)
That's the goal. For various reasons, it ended up easier to develop the solver within our own package manager enstaller, but once done, extracting it as a separate library should not be too hard. It is for example designed to support various versioning schemes (for legacy reasons we can't use PEP440 just yet). Regarding speed, initial experiments showed that even for relatively deep graphs, the running time is taken outside the SAT solver (e.g. to generate the rules, you need to parse the version of every package you want to consider, and parsing 1000s of PEP440 versions is slow :) ). David
On 16 April 2015 at 04:52, David Cournapeau <cournape@gmail.com> wrote:
On Wed, Apr 15, 2015 at 11:32 AM, Trishank Karthik Kuppusamy <trishank@nyu.edu> wrote:
On 15 April 2015 at 11:15, David Cournapeau <cournape@gmail.com> wrote:
This is indeed the case. If you want to solve dependencies in a way that works well, you want an index that describes all your available package versions.
While solving dependencies is indeed NP complete, they can be fairly fast in practice because of various specificities : each rule is generally only a few variables, and the rules have a specific form allowing for more efficient rule representation (e.g. "at most one of variable", etc...). In my experience, it is not more difficult than using graph-based algorithms, and
FWIW, at Enthought, we are working on a pure python SAT solver for resolving dependencies, to solve some of those issues. I am actually hacking on it right at PyCon, we hope to have a first working version end of Q2, at which point it will be OSS, and reintegrated in my older project depsolver (https://github.com/enthought/depsolver).
Awesome! Then pip could use that in the near future :)
That's the goal. For various reasons, it ended up easier to develop the solver within our own package manager enstaller, but once done, extracting it as a separate library should not be too hard. It is for example designed to support various versioning schemes (for legacy reasons we can't use PEP440 just yet).
Regarding speed, initial experiments showed that even for relatively deep graphs, the running time is taken outside the SAT solver (e.g. to generate the rules, you need to parse the version of every package you want to consider, and parsing 1000s of PEP440 versions is slow :) ).
My intent was to use a simple backtracking enhancement to the existing pip code, because: - there is no index of dependencies today - many packages have no wheels, so we have to run arbitrary code in a new process to produce dependency metadata - so only considering the versions we have to process seems preferrable. Are you working on integrating your thing into PIP? My current work queue is roughly: - declarative setup_requires and install_requires/extra_requires support in pip and setuptools - handling conflicts with already installed packages (https://github.com/pypa/pip/issues/2687) - then teach pip how to backtrack If you're actively working on integrating it into pip, cool. -Rob -- Robert Collins <rbtcollins@hp.com> Distinguished Technologist HP Converged Cloud
I guess I should provide the code for what I've done in this space also. Yes, the problem is NP hard, so in both cases, the system may run really slowly. In practice, you don't tend to have a lot of conflicts you need to resolve during a package install though. So I'd argue that optimizing for the case where you have a huge number of conflicts doesn't really matter. I've put the code for Stork up at: https://github.com/JustinCappos/stork It almost certainly still runs, but hasn't been used in production for about 4 years. The code which does the backtracking dependency resolution is in the satisfy function here: https://github.com/JustinCappos/stork/blob/master/python/storkdependency.py#... (FYI: Stork was the very first Python program I wrote after completing the Python tutorial, so apologies for the style and mess. I can clean up the code if needed, but I don't imagine Stork's code would be useful as more than a reference.) However, I'd just like to reiterate that it would be good to check that pip really needs a solution to automatically resolve conflicts, whether with a SAT solver or backtracking. (Stork did something atypical with the way trust and conflicts were specified, so I think it made more sense for our user base.) Perhaps it's worth measuring the scope and severity of the problem first. In the meantime, is someone planning to work on a patch to fix the conflict detection issue in pip? Thanks, Justin On Wed, Apr 15, 2015 at 11:15 AM, David Cournapeau <cournape@gmail.com> wrote:
On Wed, Apr 15, 2015 at 9:34 AM, Trishank Karthik Kuppusamy <tk47@nyu.edu> wrote:
On 4/15/15 9:28 AM, Justin Cappos wrote:
Yes, it's another way to solve the problem. Both backtracking dependency resolution and ZYpp will always find a solution. The tradeoff is really in how they function. ZYpp is faster if there are a lot of dependencies that conflict. The backtracking dependency resolution used in Stork is much easier for the user to understand why it chose what it did.
An aside: I'm not necessarily convinced that you need to solve this problem automatically, instead of just raising an error when it occurs. It should be quite rare in practice and as such may not be worth the complexity to have an automatic solution for the problem.
ZYpp seems to assume that dependency resolution is an NP-complete problem (http://www.watzmann.net/blog/2005/11/package-installation- is-np-complete.html).
I agree that we need not solve the problem just yet. It may be worthwhile to inspect packages on PyPI to see which package is unsatisfiable, but I am led to understand that this is difficult to do because most package metadata is in setup.py (runtime information).
This is indeed the case. If you want to solve dependencies in a way that works well, you want an index that describes all your available package versions.
While solving dependencies is indeed NP complete, they can be fairly fast in practice because of various specificities : each rule is generally only a few variables, and the rules have a specific form allowing for more efficient rule representation (e.g. "at most one of variable", etc...). In my experience, it is not more difficult than using graph-based algorithms, and
FWIW, at Enthought, we are working on a pure python SAT solver for resolving dependencies, to solve some of those issues. I am actually hacking on it right at PyCon, we hope to have a first working version end of Q2, at which point it will be OSS, and reintegrated in my older project depsolver (https://github.com/enthought/depsolver).
David
On 15/04/2015 14:34, Trishank Karthik Kuppusamy wrote:
On 4/15/15 9:28 AM, Justin Cappos wrote: .........
ZYpp seems to assume that dependency resolution is an NP-complete problem (http://www.watzmann.net/blog/2005/11/package-installation-is-np-complete.htm...).
yes I deduced that this must be some kind of satisfiability problem although what the mapping is escapes me right now. Various hints in the stork work seem to imply having the version info (and presumably other meta data including requirements) stored in fast access some how so that the computations can be done fast.
I agree that we need not solve the problem just yet. It may be worthwhile to inspect packages on PyPI to see which package is unsatisfiable, but I am led to understand that this is difficult to do because most package metadata is in setup.py (runtime information).
Where did you run into this problem, Robin?
this is the reportlab website repository where we have a top level requirements file for whatever reason that contains a git+https package from github django-mailer 0.2a1 which has open ended requirement 1.4<=Django, later in the top level requirements there's another package that wants 1.6<=Django<1.7, and this package coming after never gets a look in so far as Django is concerned. After looking at the trace and discovering the problem the fix is to move the open ended requirement after the closed one. It would have been easier to discover the problem if pip had flagged the unsatisfied requirement eg !!!!!!!!!!! Django 1.8 loaded because of django-mailer-0.2a1 conflicts with 1.6<=Django<1.7 required by docengine-0.6.29 but how hard that is I don't know; certainly pip must inspect the requirements so it could record the facts and possible spit out the first n conflicts. -- Robin Becker
Thanks to Justin & Daniel for some insights. On 15/04/2015 14:28, Justin Cappos wrote:
Yes, it's another way to solve the problem. Both backtracking dependency resolution and ZYpp will always find a solution. The tradeoff is really in how they function. ZYpp is faster if there are a lot of dependencies that conflict. The backtracking dependency resolution used in Stork is much easier for the user to understand why it chose what it did.
An aside: I'm not necessarily convinced that you need to solve this problem automatically, instead of just raising an error when it occurs. It should be quite rare in practice and as such may not be worth the complexity to have an automatic solution for the problem.
I think you are probably right. Not sure how deep the dependency graph can be, but there could be quite a few candidates to check and presumably they have to be downloaded and the dependency information obtained for each unique branch of the decision tree.
Thanks, Justin
On Wed, Apr 15, 2015 at 8:55 AM, Daniel Holth <dholth@gmail.com> wrote:
See also http://en.wikipedia.org/wiki/ZYpp
On Wed, Apr 15, 2015 at 7:43 AM, Justin Cappos <jcappos@nyu.edu> wrote:
First of all, I'm surprised that pip doesn't warn or error in this case. I ........
-- Robin Becker
level 0 A A level 1 1.4<= C
level 0 B B level 1 1.6<= C <1.7
pip manages to download version 1.8 of C(Django) using A's requirement, but never even warns us that the B requirement of C was violated. Surely even in the absence of a resolution pip could raise a warning at the end.
agreed on the warning, but there is a documented workaround for this, that is to put the desired constraint for C at level 0 (i.e. in the install args or in your requirements file) see case #2 here https://pip.pypa.io/en/latest/user_guide.html#requirements-files
On 15/04/2015 16:49, Marcus Smith wrote: ..........
agreed on the warning, but there is a documented workaround for this, that is to put the desired constraint for C at level 0 (i.e. in the install args or in your requirements file)
see case #2 here https://pip.pypa.io/en/latest/user_guide.html#requirements-files
indeed pushing all the requirements to level 0 is a way to solve the dependency problem myself. Without knowledge that a problem existed I didn't do this first time around so unless I check all the dependencies for the installed packages I won't know if another one should be pushed to level 0. With a warning I am at least alerted to the issue, without one I depend on bugs happening and then need to figure out the problems myself. -- Robin Becker
On Wed, Apr 15, 2015 at 8:56 AM, Robin Becker <robin@reportlab.com> wrote:
On 15/04/2015 16:49, Marcus Smith wrote: ..........
agreed on the warning, but there is a documented workaround for this, that is to put the desired constraint for C at level 0 (i.e. in the install args or in your requirements file)
see case #2 here https://pip.pypa.io/en/latest/user_guide.html#requirements-files
indeed pushing all the requirements to level 0 is a way to solve the
dependency problem myself.
well, not all, just the cases where an override to the default "first found, wins" routine, is needed again, I agree a warning would be a great, just noting that people can work around this (once they realize there's a problem), without having to manually order things (as you mentioned in your last post)
On 2015-04-15 12:09:04 +0100 (+0100), Robin Becker wrote:
After again finding that pip doesn't have a correct dependency resolution solution a colleage and I discussed the nature of the problem. [...]
Before the discussion of possible solutions heads too far afield, it's worth noting that this was identified years ago and has a pending feature request looking for people pitching in on implementation. It's perhaps better discussed at https://github.com/pypa/pip/issues/988 so as to avoid too much repetition. -- Jeremy Stanley
Okay, I tried to summarize the discussion and most of my thoughts on that issue. https://github.com/pypa/pip/issues/988 I'll post anything further I have to say there. I hope to get a student to measure the extent of the problem... Thanks, Justin On Wed, Apr 15, 2015 at 2:44 PM, Jeremy Stanley <fungi@yuggoth.org> wrote:
On 2015-04-15 12:09:04 +0100 (+0100), Robin Becker wrote:
After again finding that pip doesn't have a correct dependency resolution solution a colleage and I discussed the nature of the problem. [...]
Before the discussion of possible solutions heads too far afield, it's worth noting that this was identified years ago and has a pending feature request looking for people pitching in on implementation. It's perhaps better discussed at https://github.com/pypa/pip/issues/988 so as to avoid too much repetition. -- Jeremy Stanley _______________________________________________ Distutils-SIG maillist - Distutils-SIG@python.org https://mail.python.org/mailman/listinfo/distutils-sig
IIRC, Conda (BSD License) takes a SAT solving (e.g. Sudoku) approach: http://continuum.io/blog/new-advances-in-conda (such as installing "pycosat" (MIT License) when I install conda). Some links to the source: * https://github.com/conda/conda/blob/master/conda/logic.py * https://github.com/conda/conda/blob/master/tests/test_logic.py' * https://github.com/conda/conda/blob/master/conda/resolve.py * https://github.com/conda/conda/blob/master/tests/test_resolve.py ... https://github.com/conda/conda/blob/master/conda/toposort.py On Apr 15, 2015 6:14 AM, "Robin Becker" <robin@reportlab.com> wrote:
After again finding that pip doesn't have a correct dependency resolution solution a colleage and I discussed the nature of the problem. We examined the script capture of our install and it seems as though when presented with
level 0 A A level 1 1.4<= C
level 0 B B level 1 1.6<= C <1.7
pip manages to download version 1.8 of C(Django) using A's requirement, but never even warns us that the B requirement of C was violated. Surely even in the absence of a resolution pip could raise a warning at the end.
Anyhow after some discussion I realize I don't even know the name of the problem that pip should try to solve, is there some tree / graph problem that corresponds? Searching on dependency seems to lead to topological sorts of one kind or another, but here we seem to have nodes with discrete values attached so in the above example we might have (assuming only singleton A & B)
R --> A R --> B
A --> C-1.4 A --> C-1.6 A --> C-1.6.11 A --> C-1.7 A --> C-1.8
B --> C-1.6 B --> C-1.6.11
so looking at C equivalent nodes seems to allow a solution set. Are there any real problem descriptions / solutions to this kind of problem? -- Robin Becker _______________________________________________ Distutils-SIG maillist - Distutils-SIG@python.org https://mail.python.org/mailman/listinfo/distutils-sig
participants (10)
-
Daniel Holth
-
David Cournapeau
-
Jeremy Stanley
-
Justin Cappos
-
Marcus Smith
-
Robert Collins
-
Robin Becker
-
Trishank Karthik Kuppusamy
-
Trishank Karthik Kuppusamy
-
Wes Turner