On Jul 3, 2016 1:45 PM, "Paul Moore" <p.f.moore@gmail.com> wrote:
[...]
Furthermore, pip/setuptools are just getting to the point of allowing for dependencies conditional on Python version. If independent stdlib releases were introduced, we'd need to implement dependencies based on stdlib version as well - consider depending on a backport of a new module if the user has an older stdlib version that doesn't include it.
Regarding this particular point: right now, yeah, there's an annoying thing where you have to know that a dependency on stdlib/backported library X has to be written as "X >= 1.0 [py_version <= 3.4]" or whatever, and every package with this dependency has to encode some complicated indirect knowledge of what versions of X ship with what versions of python. (And life is even more complicated if you want to support pypy/jython/..., who are generally shipping manually maintained stdlib forks, and whose nominal "python version equivalent" is only an approximation.) In the extreme, one can imagine a module like typing still being distributed as part of the standard python download, BUT not in the stdlib, but rather as a "preinstalled package" in site-packages/ that could then be upgraded normally after install. In addition to whatever maintenance advantages this might (or might not) have, with regards to Paul's concerns this would actually be a huge improvement, since if a package needs typing 1.3 or whatever then they could just declare that, without having to know a priori which versions of python shipped which version. (Note that linux distributions already split up the stdlib into pieces, and you're not guaranteed to have all of it available.) Or if we want to be less aggressive and keep the stdlib monolithic, then it would still be great if there were some .dist-info metadata somewhere that said "this version of the stdlib provides typing 1.3, asyncio 1.4, ...". I haven't thought through all the details of how this would work and how pip could best take advantage, though. -n