Re: Update PEP 508 to allow version specifiers
On 15 Mar 2019, at 21:47, Simon Ruggier <simon80@gmail.com> wrote:
The packages have to be available online for dependency links to work, yes, but they're not public: one needs to authenticate with an SSH key to clone each repository.
Both --find-links and --index-url (or --extra-index-url) provide the possibility to secure the download with authentication (HTTP Basic Auth).
On March 13, 2019 2:09:03 PM UTC, Tzu-ping Chung <uranusjr@gmail.com> wrote:
On 6 Mar 2019, at 03:53, Simon <simon80@gmail.com> wrote:
I hope it's not an issue that I'm replying to a month-old thread. I reviewed the previous discussion to try to avoid duplicating any of it.
When using pip with PyPI, calling pip a second time is much quicker than the first time, because it verifies that the requirements, including version constraints, are satisfied in the target environment and doesn't needlessly reinstall stuff.
Dependency links allowed the same behaviour to be implemented for private packages with dependencies on other private repositories: given a requirement B >= 3 and a dependency link that B was available from, pip could check if the environment already includes a package B with a new enough version, and only use the dependency link as a fallback if the requirement isn't already satisfied.
URL specifiers aren't useful for providing a fallback location to get a package from, because using one prevents the package from specifying a version constraint in the same way that was possible with dependency links, or with normal requirements available from PyPI. Curiously, discussion of version constraints in this thread has focused on how nonsensical it would be to compare them to the specifying URL, ignoring the possibility of comparing the constraint with the target environment.
The loss of this functionality means that anyone who was previously using pip to automatically install private packages with private dependencies now has to either forgo automatic dependency management (a large part of why one would use a package manager to begin with) in favour of recursively specified requirements files, publish their private packages somewhere so that pip can find them, or stick with pip 18.1 for now.
Wouldn’t you still need to “publish the private packages somewhere” for dependency links to work? `setup.py sdist` with `pip --find-links` can get you very far; the only differences IMO is you have to provide a proper package, and write a simple HTML file to point to it.
-- Distutils-SIG mailing list -- distutils-sig@python.org To unsubscribe send an email to distutils-sig-leave@python.org https://mail.python.org/mailman3/lists/distutils-sig.python.org/ Message archived at https://mail.python.org/archives/list/distutils-sig@python.org/message/BALD2...
Admittedly, HTTP Basic Auth is better than nothing, but it's hard to get excited about setting that up and either managing an extra set of credentials, or hooking it up to a single-signon system, when the SSH-based authentication is already set up to control access to the source repositories. On March 16, 2019 1:31:49 AM UTC, Tzu-ping Chung <uranusjr@gmail.com> wrote:
On 15 Mar 2019, at 21:47, Simon Ruggier <simon80@gmail.com> wrote:
The packages have to be available online for dependency links to work, yes, but they're not public: one needs to authenticate with an SSH key to clone each repository.
Both --find-links and --index-url (or --extra-index-url) provide the possibility to secure the download with authentication (HTTP Basic Auth).
On March 13, 2019 2:09:03 PM UTC, Tzu-ping Chung <uranusjr@gmail.com> wrote:
On 6 Mar 2019, at 03:53, Simon <simon80@gmail.com> wrote:
I hope it's not an issue that I'm replying to a month-old thread. I reviewed the previous discussion to try to avoid duplicating any of it.
When using pip with PyPI, calling pip a second time is much quicker than the first time, because it verifies that the requirements, including version constraints, are satisfied in the target environment and doesn't needlessly reinstall stuff.
Dependency links allowed the same behaviour to be implemented for private packages with dependencies on other private repositories: given a requirement B >= 3 and a dependency link that B was available from, pip could check if the environment already includes a package B with a new enough version, and only use the dependency link as a fallback if the requirement isn't already satisfied.
URL specifiers aren't useful for providing a fallback location to get a package from, because using one prevents the package from specifying a version constraint in the same way that was possible with dependency links, or with normal requirements available from PyPI. Curiously, discussion of version constraints in this thread has focused on how nonsensical it would be to compare them to the specifying URL, ignoring the possibility of comparing the constraint with the target environment.
The loss of this functionality means that anyone who was previously using pip to automatically install private packages with private dependencies now has to either forgo automatic dependency management (a large part of why one would use a package manager to begin with) in favour of recursively specified requirements files, publish their private packages somewhere so that pip can find them, or stick with pip 18.1 for now.
Wouldn’t you still need to “publish the private packages somewhere” for dependency links to work? `setup.py sdist` with `pip --find-links` can get you very far; the only differences IMO is you have to provide a proper package, and write a simple HTML file to point to it.
--
Distutils-SIG mailing list -- distutils-sig@python.org To unsubscribe send an email to distutils-sig-leave@python.org https://mail.python.org/mailman3/lists/distutils-sig.python.org/ Message archived at https://mail.python.org/archives/list/distutils-sig@python.org/message/BALD2...
participants (2)
-
Simon
-
Tzu-ping Chung