On Wed, 20 Feb 2019 at 22:40, Alex Walters
I have 2 main concerns about PEP 582 that might just be me misunderstanding the pep.
My first concern is the use of CWD, and prepending ./_pypackages_ for scripts. For example, if you were in a directory with a _pypackages_ subdirectory, and had installed the module "super.important.module". My understanding is that any scripts you run will have "super.important.module" available to it before the system site-packages directory. Say you also run "/usr/bin/an_apt-ly_named_python_script" that uses "super.important.module" (and there is no _pypackages_ subdirectory in /usr/bin). You would be shadowing "super.important.module".
From https://www.python.org/dev/peps/pep-0582/#specification: "In case of Python scripts, Python will try to find __pypackages__ in the same directory as the script." That behaviour then gets covered again in https://www.python.org/dev/peps/pep-0582/#security-considerations The initial paragraphs in the specification section are only talking about the cases where sys.path[0] is already getting set to the current working directory (e.g. the interactive prompt).
My second concern is a little more... political.
This pep does not attempt to cover all the use-cases of virtualenvs - which is understandable. However, this also means that we have to teach new users *both* right away in order to get them up and running, and teach them the complexities of both, and when to use one over the other. Instead of making it easier for the new user, this pep makes it harder. This also couldn't have come at a worse time with the growing use of pipenv which provides a fully third way of thinking about application dependencies (yes, pipenv uses virtualenvs under the hood, but it is a functionally different theory of operation from a user standpoint compared to traditional pip/virtualenv or this pep).
Is it really a good idea to do this pep at this time?
One of the potential amendments to the PEP is to have it simply be a more visible alternative naming scheme for the existing `.venv` convention (which `pipenv` supports via the PIPENV_VENV_IN_PROJECT environment setting). If that option gets pursued, then the key new behaviours would be: 1. Interpreters auto-activating the __pypackages__ venv on startup (without requiring `pipenv run` or a functional equivalent) 2. Installers auto-targeting the __pypackages__ venv at install time (without requiring `pipenv install` or a functional equivalent) That way, for folks that already understand virtual environments, the explanation is exactly that: "It's like .venv, but interpreters activate it automatically, and installers target it automatically" Whereas for folks that *don't* already understand virtual environments, the benefits are those described in the PEP: we can just tell people "When you have a __pypackages__ subdirectory in the current directory, anything you install will be installed there by default, and then be available for import when running Python from that directory". The other potential new behaviour which the PEP implies in its examples but doesn't currently spell out in the text is that it could potentially be defined in a way that better supports multiple interpreter versions sharing the same subdirectory (i.e. rather than being associated with a specific Python interpreter installation the way .venv is, the __pypackages__ directory may instead have separate subdirectories for each X.Y Python version). That would still be similar in spirit to `.venv`, it would just be natively version-aware. The benefit of doing that is better handling of situations where the required dependencies vary based on the Python version in use, while the downside is that you potentially end up with multiple copies of the dependencies, even when they could have been shared without any problems. Cheers, Nick. -- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia