
(I originally posted this to python-ideas, where I was told none of this PEP's authors subscribe so probably no one will see it there, so I'm posting it here to raise the issue where it can get seen and hopefully discussed) While the PEP does show the version number as part of the path to the actual packages, implying support for multiple versions, this doesn't seem to be spelled out in the actual text. Presumably __pypackages__/3.8/ might sit beside __pypackages__/3.9/, etc. to keep future versions capable of installing packages for each version, the way virtualenv today is bound to one version of Python. I'd like to raise a potential edge case that might be a problem, and likely an increasingly common one: users with multiple installations of the *same* version of Python. This is actually a common setup for Windows users who use WSL, Microsoft's Linux-on-Windows solution, as you could have both the Windows and Linux builds of a given Python version installed on the same machine. The currently implied support for multiple versions would not be able to separate these and could create problems if users pip install a Windows binary package through Powershell and then try to run a script in Bash from the same directory, causing the Linux version of Python to try to use Windows python packages. I'm not actually sure what the solution here is. Mostly I wanted to raise the concern, because I'm very keen on WSL being a great entry path for new developers and I want to make that a better experience, not a more confusing one. Maybe that version number could include some other unique identify, maybe based on Python's own executable. A hash maybe? I don't know if anything like that already exists to uniquely identify a Python build or installation. -- CALVIN SPEALMAN SENIOR QUALITY ENGINEER cspealma@redhat.com M: +1.336.210.5107 <https://red.ht/sig> TRIED. TESTED. TRUSTED. <https://redhat.com/trusted>

On 02Apr2019 0817, Calvin Spealman wrote:
(I originally posted this to python-ideas, where I was told none of this PEP's authors subscribe so probably no one will see it there, so I'm posting it here to raise the issue where it can get seen and hopefully discussed)
Correct, thanks for posting. (I thought we had a "discussions-to" tag with distutils-sig on it, but apparently not.)
While the PEP does show the version number as part of the path to the actual packages, implying support for multiple versions, this doesn't seem to be spelled out in the actual text. Presumably __pypackages__/3.8/ might sit beside __pypackages__/3.9/, etc. to keep future versions capable of installing packages for each version, the way virtualenv today is bound to one version of Python.
I'd like to raise a potential edge case that might be a problem, and likely an increasingly common one: users with multiple installations of the *same* version of Python. This is actually a common setup for Windows users who use WSL, Microsoft's Linux-on-Windows solution, as you could have both the Windows and Linux builds of a given Python version installed on the same machine. The currently implied support for multiple versions would not be able to separate these and could create problems if users pip install a Windows binary package through Powershell and then try to run a script in Bash from the same directory, causing the Linux version of Python to try to use Windows python packages.
I'm not actually sure what the solution here is. Mostly I wanted to raise the concern, because I'm very keen on WSL being a great entry path for new developers and I want to make that a better experience, not a more confusing one. Maybe that version number could include some other unique identify, maybe based on Python's own executable. A hash maybe? I don't know if anything like that already exists to uniquely identify a Python build or installation.
Yes, this is a situation we're aware of, and it's caught in the conflict of "who is this feature meant to support". Since all platforms have a unique extension module suffix (e.g. "module.cp38-win32.pyd"), it would be possible to support this with "fat" packages that include all binaries (or some clever way of merging wheels for multiple platforms). And since this is already in CPython itself, it leads to about the only reasonable solution - instead of "3.8", use the extension module suffix "cp38-win32". (Wheel tags are not in core CPython, so we can't use those.) But while this seems obvious, it also reintroduces problems that this has the potential to fix - suddenly, just like installing into your global environment, your packages are not project-specific anymore but are Python-specific. Which is one of the major confusions people run into ("I pip installed X but now can't import it in python"). So the main points of discussion right now are "whose problem does this solve" and "when do we tell people they need a full venv". And that discussion is mostly happening at https://discuss.python.org/t/pep-582-python-local-packages-directory/963/ Cheers, Steve

On 02.04.19 18:10, Steve Dower wrote:
On 02Apr2019 0817, Calvin Spealman wrote:
(I originally posted this to python-ideas, where I was told none of this PEP's authors subscribe so probably no one will see it there, so I'm posting it here to raise the issue where it can get seen and hopefully discussed)
Correct, thanks for posting. (I thought we had a "discussions-to" tag with distutils-sig on it, but apparently not.)
While the PEP does show the version number as part of the path to the actual packages, implying support for multiple versions, this doesn't seem to be spelled out in the actual text. Presumably __pypackages__/3.8/ might sit beside __pypackages__/3.9/, etc. to keep future versions capable of installing packages for each version, the way virtualenv today is bound to one version of Python.
I'd like to raise a potential edge case that might be a problem, and likely an increasingly common one: users with multiple installations of the *same* version of Python. This is actually a common setup for Windows users who use WSL, Microsoft's Linux-on-Windows solution, as you could have both the Windows and Linux builds of a given Python version installed on the same machine. The currently implied support for multiple versions would not be able to separate these and could create problems if users pip install a Windows binary package through Powershell and then try to run a script in Bash from the same directory, causing the Linux version of Python to try to use Windows python packages.
I'm not actually sure what the solution here is. Mostly I wanted to raise the concern, because I'm very keen on WSL being a great entry path for new developers and I want to make that a better experience, not a more confusing one. Maybe that version number could include some other unique identify, maybe based on Python's own executable. A hash maybe? I don't know if anything like that already exists to uniquely identify a Python build or installation.
Yes, this is a situation we're aware of, and it's caught in the conflict of "who is this feature meant to support".
This smells the same like mixing system installed python packages (deb/rpm) with one managed by pip, and pip touching system installed packages.
Since all platforms have a unique extension module suffix (e.g. "module.cp38-win32.pyd"), it would be possible to support this with "fat" packages that include all binaries (or some clever way of merging wheels for multiple platforms).
unfortunately not. The Android developers opted out of that, reverting that change. Also how would you differentiate win32 builds for different architectures? But maybe this is already done.
And since this is already in CPython itself, it leads to about the only reasonable solution - instead of "3.8", use the extension module suffix "cp38-win32". (Wheel tags are not in core CPython, so we can't use those.)
But while this seems obvious, it also reintroduces problems that this has the potential to fix - suddenly, just like installing into your global environment, your packages are not project-specific anymore but are Python-specific. Which is one of the major confusions people run into ("I pip installed X but now can't import it in python").
So the main points of discussion right now are "whose problem does this solve" and "when do we tell people they need a full venv". And that discussion is mostly happening at https://discuss.python.org/t/pep-582-python-local-packages-directory/963/
Cheers, Steve _______________________________________________ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/doko%40ubuntu.com

I'd like to raise a potential edge case that might be a problem, and likely an increasingly common one: users with multiple installations of the *same* version of Python.
I would suggest that that use case is best addressed by a system that isolates the entire python environment, such as conda.
This is actually a common setup for Windows users who use WSL, Microsoft's Linux-on-Windows solution, as you could have both the Windows and Linux builds of a given Python version installed on the same machine.
Sure, but Isn’t the WSL subsystem pretty isolated already? Would native Windows and WSL users be running in the same dir? That being said, I’m pretty skeptical of the PEP — I understand the motivation — I make a point of avoiding virtual environments in my intro classes, but at some point folks will need to learn them. I’ve had students think that virtualenv was a part of (or required by) e.g. flask, because the tutorials include it in the setup. But I think environments really need to be more distinct, not less, I’m quite concerned about mingling them in one place. Maybe I’m reading it wrong, but it seems that this could create serious clashes with other “environment” systems, such as conda. I suppose one could say: “don’t do that” — I.e. don’t create a __pypackages__ dir if you are going to use conda — but many folks want the same source to be runnable in multiple “styles” of Python. Also, I see a major benefit for teaching, but it does go a bit against my philosophy of not hiding important details from newbies — that is, don’t teach using an approach that is not suitable for production. And newbies could be really confused by the fact that pip installs stuff differently depending on what dir they are in and what is in that dir. The PEP is listed as a draft — anyone know what’s going on with it? -CHB

Sorry somehow missed Steve Dower's post: that discussion is mostly happening at https://discuss.python.org/t/pep-582-python-local-packages-directory/963/ I"ll go there to comment. -CHB On Thu, Apr 4, 2019 at 9:02 AM Chris Barker - NOAA Federal < chris.barker@noaa.gov> wrote:
I'd like to raise a potential edge case that might be a problem, and likely an increasingly common one: users with multiple installations of the *same* version of Python.
I would suggest that that use case is best addressed by a system that isolates the entire python environment, such as conda.
This is actually a common setup for Windows users who use WSL, Microsoft's Linux-on-Windows solution, as you could have both the Windows and Linux builds of a given Python version installed on the same machine.
Sure, but Isn’t the WSL subsystem pretty isolated already? Would native Windows and WSL users be running in the same dir?
That being said, I’m pretty skeptical of the PEP — I understand the motivation — I make a point of avoiding virtual environments in my intro classes, but at some point folks will need to learn them.
I’ve had students think that virtualenv was a part of (or required by) e.g. flask, because the tutorials include it in the setup.
But I think environments really need to be more distinct, not less, I’m quite concerned about mingling them in one place.
Maybe I’m reading it wrong, but it seems that this could create serious clashes with other “environment” systems, such as conda.
I suppose one could say: “don’t do that” — I.e. don’t create a __pypackages__ dir if you are going to use conda — but many folks want the same source to be runnable in multiple “styles” of Python.
Also, I see a major benefit for teaching, but it does go a bit against my philosophy of not hiding important details from newbies — that is, don’t teach using an approach that is not suitable for production.
And newbies could be really confused by the fact that pip installs stuff differently depending on what dir they are in and what is in that dir.
The PEP is listed as a draft — anyone know what’s going on with it?
-CHB
-- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker@noaa.gov

Hi, Le mar. 2 avr. 2019 à 17:20, Calvin Spealman <cspealma@redhat.com> a écrit :
While the PEP does show the version number as part of the path to the actual packages, implying support for multiple versions, this doesn't seem to be spelled out in the actual text. Presumably __pypackages__/3.8/ might sit beside __pypackages__/3.9/, etc. to keep future versions capable of installing packages for each version, the way virtualenv today is bound to one version of Python.
I'd like to raise a potential edge case that might be a problem, and likely an increasingly common one: users with multiple installations of the *same* version of Python.
Hum, I don't know if it's relevant to support multiple Python binaries of the same Python version, but just in case, let me share my experience with that in the pyperformance project. The pyperformance project uses virtual environment for two binaries of the exact Python version (and usually the same path!): one unpatched "reference" and one "patched" binary, to experiment an optimization. I needed a way to build a short text identifier to still be able to get a "cached" virtual environment per Python binary. I wrote a short code to generate the identifier using: * pyperformance version * requirements.txt * sys.executable * sys.version * sys.version_info * sys.implementation.name of platform.python_implementation() The script builds a long string using these info, hash it with SHA1 and take first 12 characters of the hexadecimal format of the hash. Script: --- import hashlib import platform import sys performance_version = sys.argv[1] requirements = sys.argv[2] data = performance_version + sys.executable + sys.version pyver= sys.version_info if hasattr(sys, 'implementation'): # PEP 421, Python 3.3 implementation = sys.implementation.name else: implementation = platform.python_implementation() implementation = implementation.lower() if not isinstance(data, bytes): data = data.encode('utf-8') with open(requirements, 'rb') as fp: data += fp.read() sha1 = hashlib.sha1(data).hexdigest() name = ('%s%s.%s-%s' % (implementation, pyver.major, pyver.minor, sha1[:12])) print(name) --- Examples: $ touch requirements.txt # empty file $ python3.7 x.py version requirements.txt cpython3.7-502d35b8e005 $ python3.6 x.py version requirements.txt cpython3.6-7f4febbec0be $ python3 x.py version requirements.txt cpython3.7-59ab636dfacb $ file /usr/bin/python3 /usr/bin/python3: symbolic link to python3.7 Hum, python3 and python3.7 produce the different hash whereas it's the same binary. Maybe os.path.realpath() should be called on sys.executable :-) Victor -- Night gathers, and now my watch begins. It shall not end until my death.
participants (6)
-
Calvin Spealman
-
Chris Barker
-
Chris Barker - NOAA Federal
-
Matthias Klose
-
Steve Dower
-
Victor Stinner