Three clarification questions about PEP 425 and PyPy3
I noticed that for PyPy3, the tag triples considered compatible were (roughly; trimmed out the long list of macOS versions): [('pp360', 'pypy3_60', 'macosx_10_13_x86_64'), ('pp360', 'none', 'macosx_10_13_x86_64'), ('py3', 'none', 'macosx_10_13_x86_64'), ('pp360', 'none', 'any'), ('pp3', 'none', 'any'), ('py360', 'none', 'any'), ('py3', 'none', 'any')] Now the first question I have is about ('pp3', 'none', 'any'). Is this meant to be a generic thing for any interpreter of the interpreter implementation and major version, or is this special to CPython and PyPy3? Question two is why isn't there a ('py35', 'none', 'any') or ('py34', 'none', 'any') and older to py30 after py3 like there is for CPython? Seems like if they are just source then they should be compatible as much as CPython. Question three is why isn't there a ('py35', 'none', 'macosx_10_13_x86_64') for PyPy3 or CPython 3.7? I can't figure out what a Python- and platform-specific wheel but agnostic to API wouldn't ever work? And I'm assuming ('py360', 'none', 'any'), isn't legitimate since that makes no sense. ;)
And to help in getting a reply, here is the trimmed-down results for CPython 3.7 to compare against: [('cp37', 'cp37m', 'macosx_10_13_x86_64'), … ('cp37', 'abi3', 'macosx_10_13_x86_64'), … ('cp37', 'none', 'macosx_10_13_x86_64'), … ('cp36', 'abi3', 'macosx_10_13_x86_64'), … ('cp35', 'abi3', 'macosx_10_13_x86_64'), … ('cp34', 'abi3', 'macosx_10_13_x86_64'), … ('cp33', 'abi3', 'macosx_10_13_x86_64'), … ('cp32', 'abi3', 'macosx_10_13_x86_64'), … ('py3', 'none', 'macosx_10_13_x86_64'), … ('cp37', 'none', 'any'), ('cp3', 'none', 'any'), ('py37', 'none', 'any'), ('py3', 'none', 'any'), ('py36', 'none', 'any'), ('py35', 'none', 'any'), ('py34', 'none', 'any'), ('py33', 'none', 'any'), ('py32', 'none', 'any'), ('py31', 'none', 'any'), ('py30', 'none', 'any')] So, it re-iterate the questions: 1. What is ('pp3', 'none', 'any') supposed to represent for PyPy3? Since the version of the interpreter is PyPy3 6.0 the lack of major version number seems like a bug more than a purposeful interpreter version (and there's only a single project -- cliquet <https://pypi.org/project/cliquet/3.1.1/#files> -- that has a wheel that's compatible with that tag triple and it's not even for their latest release). 2. Why does CPython have (*, 'none', 'any') from the version of the interpreter down to Python 3.0 plus generically Python 3 while PyPy3 only gets generic Python 3? 3. Why isn't (*, 'none', platform) listed from Python 3.7 to 3.0 for either CPython or PyPy3? I understand not iterating through all versions when an ABI is involved (without knowing exactly which versions are compatible like abi3), but this triple seems safe to iterate through as a fallback just as much as (*, 'none', 'any'). Maybe because it's too ambiguous to know how important such a fallback would be between e.g. ('py36', 'none', 'macosx_10_13_x86_64') and ('py37', 'none', 'any'), and so why bother when the older version triples are there just for a safety net to have at least some chance of a match? 4. I still think ('py360', 'none', 'any') is a bug. ;) P.S.: The ('py3', 'none', 'macosx_10_13_x86_64') triple being between e.g. ('pp360', 'none', 'macosx_10_13_x86_64') and ('pp360', 'none', 'any') is really messing with my head and making the code to generate supported triples a bit less elegant. ;) On Sat, 25 Aug 2018 at 15:03 Brett Cannon <brett@python.org> wrote:
I noticed that for PyPy3, the tag triples considered compatible were (roughly; trimmed out the long list of macOS versions):
[('pp360', 'pypy3_60', 'macosx_10_13_x86_64'), ('pp360', 'none', 'macosx_10_13_x86_64'), ('py3', 'none', 'macosx_10_13_x86_64'), ('pp360', 'none', 'any'), ('pp3', 'none', 'any'), ('py360', 'none', 'any'), ('py3', 'none', 'any')]
Now the first question I have is about ('pp3', 'none', 'any'). Is this meant to be a generic thing for any interpreter of the interpreter implementation and major version, or is this special to CPython and PyPy3?
Question two is why isn't there a ('py35', 'none', 'any') or ('py34', 'none', 'any') and older to py30 after py3 like there is for CPython? Seems like if they are just source then they should be compatible as much as CPython.
Question three is why isn't there a ('py35', 'none', 'macosx_10_13_x86_64') for PyPy3 or CPython 3.7? I can't figure out what a Python- and platform-specific wheel but agnostic to API wouldn't ever work?
And I'm assuming ('py360', 'none', 'any'), isn't legitimate since that makes no sense. ;)
I think the answer to all of these questions is "well, no-one's ever really looked that closely". There's a theory behind the tags; they're supposed to be a reasonably expressive language for talking about Python dialect compatibility, Python C ABI compatibility, and platform ABI compatibility, respectively. But in practice so far only a small fixed set of tag combinations actually gets used, so there's plenty of room for weird stuff to accumulate in the corners where no-one looks. I've never been able to figure out a use case for the interpreter tags in the first field ("cp36", "pp3", etc). IIUC, the theory is that they're supposed to mean "the Python code in this package is not actually portable Python, but uses an interpreter-specific dialect of the language". (This is very different from the second field, where tags like "cp36m" tell you about the required C ABI -- that one is obviously super useful.) I guess if you had a package that like, absolutely depended on GC being refcount-based, you could use a cp3 tag to indicate that, or if you had a pure-Python, python 2 package, that required 'dict' to be ordered, maybe that's 'pp2-none-any'? But this never seems to actually happen in practice. It seems like an idea that sounded plausible early on, and then never got fleshed out or revisited. The distutils folks have never sat down to seriously think about non-CPython implementations, where the language version and the implementation version are separate things. The pypy folks have never sat down to seriously think about API/ABI stability. Generally at the Python dialect level they try to match a given version of (C)Python, and at the ABI level every new release is a new ABI. My guess is you shouldn't spend too much effort on trying to slavishly reproduce pip's logic, and that if you wanted to go clean up pip's logic (and maybe extract it into a reusable library?) then the devs would be perfectly happy that someone was doing it... -n On Mon, Aug 27, 2018 at 6:28 PM, Brett Cannon <brett@python.org> wrote:
And to help in getting a reply, here is the trimmed-down results for CPython 3.7 to compare against:
[('cp37', 'cp37m', 'macosx_10_13_x86_64'), … ('cp37', 'abi3', 'macosx_10_13_x86_64'), … ('cp37', 'none', 'macosx_10_13_x86_64'), … ('cp36', 'abi3', 'macosx_10_13_x86_64'), … ('cp35', 'abi3', 'macosx_10_13_x86_64'), … ('cp34', 'abi3', 'macosx_10_13_x86_64'), … ('cp33', 'abi3', 'macosx_10_13_x86_64'), … ('cp32', 'abi3', 'macosx_10_13_x86_64'), … ('py3', 'none', 'macosx_10_13_x86_64'), … ('cp37', 'none', 'any'), ('cp3', 'none', 'any'), ('py37', 'none', 'any'), ('py3', 'none', 'any'), ('py36', 'none', 'any'), ('py35', 'none', 'any'), ('py34', 'none', 'any'), ('py33', 'none', 'any'), ('py32', 'none', 'any'), ('py31', 'none', 'any'), ('py30', 'none', 'any')]
So, it re-iterate the questions:
What is ('pp3', 'none', 'any') supposed to represent for PyPy3? Since the version of the interpreter is PyPy3 6.0 the lack of major version number seems like a bug more than a purposeful interpreter version (and there's only a single project -- cliquet -- that has a wheel that's compatible with that tag triple and it's not even for their latest release). Why does CPython have (*, 'none', 'any') from the version of the interpreter down to Python 3.0 plus generically Python 3 while PyPy3 only gets generic Python 3? Why isn't (*, 'none', platform) listed from Python 3.7 to 3.0 for either CPython or PyPy3? I understand not iterating through all versions when an ABI is involved (without knowing exactly which versions are compatible like abi3), but this triple seems safe to iterate through as a fallback just as much as (*, 'none', 'any'). Maybe because it's too ambiguous to know how important such a fallback would be between e.g. ('py36', 'none', 'macosx_10_13_x86_64') and ('py37', 'none', 'any'), and so why bother when the older version triples are there just for a safety net to have at least some chance of a match? I still think ('py360', 'none', 'any') is a bug. ;)
P.S.: The ('py3', 'none', 'macosx_10_13_x86_64') triple being between e.g. ('pp360', 'none', 'macosx_10_13_x86_64') and ('pp360', 'none', 'any') is really messing with my head and making the code to generate supported triples a bit less elegant. ;)
On Sat, 25 Aug 2018 at 15:03 Brett Cannon <brett@python.org> wrote:
I noticed that for PyPy3, the tag triples considered compatible were (roughly; trimmed out the long list of macOS versions):
[('pp360', 'pypy3_60', 'macosx_10_13_x86_64'), ('pp360', 'none', 'macosx_10_13_x86_64'), ('py3', 'none', 'macosx_10_13_x86_64'), ('pp360', 'none', 'any'), ('pp3', 'none', 'any'), ('py360', 'none', 'any'), ('py3', 'none', 'any')]
Now the first question I have is about ('pp3', 'none', 'any'). Is this meant to be a generic thing for any interpreter of the interpreter implementation and major version, or is this special to CPython and PyPy3?
Question two is why isn't there a ('py35', 'none', 'any') or ('py34', 'none', 'any') and older to py30 after py3 like there is for CPython? Seems like if they are just source then they should be compatible as much as CPython.
Question three is why isn't there a ('py35', 'none', 'macosx_10_13_x86_64') for PyPy3 or CPython 3.7? I can't figure out what a Python- and platform-specific wheel but agnostic to API wouldn't ever work?
And I'm assuming ('py360', 'none', 'any'), isn't legitimate since that makes no sense. ;)
-- Distutils-SIG mailing list -- distutils-sig@python.org To unsubscribe send an email to distutils-sig-leave@python.org https://mail.python.org/mm3/mailman3/lists/distutils-sig.python.org/ Message archived at https://mail.python.org/mm3/archives/list/distutils-sig@python.org/message/D...
-- Nathaniel J. Smith -- https://vorpus.org
That's right. No one writes 2to3 but for python 3.6 -> 3.7. The js people do. If we got into it we could have wheel tags for that sort of thing. In practice only a few classes of tags are used. On Mon, Aug 27, 2018, 22:06 Nathaniel Smith <njs@pobox.com> wrote:
I think the answer to all of these questions is "well, no-one's ever really looked that closely".
There's a theory behind the tags; they're supposed to be a reasonably expressive language for talking about Python dialect compatibility, Python C ABI compatibility, and platform ABI compatibility, respectively. But in practice so far only a small fixed set of tag combinations actually gets used, so there's plenty of room for weird stuff to accumulate in the corners where no-one looks.
I've never been able to figure out a use case for the interpreter tags in the first field ("cp36", "pp3", etc). IIUC, the theory is that they're supposed to mean "the Python code in this package is not actually portable Python, but uses an interpreter-specific dialect of the language". (This is very different from the second field, where tags like "cp36m" tell you about the required C ABI -- that one is obviously super useful.) I guess if you had a package that like, absolutely depended on GC being refcount-based, you could use a cp3 tag to indicate that, or if you had a pure-Python, python 2 package, that required 'dict' to be ordered, maybe that's 'pp2-none-any'? But this never seems to actually happen in practice. It seems like an idea that sounded plausible early on, and then never got fleshed out or revisited.
The distutils folks have never sat down to seriously think about non-CPython implementations, where the language version and the implementation version are separate things.
The pypy folks have never sat down to seriously think about API/ABI stability. Generally at the Python dialect level they try to match a given version of (C)Python, and at the ABI level every new release is a new ABI.
My guess is you shouldn't spend too much effort on trying to slavishly reproduce pip's logic, and that if you wanted to go clean up pip's logic (and maybe extract it into a reusable library?) then the devs would be perfectly happy that someone was doing it...
-n
And to help in getting a reply, here is the trimmed-down results for CPython 3.7 to compare against:
[('cp37', 'cp37m', 'macosx_10_13_x86_64'), … ('cp37', 'abi3', 'macosx_10_13_x86_64'), … ('cp37', 'none', 'macosx_10_13_x86_64'), … ('cp36', 'abi3', 'macosx_10_13_x86_64'), … ('cp35', 'abi3', 'macosx_10_13_x86_64'), … ('cp34', 'abi3', 'macosx_10_13_x86_64'), … ('cp33', 'abi3', 'macosx_10_13_x86_64'), … ('cp32', 'abi3', 'macosx_10_13_x86_64'), … ('py3', 'none', 'macosx_10_13_x86_64'), … ('cp37', 'none', 'any'), ('cp3', 'none', 'any'), ('py37', 'none', 'any'), ('py3', 'none', 'any'), ('py36', 'none', 'any'), ('py35', 'none', 'any'), ('py34', 'none', 'any'), ('py33', 'none', 'any'), ('py32', 'none', 'any'), ('py31', 'none', 'any'), ('py30', 'none', 'any')]
So, it re-iterate the questions:
What is ('pp3', 'none', 'any') supposed to represent for PyPy3? Since the version of the interpreter is PyPy3 6.0 the lack of major version number seems like a bug more than a purposeful interpreter version (and there's only a single project -- cliquet -- that has a wheel that's compatible with that tag triple and it's not even for their latest release). Why does CPython have (*, 'none', 'any') from the version of the interpreter down to Python 3.0 plus generically Python 3 while PyPy3 only gets generic Python 3? Why isn't (*, 'none', platform) listed from Python 3.7 to 3.0 for either CPython or PyPy3? I understand not iterating through all versions when an ABI is involved (without knowing exactly which versions are compatible
On Mon, Aug 27, 2018 at 6:28 PM, Brett Cannon <brett@python.org> wrote: like
abi3), but this triple seems safe to iterate through as a fallback just as much as (*, 'none', 'any'). Maybe because it's too ambiguous to know how important such a fallback would be between e.g. ('py36', 'none', 'macosx_10_13_x86_64') and ('py37', 'none', 'any'), and so why bother when the older version triples are there just for a safety net to have at least some chance of a match? I still think ('py360', 'none', 'any') is a bug. ;)
P.S.: The ('py3', 'none', 'macosx_10_13_x86_64') triple being between e.g. ('pp360', 'none', 'macosx_10_13_x86_64') and ('pp360', 'none', 'any') is really messing with my head and making the code to generate supported triples a bit less elegant. ;)
On Sat, 25 Aug 2018 at 15:03 Brett Cannon <brett@python.org> wrote:
I noticed that for PyPy3, the tag triples considered compatible were (roughly; trimmed out the long list of macOS versions):
[('pp360', 'pypy3_60', 'macosx_10_13_x86_64'), ('pp360', 'none', 'macosx_10_13_x86_64'), ('py3', 'none', 'macosx_10_13_x86_64'), ('pp360', 'none', 'any'), ('pp3', 'none', 'any'), ('py360', 'none', 'any'), ('py3', 'none', 'any')]
Now the first question I have is about ('pp3', 'none', 'any'). Is this meant to be a generic thing for any interpreter of the interpreter implementation and major version, or is this special to CPython and
PyPy3?
Question two is why isn't there a ('py35', 'none', 'any') or ('py34', 'none', 'any') and older to py30 after py3 like there is for CPython?
Seems
like if they are just source then they should be compatible as much as CPython.
Question three is why isn't there a ('py35', 'none', 'macosx_10_13_x86_64') for PyPy3 or CPython 3.7? I can't figure out what a Python- and platform-specific wheel but agnostic to API wouldn't ever work?
And I'm assuming ('py360', 'none', 'any'), isn't legitimate since that makes no sense. ;)
-- Distutils-SIG mailing list -- distutils-sig@python.org To unsubscribe send an email to distutils-sig-leave@python.org https://mail.python.org/mm3/mailman3/lists/distutils-sig.python.org/ Message archived at
https://mail.python.org/mm3/archives/list/distutils-sig@python.org/message/D...
-- Nathaniel J. Smith -- https://vorpus.org -- Distutils-SIG mailing list -- distutils-sig@python.org To unsubscribe send an email to distutils-sig-leave@python.org https://mail.python.org/mm3/mailman3/lists/distutils-sig.python.org/ Message archived at https://mail.python.org/mm3/archives/list/distutils-sig@python.org/message/7...
On Mon, 27 Aug 2018 at 19:05 Nathaniel Smith <njs@pobox.com> wrote:
I think the answer to all of these questions is "well, no-one's ever really looked that closely".
I figured, but I just needed someone to verify that hunch was correct. :)
There's a theory behind the tags; they're supposed to be a reasonably expressive language for talking about Python dialect compatibility, Python C ABI compatibility, and platform ABI compatibility, respectively. But in practice so far only a small fixed set of tag combinations actually gets used, so there's plenty of room for weird stuff to accumulate in the corners where no-one looks.
Yep. There's also the side-effect that the pep425tags code that the various tools have embedded makes some very broad assumptions that all interpreters follow a tagging/versioning style similar to what is used for CPython, hence the odd PyPy3 results where the "3" part of the interpreter _name_ gets used as if it's a major version.
I've never been able to figure out a use case for the interpreter tags in the first field ("cp36", "pp3", etc).
To help answer that, here are some statistics on the number of projects that have a certain wheel tag based on download counts on 2018-08-24. - cp36 - %cp36-none-any.whl: 7 (example <https://pypi.org/project/google-python-cloud-debugger/2.8/#files>) - %cp36-none-%.whl: 70 (example <https://pypi.org/project/qucumber/#files>) - cp36-none-%.whl but not cp36-none-any.whl: 65 (example <https://pypi.org/project/numpy/1.15.1/#files>that Nathaniel knows very well ;) - cp3 - %cp3-none-any.whl: 2 (example <https://pypi.org/project/kinto/#files>) - %cp3-none-%.whl: 3 (example <https://pypi.org/project/billiard/3.3.0.23/#files>) - cp3-%: 3 (same as cp3-none-%) - py36 - py36-none-% but not py36-none-any: 2 (example <https://pypi.org/project/pytrack-analysis/0.0.3/#files>) - py3 - py3-none-% but not py3-none-any: 142 (example <https://pypi.org/project/mypy/0.511/#files>) - pp3 - %pp3-%: 1 (example <https://pypi.org/project/cliquet/3.1.1/#files>) - pp360 - %pp360-%: 6 (example <https://pypi.org/project/Pillow/5.2.0/#files>) To put this into perspective, there are currently 150,410 projects on PyPI. Plus a lot of those odd examples for broadly stated interpreter versions consistently come from the kinto and cliquet projects (although not for their latest releases). In the end I think you can view the interpreter tag as representing a namespace for the ABI tag. Otherwise it technically isn't necessary as you could just use either an interpreter version or Python version as a form of ABI tag and drop the interpreter tag.
IIUC, the theory is that they're supposed to mean "the Python code in this package is not actually portable Python, but uses an interpreter-specific dialect of the language". (This is very different from the second field, where tags like "cp36m" tell you about the required C ABI -- that one is obviously super useful.) I guess if you had a package that like, absolutely depended on GC being refcount-based, you could use a cp3 tag to indicate that, or if you had a pure-Python, python 2 package, that required 'dict' to be ordered, maybe that's 'pp2-none-any'? But this never seems to actually happen in practice. It seems like an idea that sounded plausible early on, and then never got fleshed out or revisited.
Yeah, and it actually isn't expressive _enough_ to be self-contained to cover all use-cases for these tags because the "Python version" tag -- which I consider the interpreter tag -- doesn't necessarily cover Python version compatibility for the interpreter that's been specified. This becomes a need when you want to figure out what wheel is the best fit for Python for a certain interpreter (i.e. if I was a cloud provider and said what Python was supported by tag triple it actually wouldn't be enough to download appropriate wheels without also knowing the Python version as a side-channel bit of information). Not a huge deal, but something I noticed.
The distutils folks have never sat down to seriously think about non-CPython implementations, where the language version and the implementation version are separate things.
Guess what I've started doing? ;)
The pypy folks have never sat down to seriously think about API/ABI stability. Generally at the Python dialect level they try to match a given version of (C)Python, and at the ABI level every new release is a new ABI.
My guess is you shouldn't spend too much effort on trying to slavishly reproduce pip's logic, and that if you wanted to go clean up pip's logic (and maybe extract it into a reusable library?) then the devs would be perfectly happy that someone was doing it...
That's exactly what I'm in the process of doing. :) My goal is to have a library that tools will drop their internal copies of pep425tags for so there's a standardized PEP 425 implementation. I just wanted to make sure that before I write any more code that I knew what needed to be handled for backwards-compatibility versus what is a historical accident or was a guess at what the future might need when the PEP was written. Anyway, I will give this a think and try to come up with a reasonable algorithm for generating the sequence of supported tags based on a specific tag and Python version and then code that up into a library (at least I will definitely have something to work on at the dev sprints :) . -Brett
-n
And to help in getting a reply, here is the trimmed-down results for CPython 3.7 to compare against:
[('cp37', 'cp37m', 'macosx_10_13_x86_64'), … ('cp37', 'abi3', 'macosx_10_13_x86_64'), … ('cp37', 'none', 'macosx_10_13_x86_64'), … ('cp36', 'abi3', 'macosx_10_13_x86_64'), … ('cp35', 'abi3', 'macosx_10_13_x86_64'), … ('cp34', 'abi3', 'macosx_10_13_x86_64'), … ('cp33', 'abi3', 'macosx_10_13_x86_64'), … ('cp32', 'abi3', 'macosx_10_13_x86_64'), … ('py3', 'none', 'macosx_10_13_x86_64'), … ('cp37', 'none', 'any'), ('cp3', 'none', 'any'), ('py37', 'none', 'any'), ('py3', 'none', 'any'), ('py36', 'none', 'any'), ('py35', 'none', 'any'), ('py34', 'none', 'any'), ('py33', 'none', 'any'), ('py32', 'none', 'any'), ('py31', 'none', 'any'), ('py30', 'none', 'any')]
So, it re-iterate the questions:
What is ('pp3', 'none', 'any') supposed to represent for PyPy3? Since the version of the interpreter is PyPy3 6.0 the lack of major version number seems like a bug more than a purposeful interpreter version (and there's only a single project -- cliquet -- that has a wheel that's compatible with that tag triple and it's not even for their latest release). Why does CPython have (*, 'none', 'any') from the version of the interpreter down to Python 3.0 plus generically Python 3 while PyPy3 only gets generic Python 3? Why isn't (*, 'none', platform) listed from Python 3.7 to 3.0 for either CPython or PyPy3? I understand not iterating through all versions when an ABI is involved (without knowing exactly which versions are compatible
On Mon, Aug 27, 2018 at 6:28 PM, Brett Cannon <brett@python.org> wrote: like
abi3), but this triple seems safe to iterate through as a fallback just as much as (*, 'none', 'any'). Maybe because it's too ambiguous to know how important such a fallback would be between e.g. ('py36', 'none', 'macosx_10_13_x86_64') and ('py37', 'none', 'any'), and so why bother when the older version triples are there just for a safety net to have at least some chance of a match? I still think ('py360', 'none', 'any') is a bug. ;)
P.S.: The ('py3', 'none', 'macosx_10_13_x86_64') triple being between e.g. ('pp360', 'none', 'macosx_10_13_x86_64') and ('pp360', 'none', 'any') is really messing with my head and making the code to generate supported triples a bit less elegant. ;)
On Sat, 25 Aug 2018 at 15:03 Brett Cannon <brett@python.org> wrote:
I noticed that for PyPy3, the tag triples considered compatible were (roughly; trimmed out the long list of macOS versions):
[('pp360', 'pypy3_60', 'macosx_10_13_x86_64'), ('pp360', 'none', 'macosx_10_13_x86_64'), ('py3', 'none', 'macosx_10_13_x86_64'), ('pp360', 'none', 'any'), ('pp3', 'none', 'any'), ('py360', 'none', 'any'), ('py3', 'none', 'any')]
Now the first question I have is about ('pp3', 'none', 'any'). Is this meant to be a generic thing for any interpreter of the interpreter implementation and major version, or is this special to CPython and
PyPy3?
Question two is why isn't there a ('py35', 'none', 'any') or ('py34', 'none', 'any') and older to py30 after py3 like there is for CPython?
Seems
like if they are just source then they should be compatible as much as CPython.
Question three is why isn't there a ('py35', 'none', 'macosx_10_13_x86_64') for PyPy3 or CPython 3.7? I can't figure out what a Python- and platform-specific wheel but agnostic to API wouldn't ever work?
And I'm assuming ('py360', 'none', 'any'), isn't legitimate since that makes no sense. ;)
-- Distutils-SIG mailing list -- distutils-sig@python.org To unsubscribe send an email to distutils-sig-leave@python.org https://mail.python.org/mm3/mailman3/lists/distutils-sig.python.org/ Message archived at
https://mail.python.org/mm3/archives/list/distutils-sig@python.org/message/D...
-- Nathaniel J. Smith -- https://vorpus.org
On Tue, Aug 28, 2018 at 11:46 AM, Brett Cannon <brett@python.org> wrote:
cp36
%cp36-none-any.whl: 7 (example) %cp36-none-%.whl: 70 (example) cp36-none-%.whl but not cp36-none-any.whl: 65 (example that Nathaniel knows very well ;)
Yeah, that's an old hack that never got removed, and causes problems: https://github.com/numpy/numpy/issues/11508 Actually I wouldn't be surprised if most of those 65 are from projects using 'multibuild' that inherited that hack. -n -- Nathaniel J. Smith -- https://vorpus.org
On Tue, Aug 28, 2018 at 11:46 AM, Brett Cannon <brett@python.org> wrote:
py36
py36-none-% but not py36-none-any: 2 (example)
py3
py3-none-% but not py3-none-any: 142 (example)
Oh right, and these ones are totally sensible: this is the correct tag for a project that ships some vendored shared libraries, and accesses them using cffi's ABI mode, or through ctypes: it cares about the CPU/OS ABI, but doesn't use the Python C ABI.
In the end I think you can view the interpreter tag as representing a namespace for the ABI tag.
The ABI tags are all designed to be unique though, without namespacing. Also, in theory the semantics are slightly different, because cp36 means "3.6 or higher", while cp36m means "exactly 3.6, with --enable-pymalloc but without --enable-debug". A "cp35-cp36m" wheel is technically possible, though of course not very useful in practice...
That's exactly what I'm in the process of doing. :) My goal is to have a library that tools will drop their internal copies of pep425tags for so there's a standardized PEP 425 implementation. I just wanted to make sure that before I write any more code that I knew what needed to be handled for backwards-compatibility versus what is a historical accident or was a guess at what the future might need when the PEP was written.
Anyway, I will give this a think and try to come up with a reasonable algorithm for generating the sequence of supported tags based on a specific tag and Python version and then code that up into a library (at least I will definitely have something to work on at the dev sprints :) .
Cool, see you there :-) -n -- Nathaniel J. Smith -- https://vorpus.org
On Wed, 29 Aug 2018 at 01:56 Nathaniel Smith <njs@pobox.com> wrote:
On Tue, Aug 28, 2018 at 11:46 AM, Brett Cannon <brett@python.org> wrote:
py36
py36-none-% but not py36-none-any: 2 (example)
py3
py3-none-% but not py3-none-any: 142 (example)
Oh right, and these ones are totally sensible: this is the correct tag for a project that ships some vendored shared libraries, and accesses them using cffi's ABI mode, or through ctypes: it cares about the CPU/OS ABI, but doesn't use the Python C ABI.
Yep. I was just surprised that py37-none-% wasn't being emitted as acceptable since that technically makes sense.
In the end I think you can view the interpreter tag as representing a namespace for the ABI tag.
The ABI tags are all designed to be unique though, without namespacing. Also, in theory the semantics are slightly different, because cp36 means "3.6 or higher",
Do you happen to know where that's specified? I knew cp36m was locked to 3.6, but I don't think I was aware of cp36 meaning it was forwards-compatible from 3.6 (I just think of abi3 having that kind of forwards-compatibility, plus pip isn't even checking for cp37-cp37-%). Most of my knowledge of what the ABI tags mean has come from code introspection as I don't' know if the exact meaning for CPython is written down anywhere. And there's a single project that matches cp36-cp36-%.whl. :)
while cp36m means "exactly 3.6, with --enable-pymalloc but without --enable-debug". A "cp35-cp36m" wheel is technically possible, though of course not very useful in practice...
I think figuring out what makes sense in terms of compatibility will be the toughest bit. E.g. for Python 3.7, pip will check for py37-none-any down to py30-none-any as well as py3-none-any. With python_requires in metadata well as the py3 interpreter tag, I'm not sure if it still makes sense to enumerate all the way down to py30, especially when Python doesn't follow strict semver. Maybe for Python 3.7 py37, py3, and py36 makes the most sense by assuming code is warning-free in Python 3.6 and so should be relatively safe to use in 3.7 with warnings? Otherwise I wouldn't expect e.g. 3.5 code to work in 3.7 since there's new keywords that old code might break on.
That's exactly what I'm in the process of doing. :) My goal is to have a library that tools will drop their internal copies of pep425tags for so there's a standardized PEP 425 implementation. I just wanted to make sure that before I write any more code that I knew what needed to be handled for backwards-compatibility versus what is a historical accident or was a guess at what the future might need when the PEP was written.
Anyway, I will give this a think and try to come up with a reasonable algorithm for generating the sequence of supported tags based on a specific tag and Python version and then code that up into a library (at least I will definitely have something to work on at the dev sprints :) .
Cool, see you there :-)
Yes, a week and a half away! -Brett
-n
-- Nathaniel J. Smith -- https://vorpus.org
On Wed, Aug 29, 2018 at 10:25 AM, Brett Cannon <brett@python.org> wrote:
On Wed, 29 Aug 2018 at 01:56 Nathaniel Smith <njs@pobox.com> wrote:
On Tue, Aug 28, 2018 at 11:46 AM, Brett Cannon <brett@python.org> wrote:
py36
py36-none-% but not py36-none-any: 2 (example)
py3
py3-none-% but not py3-none-any: 142 (example)
Oh right, and these ones are totally sensible: this is the correct tag for a project that ships some vendored shared libraries, and accesses them using cffi's ABI mode, or through ctypes: it cares about the CPU/OS ABI, but doesn't use the Python C ABI.
Yep. I was just surprised that py37-none-% wasn't being emitted as acceptable since that technically makes sense.
Setuptools never creates such wheels, so I guess it's not well tested. The main reason they exist at all is that Armin Ronacher jumped through a bunch of hoops to make it happen in his milksnake [1] project, and it's not even a year old. [1] https://github.com/getsentry/milksnake
I think figuring out what makes sense in terms of compatibility will be the toughest bit. E.g. for Python 3.7, pip will check for py37-none-any down to py30-none-any as well as py3-none-any. With python_requires in metadata well as the py3 interpreter tag, I'm not sure if it still makes sense to enumerate all the way down to py30, especially when Python doesn't follow strict semver. Maybe for Python 3.7 py37, py3, and py36 makes the most sense by assuming code is warning-free in Python 3.6 and so should be relatively safe to use in 3.7 with warnings? Otherwise I wouldn't expect e.g. 3.5 code to work in 3.7 since there's new keywords that old code might break on.
This is a tricky decision. Any time a new Python comes out, some existing wheels will continue to work fine, and some will be broken. One goal is to avoid installing broken wheels. But, there's also another consideration: if we're too conservative, then with every release we create a bunch of make-work as projects have to re-roll old wheels that would have worked fine, and some percentage of projects won't do this (e.g. b/c they're abandoned), and we lose them forever. Also, for the py3x tags in particular, if the wheel fails on py3(x+1), then the sdist probably will too, so it's not like we have any useful fallback. So, it's arguably better to be optimistic and assume that all py3x wheels will work on py3(x+k), even if it's sometimes wrong, because when we're wrong the failure modes are more acceptable. -n -- Nathaniel J. Smith -- https://vorpus.org
On Wed, 29 Aug 2018 at 15:54 Nathaniel Smith <njs@pobox.com> wrote:
On Wed, Aug 29, 2018 at 10:25 AM, Brett Cannon <brett@python.org> wrote:
On Wed, 29 Aug 2018 at 01:56 Nathaniel Smith <njs@pobox.com> wrote:
On Tue, Aug 28, 2018 at 11:46 AM, Brett Cannon <brett@python.org>
wrote:
py36
py36-none-% but not py36-none-any: 2 (example)
py3
py3-none-% but not py3-none-any: 142 (example)
Oh right, and these ones are totally sensible: this is the correct tag for a project that ships some vendored shared libraries, and accesses them using cffi's ABI mode, or through ctypes: it cares about the CPU/OS ABI, but doesn't use the Python C ABI.
What you say below would also suggest that Python 3.7 should support down to py30-none-% as well.
Yep. I was just surprised that py37-none-% wasn't being emitted as acceptable since that technically makes sense.
Setuptools never creates such wheels, so I guess it's not well tested. The main reason they exist at all is that Armin Ronacher jumped through a bunch of hoops to make it happen in his milksnake [1] project, and it's not even a year old.
[1] https://github.com/getsentry/milksnake
I think figuring out what makes sense in terms of compatibility will be the toughest bit. E.g. for Python 3.7, pip will check for py37-none-any down to py30-none-any as well as py3-none-any. With python_requires in metadata well as the py3 interpreter tag, I'm not sure if it still makes sense to enumerate all the way down to py30, especially when Python doesn't follow strict semver. Maybe for Python 3.7 py37, py3, and py36 makes the most sense by assuming code is warning-free in Python 3.6 and so should be relatively safe to use in 3.7 with warnings? Otherwise I wouldn't expect e.g. 3.5 code to work in 3.7 since there's new keywords that old code might break on.
This is a tricky decision. Any time a new Python comes out, some existing wheels will continue to work fine, and some will be broken. One goal is to avoid installing broken wheels. But, there's also another consideration: if we're too conservative, then with every release we create a bunch of make-work as projects have to re-roll old wheels that would have worked fine, and some percentage of projects won't do this (e.g. b/c they're abandoned), and we lose them forever. Also, for the py3x tags in particular, if the wheel fails on py3(x+1), then the sdist probably will too, so it's not like we have any useful fallback.
Right, but isn't that what the py3-none-any tag is meant to represent? If someone doesn't use that tag then I would take that as there is some version-specific stuff in that wheel.
So, it's arguably better to be optimistic and assume that all py3x wheels will work on py3(x+k), even if it's sometimes wrong, because when we're wrong the failure modes are more acceptable.
Quite possibly, but at this point I don't want to take anything for certain. :) I mean in the end it's just a string coming from a generator so it's cheap to include, but I just want to make sure we appropriately justify its inclusion when it's inferred versus specified.
On Thu, 30 Aug 2018 at 09:58, Brett Cannon <brett@python.org> wrote:
On Wed, 29 Aug 2018 at 15:54 Nathaniel Smith <njs@pobox.com> wrote:
This is a tricky decision. Any time a new Python comes out, some existing wheels will continue to work fine, and some will be broken. One goal is to avoid installing broken wheels. But, there's also another consideration: if we're too conservative, then with every release we create a bunch of make-work as projects have to re-roll old wheels that would have worked fine, and some percentage of projects won't do this (e.g. b/c they're abandoned), and we lose them forever. Also, for the py3x tags in particular, if the wheel fails on py3(x+1), then the sdist probably will too, so it's not like we have any useful fallback.
Right, but isn't that what the py3-none-any tag is meant to represent? If someone doesn't use that tag then I would take that as there is some version-specific stuff in that wheel.
The problem is that "py3-none-any" doesn't specify a *minimum* version, so if a project starts using a new feature like f-strings, they *have* to declare "py36-...". So even though it isn't what PEP 425 actually says, in practice it's turned out to be more useful to interpret the Python version tag as being "version X.Y or later", and only interpret the ABI tag strictly. That philosophy also makes the "abi3" ABI tag more coherent, since it means that the "pyXY" part also specifies the minimum required ABI version. The marker for "exact version required" could then be to nominate a specific Python implementation, rather than using the "py" prefix - so a hypothetical wheel builder could use "cp36-none-any" for a bytecode-only wheel archive that *only* ran on CPython 3.6, and wouldn't be portable to other versions or implementations. Cheers, Nick. -- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia
It's not an intuitive system. We have wheel tags to choose the best alternative wheel or fall back to sdist. So py3-none-any is fine for f-strings if no other candidate wheel (a list of all available wheels for the same version number of a package) has been compiled to not require f-strings. The tag only has to tell you which wheel is most likely to work. No sdist or wheel is ever guaranteed to work, for any number of reasons. On Aug 30, 2018 11:25, "Nick Coghlan" <ncoghlan@gmail.com> wrote: On Thu, 30 Aug 2018 at 09:58, Brett Cannon <brett@python.org> wrote:
On Wed, 29 Aug 2018 at 15:54 Nathaniel Smith <njs@pobox.com> wrote:
This is a tricky decision. Any time a new Python comes out, some existing wheels will continue to work fine, and some will be broken. One goal is to avoid installing broken wheels. But, there's also another consideration: if we're too conservative, then with every release we create a bunch of make-work as projects have to re-roll old wheels that would have worked fine, and some percentage of projects won't do this (e.g. b/c they're abandoned), and we lose them forever. Also, for the py3x tags in particular, if the wheel fails on py3(x+1), then the sdist probably will too, so it's not like we have any useful fallback.
Right, but isn't that what the py3-none-any tag is meant to represent? If someone doesn't use that tag then I would take that as there is some version-specific stuff in that wheel.
The problem is that "py3-none-any" doesn't specify a *minimum* version, so if a project starts using a new feature like f-strings, they *have* to declare "py36-...". So even though it isn't what PEP 425 actually says, in practice it's turned out to be more useful to interpret the Python version tag as being "version X.Y or later", and only interpret the ABI tag strictly. That philosophy also makes the "abi3" ABI tag more coherent, since it means that the "pyXY" part also specifies the minimum required ABI version. The marker for "exact version required" could then be to nominate a specific Python implementation, rather than using the "py" prefix - so a hypothetical wheel builder could use "cp36-none-any" for a bytecode-only wheel archive that *only* ran on CPython 3.6, and wouldn't be portable to other versions or implementations. Cheers, Nick. -- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia -- Distutils-SIG mailing list -- distutils-sig@python.org To unsubscribe send an email to distutils-sig-leave@python.org https://mail.python.org/mm3/mailman3/lists/distutils-sig.python.org/ Message archived at https://mail.python.org/mm3/archives/list/distutils-sig@python.org/message/F...
So based on all of this, here is my proposal of what the compatible tags should become (in priority order from most to least strict). In the list below yellow means the value changed compared to the previous tag, blue means it's something I'm proposing to add, and red is something I'm proposing to remove (using what pip considers compatible tags as the base list of tags). I have left out all of the platform variances of macOS for brevity as there's no questions regarding those. For PyPy3 6.0.0 (and any other non-CPython interpreter that reports Python 3.5 from sys.version_info, i.e. this represents the default logic for an interpreter that has no special handling): - ('pp360', 'pypy3_60', 'macosx_10_13_x86_64') - ('pp360', 'none', 'macosx_10_13_x86_64'), - ('py35', 'none', 'macosx_10_13_x86_64') - ('py3', 'none', 'macosx_10_13_x86_64'), - ('py34', 'none', 'macosx_10_13_x86_64') - ('py33', 'none', 'macosx_10_13_x86_64') - ('py32', 'none', 'macosx_10_13_x86_64') - ('py31', 'none', 'macosx_10_13_x86_64') - ('py30', 'none', 'macosx_10_13_x86_64') - ('pp360', 'none', 'any'), - ('pp3', 'none', 'any'), - ('py360', 'none', 'any'), - ('py35', 'none', 'any' - ('py3', 'none', 'any') - ('py34', 'none', 'any') - ('py33', 'none', 'any') - ('py32', 'none', 'any') - ('py31', 'none', 'any') - ('py30', 'none', 'any') For CPython 3.7.0 (whose logic will be unique to the CPython interpreter in the library, but other interpreters could have their own custom logic as well when it makes sense; there will be some API to just say "give me what makes sense based on this tag" so users don't have to know any of this if they don't want to): - ('cp37', 'cp37m', 'macosx_10_13_x86_64'), - ('cp37', 'abi3', 'macosx_10_13_x86_64'), - ('cp37', 'none', 'macosx_10_13_x86_64'), - ('cp36', 'abi3', 'macosx_10_13_x86_64'), - ('cp35', 'abi3', 'macosx_10_13_x86_64'), - ('cp34', 'abi3', 'macosx_10_13_x86_64'), - ('cp33', 'abi3', 'macosx_10_13_x86_64'), - ('cp32', 'abi3', 'macosx_10_13_x86_64'), - ('py37', 'none', 'macosx_10_13_x86_64') - ('py3', 'none', 'macosx_10_13_x86_64'), - ('py36', 'none', 'macosx_10_13_x86_64') - ('py35', 'none', 'macosx_10_13_x86_64') - ('py34', 'none', 'macosx_10_13_x86_64') - ('py33', 'none', 'macosx_10_13_x86_64') - ('py32', 'none', 'macosx_10_13_x86_64') - ('py31', 'none', 'macosx_10_13_x86_64') - ('py30', 'none', 'macosx_10_13_x86_64') - ('cp37', 'none', 'any'), - ('cp3', 'none', 'any'), - ('py37', 'none', 'any'), - ('py3', 'none', 'any'), - ('py36', 'none', 'any'), - ('py35', 'none', 'any'), - ('py34', 'none', 'any'), - ('py33', 'none', 'any'), - ('py32', 'none', 'any'), - ('py31', 'none', 'any'), - ('py30', 'none', 'any')] On Thu, 30 Aug 2018 at 09:03 Daniel Holth <dholth@gmail.com> wrote:
It's not an intuitive system. We have wheel tags to choose the best alternative wheel or fall back to sdist. So py3-none-any is fine for f-strings if no other candidate wheel (a list of all available wheels for the same version number of a package) has been compiled to not require f-strings. The tag only has to tell you which wheel is most likely to work.
No sdist or wheel is ever guaranteed to work, for any number of reasons.
On Aug 30, 2018 11:25, "Nick Coghlan" <ncoghlan@gmail.com> wrote:
On Thu, 30 Aug 2018 at 09:58, Brett Cannon <brett@python.org> wrote:
On Wed, 29 Aug 2018 at 15:54 Nathaniel Smith <njs@pobox.com> wrote:
This is a tricky decision. Any time a new Python comes out, some existing wheels will continue to work fine, and some will be broken. One goal is to avoid installing broken wheels. But, there's also another consideration: if we're too conservative, then with every release we create a bunch of make-work as projects have to re-roll old wheels that would have worked fine, and some percentage of projects won't do this (e.g. b/c they're abandoned), and we lose them forever. Also, for the py3x tags in particular, if the wheel fails on py3(x+1), then the sdist probably will too, so it's not like we have any useful fallback.
Right, but isn't that what the py3-none-any tag is meant to represent? If someone doesn't use that tag then I would take that as there is some version-specific stuff in that wheel.
The problem is that "py3-none-any" doesn't specify a *minimum* version, so if a project starts using a new feature like f-strings, they *have* to declare "py36-...".
So even though it isn't what PEP 425 actually says, in practice it's turned out to be more useful to interpret the Python version tag as being "version X.Y or later", and only interpret the ABI tag strictly. That philosophy also makes the "abi3" ABI tag more coherent, since it means that the "pyXY" part also specifies the minimum required ABI version.
The marker for "exact version required" could then be to nominate a specific Python implementation, rather than using the "py" prefix - so a hypothetical wheel builder could use "cp36-none-any" for a bytecode-only wheel archive that *only* ran on CPython 3.6, and wouldn't be portable to other versions or implementations.
Cheers, Nick.
-- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia
-- Distutils-SIG mailing list -- distutils-sig@python.org To unsubscribe send an email to distutils-sig-leave@python.org https://mail.python.org/mm3/mailman3/lists/distutils-sig.python.org/
Message archived at https://mail.python.org/mm3/archives/list/distutils-sig@python.org/message/F...
If we're going to rethink this, then I would really like to move away from assigning special meaning to specific combinations of tags. The thing where if you use cp36m as you Python ABI tag then you're forced to use cp36 as your python dialect tag doesn't make sense. And the thing where if you have a wheel with an extension module tagged as cp36-abi3, then that works fine on 3.7, but if you *remove* the extension module then it *stops* working on 3.7? That's just bizarre... So my suggestions: * Make the 3 tag categories totally independent. Compute a separate set for each, and then take the full cross product. * Since the stable ABI actually changes over time, we should define new tags abi35, abi36, etc. that mean "requires the stable ABI as defined by this version of cpython or higher", instead of relying on abi3 + a dialect tag. (Imagine if pypy started implementing the stable ABI –we'd have to start allowing cpXY tags to match PyPy.) * Plan to move away from the pyXY and cpXY tags over time; they're confusing and not useful. Of course this will have to be a gradual process, but if pip stops requiring them now then in a year or two we could make setuptools stop generating then. On Thu, Aug 30, 2018, 09:26 Brett Cannon <brett@python.org> wrote:
So based on all of this, here is my proposal of what the compatible tags should become (in priority order from most to least strict). In the list below yellow means the value changed compared to the previous tag, blue means it's something I'm proposing to add, and red is something I'm proposing to remove (using what pip considers compatible tags as the base list of tags). I have left out all of the platform variances of macOS for brevity as there's no questions regarding those.
For PyPy3 6.0.0 (and any other non-CPython interpreter that reports Python 3.5 from sys.version_info, i.e. this represents the default logic for an interpreter that has no special handling):
- ('pp360', 'pypy3_60', 'macosx_10_13_x86_64') - ('pp360', 'none', 'macosx_10_13_x86_64'), - ('py35', 'none', 'macosx_10_13_x86_64') - ('py3', 'none', 'macosx_10_13_x86_64'), - ('py34', 'none', 'macosx_10_13_x86_64') - ('py33', 'none', 'macosx_10_13_x86_64') - ('py32', 'none', 'macosx_10_13_x86_64') - ('py31', 'none', 'macosx_10_13_x86_64') - ('py30', 'none', 'macosx_10_13_x86_64') - ('pp360', 'none', 'any'), - ('pp3', 'none', 'any'), - ('py360', 'none', 'any'), - ('py35', 'none', 'any' - ('py3', 'none', 'any') - ('py34', 'none', 'any') - ('py33', 'none', 'any') - ('py32', 'none', 'any') - ('py31', 'none', 'any') - ('py30', 'none', 'any')
For CPython 3.7.0 (whose logic will be unique to the CPython interpreter in the library, but other interpreters could have their own custom logic as well when it makes sense; there will be some API to just say "give me what makes sense based on this tag" so users don't have to know any of this if they don't want to):
- ('cp37', 'cp37m', 'macosx_10_13_x86_64'), - ('cp37', 'abi3', 'macosx_10_13_x86_64'), - ('cp37', 'none', 'macosx_10_13_x86_64'), - ('cp36', 'abi3', 'macosx_10_13_x86_64'), - ('cp35', 'abi3', 'macosx_10_13_x86_64'), - ('cp34', 'abi3', 'macosx_10_13_x86_64'), - ('cp33', 'abi3', 'macosx_10_13_x86_64'), - ('cp32', 'abi3', 'macosx_10_13_x86_64'), - ('py37', 'none', 'macosx_10_13_x86_64') - ('py3', 'none', 'macosx_10_13_x86_64'), - ('py36', 'none', 'macosx_10_13_x86_64') - ('py35', 'none', 'macosx_10_13_x86_64') - ('py34', 'none', 'macosx_10_13_x86_64') - ('py33', 'none', 'macosx_10_13_x86_64') - ('py32', 'none', 'macosx_10_13_x86_64') - ('py31', 'none', 'macosx_10_13_x86_64') - ('py30', 'none', 'macosx_10_13_x86_64') - ('cp37', 'none', 'any'), - ('cp3', 'none', 'any'), - ('py37', 'none', 'any'), - ('py3', 'none', 'any'), - ('py36', 'none', 'any'), - ('py35', 'none', 'any'), - ('py34', 'none', 'any'), - ('py33', 'none', 'any'), - ('py32', 'none', 'any'), - ('py31', 'none', 'any'), - ('py30', 'none', 'any')]
On Thu, 30 Aug 2018 at 09:03 Daniel Holth <dholth@gmail.com> wrote:
It's not an intuitive system. We have wheel tags to choose the best alternative wheel or fall back to sdist. So py3-none-any is fine for f-strings if no other candidate wheel (a list of all available wheels for the same version number of a package) has been compiled to not require f-strings. The tag only has to tell you which wheel is most likely to work.
No sdist or wheel is ever guaranteed to work, for any number of reasons.
On Aug 30, 2018 11:25, "Nick Coghlan" <ncoghlan@gmail.com> wrote:
On Thu, 30 Aug 2018 at 09:58, Brett Cannon <brett@python.org> wrote:
On Wed, 29 Aug 2018 at 15:54 Nathaniel Smith <njs@pobox.com> wrote:
This is a tricky decision. Any time a new Python comes out, some existing wheels will continue to work fine, and some will be broken. One goal is to avoid installing broken wheels. But, there's also another consideration: if we're too conservative, then with every release we create a bunch of make-work as projects have to re-roll old wheels that would have worked fine, and some percentage of projects won't do this (e.g. b/c they're abandoned), and we lose them forever. Also, for the py3x tags in particular, if the wheel fails on py3(x+1), then the sdist probably will too, so it's not like we have any useful fallback.
Right, but isn't that what the py3-none-any tag is meant to represent? If someone doesn't use that tag then I would take that as there is some version-specific stuff in that wheel.
The problem is that "py3-none-any" doesn't specify a *minimum* version, so if a project starts using a new feature like f-strings, they *have* to declare "py36-...".
So even though it isn't what PEP 425 actually says, in practice it's turned out to be more useful to interpret the Python version tag as being "version X.Y or later", and only interpret the ABI tag strictly. That philosophy also makes the "abi3" ABI tag more coherent, since it means that the "pyXY" part also specifies the minimum required ABI version.
The marker for "exact version required" could then be to nominate a specific Python implementation, rather than using the "py" prefix - so a hypothetical wheel builder could use "cp36-none-any" for a bytecode-only wheel archive that *only* ran on CPython 3.6, and wouldn't be portable to other versions or implementations.
Cheers, Nick.
-- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia
-- Distutils-SIG mailing list -- distutils-sig@python.org To unsubscribe send an email to distutils-sig-leave@python.org https://mail.python.org/mm3/mailman3/lists/distutils-sig.python.org/
Message archived at https://mail.python.org/mm3/archives/list/distutils-sig@python.org/message/F...
-- Distutils-SIG mailing list -- distutils-sig@python.org To unsubscribe send an email to distutils-sig-leave@python.org https://mail.python.org/mm3/mailman3/lists/distutils-sig.python.org/ Message archived at https://mail.python.org/mm3/archives/list/distutils-sig@python.org/message/6...
On Thu, 30 Aug 2018 at 11:21 Nathaniel Smith <njs@pobox.com> wrote:
If we're going to rethink this,
Well, I didn't want to "rethink" so much as "fill in". :)
then I would really like to move away from assigning special meaning to specific combinations of tags. The thing where if you use cp36m as you Python ABI tag then you're forced to use cp36 as your python dialect tag doesn't make sense. And the thing where if you have a wheel with an extension module tagged as cp36-abi3, then that works fine on 3.7,
I wouldn't expect that to if the stable ABI was expanded in Python 3.7 and you used the expanded part.
but if you *remove* the extension module then it *stops* working on 3.7? That's just bizarre...
I don't quite follow what you mean by "remove the extension module then it stops". What stops by removing the extension module(s)?
So my suggestions:
* Make the 3 tag categories totally independent. Compute a separate set for each, and then take the full cross product.
I think Paul was hinting at this as part of his "wildcard" idea (and I honestly thought of this initially as well as it greatly simplifies things). So what would cp36-cp36m-plat expand to? - cp36: cp3N where N is any positive digit(s)? And then toss in py3 and py3N? Prefer exact, then generic '3', then older, and finally newer? - cp36m: that, abi3, and then 'none'? Do we care if someone has some crazy ABI that no one understands like 'b' (and if so should it only be applied to 'py' interpreter versions which break your nice cross product simplicity)? - plat: depends on platform. And the match preference goes to platform, interpreter version, and then ABI? I *think* that would work in terms of ignoring C ABIs that have very little chance of linking while being the broadest in terms of accepting a wheel that has some semblance of a chance of running. And how would that apply to PyPy3? Same ranging on 'pp36N` and drop the 'abi3' insertion?
* Since the stable ABI actually changes over time, we should define new tags abi35, abi36, etc. that mean "requires the stable ABI as defined by this version of cpython or higher", instead of relying on abi3 + a dialect tag. (Imagine if pypy started implementing the stable ABI –we'd have to start allowing cpXY tags to match PyPy.)
* Plan to move away from the pyXY and cpXY tags over time; they're confusing and not useful. Of course this will have to be a gradual process, but if pip stops requiring them now then in a year or two we could make setuptools stop generating then.
What would you do then about preferred match order for pure Python wheels? E.g. how do you preferably match against Python 3.7 wheels over 3.6 when running a Python 3.7 interpreter? Or are you suggesting equivalent ABI tags to make up for the pyXY tags and the interpreter tag simply gets ignored? -Brett
On Thu, Aug 30, 2018, 09:26 Brett Cannon <brett@python.org> wrote:
So based on all of this, here is my proposal of what the compatible tags should become (in priority order from most to least strict). In the list below yellow means the value changed compared to the previous tag, blue means it's something I'm proposing to add, and red is something I'm proposing to remove (using what pip considers compatible tags as the base list of tags). I have left out all of the platform variances of macOS for brevity as there's no questions regarding those.
For PyPy3 6.0.0 (and any other non-CPython interpreter that reports Python 3.5 from sys.version_info, i.e. this represents the default logic for an interpreter that has no special handling):
- ('pp360', 'pypy3_60', 'macosx_10_13_x86_64') - ('pp360', 'none', 'macosx_10_13_x86_64'), - ('py35', 'none', 'macosx_10_13_x86_64') - ('py3', 'none', 'macosx_10_13_x86_64'), - ('py34', 'none', 'macosx_10_13_x86_64') - ('py33', 'none', 'macosx_10_13_x86_64') - ('py32', 'none', 'macosx_10_13_x86_64') - ('py31', 'none', 'macosx_10_13_x86_64') - ('py30', 'none', 'macosx_10_13_x86_64') - ('pp360', 'none', 'any'), - ('pp3', 'none', 'any'), - ('py360', 'none', 'any'), - ('py35', 'none', 'any' - ('py3', 'none', 'any') - ('py34', 'none', 'any') - ('py33', 'none', 'any') - ('py32', 'none', 'any') - ('py31', 'none', 'any') - ('py30', 'none', 'any')
For CPython 3.7.0 (whose logic will be unique to the CPython interpreter in the library, but other interpreters could have their own custom logic as well when it makes sense; there will be some API to just say "give me what makes sense based on this tag" so users don't have to know any of this if they don't want to):
- ('cp37', 'cp37m', 'macosx_10_13_x86_64'), - ('cp37', 'abi3', 'macosx_10_13_x86_64'), - ('cp37', 'none', 'macosx_10_13_x86_64'), - ('cp36', 'abi3', 'macosx_10_13_x86_64'), - ('cp35', 'abi3', 'macosx_10_13_x86_64'), - ('cp34', 'abi3', 'macosx_10_13_x86_64'), - ('cp33', 'abi3', 'macosx_10_13_x86_64'), - ('cp32', 'abi3', 'macosx_10_13_x86_64'), - ('py37', 'none', 'macosx_10_13_x86_64') - ('py3', 'none', 'macosx_10_13_x86_64'), - ('py36', 'none', 'macosx_10_13_x86_64') - ('py35', 'none', 'macosx_10_13_x86_64') - ('py34', 'none', 'macosx_10_13_x86_64') - ('py33', 'none', 'macosx_10_13_x86_64') - ('py32', 'none', 'macosx_10_13_x86_64') - ('py31', 'none', 'macosx_10_13_x86_64') - ('py30', 'none', 'macosx_10_13_x86_64') - ('cp37', 'none', 'any'), - ('cp3', 'none', 'any'), - ('py37', 'none', 'any'), - ('py3', 'none', 'any'), - ('py36', 'none', 'any'), - ('py35', 'none', 'any'), - ('py34', 'none', 'any'), - ('py33', 'none', 'any'), - ('py32', 'none', 'any'), - ('py31', 'none', 'any'), - ('py30', 'none', 'any')]
On Thu, 30 Aug 2018 at 09:03 Daniel Holth <dholth@gmail.com> wrote:
It's not an intuitive system. We have wheel tags to choose the best alternative wheel or fall back to sdist. So py3-none-any is fine for f-strings if no other candidate wheel (a list of all available wheels for the same version number of a package) has been compiled to not require f-strings. The tag only has to tell you which wheel is most likely to work.
No sdist or wheel is ever guaranteed to work, for any number of reasons.
On Aug 30, 2018 11:25, "Nick Coghlan" <ncoghlan@gmail.com> wrote:
On Thu, 30 Aug 2018 at 09:58, Brett Cannon <brett@python.org> wrote:
On Wed, 29 Aug 2018 at 15:54 Nathaniel Smith <njs@pobox.com> wrote:
This is a tricky decision. Any time a new Python comes out, some existing wheels will continue to work fine, and some will be broken. One goal is to avoid installing broken wheels. But, there's also another consideration: if we're too conservative, then with every release we create a bunch of make-work as projects have to re-roll old wheels that would have worked fine, and some percentage of projects won't do this (e.g. b/c they're abandoned), and we lose them forever. Also, for the py3x tags in particular, if the wheel fails on py3(x+1), then the sdist probably will too, so it's not like we have any useful fallback.
Right, but isn't that what the py3-none-any tag is meant to represent? If someone doesn't use that tag then I would take that as there is some version-specific stuff in that wheel.
The problem is that "py3-none-any" doesn't specify a *minimum* version, so if a project starts using a new feature like f-strings, they *have* to declare "py36-...".
So even though it isn't what PEP 425 actually says, in practice it's turned out to be more useful to interpret the Python version tag as being "version X.Y or later", and only interpret the ABI tag strictly. That philosophy also makes the "abi3" ABI tag more coherent, since it means that the "pyXY" part also specifies the minimum required ABI version.
The marker for "exact version required" could then be to nominate a specific Python implementation, rather than using the "py" prefix - so a hypothetical wheel builder could use "cp36-none-any" for a bytecode-only wheel archive that *only* ran on CPython 3.6, and wouldn't be portable to other versions or implementations.
Cheers, Nick.
-- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia
-- Distutils-SIG mailing list -- distutils-sig@python.org To unsubscribe send an email to distutils-sig-leave@python.org https://mail.python.org/mm3/mailman3/lists/distutils-sig.python.org/
Message archived at https://mail.python.org/mm3/archives/list/distutils-sig@python.org/message/F...
-- Distutils-SIG mailing list -- distutils-sig@python.org To unsubscribe send an email to distutils-sig-leave@python.org https://mail.python.org/mm3/mailman3/lists/distutils-sig.python.org/
Message archived at
https://mail.python.org/mm3/archives/list/distutils-sig@python.org/message/6...
On Thu, Aug 30, 2018 at 6:52 PM, Brett Cannon <brett@python.org> wrote:
On Thu, 30 Aug 2018 at 11:21 Nathaniel Smith <njs@pobox.com> wrote:
If we're going to rethink this,
Well, I didn't want to "rethink" so much as "fill in". :)
then I would really like to move away from assigning special meaning to specific combinations of tags. The thing where if you use cp36m as you Python ABI tag then you're forced to use cp36 as your python dialect tag doesn't make sense. And the thing where if you have a wheel with an extension module tagged as cp36-abi3, then that works fine on 3.7,
I wouldn't expect that to if the stable ABI was expanded in Python 3.7 and you used the expanded part.
In my example, you have a wheel using the 3.6 ABI, running on 3.7. That's case that's supposed to work :-).
but if you *remove* the extension module then it *stops* working on 3.7? That's just bizarre...
I don't quite follow what you mean by "remove the extension module then it stops". What stops by removing the extension module(s)?
If I'm reading your proposal right, it says that when running on Python 3.7, pip should be willing to install wheels tagged cp36-abi3, but not wheels tagged cp36-none. But conceptually, if you take the extension modules out of a cp36-abi3 wheel, then you're left with a cp36-none wheel. So this is weird. Anywhere we're willing to install a cp36-abi3 wheel, we should also be willing to install a cp36-none wheel.
So my suggestions:
* Make the 3 tag categories totally independent. Compute a separate set for each, and then take the full cross product.
I think Paul was hinting at this as part of his "wildcard" idea (and I honestly thought of this initially as well as it greatly simplifies things). So what would cp36-cp36m-plat expand to?
I don't know what it means to "expand" a wheel tag. Are you punning the tag as also describing a Python installation ("CPython 3.6, with a certain soabi tag and platform"), and then asking to find all the wheel tags that could go in that installation? You can't actually do this in general – for example, determining whether a target interpreter is compatible with manylinux wheels requires running some special sniffing code on that interpreter.
cp36: cp3N where N is any positive digit(s)? And then toss in py3 and py3N? Prefer exact, then generic '3', then older, and finally newer?
Given that we don't have any real use cases for cpXY and pyXY, I'd rather not expand the options – just preserve what we do now... so for CPython 3.6, I'd say py3, py3N for N <= 6, cp3N for N <= 6, and I don't really care about the exact order – some sort of more-specific-before-less-specific makes sense, and whatever we do now is probably fine. If someone goes wild and starts distributing py35 and py36 wheels for the same package (via a "36to35" tool, I guess?) then that's probably what you want? But I don't imagine this will ever be an important use case. We tried it with 2to3, and everyone decided they'd rather write in the subset language for a decade instead.
cp36m: that, abi3, and then 'none'? Do we care if someone has some crazy ABI that no one understands like 'b' (and if so should it only be applied to 'py' interpreter versions which break your nice cross product simplicity)? plat: depends on platform.
Do you mean, what happens if we find ourselves running on an interpreter we don't recognize (not cpython/pypy/jython/...), and that interpreter returns something wacky from sysconfig.get_config_var("SOABI")? I think there's a reasonable argument that in that case we should only accept 'none' as the tag. (And then maybe whoever invented this new interpreter gets the job of adding sysconfig.get_supported_abi_tags() as a standard stdlib feature :-).)
And the match preference goes to platform, interpreter version, and then ABI? I think that would work in terms of ignoring C ABIs that have very little chance of linking while being the broadest in terms of accepting a wheel that has some semblance of a chance of running.
I don't think the preference order matters that much as long as its well-defined. The only cases where it would affect things are like... Suppose in the future we add support for platform tags based on different ISA levels, like x86_64_avx2 vs x86_64_sse3 vs x86_64. Then you could find yourself in a situation where examplepkg v1.2.3 has the following wheels available: cp36-cp36m-manylinux1_x86_64 cp36-abi3-manylinux1_x86_64_avx2 Our environment is CPython 3.6, running on a manylinux1-compatible OS and we do have AVX2 support available, so either of these wheels could work. The first wheel has a "better" ABI (cp36m > abi3), but a "worse" platform (x86_64 < x86_64_avx2). So which one should we pick? I have zero intuition here. Whoever compiled these wheels is clearly perverse, and they should stop doing silly things like this.
And how would that apply to PyPy3? Same ranging on 'pp36N` and drop the 'abi3' insertion?
* Since the stable ABI actually changes over time, we should define new tags abi35, abi36, etc. that mean "requires the stable ABI as defined by this version of cpython or higher", instead of relying on abi3 + a dialect tag. (Imagine if pypy started implementing the stable ABI –we'd have to start allowing cpXY tags to match PyPy.)
* Plan to move away from the pyXY and cpXY tags over time; they're confusing and not useful. Of course this will have to be a gradual process, but if pip stops requiring them now then in a year or two we could make setuptools stop generating then.
What would you do then about preferred match order for pure Python wheels? E.g. how do you preferably match against Python 3.7 wheels over 3.6 when running a Python 3.7 interpreter? Or are you suggesting equivalent ABI tags to make up for the pyXY tags and the interpreter tag simply gets ignored?
I don't understand your questions, sorry :-(. I'm just saying: we should make the stable-ABI tags self-contained, so that 'py3-abi36' would be sufficient to express "uses the stable abi, as it existed on python 3.6". This would match the same set of interpreters that that 'cp36-abi3' is currently used to target. And also, we should try to gradually decrease the usage of pyXY and cpXY tags (e.g. eventually setuptools should stop generating them by default). -n -- Nathaniel J. Smith -- https://vorpus.org
On Fri, 31 Aug 2018 at 07:15, Nathaniel Smith <njs@pobox.com> wrote:
On Thu, Aug 30, 2018 at 6:52 PM, Brett Cannon <brett@python.org> wrote:
On Thu, 30 Aug 2018 at 11:21 Nathaniel Smith <njs@pobox.com> wrote:
* Make the 3 tag categories totally independent. Compute a separate set for each, and then take the full cross product.
I think Paul was hinting at this as part of his "wildcard" idea (and I honestly thought of this initially as well as it greatly simplifies things). So what would cp36-cp36m-plat expand to?
I don't know what it means to "expand" a wheel tag.
To me, the confusion really lies in the interaction of "what tags do producers set" and "what tags to consumers accept". Nearly all of the odd corner cases of tags on wheels that we see in the wild don't come from bdist_wheel without manual project intervention, and there's no validation on what they do in that case. So when we're trying to say "what should we accept" we're faced with a dilemma, because we don't really know what the options are. As a starting point, I think it would be useful to focus on what tools are allowed to *generate*. The current situation seems to me to be as follows (mostly from the PEP): 1. Python tag - can be absolutely anything (--python-tag doesn't appear to validate the value at all). In practice, py2, py3, and cpXY are the most common values AFAIK. I'd consider most other values as having had "limited testing" :-) 2. ABI tag - I don't know. But I'm pretty sure there are variations in how this works between Windows and Unix, that should be considered (IIRC, for C extensions older versions of Python on Windows produced wheels with a "none" ABI tag, but newer ones produce "cpXYm" which was a change to match Linux behaviour). 3. Platform tag - distutils.util.get_platform() (but what about manylinux, how does that end up in there?) Is there a list of valid values that get_platform() can produce? IMO, before worrying about what consumers should match, we should tie down the valid values *producers* are allowed to generate, and from that consider whether PyPI should enforce a particular set of allowed tags. Obviously, projects can do what they like in terms of randomly renaming wheels, and users can do "pip install any-old-junk.whl", but we need a baseline of what counts as "sensible" tagging schemes if we're going to enumerate acceptable tags in a consumer. Essentially this is a variation on the principle of "be strict in what you produce, and lenient in what you consume", focused on the "produce" side of the equation. My comment about wildcarding is basically a forlorn hope that if we can't be sure what we're getting, maybe the best we can do is say "well, py3-*-* sounds like it might work, and there's nothing better, so let's give it a go". But if we can tie down a definition of what constitutes a "reasonable" set of tags, enumerating what we'll accept seems more likely to succeed. Paul
On Fri, 31 Aug 2018 at 01:41 Paul Moore <p.f.moore@gmail.com> wrote:
On Fri, 31 Aug 2018 at 07:15, Nathaniel Smith <njs@pobox.com> wrote:
On Thu, Aug 30, 2018 at 6:52 PM, Brett Cannon <brett@python.org> wrote:
On Thu, 30 Aug 2018 at 11:21 Nathaniel Smith <njs@pobox.com> wrote:
* Make the 3 tag categories totally independent. Compute a separate set for each, and then take the full cross product.
I think Paul was hinting at this as part of his "wildcard" idea (and I honestly thought of this initially as well as it greatly simplifies things). So what would cp36-cp36m-plat expand to?
I don't know what it means to "expand" a wheel tag.
To me, the confusion really lies in the interaction of "what tags do producers set" and "what tags to consumers accept".
Yes, it's the fact that some tags seem to have very strict requirements for compatibility while others are fairy lenient (e.g. cp36m versus abi3).
Nearly all of the odd corner cases of tags on wheels that we see in the wild don't come from bdist_wheel without manual project intervention, and there's no validation on what they do in that case. So when we're trying to say "what should we accept" we're faced with a dilemma, because we don't really know what the options are.
As a starting point, I think it would be useful to focus on what tools are allowed to *generate*.
The current situation seems to me to be as follows (mostly from the PEP):
1. Python tag - can be absolutely anything (--python-tag doesn't appear to validate the value at all). In practice, py2, py3, and cpXY are the most common values AFAIK. I'd consider most other values as having had "limited testing" :-)
Let's find out how common. :) Using the following query (and adjusting as appropriate): ``` SELECT COUNT(DISTINCT file.project) as projects FROM [the-psf:pypi.downloads20180824] WHERE file.filename LIKE '%-py3-%.whl' ``` You get the following results: - *: 120,079 - cp36: 1,088 - cp3: 2 - py36: 79 - py3: 15,639 So yes, there is a definite skew towards cp36 and py3 for at least Python 3.6. :) I also talked with Barry Warsaw about this and he mentioned how Debian shares pure Python installs across environments and while there was initial worry about e.g. py36 versus py35 wheels needing to be taken into account, the problem never really came up.
2. ABI tag - I don't know. But I'm pretty sure there are variations in how this works between Windows and Unix, that should be considered (IIRC, for C extensions older versions of Python on Windows produced wheels with a "none" ABI tag, but newer ones produce "cpXYm" which was a change to match Linux behaviour).
I believe this is correct. The other tricky bit is abi3 as that compatibility is tied to the python tag. I think this is where Nathaniel's idea of moving the Python tag to just be py3 and then being tighter on the ABI tag comes in.
3. Platform tag - distutils.util.get_platform() (but what about manylinux, how does that end up in there?)
If you mean how does 'wheel' decide to put it in there, I don't know. For checking if a wheel works it comes from a check of system details.
Is there a list of valid values that get_platform() can produce?
Not that I'm specifically aware of. The tricky bit is really when a platform is compatible with more than a single platform tag like Linux and macOS are.
IMO, before worrying about what consumers should match, we should tie down the valid values *producers* are allowed to generate, and from that consider whether PyPI should enforce a particular set of allowed tags. Obviously, projects can do what they like in terms of randomly renaming wheels, and users can do "pip install any-old-junk.whl", but we need a baseline of what counts as "sensible" tagging schemes if we're going to enumerate acceptable tags in a consumer.
Essentially this is a variation on the principle of "be strict in what you produce, and lenient in what you consume", focused on the "produce" side of the equation.
My comment about wildcarding is basically a forlorn hope that if we can't be sure what we're getting, maybe the best we can do is say "well, py3-*-* sounds like it might work, and there's nothing better, so let's give it a go". But if we can tie down a definition of what constitutes a "reasonable" set of tags, enumerating what we'll accept seems more likely to succeed.
OK, so let's look at what we're trying to support. If we have pure Python code there's very likely going to be a bottom Python version that's supported and then forward-compatibility is assumed. This is specified today through python-requires, so having a specific Python version in the wheel itself isn't totally critical. So 'py3-none-any' combined with python-requires takes care of this the vast majority of the time. Then you have extension modules. At a CPython level there's something like cp36m today (and its variants which are not compatible with each other), and then there's abi3 whose compatibility is determined by the Python tag as well. That currently means you can't read the ABI tag in isolation but have to treat the Python-ABI tag pair as a single unit. For other interpreters like PyPy3 I don't think there's been any need or thought about this so it's just a single tag like 'pypy3_60'. This is where Nathaniel is suggesting the Python tag just be 'py3' or 'py2' and more information gets embedded in the ABI tag, e.g. abi36 (which makes sense since the stable ABI should be forwards-compatible), cp36m, and then 'none' for pure Python. As for platforms, that's a per-platform thing unfortunately. My Mac, for instance, is compatible with 'macosx_10_13_x86_64', 'macosx_10_13_intel', 'macosx_10_13_fat64', 'macosx_10_13_fat32', and 'macosx_10_13_universal' (according to pip). Linux will have the platform plus maybe manylinux1. I have not dived into what Windows 10 would emit. I'm not sure how people would want to tidy this up. IOW I like Nathaniel's proposal: 1. Python tag: 'py2', 'py3' 2. ABI tag: abiXY (with assumed forwards-compatibility and thus the one special case to not doing a strict match compared to other tags), 'none', and then a wildcard acceptance for things like pypy3_60, cp36m, or whatever any other interpreter might need to specify 3. Platform tag: I don't know of any specific shift for this and it's probably stuck being platform-specific If we agree to this I can try to code a library that expects something like this to be the future but have backwards-compatibility for today with what I proposed while we transition. What do people think about that?
Le ven. 31 août 2018 à 9:25 PM, Brett Cannon <brett@python.org> a écrit :
OK, so let's look at what we're trying to support. If we have pure Python code there's very likely going to be a bottom Python version that's supported and then forward-compatibility is assumed. This is specified today through python-requires, so having a specific Python version in the wheel itself isn't totally critical. So 'py3-none-any' combined with python-requires takes care of this the vast majority of the time.
Hello, while python-requires is indeed nice, it is only usable by pip to select a wheel file with a fully compliant PEP 503 repository. Otherwise, like in a local wheelhouse (wheel files in a directory), it is invisible to pip until pip tries to install the wheel so a more precise python tag can still be somewhat useful... (I agree that pip could access the wheel metadata to check the python-requires without needing to unzip the whole file). Xavier
On Fri, 31 Aug 2018 at 13:33 Xavier Fernandez <xav.fernandez@gmail.com> wrote:
Le ven. 31 août 2018 à 9:25 PM, Brett Cannon <brett@python.org> a écrit :
OK, so let's look at what we're trying to support. If we have pure Python code there's very likely going to be a bottom Python version that's supported and then forward-compatibility is assumed. This is specified today through python-requires, so having a specific Python version in the wheel itself isn't totally critical. So 'py3-none-any' combined with python-requires takes care of this the vast majority of the time.
Hello,
while python-requires is indeed nice, it is only usable by pip to select a wheel file with a fully compliant PEP 503 repository. Otherwise, like in a local wheelhouse (wheel files in a directory), it is invisible to pip until pip tries to install the wheel so a more precise python tag can still be somewhat useful... (I agree that pip could access the wheel metadata to check the python-requires without needing to unzip the whole file).
You can make your pure Python wheel have a py3-cp36m-* wheel name and you get the version specification you want; there is nothing saying a wheel must have extension modules if it specifies an ABI tag. But more importantly, pip still has to resolve what version of a package to use, so regardless of what's cached in your wheelhouse there will be a check against python-requires somewhere, else you're no worse off than just installing the wheel directly. IOW your wheelhouse isn't going to short-circuit pip's resolver, it's just going to save you from having to download some bits from somewhere else (unless I understand how pip utilizes the wheelhouse on disk).
On Fri, 31 Aug 2018 at 22:03, Brett Cannon <brett@python.org> wrote:
You can make your pure Python wheel have a py3-cp36m-* wheel name and you get the version specification you want; there is nothing saying a wheel must have extension modules if it specifies an ABI tag.
print("\n".join([f"{p}-{b}-{a}" for p,b,a in pep425tags.get_supported()])) cp37-cp37m-win_amd64 cp37-none-win_amd64
Currently, that won't work, as pip doesn't consider combinations with py* as the version and an ABI that isn't none as valid: py3-none-win_amd64 cp37-none-any cp3-none-any py37-none-any py3-none-any py36-none-any py35-none-any py34-none-any py33-none-any py32-none-any py31-none-any py30-none-any If we're talking about what could be made to work, then yes, I guess this is possible. Although it feels like a hack to me - I consider the ABI to be about the C API, and therefore pure-python code *should* have "none" as the ABI. From PEP 425: "The ABI tag indicates which Python ABI is required by any included extension modules". The semantics of the 3 tags is important, and like it or not they are currently "(interpreter) version", "(extension) ABI" and "(platform) architecture". I'm not saying we couldn't alter the meanings, but doing so would need an update to the PEP (which is just admin, so not that significant) and a user re-education exercise (which is the non-trivial aspect). On Fri, 31 Aug 2018 at 13:33 Xavier Fernandez <xav.fernandez@gmail.com> wrote:
while python-requires is indeed nice, it is only usable by pip to select a wheel file with a fully compliant PEP 503 repository. Otherwise, like in a local wheelhouse (wheel files in a directory), it is invisible to pip until pip tries to install the wheel so a more precise python tag can still be somewhat useful... (I agree that pip could access the wheel metadata to check the python-requires without needing to unzip the whole file).
The point of this is that PEP 503 allows the index to expose Requires-Python as a `data-requires-python` attribute on a file link. So if accessing an index, it's possible to rely on requires-python when selecting what wheels to download or consider further (and pip does that). But when using --find-links to point to a directory full of files (i.e., without a PEP 503 HTML index page) you can't access Requires-Python without opening up the wheel and accessing the metadata - so Requires-Python can't be used to trim the list of potential candidate wheels (or at least, pip doesn't, it is of course theoretically possible).
But more importantly, pip still has to resolve what version of a package to use, so regardless of what's cached in your wheelhouse there will be a check against python-requires somewhere, else you're no worse off than just installing the wheel directly. IOW your wheelhouse isn't going to short-circuit pip's resolver, it's just going to save you from having to download some bits from somewhere else (unless I understand how pip utilizes the wheelhouse on disk).
What you're talking about here is the cache, which is different (and which *is* just about saving downloads). The details of pip's resolution sequence are not defined by any standard, and they aren't documented in detail by pip either, so we're very much talking about implementation-defined behaviour here. And I haven't checked the source code in detail, so I'm going off memory and assumptions. But basically, pip picks which wheel it's going to install by scanning the filenames (and the data-requires-python tag if it's looking at a PEP 503 index). It doesn't open the wheel and look at the metadata until it's committed to a particular wheel. I *think* that if the chosen package has an incompatible Requires-Python at that point, the resolver simply errors out and doesn't look for alternatives. The *finder* picks which wheel we're going to use, and does this solely based on the wheel filename (and the data-requires-python tag, for files from a PEP 503 compatible index that supplies it). Once that decision is made, there's no going back - any incompatibilities in metadata after that point simply cause the install to fail. Or in other words, using Requires-Python is not a reliable way to direct pip to choose a particular wheel. That decision is deliberate, for performance reasons. The principle is that installers are able to pick the correct file to install *without* opening up the wheel or otherwise downloading it to check its validity. Paul.
On Tue, 4 Sep 2018 at 02:15 Paul Moore <p.f.moore@gmail.com> wrote:
On Fri, 31 Aug 2018 at 22:03, Brett Cannon <brett@python.org> wrote:
You can make your pure Python wheel have a py3-cp36m-* wheel name and you get the version specification you want; there is nothing saying a wheel must have extension modules if it specifies an ABI tag.
Currently, that won't work, as pip doesn't consider combinations with py* as the version and an ABI that isn't none as valid:
print("\n".join([f"{p}-{b}-{a}" for p,b,a in pep425tags.get_supported()])) cp37-cp37m-win_amd64 cp37-none-win_amd64 py3-none-win_amd64 cp37-none-any cp3-none-any py37-none-any py3-none-any py36-none-any py35-none-any py34-none-any py33-none-any py32-none-any py31-none-any py30-none-any
If we're talking about what could be made to work, then yes, I guess this is possible. Although it feels like a hack to me
It's a total hack. :) I was just trying to alleviate Xavier's worry about having a directory of wheels where he couldn't choose a specific wheel that was pure Python and tied to a Python version without the interpreter version being specific. Since PyPI doesn't really have projects doing this, moving away from e.g. py35 and to just py3 would still work as you could still say cp35-none-* for the same effect without abusing the ABI tag, but you would have to potentially duplicate the same wheel under different interpreter names if you were supporting both PyPy3 6.0.0 and CPython 3.5.x.
- I consider the ABI to be about the C API, and therefore pure-python code *should* have "none" as the ABI. From PEP 425: "The ABI tag indicates which Python ABI is required by any included extension modules". The semantics of the 3 tags is important, and like it or not they are currently "(interpreter) version", "(extension) ABI" and "(platform) architecture".
The real annoyance are tags that have a sliding scale of support like abi3 and the fact that what Python version an interpreter supports can't be known from a single tag set/triple/whatever-you-call-all-three-tags-together (which comes into play if you tell me you want to download wheels for your cloud host and you want to make it simple for them to tell you what their Python interpreter supports, but I'm slowly giving up on that dream).
I'm not saying we couldn't alter the meanings, but doing so would need an update to the PEP (which is just admin, so not that significant) and a user re-education exercise (which is the non-trivial aspect).
I think the real question is whether you like Nathaniel's clarification suggestion on the tags or if it isn't worth it? -Brett
On Fri, 31 Aug 2018 at 13:33 Xavier Fernandez <xav.fernandez@gmail.com> wrote:
while python-requires is indeed nice, it is only usable by pip to select a wheel file with a fully compliant PEP 503 repository. Otherwise, like in a local wheelhouse (wheel files in a directory), it is invisible to pip until pip tries to install the wheel so a more precise python tag can still be somewhat useful... (I agree that pip could access the wheel metadata to check the python-requires without needing to unzip the whole file).
The point of this is that PEP 503 allows the index to expose Requires-Python as a `data-requires-python` attribute on a file link. So if accessing an index, it's possible to rely on requires-python when selecting what wheels to download or consider further (and pip does that). But when using --find-links to point to a directory full of files (i.e., without a PEP 503 HTML index page) you can't access Requires-Python without opening up the wheel and accessing the metadata - so Requires-Python can't be used to trim the list of potential candidate wheels (or at least, pip doesn't, it is of course theoretically possible).
But more importantly, pip still has to resolve what version of a package to use, so regardless of what's cached in your wheelhouse there will be a check against python-requires somewhere, else you're no worse off than just installing the wheel directly. IOW your wheelhouse isn't going to short-circuit pip's resolver, it's just going to save you from having to download some bits from somewhere else (unless I understand how pip utilizes the wheelhouse on disk).
What you're talking about here is the cache, which is different (and which *is* just about saving downloads).
The details of pip's resolution sequence are not defined by any standard, and they aren't documented in detail by pip either, so we're very much talking about implementation-defined behaviour here. And I haven't checked the source code in detail, so I'm going off memory and assumptions. But basically, pip picks which wheel it's going to install by scanning the filenames (and the data-requires-python tag if it's looking at a PEP 503 index). It doesn't open the wheel and look at the metadata until it's committed to a particular wheel. I *think* that if the chosen package has an incompatible Requires-Python at that point, the resolver simply errors out and doesn't look for alternatives.
The *finder* picks which wheel we're going to use, and does this solely based on the wheel filename (and the data-requires-python tag, for files from a PEP 503 compatible index that supplies it). Once that decision is made, there's no going back - any incompatibilities in metadata after that point simply cause the install to fail. Or in other words, using Requires-Python is not a reliable way to direct pip to choose a particular wheel. That decision is deliberate, for performance reasons. The principle is that installers are able to pick the correct file to install *without* opening up the wheel or otherwise downloading it to check its validity.
Paul.
On Fri, 31 Aug 2018 at 09:41, Paul Moore <p.f.moore@gmail.com> wrote:
My comment about wildcarding is basically a forlorn hope that if we can't be sure what we're getting, maybe the best we can do is say "well, py3-*-* sounds like it might work, and there's nothing better, so let's give it a go". But if we can tie down a definition of what constitutes a "reasonable" set of tags, enumerating what we'll accept seems more likely to succeed.
To clarify, by wildcards, I really only mean "none or any match anything, and pyX matches any of pyXY". So (for example) pip doesn't need to say it supports cp37-none-win_amd64 because any wheel declaring it's for cp37-none-win_amd64 will match pip's supported tagset cp37-cp37m-win_amd64. But getting that sort of wildcard matching to work with the priority ordering we want (py37>py3>py36, for instance) is not straightforward, and I never thought it all through. Paul
On Thu, 30 Aug 2018 at 23:13 Nathaniel Smith <njs@pobox.com> wrote:
On Thu, Aug 30, 2018 at 6:52 PM, Brett Cannon <brett@python.org> wrote:
On Thu, 30 Aug 2018 at 11:21 Nathaniel Smith <njs@pobox.com> wrote:
If we're going to rethink this,
Well, I didn't want to "rethink" so much as "fill in". :)
then I would really like to move away from assigning special meaning to specific combinations of tags. The thing where if you use cp36m as you Python ABI tag then you're forced to use cp36 as your python dialect tag doesn't make sense. And the thing where if you have a wheel with an extension module tagged as cp36-abi3, then that works fine on 3.7,
I wouldn't expect that to if the stable ABI was expanded in Python 3.7
and
you used the expanded part.
In my example, you have a wheel using the 3.6 ABI, running on 3.7. That's case that's supposed to work :-).
but if you *remove* the extension module then it *stops* working on 3.7? That's just bizarre...
I don't quite follow what you mean by "remove the extension module then it stops". What stops by removing the extension module(s)?
If I'm reading your proposal right, it says that when running on Python 3.7, pip should be willing to install wheels tagged cp36-abi3, but not wheels tagged cp36-none. But conceptually, if you take the extension modules out of a cp36-abi3 wheel, then you're left with a cp36-none wheel. So this is weird. Anywhere we're willing to install a cp36-abi3 wheel, we should also be willing to install a cp36-none wheel.
Sure. I took it out because it complicated the logic in my head and only 7 projects have such a wheel.
So my suggestions:
* Make the 3 tag categories totally independent. Compute a separate set for each, and then take the full cross product.
I think Paul was hinting at this as part of his "wildcard" idea (and I honestly thought of this initially as well as it greatly simplifies things). So what would cp36-cp36m-plat expand to?
I don't know what it means to "expand" a wheel tag. Are you punning the tag as also describing a Python installation ("CPython 3.6, with a certain soabi tag and platform"), and then asking to find all the wheel tags that could go in that installation?
Potentially because this comes up if you're trying to create a multi-platform lock file for dependencies or you want to download the dependencies for your cloud deployment which differs from your dev machine.
You can't actually do this in general – for example, determining whether a target interpreter is compatible with manylinux wheels requires running some special sniffing code on that interpreter.
Maybe not entirely in general, but you can come pretty close.
cp36: cp3N where N is any positive digit(s)? And then toss in py3 and py3N? Prefer exact, then generic '3', then older, and finally newer?
Given that we don't have any real use cases for cpXY and pyXY, I'd rather not expand the options – just preserve what we do now... so for CPython 3.6, I'd say py3, py3N for N <= 6, cp3N for N <= 6, and I don't really care about the exact order – some sort of more-specific-before-less-specific makes sense, and whatever we do now is probably fine. If someone goes wild and starts distributing py35 and py36 wheels for the same package (via a "36to35" tool, I guess?) then that's probably what you want? But I don't imagine this will ever be an important use case. We tried it with 2to3, and everyone decided they'd rather write in the subset language for a decade instead.
cp36m: that, abi3, and then 'none'? Do we care if someone has some crazy ABI that no one understands like 'b' (and if so should it only be applied to 'py' interpreter versions which break your nice cross product simplicity)? plat: depends on platform.
Do you mean, what happens if we find ourselves running on an interpreter we don't recognize (not cpython/pypy/jython/...), and that interpreter returns something wacky from sysconfig.get_config_var("SOABI")?
That and people who have weird ABIs for CPython like 'b' as I found on PyPI. It's hard to get a clear signal of what people want because people seem to be signalling a desire to be fairly permissive, but then people bring up examples where they don't. ;)
I think there's a reasonable argument that in that case we should only accept 'none' as the tag. (And then maybe whoever invented this new interpreter gets the job of adding sysconfig.get_supported_abi_tags() as a standard stdlib feature :-).)
And the match preference goes to platform, interpreter version, and then ABI? I think that would work in terms of ignoring C ABIs that have very little chance of linking while being the broadest in terms of accepting a wheel that has some semblance of a chance of running.
I don't think the preference order matters that much as long as its well-defined. The only cases where it would affect things are like...
Suppose in the future we add support for platform tags based on different ISA levels, like x86_64_avx2 vs x86_64_sse3 vs x86_64. Then you could find yourself in a situation where examplepkg v1.2.3 has the following wheels available:
cp36-cp36m-manylinux1_x86_64 cp36-abi3-manylinux1_x86_64_avx2
Our environment is CPython 3.6, running on a manylinux1-compatible OS and we do have AVX2 support available, so either of these wheels could work. The first wheel has a "better" ABI (cp36m > abi3), but a "worse" platform (x86_64 < x86_64_avx2). So which one should we pick?
I have zero intuition here. Whoever compiled these wheels is clearly perverse, and they should stop doing silly things like this.
And how would that apply to PyPy3? Same ranging on 'pp36N` and drop the 'abi3' insertion?
* Since the stable ABI actually changes over time, we should define new tags abi35, abi36, etc. that mean "requires the stable ABI as defined by this version of cpython or higher", instead of relying on abi3 + a
tag. (Imagine if pypy started implementing the stable ABI –we'd have to start allowing cpXY tags to match PyPy.)
* Plan to move away from the pyXY and cpXY tags over time; they're confusing and not useful. Of course this will have to be a gradual
dialect process,
but if pip stops requiring them now then in a year or two we could make setuptools stop generating then.
What would you do then about preferred match order for pure Python wheels? E.g. how do you preferably match against Python 3.7 wheels over 3.6 when running a Python 3.7 interpreter? Or are you suggesting equivalent ABI tags to make up for the pyXY tags and the interpreter tag simply gets ignored?
I don't understand your questions, sorry :-(.
I'm just saying: we should make the stable-ABI tags self-contained, so that 'py3-abi36' would be sufficient to express "uses the stable abi, as it existed on python 3.6". This would match the same set of interpreters that that 'cp36-abi3' is currently used to target. And also, we should try to gradually decrease the usage of pyXY and cpXY tags (e.g. eventually setuptools should stop generating them by default).
Your comment said "move away from pyXY and cpXY", which suggested to me that you were wanting to drop the first part of the tag triple, not simply downgrade it to 'py3', so just ignore my questions based on confusion. :) -Brett
-n
-- Nathaniel J. Smith -- https://vorpus.org
On Thu, Aug 30, 2018, 08:23 Nick Coghlan <ncoghlan@gmail.com> wrote:
On Thu, 30 Aug 2018 at 09:58, Brett Cannon <brett@python.org> wrote:
On Wed, 29 Aug 2018 at 15:54 Nathaniel Smith <njs@pobox.com> wrote:
This is a tricky decision. Any time a new Python comes out, some existing wheels will continue to work fine, and some will be broken. One goal is to avoid installing broken wheels. But, there's also another consideration: if we're too conservative, then with every release we create a bunch of make-work as projects have to re-roll old wheels that would have worked fine, and some percentage of projects won't do this (e.g. b/c they're abandoned), and we lose them forever. Also, for the py3x tags in particular, if the wheel fails on py3(x+1), then the sdist probably will too, so it's not like we have any useful fallback.
Right, but isn't that what the py3-none-any tag is meant to represent? If someone doesn't use that tag then I would take that as there is some version-specific stuff in that wheel.
The problem is that "py3-none-any" doesn't specify a *minimum* version, so if a project starts using a new feature like f-strings, they *have* to declare "py36-...".
That's the theory, but I think these tags are useless in practice. If you're on py35 and pip sees a wheel with py36 as the tag, then it falls back to building from the sdist. For ABI-related tags this makes sense, because given an sdist and an appropriate compiler, you have a good chance of being able to generate wheels for some arbitrary platform, even one that the original authors never heard of. But... the python dialect tags are different. If your wheel uses f-strings, then your sdist probably does too, so all the tag does is move around the error to happen somewhere else. To avoid this, you have to put a Python-Requires header in your metadata. It's the only thing that works for sdists. And it also works for wheels. And it's strictly more expressive than the wheel tag version (you can write arbitrary restrictions like ">= 3.5.2, != 3.6.1". Note that 3.5.2 actually is a common minimum version for lots of async libraries, because it had a breaking change in the core async/await protocols). So I don't think there's any case where the pyXY tags are actually useful. You're always better off using Python-Requires. -n
As far as I know no one has released a compiler that turns f-strings into not-f-strings. If something like that existed, then you could have two wheels for the same software, one that had been passed through the compiler to remove f-strings and gain compatibility with older Python. If you were doing something like that then the pyXY tag becomes useful. Python-Requires is more useful for the ordinary case. The arch-only tags are an example of something we had to add because a previously theoretical case became real. It is a little silly but harmless to go all the way down to py30. Python 3.2 or 2.6 were perhaps the oldest Pythons bdist_wheel could ever run on (2012). On Thu, Aug 30, 2018 at 2:01 PM Nathaniel Smith <njs@pobox.com> wrote:
On Thu, Aug 30, 2018, 08:23 Nick Coghlan <ncoghlan@gmail.com> wrote:
On Thu, 30 Aug 2018 at 09:58, Brett Cannon <brett@python.org> wrote:
On Wed, 29 Aug 2018 at 15:54 Nathaniel Smith <njs@pobox.com> wrote:
This is a tricky decision. Any time a new Python comes out, some existing wheels will continue to work fine, and some will be broken. One goal is to avoid installing broken wheels. But, there's also another consideration: if we're too conservative, then with every release we create a bunch of make-work as projects have to re-roll old wheels that would have worked fine, and some percentage of projects won't do this (e.g. b/c they're abandoned), and we lose them forever. Also, for the py3x tags in particular, if the wheel fails on py3(x+1), then the sdist probably will too, so it's not like we have any useful fallback.
Right, but isn't that what the py3-none-any tag is meant to represent? If someone doesn't use that tag then I would take that as there is some version-specific stuff in that wheel.
The problem is that "py3-none-any" doesn't specify a *minimum* version, so if a project starts using a new feature like f-strings, they *have* to declare "py36-...".
That's the theory, but I think these tags are useless in practice.
If you're on py35 and pip sees a wheel with py36 as the tag, then it falls back to building from the sdist. For ABI-related tags this makes sense, because given an sdist and an appropriate compiler, you have a good chance of being able to generate wheels for some arbitrary platform, even one that the original authors never heard of. But... the python dialect tags are different. If your wheel uses f-strings, then your sdist probably does too, so all the tag does is move around the error to happen somewhere else.
To avoid this, you have to put a Python-Requires header in your metadata. It's the only thing that works for sdists. And it also works for wheels. And it's strictly more expressive than the wheel tag version (you can write arbitrary restrictions like ">= 3.5.2, != 3.6.1". Note that 3.5.2 actually is a common minimum version for lots of async libraries, because it had a breaking change in the core async/await protocols).
So I don't think there's any case where the pyXY tags are actually useful. You're always better off using Python-Requires.
-n -- Distutils-SIG mailing list -- distutils-sig@python.org To unsubscribe send an email to distutils-sig-leave@python.org https://mail.python.org/mm3/mailman3/lists/distutils-sig.python.org/ Message archived at https://mail.python.org/mm3/archives/list/distutils-sig@python.org/message/K...
On Aug 30, 2018, at 1:59 PM, Nathaniel Smith <njs@pobox.com> wrote:
That's the theory, but I think these tags are useless in practice.
If you're on py35 and pip sees a wheel with py36 as the tag, then it falls back to building from the sdist. For ABI-related tags this makes sense, because given an sdist and an appropriate compiler, you have a good chance of being able to generate wheels for some arbitrary platform, even one that the original authors never heard of. But... the python dialect tags are different. If your wheel uses f-strings, then your sdist probably does too, so all the tag does is move around the error to happen somewhere else.
To avoid this, you have to put a Python-Requires header in your metadata. It's the only thing that works for sdists. And it also works for wheels. And it's strictly more expressive than the wheel tag version (you can write arbitrary restrictions like ">= 3.5.2, != 3.6.1". Note that 3.5.2 actually is a common minimum version for lots of async libraries, because it had a breaking change in the core async/await protocols).
So I don't think there's any case where the pyXY tags are actually useful. You're always better off using Python-Requires.
I haven’t been following this discussion very well, but forgive me for jumping in here. I find it helpful to generally not think of compatibility tags as hard “this wheel is supported on this platform”, but more along the lines of “if I built a wheel in the specified environment, I would get the same results”. Those results may or may not work. So for example, if you have a pure Python wheel that *only* works on Python 3.5+, but it produces the same output when building the wheel on Python 3.3 (even if it doesn’t ultimately work), then a “py3” wheel is the right tag to use. Arguably you’d even use py2.py3 since you’d produce the same output (but in practice most don’t since that’s extra work). Likewise if you have a sdist that uses 2to3 to produce different outputs on py2 and py3, then you’d obviously use only py2 and py3. However if you had something that say, included different files in the wheel in 3.5+ (maybe PEP 484 .pyi files?) then it would be appropriate to use py35 for that wheel, and py30 or even py3 for the other wheel that didn’t include those files. I’m not sure I follow Brett’s last email about the diff he’s thinking about, but that may be because I’m still waking up.
On Thu, 30 Aug 2018 at 19:30, Donald Stufft <donald@stufft.io> wrote:
I find it helpful to generally not think of compatibility tags as hard “this wheel is supported on this platform”, but more along the lines of “if I built a wheel in the specified environment, I would get the same results”. Those results may or may not work. So for example, if you have a pure Python wheel that *only* works on Python 3.5+, but it produces the same output when building the wheel on Python 3.3 (even if it doesn’t ultimately work), then a “py3” wheel is the right tag to use. Arguably you’d even use py2.py3 since you’d produce the same output (but in practice most don’t since that’s extra work).
I also quite like Daniel's description of the list of supported tags as "try these, in this order - the first one you match is the best chance you have of getting something that works". It may still not work, but there's nothing that the package builder declared as being any better. The thing I most dislike about the tag system is that if pip (or any installer) misses out a particular tag combination, it gets totally ignored. I sort of wish that there was some level of wildcard matching, so that weird combinations get picked up at least *somewhere* in the priority list. But I never managed to design a workable scheme for doing that, so it may not even be possible. Paul
Broad question: is there a sense in which the same “resolver” logic could be used for both choosing the best tag, and choosing what versions of packages to install from abstract requirements, or are these fundamentally different paradigms that shouldn’t be thought of along the same lines? —Chris On Thu, Aug 30, 2018 at 11:42 AM Paul Moore <p.f.moore@gmail.com> wrote:
I find it helpful to generally not think of compatibility tags as hard “this wheel is supported on this platform”, but more along the lines of “if I built a wheel in the specified environment, I would get the same results”. Those results may or may not work. So for example, if you have a
On Thu, 30 Aug 2018 at 19:30, Donald Stufft <donald@stufft.io> wrote: pure Python wheel that *only* works on Python 3.5+, but it produces the same output when building the wheel on Python 3.3 (even if it doesn’t ultimately work), then a “py3” wheel is the right tag to use. Arguably you’d even use py2.py3 since you’d produce the same output (but in practice most don’t since that’s extra work).
I also quite like Daniel's description of the list of supported tags as "try these, in this order - the first one you match is the best chance you have of getting something that works". It may still not work, but there's nothing that the package builder declared as being any better.
The thing I most dislike about the tag system is that if pip (or any installer) misses out a particular tag combination, it gets totally ignored. I sort of wish that there was some level of wildcard matching, so that weird combinations get picked up at least *somewhere* in the priority list. But I never managed to design a workable scheme for doing that, so it may not even be possible.
Paul -- Distutils-SIG mailing list -- distutils-sig@python.org To unsubscribe send an email to distutils-sig-leave@python.org https://mail.python.org/mm3/mailman3/lists/distutils-sig.python.org/ Message archived at https://mail.python.org/mm3/archives/list/distutils-sig@python.org/message/H...
Package version resolution is a completely different and np-complete problem. https://en.wikipedia.org/wiki/Boolean_satisfiability_problem On Thu, Aug 30, 2018 at 2:51 PM Chris Jerdonek <chris.jerdonek@gmail.com> wrote:
Broad question: is there a sense in which the same “resolver” logic could be used for both choosing the best tag, and choosing what versions of packages to install from abstract requirements, or are these fundamentally different paradigms that shouldn’t be thought of along the same lines?
—Chris
On Thu, Aug 30, 2018 at 11:42 AM Paul Moore <p.f.moore@gmail.com> wrote:
I find it helpful to generally not think of compatibility tags as hard “this wheel is supported on this platform”, but more along the lines of “if I built a wheel in the specified environment, I would get the same results”. Those results may or may not work. So for example, if you have a
On Thu, 30 Aug 2018 at 19:30, Donald Stufft <donald@stufft.io> wrote: pure Python wheel that *only* works on Python 3.5+, but it produces the same output when building the wheel on Python 3.3 (even if it doesn’t ultimately work), then a “py3” wheel is the right tag to use. Arguably you’d even use py2.py3 since you’d produce the same output (but in practice most don’t since that’s extra work).
I also quite like Daniel's description of the list of supported tags as "try these, in this order - the first one you match is the best chance you have of getting something that works". It may still not work, but there's nothing that the package builder declared as being any better.
The thing I most dislike about the tag system is that if pip (or any installer) misses out a particular tag combination, it gets totally ignored. I sort of wish that there was some level of wildcard matching, so that weird combinations get picked up at least *somewhere* in the priority list. But I never managed to design a workable scheme for doing that, so it may not even be possible.
Paul -- Distutils-SIG mailing list -- distutils-sig@python.org To unsubscribe send an email to distutils-sig-leave@python.org https://mail.python.org/mm3/mailman3/lists/distutils-sig.python.org/ Message archived at https://mail.python.org/mm3/archives/list/distutils-sig@python.org/message/H...
-- Distutils-SIG mailing list -- distutils-sig@python.org To unsubscribe send an email to distutils-sig-leave@python.org https://mail.python.org/mm3/mailman3/lists/distutils-sig.python.org/ Message archived at https://mail.python.org/mm3/archives/list/distutils-sig@python.org/message/Z...
On Aug 30, 2018, at 3:07 PM, Daniel Holth <dholth@gmail.com> wrote:
Package version resolution is a completely different and np-complete problem. https://en.wikipedia.org/wiki/Boolean_satisfiability_problem <https://en.wikipedia.org/wiki/Boolean_satisfiability_problem>
They are related in the sense that any resolver will have to be able to treat distinct wheels as distinct candidates for installation, since two different wheels can have different dependencies— but yea like Daniel said, otherwise they are different problems.
On Thu, 30 Aug 2018 at 11:41 Paul Moore <p.f.moore@gmail.com> wrote:
I find it helpful to generally not think of compatibility tags as hard “this wheel is supported on this platform”, but more along the lines of “if I built a wheel in the specified environment, I would get the same results”. Those results may or may not work. So for example, if you have a
On Thu, 30 Aug 2018 at 19:30, Donald Stufft <donald@stufft.io> wrote: pure Python wheel that *only* works on Python 3.5+, but it produces the same output when building the wheel on Python 3.3 (even if it doesn’t ultimately work), then a “py3” wheel is the right tag to use. Arguably you’d even use py2.py3 since you’d produce the same output (but in practice most don’t since that’s extra work).
I also quite like Daniel's description of the list of supported tags as "try these, in this order - the first one you match is the best chance you have of getting something that works". It may still not work, but there's nothing that the package builder declared as being any better.
The thing I most dislike about the tag system is that if pip (or any installer) misses out a particular tag combination, it gets totally ignored. I sort of wish that there was some level of wildcard matching, so that weird combinations get picked up at least *somewhere* in the priority list.
So basically unless the python_requires is set to something that definitely won't work for a release, you almost want to try anything that has a remote chance of working?
But I never managed to design a workable scheme for doing that, so it may not even be possible.
So you could get that with the 'py' interpreter tag. If you said for e.g. Python 3.7 you first want py37-none, then py3-none, then any py3N-none where 0 <= N < 7, then py3M-none where M > 7, you have basically given a wildcard for pure Python wheels based on version. And if you extend this to ignoring the ABI tag after trying for 'none' then you really make it hard to not match *some* pure Python wheel that works with some version of Python 3 (BTW the only ABI tags not 'none' for 'py3' are cp27m, release, b, none1, py2, and sf across 11 projects on PyPI; there are no 'py36' wheels that don't have 'none' as the ABI). Is that what you're after? Wildcarding on C ABIs is probably too much trouble beyond abi3 and checking older Python versions. -Brett
Paul -- Distutils-SIG mailing list -- distutils-sig@python.org To unsubscribe send an email to distutils-sig-leave@python.org https://mail.python.org/mm3/mailman3/lists/distutils-sig.python.org/ Message archived at https://mail.python.org/mm3/archives/list/distutils-sig@python.org/message/H...
On Thu, 30 Aug 2018 at 11:29 Donald Stufft <donald@stufft.io> wrote:
On Aug 30, 2018, at 1:59 PM, Nathaniel Smith <njs@pobox.com> wrote:
That's the theory, but I think these tags are useless in practice.
If you're on py35 and pip sees a wheel with py36 as the tag, then it falls back to building from the sdist. For ABI-related tags this makes sense, because given an sdist and an appropriate compiler, you have a good chance of being able to generate wheels for some arbitrary platform, even one that the original authors never heard of. But... the python dialect tags are different. If your wheel uses f-strings, then your sdist probably does too, so all the tag does is move around the error to happen somewhere else.
To avoid this, you have to put a Python-Requires header in your metadata. It's the only thing that works for sdists. And it also works for wheels. And it's strictly more expressive than the wheel tag version (you can write arbitrary restrictions like ">= 3.5.2, != 3.6.1". Note that 3.5.2 actually is a common minimum version for lots of async libraries, because it had a breaking change in the core async/await protocols).
So I don't think there's any case where the pyXY tags are actually useful. You're always better off using Python-Requires.
I haven’t been following this discussion very well, but forgive me for jumping in here.
I find it helpful to generally not think of compatibility tags as hard “this wheel is supported on this platform”, but more along the lines of “if I built a wheel in the specified environment, I would get the same results”. Those results may or may not work. So for example, if you have a pure Python wheel that *only* works on Python 3.5+, but it produces the same output when building the wheel on Python 3.3 (even if it doesn’t ultimately work), then a “py3” wheel is the right tag to use. Arguably you’d even use py2.py3 since you’d produce the same output (but in practice most don’t since that’s extra work).
Yep.
Likewise if you have a sdist that uses 2to3 to produce different outputs on py2 and py3, then you’d obviously use only py2 and py3. However if you had something that say, included different files in the wheel in 3.5+ (maybe PEP 484 .pyi files?) then it would be appropriate to use py35 for that wheel, and py30 or even py3 for the other wheel that didn’t include those files.
Agreed.
I’m not sure I follow Brett’s last email about the diff he’s thinking about, but that may be because I’m still waking up.
Basically I'm trying to figure out what tags pip and various tools should be matching when determining what wheel to download from PyPI (so that diff is what pip matches against now in two scenarios and how I think what is matched against should change). The hope is this can be centralized into a library instead of being internalized in each tool like it is now.
On Aug 30, 2018, at 9:05 PM, Brett Cannon <brett@python.org> wrote:
Basically I'm trying to figure out what tags pip and various tools should be matching when determining what wheel to download from PyPI (so that diff is what pip matches against now in two scenarios and how I think what is matched against should change). The hope is this can be centralized into a library instead of being internalized in each tool like it is now.
That would be great. I had planned on adding this to pypa/packaging, but never found the time to figure out a good API for it or the exact semantics we want. So Kudos for doing that work. Feel free to make it part of pypa/packaging if you want, or a distinct library is fine too!
On Thu, 30 Aug 2018 at 18:08 Donald Stufft <donald@stufft.io> wrote:
On Aug 30, 2018, at 9:05 PM, Brett Cannon <brett@python.org> wrote:
Basically I'm trying to figure out what tags pip and various tools should be matching when determining what wheel to download from PyPI (so that diff is what pip matches against now in two scenarios and how I think what is matched against should change). The hope is this can be centralized into a library instead of being internalized in each tool like it is now.
That would be great. I had planned on adding this to pypa/packaging, but never found the time to figure out a good API for it or the exact semantics we want.
Same here, which is why I'm trying to lock down the semantics. :)
So Kudos for doing that work. Feel free to make it part of pypa/packaging if you want, or a distinct library is fine too!
I initially thought 'packaging', but I also don't want to care about Python 2. :) So I haven't decided yet. At worst this just leads to a clearer understanding of how tools should do wheel compatibility matching.
On Aug 30, 2018, at 9:54 PM, Brett Cannon <brett@python.org> wrote:
I initially thought 'packaging', but I also don't want to care about Python 2. :) So I haven't decided yet. At worst this just leads to a clearer understanding of how tools should do wheel compatibility matching.
If you want to get rid of all of the bundled pep425tags, you’re going to have to care about Python 2 anyways, since I think all of those projects still support it. But if the only output is a better understanding of what the semantics should be, I still think that is super valuable.
participants (8)
-
Brett Cannon
-
Chris Jerdonek
-
Daniel Holth
-
Donald Stufft
-
Nathaniel Smith
-
Nick Coghlan
-
Paul Moore
-
Xavier Fernandez