As a new Twine maintainer I've been running into questions like:
* Now that Warehouse doesn't use "register" anymore, can we deprecate it from distutils, setuptools, and twine? Are any other package indexes or upload tools using it? https://github.com/pypa/twine/issues/311
* It would be nice if Twine could depend on a package index providing an HTTP 201 response in response to a successful upload, and fail on 200 (a response some non-package-index servers will give to an arbitrary POST request).
I do not see specifications to guide me here, e.g., in the official guidance on hosting one's own package index https://packaging.python.org/guides/hosting-your-own-index/ . PEP 301 was long enough ago that it's due an update, and PEP 503 only concerns browsing and download, not upload.
I suggest that I write a PEP specifying an API for uploading to a Python package index. This PEP would partially supersede PEP 301 and would document the Warehouse reference implementation. I would write it in collaboration with the Warehouse maintainers who will develop the reference implementation per pypa/warehouse/issues/284 and maybe add a header referring to compliance with this new standard. And I would consult with the maintainers of packaging and distribution tools such as zest.releaser, flit, poetry, devpi, pypiserver, etc.
Per Nick Coghlan's formulation, my specific goal here would be close to:
> Documenting what the current upload API between twine & warehouse actually is, similar to the way PEP 503 focused on describing the status quo, without making any changes to it. That way, other servers (like devpi) and other upload clients have the info they need to help ensure interoperability.
Since Warehouse is trying to redo its various APIs in the next several months, I think it might be more useful to document and work with the new upload API, but I'm open to feedback on this.
After a little conversation here on distutils-sig, I believe my steps would be:
1. start a very early PEP draft with lots of To Be Determined blanks, submit as a PR to the python/peps repo, and share it with distutils-sig
2. ping maintainers of related tools
3. discuss with others at the packaging sprints https://wiki.python.org/psf/PackagingSprints next week
4. revise and get consensus, preferably mostly on this list
5. finalize PEP and get PEP accepted by BDFL-Delegate
6. coordinate with PyPA, maintainers of `distutils`, maintainers of packaging and distribution tools, and documentation maintainers to implement PEP compliance
Thoughts are welcome. I originally posted this at https://github.com/pypa/packaging-problems/issues/128 .
--
Sumana Harihareswara
Changeset Consulting
https://changeset.nyc
Hi all,
The manylinux1 -> manylinux2010 transition has turned out to be very
difficult. Timeline so far:
March 2017: CentOS 5 went EOL
April 2018: PEP 517 accepted
May 2018: support for manylinux2010 lands in warehouse
November 2018: support lands in auditwheel, and pip master
December 2018: 21 months after CentOS 5 EOL, wwee still don't have an
official build environment, or support in a pip release
We'll get through this, but it's been super painful and maybe we can change
things somehow so it will suck less next time.
We don't have anything like this pain on Windows or macOS. We never have to
update pip, warehouse, etc., after those OSes hit EOLs. Why not?
On Windows, we have just two tags: "win32" and "win_amd64". These are
defined to mean something like "this wheel will run on any recent-ish
Windows system". So the meaning of the tag actually changes over time: it
used to be that if a wheel said it ran on win32, then that meant it would
work on winxp, but since winxp hit EOL people started uploading "win32"
wheels that don't work on winxp, and that's worked fine.
On macOS, the tags look like "macosx_10_9_x86_64". So here we have the OS
version embedded in the tag. This means that we do occasionally switch
which tags we're using, kind of like how manylinux1 -> manylinux2010 is
intended to work. But, unlike for the manylinux tags, defining a new macosx
tag is totally trivial: every time a new OS version is released, the tag
springs into existence without any human intervention. Warehouse already
accepts uploads with this tag; pip already knows which systems can install
wheels with this tag, etc.
Can we take any inspiration from this for manylinux?
We could do the Windows thing, and have a plain "manylinux" tag that means
"any recent-ish glibc-based Linux". Today it would be defined to be "any
distro newer than CentOS 6". When CentOS 6 goes out of service, we could
tweak the definition to be "any distro newer than CentOS 7". Most parts of
the toolchain wouldn't need to be updated, though, because the tag wouldn't
change, and by assumption, enforcement wouldn't really be needed, because
the only people who could break would be ones running on unsupported
platforms. Just like happens on Windows.
We could do the macOS thing, and have a "manylinux_${glibc version}" tag
that means "this package works on any Linux using glibc newer than ${glibc
version}". We're already using this as our heuristic to handle the current
manylinux profiles, so e.g. manylinux1 is effectively equivalent to
manylinux_2_5, and manylinux2010 will be equivalent to manylinux_2_12. That
way we'd define the manylinux tags once, get support into pip and warehouse
and auditwheel once, and then in the future the only thing that would have
to change to support new distro releases or new architectures would be to
set up a proper build environment.
What do y'all think?
-n
PEP 440 contains a section that claims:
> Summary of differences from pkg_resources.parse_version [https://www.python.org/dev/peps/pep-0440/#id63]
>
> * Local versions sort differently, this PEP requires that they sort as greater than the same version without a local version, whereas pkg_resources.parse_version considers it a pre-release marker.
I haven't been able to find any mention of this in the setuptools changelogs, but this no longer seems to be the case:
>>> from pkg_resources import parse_version
>>> parse_version('1.0.0+dev') > parse_version('1.0.0')
True
Thus, it might be good to change the comment in the PEP accordingly (ideally, someone is able to figure out in which version of setuptools parse_version started to be compatible with PEP440).
Best,
Michael Goerz
Hello,
I am currently developing an application called "Outbreak" and would
like to release a package to Pypi. However, there already exists a
package with that name <https://pypi.org/project/outbreak/>. It has no
description, has no functionality, and hasn't been updated in four months.
According to PEP 541, such a project is considered name squatting and is
thus to be considered invalid. May I please request that the project be
removed from the Package Index? If you would like, I can contact the
author myself.
Thank you,
Elliot Paton-Simpson
Dotlock <https://pypi.org/project/dotlock/> is a package management tool
similar to pipenv, but with a different philosophy: instead of acting as a
wrapper around pip, dotlock handles package resolution natively. This
should allow for more flexibility, better performance, and a smaller
surface area for bugs, but at the cost of the wide platform and package
support the pip developers have put so much work into.
Dotlock is still very much beta software, and currently the only users are
myself and my company. It works well for our use case (deploying web
applications to linux servers), but has some gaps in support I know about
(including packages that use setup_requires or pyproject.toml), and
probably many I do not. Thanks to everyone on this list who I discussed
dotlock and general package issues with at PyCon, and to anyone who tries
it out!
Sincerely,
Alex Becker
It's been eight months since the release of Warehouse[0] and the sunsetting of legacy PyPI[1]. Following up from our meeting at PyCon in May[2], Changeset Consulting is back on board for another round of project management to facilitate next steps! For the next 3-6 months this work will be spearheaded by myself (Sumana) assisted by Jenny Ryan (https://jennyryan.net ).
The goal over these upcoming months is to create, steward and facilitate internal and public-facing communications to aid the folks within PyPA.
What this means is that we'll be focused on the following:
* Facilitating regular meetings of and for maintainers and contributors;
* Stewarding communications with various PyPA stakeholders, including funders and users;
* Organizing, labelling, prioritizing, and responding to GitHub issues;
* Coordinating public communications, such as announcements, sprints, and calls for participation;
* Maintaining and improving documentation, meeting notes and development roadmaps for PyPA projects.
Feedback from and participation by the Python packaging developer community is obviously part and parcel of this project, so you may see some new "here's what I think is up with this issue, is that right?" questions on old unresolved discussions. And we'll be asking questions on this & other lists and on GitHub and in IRC to collect ideas, concerns, and other productive input regarding the tools roadmaps.
You'll be seeing more details in mid-January to properly kick off this next chapter of levelling up PyPI and the PyPA -- just wanted to give y'all a heads-up.
But of course, if you were already planning on using the next few weeks to do issue triage and roadmap-writing and PyCon planning, please don't wait for us -- that'll make this work all the easier.
Thanks,
Sumana Harihareswara
[0] https://blog.python.org/2018/04/new-pypi-launched-legacy-pypi-shutting.html
[1] https://mail.python.org/archives/list/distutils-sig@python.org/thread/YRE...
[2] https://mail.python.org/archives/list/distutils-sig@python.org/thread/CCO...
--
Sumana Harihareswara
Changeset Consulting
https://changeset.nyc
I recently had to rebuild a server and find that pip 18.1 is apparently unable to install at least some older packages eg
> $ bin/pip install fcrypt
> Collecting fcrypt
> Could not find a version that satisfies the requirement fcrypt (from versions: )
> No matching distribution found for fcrypt
the version I needed is in fact the last released 1.3.1 (in 2004) and it was installed by an earlier pip. I tried being more explicit
> $ bin/pip install fcrypt==1.3.1
> Collecting fcrypt==1.3.1
> Could not find a version that satisfies the requirement fcrypt==1.3.1 (from versions: )
> No matching distribution found for fcrypt==1.3.1
I assume that latest pip needs information from the package / pypi data that is not available. Luckily installing from a pypi
download works.
Is there any legacy mode in pip? It seems wrong to cause these older packages to become unusable.
--
Robin Becker