On 15 Feb 2017 23:40, "Nathaniel Smith" <njs@pobox.com> wrote:
On Feb 15, 2017 07:41, "Nick Coghlan" <ncoghlan@gmail.com> wrote:


Ah-hah, this does make sense as a problem, thanks!

However, your solution seems very odd to me :-).

If the goal is to put an "are you sure/yes I'm sure" UX barrier between users and certain version settings, then why make a distinction that every piece of downstream software has to be aware of and ignore? Pypi seems like a funny place in the stack to be implementing this. It would be much simpler to implement this feature at the build system level, like e.g. setuptools could require that dependencies that you think are over strict be specified in an install_requires_yes_i_really_mean_it= field, without requiring any metadata changes.

If you're publishing to a *private* index server then version pinning should be allowed by default and you shouldn't get a warning.

It's only when publishing to PyPI as a *public* index server that overly restrictive dependencies become a UX problem.

The simplest way of modelling this that I've come up with is a boolean "allow pinned dependencies" flag - without the flag, "==" and "===" would emit warnings or errors when releasing to a public index server, with it they wouldn't trigger any complaints.

Basically it sounds like you're saying you want to extend the metadata so that it can represent both broken and non-broken packages, so that both can be created, passed around, and checked for. And I'm saying, how about instead we do that checking when creating the package in the first place.

Build time isn't right, due to this being a perfectly acceptable thing to do when building solely for private use. It's only you make the "I'm going to publish this for the entire community to use" that the intent needs to be clarified (as at that point you're switching from "I'm solving to my own problems" to "My problems may be shared by other people, and I'd like to help them out if I can").

(Of course I can't see any way to do any of this that won't break existing sdists, but I guess you've already decided you're OK with that. I guess I should say that I'm a bit dubious that this is so important in the first place; I feel like there are lots of legitimate use cases for == dependencies and lots of kinds of linting we might want to apply to try and improve the level of packaging quality.)




































Either way, PyPI will believe your answer, it's just refusing the
temptation to guess that using "==" or "===" in the requires section
is sufficient to indicate that you're deliberately publishing a
pre-integrated project.

> There's certainly a distinction to be made between the abstract
> dependencies and the exact locked dependencies, but to me the natural
> way to model that distinction is by re-using the distinction we
> already have been source packages and binary packages. The build
> process for this placeholder wheel is to "compile down" the abstract
> dependencies into concrete dependencies, and the resulting wheel
> encodes the result of this compilation. Again, no new concepts needed.

Source vs binary isn't where the distinction applies, though. For
example, it's legitimate for PyObjC to have pinned dependencies even
when distributed in source form, as it's a metapackage used solely to
integrate the various PyObjC subprojects into a single "release".

?? So that means that some packages have a loosely specified source that compiles down to a more strictly specified binary, and some have a more strictly specified source that compiles down to an equally strictly specified binary. That's... an argument in favor of my way of thinking about it, isn't it? That it can naturally express both situations?

My point is that *for the cases where there's an important distinction between Pipfile and Pipfile.lock*, we already have a way to think about that distinction without introducing new concepts.

-n