<div dir="ltr"><div class="gmail_extra"><div class="gmail_quote">On 14 March 2017 at 09:41, Nathaniel Smith <span dir="ltr"><<a href="mailto:njs@pobox.com" target="_blank">njs@pobox.com</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><span class="gmail-">On Fri, Mar 10, 2017 at 7:55 AM, Nick Coghlan <<a href="mailto:ncoghlan@gmail.com">ncoghlan@gmail.com</a>> wrote:<br>
> On 11 March 2017 at 00:52, Nathaniel Smith <<a href="mailto:njs@pobox.com">njs@pobox.com</a>> wrote:<br>>> We have lots of metadata files in the wild that already claim to be<br>
>> version 2.0. If you're reviving this I think you might need to change<br>
>> the version number?<br>><br>
> They're mostly in metadata.json files, though. That said, version numbers<br>
> are cheap, so I'm happy to skip straight to 3.0 if folks think it makes more<br>
> sense.<br>
<br>
</span>AFAICT bdist_wheel produces METADATA files with Metadata-Version: 2.0<br>
by default, and has for some time. Certainly this one I just<br>
spot-checked does that.<br></blockquote><div><br></div><div>We could always retroactively declare "2.0" to just mean 1.3 + Provides-Extra + (optionally) Description-Content-Type (once that has been defined in a way that makes sense for PyPI).<br><br></div><div>Either way, I'm convinced that the JSON based format should start out at 3.0.<br></div><div> </div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex">
<span class="gmail-">
> There's a lot to be said for treating the file as immutable, and instead<br>
> adding *other* metadata files as a component moves through the distribution<br>
> process. If so, then it may actually be more appropriate to call the<br>
> rendered file "pysdist.json", since it contains the sdist metadata<br>
> specifically, rather than arbitrary distribution metadata.<br>
<br>
</span>I guess there are three possible kinds of build dependencies:<br>
- those that are known statically<br>
- those that are determined by running some code at sdist creation time<br>
- those that are determined by running some code at build time<br>
<br>
But all the examples I can think of fall into either bucket A (which<br>
pyproject.toml handles), or bucket C (which pydist.json can't handle).<br>
So it seems like the metadata here is either going to be redundant or<br>
wrong?<br></blockquote><div><br></div><div>pyproject.toml only handles the bootstrapping dependencies for the build system itself, it *doesn't* necessarily include all the build dependencies, which may be in tool specific files (like setup_requires in setup.py) or otherwise added by the build system without and record of it in pyproject.toml. The build system knows the latter when it generates the sdist, and it means PyPI can extract and republish them without having to actually invoke the build system.<br><br></div><div>For dynamic dependencies where the environment marker system isn't flexible enough to express the installation conditions (so they can't be generated at sdist creation time), that will be something for the publishers of a particular project to resolve with the folks that want the ability to do builds in environments that are isolated from the internet, and hence can't download arbitrary additional dependencies at build time.<br></div><div> </div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex">
I'm not sure I understand the motivation for wanting wheels to have a<br>
file which says "here's the metadata describing the sdist that you<br>
would have, if you had an sdist (which you don't)"? I guess it doesn't<br>
hurt anything, but it seems odd.<br></blockquote><div><br></div><div>Wheels still have a corresponding source artifact, even if it hasn't been published anywhere using the Python-specific sdist format. Accordingly, I don't think it makes sense to be able to tell just from looking at a wheel file whether the generation process was:<br><br></div><div>* tree -> sdist -> wheel; or<br></div><div>* tree -> wheel<br></div><div><br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex">
<span class="gmail-">
> I'd also be fairly strongly opposed to converting extras from an optional<br>
> dependency management system to a "let multiple PyPI packages target the<br>
> same site-packages subdirectory" because we already know that's a nightmare<br>
> from the Linux distro experience (having a clear "main" package that owns<br>
> the parent directory with optional subpackages solves *some* of the<br>
> problems, but my main reaction is still "Run awaaay").<br>
<br>
</span>The "let multiple PyPI packages target the same site-packages<br>
directory" problem is orthogonal to the reified extras proposal. I<br>
actually think we can't avoid handling the same site-packages<br>
directory problem, but the solution is namespace packages and/or<br>
better Conflicts: metadata.<br>
<br>
Example illustrating why the site-packages conflict problem arises<br>
independently of reified extras: people want to distribute numpy built<br>
against different BLAS backends, especially MKL (which is good but<br>
zero-cost proprietary) versus OpenBLAS (which is not as good but is<br>
free). Right now that's possible by distributing 'numpy' and<br>
'numpy-mkl' packages, but of course ugly stuff happens if you try to<br>
install both; some sort of Conflicts: metadata would help. If we<br>
instead have the packages be named 'numpy' and 'numpy[mkl]', then<br>
they're in exactly the same position with respect to conflicts. The<br>
very significant advantage is that we know that 'numpy[mkl]' "belongs<br>
to" the numpy project, so 'numpy[mkl]' can say 'Provides-Dist: numpy'<br>
without all the security issues that Provides-Dist otherwise runs<br>
into.<br></blockquote><div><br></div><div>Do other components need to be rebuilt or relinked if the NumPy BLAS backend changes?<br><br></div><div>If the answer is yes, then this is something I'd strongly prefer to leave to conda and other package management systems like Nix that better support parallel installation of multiple versions of C/C++ dependencies.<br><br></div><div>If the answer is no, then it seems like a better solution might be to allow for rich dependencies, where numpy could depend on "_numpy_backends.openblas or _numpy_backends.mkl" and figure out the details of exactly what's available and which one it's going to use at import time.<br><br></div><div>Either way, contorting the Extras system to try to cover such a significantly different set of needs doesn't seem like a good idea.<br></div> <blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex">
Example illustrating why reifed extras are useful totally<br>
independently of site-packages conflicts: it would be REALLY NICE if<br>
numpy could say 'Provides-Dist: numpy[abi=7]' and then packages could<br>
depend on 'numpy[abi=7]' and have that do something sensible. This<br>
would be a pure virtual package.<br></blockquote><div><br></div><div>PEP 459 has a whole separate "python.constraints" extension rather than trying to cover environmental constraints within the existing Extras system: <a href="https://www.python.org/dev/peps/pep-0459/#the-python-constraints-extension">https://www.python.org/dev/peps/pep-0459/#the-python-constraints-extension</a><br></div><div> </div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex">
<span class="gmail-">
> It especially isn't needed just to solve the "pip forgets what extras it<br>
> installed" problem - that technically doesn't even need a PEP to resolve, it<br>
> just needs pip to drop a pip specific file into the PEP 376 dist-info<br>
> directory that says what extras to request when doing future upgrades.<br>
<br>
</span>But that breaks if people use a package manager other than pip, which<br>
is something we want to support, right? And in any case it requires a<br>
bunch more redundant special-case logic inside pip, to basically make<br>
extras act like virtual packages.<br></blockquote><br></div><div class="gmail_quote">OK, it would still need a PEP to make the file name and format standardised across tools. Either way, it's an "installed packages database" problem, not a software publication problem.<br><br></div><div class="gmail_quote">Cheers,<br></div><div class="gmail_quote">Nick.<br></div><br>-- <br><div class="gmail_signature">Nick Coghlan | <a href="mailto:ncoghlan@gmail.com" target="_blank">ncoghlan@gmail.com</a> | Brisbane, Australia</div>
</div></div>