[Distutils] PEP for dependencies on libraries like BLAS (was: Re: Working toward Linux wheel support)

Wes Turner wes.turner at gmail.com
Fri Aug 21 21:34:51 CEST 2015


Something like this for packages that work with a default install would be
helpful:

>From metadata:

* https://python3wos.appspot.com/
* http://djangowos.com/
* http://pythonwheels.com/

Additional package build scripts (and managed e.g. BLAS, numpy ABI
compatibility):

* http://docs.continuum.io/anaconda/pkg-docs
* https://www.enthought.com/products/canopy/package-index/


Additional attributes that would be useful in setup.py and JSON:

* [ ] Add conda and canopy as pkgmgrstrs: { "apt": [...], ..., "conda": [
], "canopy": [ ] }


On Fri, Aug 21, 2015 at 2:27 PM, Wes Turner <wes.turner at gmail.com> wrote:

>
>
> On Fri, Aug 21, 2015 at 1:30 PM, Wes Turner <wes.turner at gmail.com> wrote:
>
>>
>> On Aug 21, 2015 12:41 PM, "Brett Cannon" <brett at python.org> wrote:
>> >
>> >
>> >
>> > On Thu, 20 Aug 2015 at 10:16 Wes Turner <wes.turner at gmail.com> wrote:
>> >>
>> >>
>> >> On Aug 20, 2015 5:05 AM, "Nick Coghlan" <ncoghlan at gmail.com> wrote:
>> >> >
>> >> > [Catching up on distutils-sig after travel]
>> >> >
>> >> > On 13 August 2015 at 16:08, Nathaniel Smith <njs at pobox.com> wrote:
>> >> > > It seems like a reasonable effort at solving this problem, and I
>> guess
>> >> > > there are probably some people somewhere that have this problem,
>> but
>> >> > > my concern is that I don't actually know any of those people. The
>> >> > > developers I know instead have the problem of, they want to be
>> able to
>> >> > > provide a small finite number of binaries (ideally six binaries per
>> >> > > Python version: {32 bit, 64 bit} * {windows, osx, linux}) that
>> >> > > together will Just Work on 99% of end-user systems. And that's the
>> >> > > problem that Enthought, Continuum, etc., have been solving for
>> years,
>> >> > > and which wheels already mostly solve on windows and osx, so it
>> seems
>> >> > > like a reasonable goal to aim for. But I don't see how this PEP
>> gets
>> >> > > us any closer to that.
>> >> >
>> >> > The key benefit from my perspective is that tools like pyp2rpm, conda
>> >> > skeleton, the Debian Python packaging tools, etc, will be able to
>> >> > automatically generate full dependency sets automatically from
>> >> > upstream Python metadata.
>> >> >
>> >> > At the moment that's manual work which needs to be handled
>> >> > independently for each binary ecosystem, but there's no reason it has
>> >> > to be that way - we can do a better job of defining the source
>> >> > dependencies, and then hook into release-monitoring.org to
>> >> > automatically rebuild the downstream binaries (including adding new
>> >> > external dependencies if needed) whenever new upstream releases are
>> >> > published.
>> >>
>> >> JSON (JSON-LD) would likely be most platform compatible (and designed
>> for interoperable graph nodes and edges with attributes).
>> >>
>> >> JSON-LD does not require a specific library iff the @context is not
>> necessary.
>> >>
>> >> Notes about JSON-LD and interoperable software package metadata:
>> https://mail.python.org/pipermail/distutils-sig/2015-April/026108.html
>> >
>> >  What does JSON-LD have to do with this conversation, Wes? No
>> discussion of implementation has even begun, let alone worrying about data
>> formats. This is purely a discussion of problem space and what needs to be
>> solved and is not grounded in concrete design yet.
>>
>> Really?
>> Why would a language designed for graphs be appropriate for expressing
>> graphs and constraints?
>>
>> The problem is: setuptools packages cannot declare dependency edges to
>> things that are not Python packages; and, basically portage ebuild USE
>> flags for attributes.
>>
>> Now, as ever, would be a good time to learn to write a JSONLD @context
>> (for the existing fragmented packaging system standards (that often require
>> code execution to read/generate JSON metadata (because this is decidable))).
>>
>
> I guess what I'm trying to say is:
>
> * "why is this packaging metadata split?"
> * shouldn't this all be in setup.py
>
>   * couldn't we generate a proper RDF graph
>     from each of the latest JSONLD serializations
>     (e.g. https://pypi.python.org/pypi/<pkg>/jsonld)
>
>     * what is missing?
>        * install_requires_pkgs_apt
>          install_requires_pkgs = { "apt": [...], "yum": [...], "dnf":
> [...], "aur": [....] }
>        * extras_require_pkgs = {
>             "extraname": {  { "apt": [...], "yum": [...], "dnf": [...],
> "aur": [....] },
>             "otherextraname": { "apt": [...], "yum": [...], "dnf": [...],
> "aur": [....] }
>        * build flag schema
>          buildflags_schema={"numpy17":"bool"}
>          buildflags=[{"numpy17"}]
>
> Because, from a maintainer/reproducibility standpoint,
> how do I know that all of these packages (1) do (2) could exist?
>
> * (3) SHOULD exist: tox, Docker builds
>
>
> So, is JSON-LD necessary to add extra attributes to setup.py? Nope.
>
> Would it be a good time to norm with a JSON-LD @context and determine how
> to specify package groupings like [salt grains] without salt, in
> setuptools?   Or just be declarative (e.g. with Portage and Conda (and
> Pip/Peep, inevitably)).
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/distutils-sig/attachments/20150821/ccdcf8dc/attachment.html>


More information about the Distutils-SIG mailing list