Draft PEP for JSON based metadata published

After preliminary reviews by Donald and Daniel, I have now pushed the first complete draft of the JSON-based metadata 2.0 proposal to python.org PEP 426 (metadata 2.0): http://www.python.org/dev/peps/pep-0426/ PEP 440 (versioning): http://www.python.org/dev/peps/pep-0440/ With the rationale and commentary, they're over 3000 lines between them, so I'm not attaching them here. The rationale for many of the changes is at the end of each PEP, along with some comments on features that I have either rejected or deliberately chosen to defer to the next revision of the metadata (at the earliest). Those with BitBucket accounts may also comment inline on the drafts here: PEP 426: https://bitbucket.org/ncoghlan/misc/src/05d3586464b10d6a04a35409468269d7c89a... PEP 440: https://bitbucket.org/ncoghlan/misc/src/05d3586464b10d6a04a35409468269d7c89a... Cheers, Nick. -- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia

On 27 May, 2013, at 13:36, Nick Coghlan <ncoghlan@gmail.com> wrote:
After preliminary reviews by Donald and Daniel, I have now pushed the first complete draft of the JSON-based metadata 2.0 proposal to python.org
PEP 426 (metadata 2.0): http://www.python.org/dev/peps/pep-0426/
Could platform_release be added to the set of environment markers? On OSX platform.release() returns a kernel version number, while platform.version() returns a string that cannot easily be used as a marker. It might also be nice to have markers for the marketing name and version of the OS next to the kernel name and release (for example Solaris 8 vs. SunOS 5.8, or darwin 12.3.0 vs. OSX 10.8.3), but that would probably need a PEP of its own to add the functionality to the stdlib before it is used in metadata. BTW. Why can marker expressions use only a limited subset of operators? In particular <, <=, > and >= are not allowed and would be nice to have to specify packages that need a minimal platform version (parts of PyObjC have a dependency on the OSX release because the wrapped library is available in a subset of releases). Listing all OS releases would be fairly useless, platform.release() is too specific to use with '==' as new patch releases of OSX can introduce new micro versions of the kernel.
PEP 440 (versioning): http://www.python.org/dev/peps/pep-0440/
The versioning spec mentions that distribution tools may refuse to publish distributions that pin the versions of dependencies. I understand why this is needed, and agree in general, but have a usecase that I don't know how to express without pinning. In particular, PyObjC consists of a number of distributions (pyobjc-core, pyobjc-framework-Cocoa, ...) and an umbrella package (pyobjc) what depends on the various distributions to make it easier to install all of PyObjC. The umbrella package currently pins the versions of subpackages to ensure that "pip install pyobjc==2.5.1" installs exactly that version of the entire project. When I'd use the "compatible release" specifier I can no longer easily ensure that users can install an exact version of the entire project, other than by hacking the system: specify a compatible version with an additional level that isn't used by the project (for example ~=2.5.2.0). What is the correct way to create an umberella project without getting yelled at by distribution tools? Ronald
With the rationale and commentary, they're over 3000 lines between them, so I'm not attaching them here.
The rationale for many of the changes is at the end of each PEP, along with some comments on features that I have either rejected or deliberately chosen to defer to the next revision of the metadata (at the earliest).
Those with BitBucket accounts may also comment inline on the drafts here:
PEP 426: https://bitbucket.org/ncoghlan/misc/src/05d3586464b10d6a04a35409468269d7c89a... PEP 440: https://bitbucket.org/ncoghlan/misc/src/05d3586464b10d6a04a35409468269d7c89a...
Cheers, Nick.
-- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia _______________________________________________ Distutils-SIG maillist - Distutils-SIG@python.org http://mail.python.org/mailman/listinfo/distutils-sig

On May 27, 2013, at 10:44 AM, Ronald Oussoren <ronaldoussoren@mac.com> wrote:
On 27 May, 2013, at 13:36, Nick Coghlan <ncoghlan@gmail.com> wrote:
After preliminary reviews by Donald and Daniel, I have now pushed the first complete draft of the JSON-based metadata 2.0 proposal to python.org
PEP 426 (metadata 2.0): http://www.python.org/dev/peps/pep-0426/
Could platform_release be added to the set of environment markers? On OSX platform.release() returns a kernel version number, while platform.version() returns a string that cannot easily be used as a marker. It might also be nice to have markers for the marketing name and version of the OS next to the kernel name and release (for example Solaris 8 vs. SunOS 5.8, or darwin 12.3.0 vs. OSX 10.8.3), but that would probably need a PEP of its own to add the functionality to the stdlib before it is used in metadata.
BTW. Why can marker expressions use only a limited subset of operators? In particular <, <=, > and >= are not allowed and would be nice to have to specify packages that need a minimal platform version (parts of PyObjC have a dependency on the OSX release because the wrapped library is available in a subset of releases). Listing all OS releases would be fairly useless, platform.release() is too specific to use with '==' as new patch releases of OSX can introduce new micro versions of the kernel.
PEP 440 (versioning): http://www.python.org/dev/peps/pep-0440/
The versioning spec mentions that distribution tools may refuse to publish distributions that pin the versions of dependencies. I understand why this is needed, and agree in general, but have a usecase that I don't know how to express without pinning.
In particular, PyObjC consists of a number of distributions (pyobjc-core, pyobjc-framework-Cocoa, ...) and an umbrella package (pyobjc) what depends on the various distributions to make it easier to install all of PyObjC. The umbrella package currently pins the versions of subpackages to ensure that "pip install pyobjc==2.5.1" installs exactly that version of the entire project. When I'd use the "compatible release" specifier I can no longer easily ensure that users can install an exact version of the entire project, other than by hacking the system: specify a compatible version with an additional level that isn't used by the project (for example ~=2.5.2.0). What is the correct way to create an umberella project without getting yelled at by distribution tools?
It's unlikely PyPI will get more than a warning for ``==``, `is` comparisons might be disallowed? Not sure.
Ronald
With the rationale and commentary, they're over 3000 lines between them, so I'm not attaching them here.
The rationale for many of the changes is at the end of each PEP, along with some comments on features that I have either rejected or deliberately chosen to defer to the next revision of the metadata (at the earliest).
Those with BitBucket accounts may also comment inline on the drafts here:
PEP 426: https://bitbucket.org/ncoghlan/misc/src/05d3586464b10d6a04a35409468269d7c89a... PEP 440: https://bitbucket.org/ncoghlan/misc/src/05d3586464b10d6a04a35409468269d7c89a...
Cheers, Nick.
-- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia _______________________________________________ Distutils-SIG maillist - Distutils-SIG@python.org http://mail.python.org/mailman/listinfo/distutils-sig
_______________________________________________ Distutils-SIG maillist - Distutils-SIG@python.org http://mail.python.org/mailman/listinfo/distutils-sig
----------------- Donald Stufft PGP: 0x6E3CBCE93372DCFA // 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA

On Tue, May 28, 2013 at 7:28 AM, Donald Stufft <donald@stufft.io> wrote:
On May 27, 2013, at 10:44 AM, Ronald Oussoren <ronaldoussoren@mac.com> wrote: The versioning spec mentions that distribution tools may refuse to publish distributions that pin the versions of dependencies. I understand why this is needed, and agree in general, but have a usecase that I don't know how to express without pinning.
In particular, PyObjC consists of a number of distributions (pyobjc-core, pyobjc-framework-Cocoa, ...) and an umbrella package (pyobjc) what depends on the various distributions to make it easier to install all of PyObjC. The umbrella package currently pins the versions of subpackages to ensure that "pip install pyobjc==2.5.1" installs exactly that version of the entire project. When I'd use the "compatible release" specifier I can no longer easily ensure that users can install an exact version of the entire project, other than by hacking the system: specify a compatible version with an additional level that isn't used by the project (for example ~=2.5.2.0). What is the correct way to create an umberella project without getting yelled at by distribution tools?
It's unlikely PyPI will get more than a warning for ``==``, `is` comparisons might be disallowed? Not sure.
I think Ronald's example of publishing metadistributions that pin particular versions of subdistributions is a valid one (I do exactly the same thing myself with RPM, it just didn't occur to me as a use case while updating the PEPs), so I need to reconsider some of the index server restrictions currently proposed in the PEPs. However, I'd also still like to not-so-gently steer users away from overly restrictive dependencies in the general case. This is a case where in a *technical* sense there's no difference between "We are making these distributions we maintain easier to install all at once" and "Our distribution needs a compatible version of this other distribution in order to work", but *semantically* they're two quite different operations. So, what do people think of the idea of a new top level "distributes" field? Syntax identical to "requires", but *semantically* distinguished in that version pinning in "distributes" would be not only allowed, but encouraged. A metapackage like PyObjC would then have just entries in the "distributes" field, and no direct dependencies of its own. Thoughts? Cheers, Nick. -- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia

On Tue, May 28, 2013 at 12:44 AM, Ronald Oussoren <ronaldoussoren@mac.com> wrote:
On 27 May, 2013, at 13:36, Nick Coghlan <ncoghlan@gmail.com> wrote:
After preliminary reviews by Donald and Daniel, I have now pushed the first complete draft of the JSON-based metadata 2.0 proposal to python.org
PEP 426 (metadata 2.0): http://www.python.org/dev/peps/pep-0426/
Could platform_release be added to the set of environment markers? On OSX platform.release() returns a kernel version number, while platform.version() returns a string that cannot easily be used as a marker.
Heh, I just checked Linux and it's the same - I don't know where the original list came from, but adding platform_release sounds good to me.
It might also be nice to have markers for the marketing name and version of the OS next to the kernel name and release (for example Solaris 8 vs. SunOS 5.8, or darwin 12.3.0 vs. OSX 10.8.3), but that would probably need a PEP of its own to add the functionality to the stdlib before it is used in metadata.
BTW. Why can marker expressions use only a limited subset of operators? In particular <, <=, > and >= are not allowed and would be nice to have to specify packages that need a minimal platform version (parts of PyObjC have a dependency on the OSX release because the wrapped library is available in a subset of releases). Listing all OS releases would be fairly useless, platform.release() is too specific to use with '==' as new patch releases of OSX can introduce new micro versions of the kernel.
Simple string comparison also works pretty well with Python versions, too. I think it makes sense to allow them.
PEP 440 (versioning): http://www.python.org/dev/peps/pep-0440/
The versioning spec mentions that distribution tools may refuse to publish distributions that pin the versions of dependencies. I understand why this is needed, and agree in general, but have a usecase that I don't know how to express without pinning.
In particular, PyObjC consists of a number of distributions (pyobjc-core, pyobjc-framework-Cocoa, ...) and an umbrella package (pyobjc) what depends on the various distributions to make it easier to install all of PyObjC. The umbrella package currently pins the versions of subpackages to ensure that "pip install pyobjc==2.5.1" installs exactly that version of the entire project. When I'd use the "compatible release" specifier I can no longer easily ensure that users can install an exact version of the entire project, other than by hacking the system: specify a compatible version with an additional level that isn't used by the project (for example ~=2.5.2.0). What is the correct way to create an umberella project without getting yelled at by distribution tools?
As noted in a previous email, I've added a specific field for this. It also highlighted some issues with the way version pinning was documented in general, so it led to some other changes (like renaming build labels and build references to source labels and source references). The latest drafts should be up on python.org shortly (I just pushed them to both the PEPs repo and bitbucket) -- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia

On Mon, May 27, 2013 at 7:36 AM, Nick Coghlan <ncoghlan@gmail.com> wrote:
After preliminary reviews by Donald and Daniel, I have now pushed the first complete draft of the JSON-based metadata 2.0 proposal to python.org
PEP 426 (metadata 2.0): http://www.python.org/dev/peps/pep-0426/ PEP 440 (versioning): http://www.python.org/dev/peps/pep-0440/
With the rationale and commentary, they're over 3000 lines between them, so I'm not attaching them here.
The rationale for many of the changes is at the end of each PEP, along with some comments on features that I have either rejected or deliberately chosen to defer to the next revision of the metadata (at the earliest).
Those with BitBucket accounts may also comment inline on the drafts here:
PEP 426: https://bitbucket.org/ncoghlan/misc/src/05d3586464b10d6a04a35409468269d7c89a... PEP 440: https://bitbucket.org/ncoghlan/misc/src/05d3586464b10d6a04a35409468269d7c89a...
This is looking fantastic so far--thanks to Nick, Daniel, and Donald for their continued work on this. For now I just have a handful of minor notes on the latest draft of PEP 426: Typos: Under "Essential dependency resolution metadata" the "may_require" and related metadata keywords are spelled with hyphens instead of underscores. Under "Metabuild system" in the first example I think "some_test_harness.metabuild_hook" was meant to read "some_test_harness:metabuild_hook" Under "Development, build and deployment dependencies": "allow" -> "allows" Under "Support for metabuild hooks": "by allows projects" -> "by allowing projects" Comment: I'm not sure if this PEP is the best place for this, but I wonder if the description of the "Keywords" format could provide some clarification on how that field should be formatted in older metadata versions (specifically when including version 1.x metadata for backwards compatibility). In the past its format has never been specified. Some tools treat it as a space-separated fields. Others have treated it as a comma-separated field. Sometimes one or the other depending on whether commas are present. It's a very annoying field. Erik

On Tue, May 28, 2013 at 2:07 PM, Erik Bray <erik.m.bray@gmail.com> wrote:
On Mon, May 27, 2013 at 7:36 AM, Nick Coghlan <ncoghlan@gmail.com> wrote:
After preliminary reviews by Donald and Daniel, I have now pushed the first complete draft of the JSON-based metadata 2.0 proposal to python.org
PEP 426 (metadata 2.0): http://www.python.org/dev/peps/pep-0426/ PEP 440 (versioning): http://www.python.org/dev/peps/pep-0440/
With the rationale and commentary, they're over 3000 lines between them, so I'm not attaching them here.
The rationale for many of the changes is at the end of each PEP, along with some comments on features that I have either rejected or deliberately chosen to defer to the next revision of the metadata (at the earliest).
Those with BitBucket accounts may also comment inline on the drafts here:
PEP 426: https://bitbucket.org/ncoghlan/misc/src/05d3586464b10d6a04a35409468269d7c89a... PEP 440: https://bitbucket.org/ncoghlan/misc/src/05d3586464b10d6a04a35409468269d7c89a...
This is looking fantastic so far--thanks to Nick, Daniel, and Donald for their continued work on this. For now I just have a handful of minor notes on the latest draft of PEP 426:
Typos:
Under "Essential dependency resolution metadata" the "may_require" and related metadata keywords are spelled with hyphens instead of underscores.
Under "Metabuild system" in the first example I think "some_test_harness.metabuild_hook" was meant to read "some_test_harness:metabuild_hook"
Under "Development, build and deployment dependencies": "allow" -> "allows"
Under "Support for metabuild hooks": "by allows projects" -> "by allowing projects"
Comment:
I'm not sure if this PEP is the best place for this, but I wonder if the description of the "Keywords" format could provide some clarification on how that field should be formatted in older metadata versions (specifically when including version 1.x metadata for backwards compatibility). In the past its format has never been specified. Some tools treat it as a space-separated fields. Others have treated it as a comma-separated field. Sometimes one or the other depending on whether commas are present. It's a very annoying field.
I suggest treating it as a space-separated field for converting from 2.0 to 1.0. To convert from 1.0 to 2.0 you should just split on "not a letter" or if you are feeling ambitious "not some larger set of characters, probably resembling the identifier or package name rules".

On Mon, May 27, 2013 at 9:36 PM, Nick Coghlan <ncoghlan@gmail.com> wrote:
After preliminary reviews by Donald and Daniel, I have now pushed the first complete draft of the JSON-based metadata 2.0 proposal to python.org
PEP 426 (metadata 2.0): http://www.python.org/dev/peps/pep-0426/ PEP 440 (versioning): http://www.python.org/dev/peps/pep-0440/
Based on some offline feedback from Daniel, I'm going to change the current "type" field in the contact metadata to "role". The name of the default role will change from "individual" to "contributor", and projects will be given freedom to define their own roles beyond the predefined ones. (We're actually stealing this from the way contact metadata works in PHP's composer). Cheers, Nick.

On May 28, 2013, at 9:00 PM, Nick Coghlan <ncoghlan@gmail.com> wrote:
On Mon, May 27, 2013 at 9:36 PM, Nick Coghlan <ncoghlan@gmail.com> wrote:
After preliminary reviews by Donald and Daniel, I have now pushed the first complete draft of the JSON-based metadata 2.0 proposal to python.org
PEP 426 (metadata 2.0): http://www.python.org/dev/peps/pep-0426/ PEP 440 (versioning): http://www.python.org/dev/peps/pep-0440/
Based on some offline feedback from Daniel, I'm going to change the current "type" field in the contact metadata to "role". The name of the default role will change from "individual" to "contributor", and projects will be given freedom to define their own roles beyond the predefined ones. (We're actually stealing this from the way contact metadata works in PHP's composer).
Cheers, Nick. _______________________________________________ Distutils-SIG maillist - Distutils-SIG@python.org http://mail.python.org/mailman/listinfo/distutils-sig
Please define what the valid values for the role field are when you include it. ----------------- Donald Stufft PGP: 0x6E3CBCE93372DCFA // 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA

On Wed, May 29, 2013 at 11:04 AM, Donald Stufft <donald@stufft.io> wrote:
On May 28, 2013, at 9:00 PM, Nick Coghlan <ncoghlan@gmail.com> wrote:
On Mon, May 27, 2013 at 9:36 PM, Nick Coghlan <ncoghlan@gmail.com> wrote:
After preliminary reviews by Donald and Daniel, I have now pushed the first complete draft of the JSON-based metadata 2.0 proposal to python.org
PEP 426 (metadata 2.0): http://www.python.org/dev/peps/pep-0426/ PEP 440 (versioning): http://www.python.org/dev/peps/pep-0440/
Based on some offline feedback from Daniel, I'm going to change the current "type" field in the contact metadata to "role". The name of the default role will change from "individual" to "contributor", and projects will be given freedom to define their own roles beyond the predefined ones. (We're actually stealing this from the way contact metadata works in PHP's composer).
Hmm, I may actually drop the extensibility idea - it makes the tooling harder without providing a significant benefit. So just the name changes for the field and the default value. Cheers, Nick. -- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia

A couple of significant upcoming changes: "build label" will be renamed as "source label" (since it refers to the common unbuilt source rather than a specific build) "version URL" will be renamed as "source URL" (same rationale) The current names are ambiguous as to whether they refer to the source code for the version or can be used to refer to built versions. Since they're specifically for source references (you need to add at least PEP 425 compatibility tags to construct a built reference), it makes sense to change the names. A more minor change is that the "organization" type/role for contacts will go away. Organization will be able to have any of the defined roles (author, maintainer, contributor) and if we later decide we need a programmatic means to distinguish abstract organisations from flesh and blood humans we can consider adding a new mechanism. Cheers, Nick. -- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia

On Tue, May 28, 2013 at 11:12 PM, Nick Coghlan <ncoghlan@gmail.com> wrote:
A couple of significant upcoming changes:
"build label" will be renamed as "source label" (since it refers to the common unbuilt source rather than a specific build) "version URL" will be renamed as "source URL" (same rationale)
The current names are ambiguous as to whether they refer to the source code for the version or can be used to refer to built versions. Since they're specifically for source references (you need to add at least PEP 425 compatibility tags to construct a built reference), it makes sense to change the names.
A more minor change is that the "organization" type/role for contacts will go away. Organization will be able to have any of the defined roles (author, maintainer, contributor) and if we later decide we need a programmatic means to distinguish abstract organisations from flesh and blood humans we can consider adding a new mechanism.
Cheers, Nick.
" Request the test extra to also install test_requires test_may_require " If test requirements are not actually extras then I would prefer having no special-cased extra names at all, or a special extra-like syntax used for pip-install'ing different categories of dependencies. Is there a function to get the metadata keys from the category name? It would be something like: f('test') -> 'test_requires', 'test_may_require' def f(category): if category == 'install': category = '' conditional = [category, 'may_require'] unconditional = [category, '_requires'] return ('_'.join(unconditional).strip('_'), '_'.join(conditional).strip('_'))

On Thu, May 30, 2013 at 12:14 AM, Daniel Holth <dholth@gmail.com> wrote:
Request the test extra to also install
test_requires test_may_require "
If test requirements are not actually extras then I would prefer having no special-cased extra names at all, or a special extra-like syntax used for pip-install'ing different categories of dependencies.
No, I still like the idea of using the existing extras syntax to request their installation when you do want them, rather than needing a special mechanism in the command line tools. The declaration just got broken out to a separate field as otherwise it wasn't clear whether test dependencies should be listed in may_require, dev_requires or dev_may_require.
Is there a function to get the metadata keys from the category name? It would be something like:
f('test') -> 'test_requires', 'test_may_require'
def f(category):
if category == 'install': category = '' conditional = [category, 'may_require'] unconditional = [category, '_requires'] return ('_'.join(unconditional).strip('_'), '_'.join(conditional).strip('_'))
Possibly, but API details aren't a question for this PEP. If you look at the deployment scenarios in the PEP, that's not actually the way the mapping works anyway (development mode gets everything, not just the dev dependencies). Cheers, Nick. -- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia

On Wed, May 29, 2013 at 10:25 AM, Nick Coghlan <ncoghlan@gmail.com> wrote:
On Thu, May 30, 2013 at 12:14 AM, Daniel Holth <dholth@gmail.com> wrote:
Request the test extra to also install
test_requires test_may_require "
If test requirements are not actually extras then I would prefer having no special-cased extra names at all, or a special extra-like syntax used for pip-install'ing different categories of dependencies.
No, I still like the idea of using the existing extras syntax to request their installation when you do want them, rather than needing a special mechanism in the command line tools.
The declaration just got broken out to a separate field as otherwise it wasn't clear whether test dependencies should be listed in may_require, dev_requires or dev_may_require.
I just mean something like pip install package[:test:], using any character that is not allowed in extra names to avoid a collision.
Is there a function to get the metadata keys from the category name? It would be something like:
f('test') -> 'test_requires', 'test_may_require'
def f(category):
if category == 'install': category = '' conditional = [category, 'may_require'] unconditional = [category, '_requires'] return ('_'.join(unconditional).strip('_'), '_'.join(conditional).strip('_'))
Possibly, but API details aren't a question for this PEP. If you look at the deployment scenarios in the PEP, that's not actually the way the mapping works anyway (development mode gets everything, not just the dev dependencies).
I think the naming of the two keys is slightly inelegant, with the 'may' in the middle and the requires / require, but there aren't any short english words that would look good simply appended to the end 'test_requires', 'test_requires_conditional'.

I support using :test, or :test: or whatever to mark it as a "special" name. -- Donald Stufft donald@stufft.io

On Thu, May 30, 2013 at 12:32 AM, Daniel Holth <dholth@gmail.com> wrote:
On Wed, May 29, 2013 at 10:25 AM, Nick Coghlan <ncoghlan@gmail.com> wrote:
On Thu, May 30, 2013 at 12:14 AM, Daniel Holth <dholth@gmail.com> wrote:
Request the test extra to also install
test_requires test_may_require "
If test requirements are not actually extras then I would prefer having no special-cased extra names at all, or a special extra-like syntax used for pip-install'ing different categories of dependencies.
No, I still like the idea of using the existing extras syntax to request their installation when you do want them, rather than needing a special mechanism in the command line tools.
The declaration just got broken out to a separate field as otherwise it wasn't clear whether test dependencies should be listed in may_require, dev_requires or dev_may_require.
I just mean something like pip install package[:test:], using any character that is not allowed in extra names to avoid a collision.
Hmm, I'll have to think about that one. It doesn't cause any data modelling issues any more (since the dedicated fields have been broken out to store the test dependencies), and you're right, lifting the "any name except test" restriction on the extras field itself would be nice.
Is there a function to get the metadata keys from the category name? It would be something like:
f('test') -> 'test_requires', 'test_may_require'
def f(category):
if category == 'install': category = '' conditional = [category, 'may_require'] unconditional = [category, '_requires'] return ('_'.join(unconditional).strip('_'), '_'.join(conditional).strip('_'))
Possibly, but API details aren't a question for this PEP. If you look at the deployment scenarios in the PEP, that's not actually the way the mapping works anyway (development mode gets everything, not just the dev dependencies).
I think the naming of the two keys is slightly inelegant, with the 'may' in the middle and the requires / require, but there aren't any short english words that would look good simply appended to the end 'test_requires', 'test_requires_conditional'.
Yeah, I picked the syntax I did for readability. I've seen similar APIs along the lines of "requires" and "requires_maybe" and they're ugly. In terms of how you deal with it programmatically, you would likely want to have your mandatory and conditional handling separated out anyway, so iterators are a natural fit: def iter_dependencies(metadata, field, prefixes): for prefix in prefixes: if prefix: field_name = prefix + "_" + field else: field_name = field data = metadata.get(field_name) if data is None: continue yield from data def iter_requires(metadata, prefixes): yield from iter_dependencies(metadata, "requires", prefixes) def iter_may_require(metadata, prefixes): yield from iter_dependencies(metadata, "may_require", prefixes) Cheers, Nick. -- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia

I feel like we are still conflating some names and use cases here with source_label, source_url, source reference, and the is operator. source_label // source_url - References to the source that is associated with this release. Must not point to anything besides source (no whls, no other built or platform specific artifact). is // source reference - This is where my real issue comes into play. I almost think that it doesn't make any sense to allow an is comparison to a source label. In Python `is` is an identity function and a source label is not a directly installable thing. It must be built first (even if building involves simply copying .py files into a whl). I don't like using the word "source" here because when pointing at an url it could be a whl file, it could be an egg, it could be a source archive. I'm not sure what should be done about the non url form of ``is``. The url form makes sense to me because an url is (theortically) an identity pointer to an exact installable. However a source_label is not, it could have any number of installable files associated with it and the one you install may be dependent upon what OS you're using or what Python version your using. -- Donald Stufft donald@stufft.io

In 440, < 2.0 means < 2.0, != 2.0.* to avoid dev releases. Would it be equivalent for < 2.0 to expand to the smallest possible release in the 2.0 series 2.0.dev0?

On 30 May 2013 01:47, "Daniel Holth" <dholth@gmail.com> wrote:
In 440, < 2.0 means < 2.0, != 2.0.* to avoid dev releases. Would it be equivalent for < 2.0 to expand to the smallest possible release in the 2.0 series 2.0.dev0?
Yes, but that approach doesn't work to exclude post and maintenance releases in the > 2.0 case. The PEP uses the expansion it does so that the greater than and less operations use similar definitions. Cheers, Nick.

On 30 May 2013 01:23, "Donald Stufft" <donald@stufft.io> wrote:
I feel like we are still conflating some names and use cases here with source_label, source_url, source reference, and the is operator.
source_label // source_url - References to the source that is associated with this release. Must not point to anything besides source (no whls, no other built or platform specific artifact).
is // source reference - This is where my real issue comes into play.
I almost think that it doesn't make any sense to allow an is comparison to a source label. In Python `is` is an identity function and a source label is not a directly installable thing. It must be built first (even if building involves simply copying .py files into a whl).
I don't like using the word "source" here because when pointing at an url it could be a whl file, it could be an egg, it could be a source archive.
I'm not sure what should be done about the non url form of ``is``. The url form makes sense to me because an url is (theortically) an identity pointer to an exact installable. However a source_label is not, it could have any number of installable files associated with it and the one you install may be dependent upon what OS you're using or what Python version your using.
The whole of the metadata operates at the source identification level - it's up to the installation tools to work out how best to satisfy that given the installation context. I only allowed build URLs at all because it seemed silly to force installation tools to come up with a distinct format for them. However, if allowing both appears to be too confusing, that's the half that will go away, leaving it up to the integration tools to say how to nominate particular binaries. Source references themselves are in danger of going away entirely as being not worth the hassle. Cheers, Nick.
-- Donald Stufft donald@stufft.io

On Wed, May 29, 2013, at 11:48 AM, Nick Coghlan wrote: I only allowed build URLs at all because it seemed silly to force installation tools to come up with a distinct format for them. However, if allowing both appears to be too confusing, that's the half that will go away, leaving it up to the integration tools to say how to nominate particular binaries. Source references themselves are in danger of going away entirely as being not worth the hassle. I understand it's supposed to operate at the release level (vs the file level) but I think it needs to operate at both. The common case is going to be "I'm referencing the thing I want to install by name and version". However I think it should be a supported operation to say "I want to install exactly this file. Nothing else will do". ``foo (is [1]https://example.com/foo-1.0.tar.gz)`` solves that. However that doesn't jive with ``foo (is <source_label)``. But if files had a unique per project build label, then ``foo (is <build_label>)`` would provide that. I don't see much value in being able to install by referencing the source_label (to me it's more of an identifier to allow tools like RPM to trace a PyPI release back to it's origin). I do however see immense value in being able to single out a particular file you want to install either via url or a uniquely identifier (for project) build tag/label. References 1. https://example.com/foo-1.0.tar.gz

Hi Nick, On Mon, May 27, 2013 at 21:36 +1000, Nick Coghlan wrote:
After preliminary reviews by Donald and Daniel, I have now pushed the first complete draft of the JSON-based metadata 2.0 proposal to python.org
PEP 426 (metadata 2.0): http://www.python.org/dev/peps/pep-0426/
After a first quick read i am wondering if i missed something with respect to test tools. There are some fields which specify dependencies required for running tests, but there is nothing that would tell which test runner to use, which test command to invoke, or am i missing something? Basically i am wondering how PEP426 could be useful/used by tox. best, holger

On Wed, May 29, 2013 at 2:36 PM, holger krekel <holger@merlinux.eu> wrote:
Hi Nick,
On Mon, May 27, 2013 at 21:36 +1000, Nick Coghlan wrote:
After preliminary reviews by Donald and Daniel, I have now pushed the first complete draft of the JSON-based metadata 2.0 proposal to python.org
PEP 426 (metadata 2.0): http://www.python.org/dev/peps/pep-0426/
After a first quick read i am wondering if i missed something with respect to test tools. There are some fields which specify dependencies required for running tests, but there is nothing that would tell which test runner to use, which test command to invoke, or am i missing something?
Basically i am wondering how PEP426 could be useful/used by tox. best,
The first thing we might do is to have setuptools expose its test_suite argument as "extensions" : { "setuptools": { "test_suite": "callable.name" } }. I think we need the next version of the metadata or sdist 2.0 to really do a better job than just running "setup.py test"; right now the tests usually only make sense in the context of an unpacked sdist. Input appreciated.

On Wed, May 29, 2013 at 15:15 -0400, Daniel Holth wrote:
On Wed, May 29, 2013 at 2:36 PM, holger krekel <holger@merlinux.eu> wrote:
Hi Nick,
On Mon, May 27, 2013 at 21:36 +1000, Nick Coghlan wrote:
After preliminary reviews by Donald and Daniel, I have now pushed the first complete draft of the JSON-based metadata 2.0 proposal to python.org
PEP 426 (metadata 2.0): http://www.python.org/dev/peps/pep-0426/
After a first quick read i am wondering if i missed something with respect to test tools. There are some fields which specify dependencies required for running tests, but there is nothing that would tell which test runner to use, which test command to invoke, or am i missing something?
Basically i am wondering how PEP426 could be useful/used by tox. best,
The first thing we might do is to have setuptools expose its test_suite argument as "extensions" : { "setuptools": { "test_suite": "callable.name" } }.
The way tox specifies testing is to allow arbitrary test commands not just unittest test suites. People use py.test, make and whatnot to run tests.
I think we need the next version of the metadata or sdist 2.0 to really do a better job than just running "setup.py test"; right now the tests usually only make sense in the context of an unpacked sdist. Input appreciated.
We certainly don't want to advocate using "setup.py" for running tests as we want to get way from the neccessity for this file to exist (correct me if i am wrong with this presumption). The metadata should be rich enough to support tools like tox to perform the testing, much like pip for installations. holger

On Wed, May 29, 2013 at 3:33 PM, holger krekel <holger@merlinux.eu> wrote:
On Wed, May 29, 2013 at 15:15 -0400, Daniel Holth wrote:
On Wed, May 29, 2013 at 2:36 PM, holger krekel <holger@merlinux.eu> wrote:
Hi Nick,
On Mon, May 27, 2013 at 21:36 +1000, Nick Coghlan wrote:
After preliminary reviews by Donald and Daniel, I have now pushed the first complete draft of the JSON-based metadata 2.0 proposal to python.org
PEP 426 (metadata 2.0): http://www.python.org/dev/peps/pep-0426/
After a first quick read i am wondering if i missed something with respect to test tools. There are some fields which specify dependencies required for running tests, but there is nothing that would tell which test runner to use, which test command to invoke, or am i missing something?
Basically i am wondering how PEP426 could be useful/used by tox. best,
The first thing we might do is to have setuptools expose its test_suite argument as "extensions" : { "setuptools": { "test_suite": "callable.name" } }.
The way tox specifies testing is to allow arbitrary test commands not just unittest test suites. People use py.test, make and whatnot to run tests.
tox could shove a JSON version of its config into the metadata as an extension. If "something wrapped in a unittest suite" isn't the interface for running tests (in the most boring way possible, with better runners for the developers not the users of the package), then what should it be?
I think we need the next version of the metadata or sdist 2.0 to really do a better job than just running "setup.py test"; right now the tests usually only make sense in the context of an unpacked sdist. Input appreciated.
We certainly don't want to advocate using "setup.py" for running tests as we want to get way from the neccessity for this file to exist (correct me if i am wrong with this presumption). The metadata should be rich enough to support tools like tox to perform the testing, much like pip for installations.
The first priority is to get rid of "setup.py install" since that is where most of the trouble comes from. It's not that big of a task to relegate setup.py to its proper place as a build script. At the same time it will eventually become possible to use non-distutils-derived build tools without having to emulate a distutils setup.py. Wouldn't we continue to type tox, or nose, or py.test, with the standard hook being mainly for automated tools that just want to verify that something still works before building / installing?

On Wed, May 29, 2013 at 16:26 -0400, Daniel Holth wrote:
On Wed, May 29, 2013 at 3:33 PM, holger krekel <holger@merlinux.eu> wrote:
On Wed, May 29, 2013 at 15:15 -0400, Daniel Holth wrote:
On Wed, May 29, 2013 at 2:36 PM, holger krekel <holger@merlinux.eu> wrote:
Hi Nick,
On Mon, May 27, 2013 at 21:36 +1000, Nick Coghlan wrote:
After preliminary reviews by Donald and Daniel, I have now pushed the first complete draft of the JSON-based metadata 2.0 proposal to python.org
PEP 426 (metadata 2.0): http://www.python.org/dev/peps/pep-0426/
After a first quick read i am wondering if i missed something with respect to test tools. There are some fields which specify dependencies required for running tests, but there is nothing that would tell which test runner to use, which test command to invoke, or am i missing something?
Basically i am wondering how PEP426 could be useful/used by tox. best,
The first thing we might do is to have setuptools expose its test_suite argument as "extensions" : { "setuptools": { "test_suite": "callable.name" } }.
The way tox specifies testing is to allow arbitrary test commands not just unittest test suites. People use py.test, make and whatnot to run tests.
tox could shove a JSON version of its config into the metadata as an extension.
If "something wrapped in a unittest suite" isn't the interface for running tests (in the most boring way possible, with better runners for the developers not the users of the package), then what should it be?
An entry point or way to execute a command line script, either coming from an test_requires dependency, or a script contained in the package. Note that py.test can run unittest test suites, but pytest test suites are not modeled as a unittest test suite. The same would be true if you use a test script contained in a package or "make" for that matter. IOW, i think there should be a field denoting which command line tool or script to run for the tests. If it's not defined and a test suite is defined, it's of course fine to invoke unitest's machinery on it.
I think we need the next version of the metadata or sdist 2.0 to really do a better job than just running "setup.py test"; right now the tests usually only make sense in the context of an unpacked sdist. Input appreciated.
We certainly don't want to advocate using "setup.py" for running tests as we want to get way from the neccessity for this file to exist (correct me if i am wrong with this presumption). The metadata should be rich enough to support tools like tox to perform the testing, much like pip for installations.
The first priority is to get rid of "setup.py install" since that is where most of the trouble comes from. It's not that big of a task to relegate setup.py to its proper place as a build script. At the same time it will eventually become possible to use non-distutils-derived build tools without having to emulate a distutils setup.py.
Wouldn't we continue to type tox, or nose, or py.test, with the standard hook being mainly for automated tools that just want to verify that something still works before building / installing?
Typically projects depend on the test runner/framework for executing their tests, be it at development or pre-installation time. holger

On 29 May, 2013, at 20:36, holger krekel <holger@merlinux.eu> wrote:
Hi Nick,
On Mon, May 27, 2013 at 21:36 +1000, Nick Coghlan wrote:
After preliminary reviews by Donald and Daniel, I have now pushed the first complete draft of the JSON-based metadata 2.0 proposal to python.org
PEP 426 (metadata 2.0): http://www.python.org/dev/peps/pep-0426/
After a first quick read i am wondering if i missed something with respect to test tools. There are some fields which specify dependencies required for running tests, but there is nothing that would tell which test runner to use, which test command to invoke, or am i missing something?
Basically i am wondering how PEP426 could be useful/used by tox. best,
There is something about testing in the metabuild hooks section, although that appears to be targetted towards running tests for an installed distribution. I'm not sure yet how useful that hook would be for my packages, as the test suite isn't actually installed (it is in the sdist, but not in the output of bdist_*). Ronald

On Thu, May 30, 2013 at 10:41 +0200, Ronald Oussoren wrote:
On 29 May, 2013, at 20:36, holger krekel <holger@merlinux.eu> wrote:
Hi Nick,
On Mon, May 27, 2013 at 21:36 +1000, Nick Coghlan wrote:
After preliminary reviews by Donald and Daniel, I have now pushed the first complete draft of the JSON-based metadata 2.0 proposal to python.org
PEP 426 (metadata 2.0): http://www.python.org/dev/peps/pep-0426/
After a first quick read i am wondering if i missed something with respect to test tools. There are some fields which specify dependencies required for running tests, but there is nothing that would tell which test runner to use, which test command to invoke, or am i missing something?
Basically i am wondering how PEP426 could be useful/used by tox. best,
There is something about testing in the metabuild hooks section, although that appears to be targetted towards running tests for an installed distribution.
I'm not sure yet how useful that hook would be for my packages, as the test suite isn't actually installed (it is in the sdist, but not in the output of bdist_*).
That's indeed another issue. In order to run tests you sometimes need to install a test package for it. I'd like to add this to tox but have refrained so far because of the involved packaging activities. Ideally, building+package+upload should be able to create both the actual package and the test package in case someone wants to run the tests. That wouldn't easily work for models where tests are put into subdirectories unless maybe the new python3 namespaces are used. best, holger
Ronald

Hi Nick, I am actually missing a "goals" section in this PEP. Who can/should use the PEP's new formats and definitions? Is it meant to fit well for python2.6, PyPy, Jython, etc? Also, what the intention/roadmap for adoption, how can existing and new tools deald with or distinguish old-style and PEP426 packaging semantics? I think such "process" aspects would provide very valuable context for the many players in the field. best, holger On Mon, May 27, 2013 at 21:36 +1000, Nick Coghlan wrote:
After preliminary reviews by Donald and Daniel, I have now pushed the first complete draft of the JSON-based metadata 2.0 proposal to python.org
PEP 426 (metadata 2.0): http://www.python.org/dev/peps/pep-0426/ PEP 440 (versioning): http://www.python.org/dev/peps/pep-0440/
With the rationale and commentary, they're over 3000 lines between them, so I'm not attaching them here.
The rationale for many of the changes is at the end of each PEP, along with some comments on features that I have either rejected or deliberately chosen to defer to the next revision of the metadata (at the earliest).
Those with BitBucket accounts may also comment inline on the drafts here:
PEP 426: https://bitbucket.org/ncoghlan/misc/src/05d3586464b10d6a04a35409468269d7c89a... PEP 440: https://bitbucket.org/ncoghlan/misc/src/05d3586464b10d6a04a35409468269d7c89a...
Cheers, Nick.
-- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia _______________________________________________ Distutils-SIG maillist - Distutils-SIG@python.org http://mail.python.org/mailman/listinfo/distutils-sig

On Thu, May 30, 2013 at 2:28 PM, holger krekel <holger@merlinux.eu> wrote:
Hi Nick,
I am actually missing a "goals" section in this PEP. Who can/should use the PEP's new formats and definitions? Is it meant to fit well for python2.6, PyPy, Jython, etc? Also, what the intention/roadmap for adoption, how can existing and new tools deald with or distinguish old-style and PEP426 packaging semantics? I think such "process" aspects would provide very valuable context for the many players in the field.
best,
holger
I agree that this pep says a lot about how it's different than previous metadata but doesn't seem to have a clear single purpose. We hope to accomplish a lot of things, but this PEP is just concerned with defining an interchange format for the data we already have in [setuptools-based] setup.py, and some additional things we think we will need to get where we want to go. In particular this PEP does not say anything about entirely replacing setup.py.. What Metadata 2.0 does help to do is to represent a lot more of the information that's locked up in setup.py (or obfuscated by coming out different ways depending on the system) so that other tools can deal with it instead of necessarily having setup.py take charge of everything from development to installation. Along with wheel and the filesystem layout (.dist-info directories) we aim to have a clean, well documented, static interface between development/build and install, making it much easier to work on one half of the problem or the other. A future meta-build system will define an API or interface for the things that setup.py does. Existing setup.py scripts will be only the first available implementation. The format is designed to work with all current versions and implementations of Python. The end user shouldn't really notice anything different until some number of packages stop using setup.py. The key insight this time is that "setup.py install" is more harmful than other kinds of setup.py and is the first thing we should get rid of. No one actually parses PKG-INFO so I expect the Metadata 2.0 transition to be relatively painless. The semantics are intended to be as backwards compatible as possible. A future version of setuptools will be able to produce and consume the new and old formats, and distlib provides an alternate consumer. Daniel Holth

On Thu, May 30, 2013 at 22:26 -0400, Daniel Holth wrote:
On Thu, May 30, 2013 at 2:28 PM, holger krekel <holger@merlinux.eu> wrote: No one actually parses PKG-INFO so I expect the Metadata 2.0 transition to be relatively painless.
I am actually intending to do exactly that in devpi-server. It's the only method to obtain metadata from a tar file without running setup.py ASFAIK. That being said, when a new format and file comes around i am happy to detect that instead, i guess. holger

On Thu, Jun 6, 2013 at 3:31 PM, holger krekel <holger@merlinux.eu> wrote:
On Thu, May 30, 2013 at 22:26 -0400, Daniel Holth wrote:
On Thu, May 30, 2013 at 2:28 PM, holger krekel <holger@merlinux.eu> wrote: No one actually parses PKG-INFO so I expect the Metadata 2.0 transition to be relatively painless.
I am actually intending to do exactly that in devpi-server. It's the only method to obtain metadata from a tar file without running setup.py ASFAIK.
That being said, when a new format and file comes around i am happy to detect that instead, i guess.
Unfortunately if you author a pypi server I count you as "no one" for this exercise. :-P And the most useful requirements metadata is in a setuptools .txt file anyway. Once you start seeing sdists with a .dist-info directory instead of an .egg-info or nothing you'll just look for pymeta.json. Wheel has code to convert from .egg-info to the newer formats.
participants (6)
-
Daniel Holth
-
Donald Stufft
-
Erik Bray
-
holger krekel
-
Nick Coghlan
-
Ronald Oussoren