At 03:16 PM 6/11/2009 +0100, Paul Moore wrote:
2009/6/11 P.J. Eby pje@telecommunity.com:
PyPI uploads aren't a suitable basis for analyzing "dev" use cases, since the whole point of having a "dev" tag is for *non-released* versions. (E.g., in-progress development via SVN.)
If it's non-released, I've yet to see a clear explanation of why the PEP is relevant. Who is going to use an API from the PEP to parse your "version number", and why?
Dev tags are so that while you're doing development, your locally-installed versions can be
distinguished from
one another.
Distinguished by what? What code (that you didn't write yourself, purely for internal use) needs to parse your dev tag?
Distinguished by setuptools for processing version requirements of scripts, or require() statements in code, and installation requirements of newly-installed code.
For example, if I'm working on two projects that are distributed via SVN and one depends on the other, if I update one, it may require an update of the other; the failure of the .dev#### version requirement in the first one will inform me of the need to "svn up" the second project and rerun "setup.py develop" on it.
This is a routine circumstance in at least my development cycle; I would expect that it's the case in other open source development workflows as well as proprietary ones.
On Thu, Jun 11, 2009 at 10:53 AM, P.J. Ebypje@telecommunity.com wrote:
At 03:16 PM 6/11/2009 +0100, Paul Moore wrote:
2009/6/11 P.J. Eby pje@telecommunity.com:
PyPI uploads aren't a suitable basis for analyzing "dev" use cases, since the whole point of having a "dev" tag is for *non-released* versions. (E.g., in-progress development via SVN.)
If it's non-released, I've yet to see a clear explanation of why the PEP is relevant. Who is going to use an API from the PEP to parse your "version number", and why?
Dev tags are so that while you're doing development, your locally-installed versions can be distinguished from one another.
Distinguished by what? What code (that you didn't write yourself, purely for internal use) needs to parse your dev tag?
Distinguished by setuptools for processing version requirements of scripts, or require() statements in code, and installation requirements of newly-installed code.
For example, if I'm working on two projects that are distributed via SVN and one depends on the other, if I update one, it may require an update of the other; the failure of the .dev#### version requirement in the first one will inform me of the need to "svn up" the second project and rerun "setup.py develop" on it.
This is a routine circumstance in at least my development cycle; I would expect that it's the case in other open source development workflows as well as proprietary ones.
Agreed, I do this all the time. Pylons dev versions also regularly rely on other packages with a dev version, and people regularly use these non-released versions, with dependencies detected and installed via dependency_links.
On Thu, Jun 11, 2009 at 9:03 AM, Ian Bickingianb@colorstudy.com wrote:
On Thu, Jun 11, 2009 at 10:53 AM, P.J. Ebypje@telecommunity.com wrote:
For example, if I'm working on two projects that are distributed via SVN and one depends on the other, if I update one, it may require an update of the other; the failure of the .dev#### version requirement in the first one will inform me of the need to "svn up" the second project and rerun "setup.py develop" on it.
This is a routine circumstance in at least my development cycle; I would expect that it's the case in other open source development workflows as well as proprietary ones.
Agreed, I do this all the time. Pylons dev versions also regularly rely on other packages with a dev version, and people regularly use these non-released versions, with dependencies detected and installed via dependency_links.
If there were a setup.py metadata field (called "dev_revision" or "build_number" or something), separate from the "version" field, that was used to hold the value for sorting/distinguishing unreleased versions... could that work?
I.e. Have the shorter "N.N.N[(a|b|c)N]" scheme for "version" to be used for "released" packages. And have a separate field (or fields) for use in dependency handling of unreleased versions? Putting the two together is resulting in package uploads to PyPI (foo-1.2.3.dev-r456.tar.gz) that I think were never intended in the design
A weird thing about defining a sort order for:
1.2.3.dev-r450 1.2.3 # this is the released version (svn rev was r454) 1.2.3-r456
is that it is comparing apples and oranges. The "1.2.3" released version *had* a VCS revision number (or release date, or "number of patches").
Trent
On Thu, Jun 11, 2009 at 8:05 PM, Trent Micktrentm@gmail.com wrote:
On Thu, Jun 11, 2009 at 9:03 AM, Ian Bickingianb@colorstudy.com wrote:
On Thu, Jun 11, 2009 at 10:53 AM, P.J. Ebypje@telecommunity.com wrote:
For example, if I'm working on two projects that are distributed via SVN and one depends on the other, if I update one, it may require an update of the other; the failure of the .dev#### version requirement in the first one will inform me of the need to "svn up" the second project and rerun "setup.py develop" on it.
This is a routine circumstance in at least my development cycle; I would expect that it's the case in other open source development workflows as well as proprietary ones.
Agreed, I do this all the time. Pylons dev versions also regularly rely on other packages with a dev version, and people regularly use these non-released versions, with dependencies detected and installed via dependency_links.
Me too, dev version have to be included in the comparison tool.
If there were a setup.py metadata field (called "dev_revision" or "build_number" or something), separate from the "version" field, that was used to hold the value for sorting/distinguishing unreleased versions... could that work?
I.e. Have the shorter "N.N.N[(a|b|c)N]" scheme for "version" to be used for "released" packages. And have a separate field (or fields) for use in dependency handling of unreleased versions? Putting the two together is resulting in package uploads to PyPI
What would be the difference then with the initial proposal ? You would end up merging the "short" version with the dev field to be able to sort different versions of the same distribution.
If we have dev versions, we have to include them in the scheme
Tarek
I.e. Have the shorter "N.N.N[(a|b|c)N]" scheme for "version" to be used for "released" packages. And have a separate field (or fields) for use in dependency handling of unreleased versions? Putting the two together is resulting in package uploads to PyPI
What would be the difference then with the initial proposal ? You would end up merging the "short" version with the dev field to be able to sort different versions of the same distribution.
If we have dev versions, we have to include them in the scheme
I've been thinking from the p.o.v. of what releases get up on PyPI.... and I gather that those releases are the ones that lead to potential packaging in RPM and .deb repositories.
Say "version" and "build_number" (or whatever name for the latter) are separate fields. Only "version" is used for putting in package names (sdist, bdist_*). However the setup() fields for dependency info can specify checks against both version and build_number.
The difference with the initial proposal (if I'm not missing something) is that:
- packages looking like "foo-1.2.3.dev-r456.tar.gz" don't get uploaded to PyPI (yeah!) - the meta data in my released version can still state what SCC revision (in the build_number field) it was built against - when I specify a dependency against a particular build_number of a package, I don't care if that build_number happened to be a released version or a dev version
Trent
Trent Mick wrote:
I've been thinking from the p.o.v. of what releases get up on PyPI.... and I gather that those releases are the ones that lead to potential packaging in RPM and .deb repositories.
I think that is not necessarily true -- I seem to recall some discussion in the PyCon sessions about OS distro folks sometimes having to package dev versions ... hopefully there are some distro packagers here who can chime in on this ...
Steve
On Thu, Jun 11, 2009 at 04:51:24PM -0400, Stephen Waterbury wrote:
Trent Mick wrote:
I've been thinking from the p.o.v. of what releases get up on PyPI.... and I gather that those releases are the ones that lead to potential packaging in RPM and .deb repositories.
I think that is not necessarily true -- I seem to recall some discussion in the PyCon sessions about OS distro folks sometimes having to package dev versions ... hopefully there are some distro packagers here who can chime in on this ...
True, some packages have versions like 1.2.3+hg20090612-1 in which case there didn't even need to be a development release. Of course 1.2.3.dev456-1 would be fine too (the last "-1" would be debian's version, ignore that).
Regards Floris
Ian Bicking ianb@colorstudy.com writes:
Agreed, I do this all the time. Pylons dev versions also regularly rely on other packages with a dev version, and people regularly use these non-released versions, with dependencies detected and installed via dependency_links.
I don't see how any of this argues for special-cased tokens in a version string. Isn't this all possible anyway with a version comparison algorithm with no special-cased tokens?
2009/6/11 P.J. Eby pje@telecommunity.com:
Dev tags are so that while you're doing development, your locally-installed versions can be distinguished from one another.
Distinguished by what? What code (that you didn't write yourself, purely for internal use) needs to parse your dev tag?
Distinguished by setuptools for processing version requirements of scripts, or require() statements in code, and installation requirements of newly-installed code.
So will setuptools be modified to use the new code? If not, the PEP is directly competing with setuptools. And the new PEP will probably not be adopted by anyone currently using setuptools (after all, switching is all cost and no benefit).
If setuptools *will* be changing, then setuptools users have to choose whether to conform to the new version of setuptools (and hence the spec) or to remain on an old/forked version of setuptools. That's a genuine choice - although maybe not one that people will like being given.
My understanding was that setuptools was a success at least in part due to its policy of catering for as many variations on existing policy as possible, effectively refusing to enforce policy, but rather to adapt to existing use. If the new PEP isn't designed precisely to limit variation, and enforce policy even if that does mean excluding certain current usages, then I don't see why the PEP doesn't just say "adopt setuptools version code" [1].
So I guess I'm missing the point of the PEP, given the existence of setuptools.
For example, if I'm working on two projects that are distributed via SVN and one depends on the other, if I update one, it may require an update of the other; the failure of the .dev#### version requirement in the first one will inform me of the need to "svn up" the second project and rerun "setup.py develop" on it.
This is a routine circumstance in at least my development cycle; I would expect that it's the case in other open source development workflows as well as proprietary ones.
But setup.py develop is a setuptools extension. The PEP doesn't say anything about setuptools. Are you saying that setuptools *will* be modified to use the PEP version rules? And hence, what are you saying about the relationship between the principle I stated above (assuming I didn't misinterpret things) and the (implicit) goals of the PEP process?
Paul.
[1] Actually, if I was being cynical, I could say that the reason is to avoid alienating people who don't like / use setuptools. Either way, it may be that the reason I'm making such a fuss is that I'm (unconsciously) reacting against the same aspects of the PEP that I dislike in setuptools.