Automating the maintenance release pipeline (was Re: Request for CPython 3.5.3 release)
On 4 July 2016 at 00:39, Guido van Rossum <guido@python.org> wrote:
Another thought recently occurred to me. Do releases really have to be such big productions? A recent ACM article by Tom Limoncelli[1] reminded me that we're doing releases the old-fashioned way -- infrequently, and with lots of manual labor. Maybe we could (eventually) try to strive for a lighter-weight, more automated release process? It would be less work, and it would reduce stress for authors of stdlib modules and packages -- there's always the next release. I would think this wouldn't obviate the need for carefully planned and timed "big deal" feature releases, but it could make the bug fix releases *less* of a deal, for everyone.
Yes, getting the maintenance releases to the point of being largely automated would be beneficial. However, I don't think the problem is lack of desire for that outcome, it's that maintaining the release toolchain pretty much becomes a job at that point, as you really want to be producing nightly builds (since the creation of those nightlies in effect becomes the regression test suite for the release toolchain), and you also need to more strictly guard against even temporary regressions in the maintenance branches. There are some variants we could pursue around that model (e.g. automating Python-only updates without automating updates that require rebuilding the core interpreter binaries for Windows and Mac OS X), but none of it is the kind of thing likely to make anyone say "I want to work on improving this in my free time". Even for commercial redistributors, it isn't easy for us to make the business case for assigning someone to work on it, since we're generally working from the source trees rather than the upstream binary releases. I do think it's worth putting this into our bucket of "ongoing activities we could potentially propose to the PSF for funding", though. I know Ewa (Jodlowska, the PSF's Director of Operations) is interested in better supporting the Python development community directly (hence https://donate.pypi.io/ ) in addition to the more indirect community building efforts like PyCon US and the grants program, so I've been trying to build up a mental list of CPython development pain points where funded activities could potentially improve the contributor experience for volunteers. So far I have: - issue triage (including better acknowledging folks that help out with triage efforts) - patch review (currently "wait and see" pending the impact of the GitHub migration) - nightly pre-release builds (for ease of contribution without first becoming a de facto C developer and to help make life easier for release managers) That last one is a new addition to my list based on this thread, and I think it's particularly interesting in that it would involve a much smaller set of target users than the first two (with the primary stakeholders being the release managers and the folks preparing the binary installers), but also a far more concrete set of deliverables (i.e. nightly binary builds being available for active development and maintenance branches for at least Windows and Mac OS X, and potentially for the manylinux1 baseline API defined in PEP 513) Cheers, Nick. -- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia
The bots Mozilla runs around both Rust and Servo should make a lot of this much lower overhead if they can be repurposed (as I believe several other communities have already done). Homu, the build manager tool, runs CI (including buildbots, Travis, etc.), is integrated with GitHub PRs so maintainers can trigger it with a comment there, and can also roll up a bunch of changes into one (handy to pull together e.g. a bunch of small documentation changes like typo fixes): https://github.com/barosl/homu That seems to keep the pain level of having an always-building-and-passing-tests nightly version much lower. Aside: I don't want to flood these discussions with "yay Rust!" stuff, so this will probably be my last such response unless something else really jumps out. ;-) Thanks for the work you're all doing here. Regards, Chris Krycho
On Jul 3, 2016, at 7:34 PM, Nick Coghlan <ncoghlan@gmail.com> wrote:
On 4 July 2016 at 00:39, Guido van Rossum <guido@python.org> wrote: Another thought recently occurred to me. Do releases really have to be such big productions? A recent ACM article by Tom Limoncelli[1] reminded me that we're doing releases the old-fashioned way -- infrequently, and with lots of manual labor. Maybe we could (eventually) try to strive for a lighter-weight, more automated release process? It would be less work, and it would reduce stress for authors of stdlib modules and packages -- there's always the next release. I would think this wouldn't obviate the need for carefully planned and timed "big deal" feature releases, but it could make the bug fix releases *less* of a deal, for everyone.
Yes, getting the maintenance releases to the point of being largely automated would be beneficial. However, I don't think the problem is lack of desire for that outcome, it's that maintaining the release toolchain pretty much becomes a job at that point, as you really want to be producing nightly builds (since the creation of those nightlies in effect becomes the regression test suite for the release toolchain), and you also need to more strictly guard against even temporary regressions in the maintenance branches.
There are some variants we could pursue around that model (e.g. automating Python-only updates without automating updates that require rebuilding the core interpreter binaries for Windows and Mac OS X), but none of it is the kind of thing likely to make anyone say "I want to work on improving this in my free time". Even for commercial redistributors, it isn't easy for us to make the business case for assigning someone to work on it, since we're generally working from the source trees rather than the upstream binary releases.
I do think it's worth putting this into our bucket of "ongoing activities we could potentially propose to the PSF for funding", though. I know Ewa (Jodlowska, the PSF's Director of Operations) is interested in better supporting the Python development community directly (hence https://donate.pypi.io/ ) in addition to the more indirect community building efforts like PyCon US and the grants program, so I've been trying to build up a mental list of CPython development pain points where funded activities could potentially improve the contributor experience for volunteers. So far I have:
- issue triage (including better acknowledging folks that help out with triage efforts) - patch review (currently "wait and see" pending the impact of the GitHub migration) - nightly pre-release builds (for ease of contribution without first becoming a de facto C developer and to help make life easier for release managers)
That last one is a new addition to my list based on this thread, and I think it's particularly interesting in that it would involve a much smaller set of target users than the first two (with the primary stakeholders being the release managers and the folks preparing the binary installers), but also a far more concrete set of deliverables (i.e. nightly binary builds being available for active development and maintenance branches for at least Windows and Mac OS X, and potentially for the manylinux1 baseline API defined in PEP 513)
Cheers, Nick.
-- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia _______________________________________________ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/chris%40chriskrycho.com
Once the GH migration occurs I think we will take a look at Homu (it's been brought up previously). On Sun, Jul 3, 2016, 17:35 Chris Krycho <chris@chriskrycho.com> wrote:
The bots Mozilla runs around both Rust and Servo should make a lot of this much lower overhead if they can be repurposed (as I believe several other communities have already done).
Homu, the build manager tool, runs CI (including buildbots, Travis, etc.), is integrated with GitHub PRs so maintainers can trigger it with a comment there, and can also roll up a bunch of changes into one (handy to pull together e.g. a bunch of small documentation changes like typo fixes): https://github.com/barosl/homu That seems to keep the pain level of having an always-building-and-passing-tests nightly version much lower.
Aside: I don't want to flood these discussions with "yay Rust!" stuff, so this will probably be my last such response unless something else really jumps out. ;-) Thanks for the work you're all doing here.
Regards, Chris Krycho
On Jul 3, 2016, at 7:34 PM, Nick Coghlan <ncoghlan@gmail.com> wrote:
On 4 July 2016 at 00:39, Guido van Rossum <guido@python.org> wrote:
Another thought recently occurred to me. Do releases really have to be
such big productions? A recent ACM article by Tom Limoncelli[1]
reminded me that we're doing releases the old-fashioned way --
infrequently, and with lots of manual labor. Maybe we could
(eventually) try to strive for a lighter-weight, more automated
release process? It would be less work, and it would reduce stress for
authors of stdlib modules and packages -- there's always the next
release. I would think this wouldn't obviate the need for carefully
planned and timed "big deal" feature releases, but it could make the
bug fix releases *less* of a deal, for everyone.
Yes, getting the maintenance releases to the point of being largely automated would be beneficial. However, I don't think the problem is lack of desire for that outcome, it's that maintaining the release toolchain pretty much becomes a job at that point, as you really want to be producing nightly builds (since the creation of those nightlies in effect becomes the regression test suite for the release toolchain), and you also need to more strictly guard against even temporary regressions in the maintenance branches.
There are some variants we could pursue around that model (e.g. automating Python-only updates without automating updates that require rebuilding the core interpreter binaries for Windows and Mac OS X), but none of it is the kind of thing likely to make anyone say "I want to work on improving this in my free time". Even for commercial redistributors, it isn't easy for us to make the business case for assigning someone to work on it, since we're generally working from the source trees rather than the upstream binary releases.
I do think it's worth putting this into our bucket of "ongoing activities we could potentially propose to the PSF for funding", though. I know Ewa (Jodlowska, the PSF's Director of Operations) is interested in better supporting the Python development community directly (hence https://donate.pypi.io/ ) in addition to the more indirect community building efforts like PyCon US and the grants program, so I've been trying to build up a mental list of CPython development pain points where funded activities could potentially improve the contributor experience for volunteers. So far I have:
- issue triage (including better acknowledging folks that help out with triage efforts) - patch review (currently "wait and see" pending the impact of the GitHub migration) - nightly pre-release builds (for ease of contribution without first becoming a de facto C developer and to help make life easier for release managers)
That last one is a new addition to my list based on this thread, and I think it's particularly interesting in that it would involve a much smaller set of target users than the first two (with the primary stakeholders being the release managers and the folks preparing the binary installers), but also a far more concrete set of deliverables (i.e. nightly binary builds being available for active development and maintenance branches for at least Windows and Mac OS X, and potentially for the manylinux1 baseline API defined in PEP 513)
Cheers, Nick.
-- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia _______________________________________________ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: https://mail.python.org/mailman/options/python-dev/chris%40chriskrycho.com
_______________________________________________ Python-Dev mailing list Python-Dev@python.org https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/brett%40python.org
On 4 July 2016 at 10:34, Chris Krycho <chris@chriskrycho.com> wrote:
The bots Mozilla runs around both Rust and Servo should make a lot of this much lower overhead if they can be repurposed (as I believe several other communities have already done).
Homu, the build manager tool, runs CI (including buildbots, Travis, etc.), is integrated with GitHub PRs so maintainers can trigger it with a comment there, and can also roll up a bunch of changes into one (handy to pull together e.g. a bunch of small documentation changes like typo fixes): https://github.com/barosl/homu That seems to keep the pain level of having an always-building-and-passing-tests nightly version much lower.
Aye, as Brett mentioned, we're definitely interested in the work Rust/Mozilla have been doing, and it's come up in previous discussions on the core-workflow list like https://mail.python.org/pipermail/core-workflow/2016-February/000480.html However, automating the Mac OS X and Windows Installer builds and the subsquent uploads to python.org gets more challenging, as at that point you're looking at either producing unsigned binaries, or else automating the creation of signed binaries, and the latter means you start running into secrets management problems that don't exist for plain CI builds. Cheers, Nick. -- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia
participants (3)
-
Brett Cannon
-
Chris Krycho
-
Nick Coghlan