[Numpy-discussion] Numpy 1.11.0b2 released

Ralf Gommers ralf.gommers at gmail.com
Fri Jan 29 02:49:32 EST 2016


On Fri, Jan 29, 2016 at 4:21 AM, Nathaniel Smith <njs at pobox.com> wrote:

> On Jan 28, 2016 3:25 PM, "Ralf Gommers" <ralf.gommers at gmail.com> wrote:
> >
> >
> >
> > On Thu, Jan 28, 2016 at 11:57 PM, Nathaniel Smith <njs at pobox.com> wrote:
> >>
> >> On Thu, Jan 28, 2016 at 2:23 PM, Ralf Gommers <ralf.gommers at gmail.com>
> wrote:
> >> >
> >> >
> >> > On Thu, Jan 28, 2016 at 11:03 PM, Charles R Harris
> >> > <charlesr.harris at gmail.com> wrote:
> >> >>
> >> >>
> >> >>
> >> >> On Thu, Jan 28, 2016 at 2:36 PM, Nathaniel Smith <njs at pobox.com>
> wrote:
> >> >>>
> >> >>> Maybe we should upload to pypi? This allows us to upload binaries
> for osx
> >> >>> at least, and in general will make the beta available to anyone who
> does
> >> >>> 'pip install --pre numpy'. (But not regular 'pip install numpy',
> because pip
> >> >>> is clever enough to recognize that this is a prerelease and should
> not be
> >> >>> used by default.)
> >> >>>
> >> >>> (For bonus points, start a campaign to convince everyone to add
> --pre to
> >> >>> their ci setups, so that merely uploading a prerelease will ensure
> that it
> >> >>> starts getting tested automatically.)
> >> >>>
> >> >>> On Jan 28, 2016 12:51 PM, "Charles R Harris" <
> charlesr.harris at gmail.com>
> >> >>> wrote:
> >> >>>>
> >> >>>> Hi All,
> >> >>>>
> >> >>>> I hope I am pleased to announce the Numpy 1.11.0b2 release. The
> first
> >> >>>> beta was a damp squib due to missing files in the released source
> files,
> >> >>>> this release fixes that. The new source filese may be downloaded
> from
> >> >>>> sourceforge, no binaries will be released until the mingw tool
> chain
> >> >>>> problems are sorted.
> >> >>>>
> >> >>>> Please test and report any problem.
> >> >>
> >> >>
> >> >> So what happens if I use twine to upload a beta? Mind, I'd give it a
> try
> >> >> if pypi weren't an irreversible machine of doom.
> >> >
> >> >
> >> > One of the things that will probably happen but needs to be avoided
> is that
> >> > 1.11b2 becomes the visible release at
> https://pypi.python.org/pypi/numpy. By
> >> > default I think the status of all releases but the last uploaded one
> (or
> >> > highest version number?) is set to hidden.
> >>
> >> Huh, I had the impression that if it was ambiguous whether the "latest
> >> version" was a pre-release or not, then pypi would list all of them on
> >> that page -- at least I know I've seen projects where going to the
> >> main pypi URL gives a list of several versions like that. Or maybe the
> >> next-to-latest one gets hidden by default and you're supposed to go
> >> back and "un-hide" the last release manually.
> >>
> >> Could try uploading to
> >>
> >>   https://testpypi.python.org/pypi
> >>
> >> and see what happens...
> >
> >
> > That's worth a try, would be good to know what the behavior is.
> >
> >>
> >>
> >> > Other ways that users can get a pre-release by accident are:
> >> > - they have pip <1.4 (released in July 2013)
> >>
> >> It looks like ~a year ago this was ~20% of users --
> >> https://caremad.io/2015/04/a-year-of-pypi-downloads/
> >> I wouldn't be surprised if it dropped quite a bit since then, but if
> >> this is something that will affect our decision then we can ping
> >> @dstufft to ask for updated numbers.
> >
> >
> > Hmm, that's more than I expected. Even if it dropped by a factor of 10
> over the last year, that would still be a lot of failed installs for the
> current beta1. It looks to me like this is a bad trade-off. It would be
> much better to encourage people to test against numpy master instead of a
> pre-release (and we were trying to do that anyway). So the benefit is then
> fairly limited, mostly typing the longer line including wheels.scipy.org
> when someone wants to test a pre-release.
>
> After the disastrous lack of testing for the 1.10 prereleases, it
> might almost be a good thing if we accidentally swept up some pip 1.3
> users into doing prerelease testing... I mean, if they don't test it
> now, they'll just end up testing it later, and at least there will be
> fewer of them to start with? Plus all they have to do to opt out is to
> maintain a vaguely up-to-date environment, which is a good thing for
> the ecosystem anyway :-). It's bad for everyone if pip and PyPI are
> collaborating to provide this rather nice, standard feature for
> distributing and QAing pre-releases, but we can't actually use it
> because of people not upgrading pip...
>

That's a fair point. And given the amount of brokenness in (especially
older versions of ) pip, plus how easy it is to upgrade pip, we should
probably just say that we expect a recent pip (say last 3 major releases).


>
> Regarding CI setups and testing against master: I think of these as
> being complementary. The fact is that master *will* sometimes just be
> broken, or contain tentative API changes that get changed before the
> release, etc. So it's really great that there are some projects who
> are willing to take on the work of testing numpy master directly as
> part of their own CI setups, but it is going to be extra work and risk
> for them, they'll probably have to switch it off sometimes and then
> turn it back on, and they really need to have decent channels of
> communication with us whenever things go wrong because sometimes the
> answer will be "doh, we didn't mean to change that, please leave your
> code alone and we'll fix it on our end". (My nightmare here is that
> downstream projects start working around bugs in master, and then we
> find ourselves having to jump through hoops to maintain backcompat
> with code that was never even released. __numpy_ufunc__ is stuck in
> this situation -- we know that the final version will have to change
> its name, because scipy has been shipping code that assumes a
> different calling convention than the final released version will
> have.)
>
> So, testing master is *great*, but also tricky and not really
> something I think we should be advocating to all 5000 downstream
> projects [1].
>
> OTOH, once a project has put up a prerelease, then *everyone* wants to
> be testing that, because if they don't then things definitely *will*
> break soon. (And this isn't specific to numpy -- this applies to
> pretty much all upstream dependencies.) So IMO we should be teaching
> everyone that their CI setups should just always use --pre when
> running pip install, and this will automatically improve QA coverage
> for the whole ecosystem.
>

OK, persuasive argument. In the past this wouldn't have worked, but our CI
setup is much better now. Until we had Appveyor testing for example, it was
almost the rule that MSVC builds were broken for every first beta.

So, with some hesitation: let's go for it.


> ...It does help if we run at least some minimal QA against the sdist
> before uploading it though, to avoid the 1.11.0b1 problem :-). (Though
> the new travis test for sdists should cover that.) Something else for
> the release checklist I guess...
>

There's still a large number of ways that one can install numpy that aren't
tested (see list in https://github.com/numpy/numpy/issues/6599), but the
only one relevant for pip is when easy_install is triggered by
`setup_requires=numpy`. It's actually not too hard to add that to TravisCI
testing (just install a dummy package that uses setup_requires. I'll add
that to the todo list.

Ralf
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/numpy-discussion/attachments/20160129/ca2e9cb9/attachment.html>


More information about the NumPy-Discussion mailing list