[Numpy-discussion] [SciPy-Dev] 1.8.0rc1
Christoph Gohlke
cgohlke at uci.edu
Mon Sep 30 20:13:07 EDT 2013
On 9/30/2013 4:44 PM, Julian Taylor wrote:
> On 01.10.2013 01:30, Charles R Harris wrote:
>>
>>
>>
>> On Mon, Sep 30, 2013 at 5:12 PM, Christoph Gohlke <cgohlke at uci.edu
>> <mailto:cgohlke at uci.edu>> wrote:
>>
>> On 9/30/2013 3:45 PM, Charles R Harris wrote:
>> >
>> >
>> >
>> > On Mon, Sep 30, 2013 at 3:51 PM, Christoph Gohlke <cgohlke at uci.edu
>> <mailto:cgohlke at uci.edu>
>> > <mailto:cgohlke at uci.edu <mailto:cgohlke at uci.edu>>> wrote:
>> >
>> >
>> >
>> > On 9/30/2013 11:02 AM, Nathaniel Smith wrote:> Everyone please do
>> > actually test this! It is really in your best
>> > > interest, and I think people don't always realize this.
>> > >
>> > > Here's how it works:
>> > > - If you test it *now*, and it breaks your code that worked
>> with 1.7,
>> > > and you *tell* us this now, then it's *our* problem and we
>> hold up the
>> > > release to fix the bug.
>> > > - If you test it *after* we release, and it breaks your
>> code, then we
>> > > are sad but you have to work around it (because we can't
>> magically
>> > > make that release not have happened, your users will be using it
>> > > anyway), and we put it on the stack with all the other bugs.
>> All of
>> > > which we care about but it's a large enough stack that it's
>> not going
>> > > to get any special priority, because, see above about how at
>> this
>> > > point you'll have had to work around it anyway.
>> > >
>> > > -n
>> > >
>> > > On Mon, Sep 30, 2013 at 4:17 PM, Charles R Harris
>> > > <charlesr.harris at gmail.com
>> <mailto:charlesr.harris at gmail.com> <mailto:charlesr.harris at gmail.com
>> <mailto:charlesr.harris at gmail.com>>> wrote:
>> > >> Hi All,
>> > >>
>> > >> NumPy 1.8.0rc1 is up now on sourceforge .The binary builds
>> are included
>> > >> except for Python 3.3 on windows, which will arrive later.
>> Many thanks to
>> > >> Ralf for the binaries, and to those who found and fixed the
>> bugs in the last
>> > >> beta. Any remaining bugs are all my fault ;) I hope this
>> will be the last
>> > >> release before final, so please test it thoroughly.
>> > >>
>> > >> Chuck
>> >
>> >
>> > Hello,
>> >
>> > NumPy 1.8.0rc1 looks good. All tests pass on Windows and most
>> 3rd party
>> > packages test OK now. Thank you.
>> >
>> > A few tests still fail in the following packages when run with
>> > numpy-MKL-1.8.0rc1-win-amd64-py3.3 compared to
>> > numpy-MKL-1.7.1-win-amd64-py3.3:
>> >
>> > 1) Pandas 0.12.0
>> >
>> > ```
>> >
>> ======================================================================
>> > FAIL: test_nansum_buglet (pandas.tests.test_series.TestNanops)
>> >
>> ----------------------------------------------------------------------
>> > Traceback (most recent call last):
>> > File
>> "X:\Python33\lib\site-packages\pandas\tests\test_series.py",
>> > line 254, in test_nansum_buglet
>> > assert_almost_equal(result, 1)
>> > File
>> "X:\Python33\lib\site-packages\pandas\util\testing.py", line
>> > 134, in assert_almost_equal
>> > np.testing.assert_(isiterable(b))
>> > File
>> "D:\Dev\Compile\Test\numpy-build\numpy\testing\utils.py", line
>> > 44, in assert_
>> > raise AssertionError(msg)
>> > AssertionError
>> > ```
>> >
>> > Possibly related:
>> >
>> > ```
>> > >>> import numpy as np
>> > >>> from pandas import Series
>> > >>> s = Series([0.0])
>> > >>> result = np.nansum(s)
>> > >>> print(result)
>> > Traceback (most recent call last):
>> > File "<stdin>", line 1, in <module>
>> > File "X:\Python33\lib\site-packages\pandas\core\base.py", line
>> > 19, in
>> > __str__
>> > return self.__unicode__()
>> > File
>> "X:\Python33\lib\site-packages\pandas\core\series.py", line
>> > 1115, in __unicode__
>> > length=len(self) > 50,
>> > TypeError: len() of unsized object
>> > ```
>> >
>> > 2) Bottleneck 0.7.0
>> >
>> >
>> https://github.com/kwgoodman/bottleneck/issues/71#issuecomment-25331701
>> >
>> > 3) skimage 0.8.2
>> >
>> > These tests passed with numpy 1.8.0b2:
>> >
>> > ```
>> >
>> ======================================================================
>> > FAIL: test_grey.test_non_square_image
>> >
>> ----------------------------------------------------------------------
>> > Traceback (most recent call last):
>> > File "X:\Python33\lib\site-packages\nose\case.py", line
>> 198, in
>> > runTest
>> > self.test(*self.arg)
>> > File
>> >
>> "X:\Python33\lib\site-packages\skimage\morphology\tests\test_grey.py",
>> > line 162, in test_non_square_image
>> > testing.assert_array_equal(binary_res, grey_res)
>> > File
>> "X:\Python33\lib\site-packages\numpy\testing\utils.py", line
>> > 718, in assert_array_equal
>> > verbose=verbose, header='Arrays are not equal')
>> > File
>> "X:\Python33\lib\site-packages\numpy\testing\utils.py", line
>> > 644, in assert_array_compare
>> > raise AssertionError(msg)
>> > AssertionError:
>> > Arrays are not equal
>> >
>> > (mismatch 50.6328125%)
>> > x: array([[False, False, False, ..., False, False, False],
>> > [False, False, False, ..., False, False, False],
>> > [False, False, False, ..., False, False, False],...
>> > y: array([[ True, True, True, ..., True, False, False],
>> > [ True, True, True, ..., False, False, False],
>> > [ True, True, True, ..., False, False, False],...
>> >
>> >
>> ======================================================================
>> > FAIL: test_grey.test_binary_erosion
>> >
>> ----------------------------------------------------------------------
>> > Traceback (most recent call last):
>> > File "X:\Python33\lib\site-packages\nose\case.py", line
>> 198, in
>> > runTest
>> > self.test(*self.arg)
>> > File
>> >
>> "X:\Python33\lib\site-packages\skimage\morphology\tests\test_grey.py",
>> > line 169, in test_binary_erosion
>> > testing.assert_array_equal(binary_res, grey_res)
>> > File
>> "X:\Python33\lib\site-packages\numpy\testing\utils.py", line
>> > 718, in assert_array_equal
>> > verbose=verbose, header='Arrays are not equal')
>> > File
>> "X:\Python33\lib\site-packages\numpy\testing\utils.py", line
>> > 644, in assert_array_compare
>> > raise AssertionError(msg)
>> > AssertionError:
>> > Arrays are not equal
>> >
>> > (mismatch 48.260498046875%)
>> > x: array([[False, False, False, ..., False, False, False],
>> > [False, False, False, ..., False, False, False],
>> > [False, False, False, ..., False, False, False],...
>> > y: array([[ True, True, True, ..., True, False, False],
>> > [ True, True, True, ..., False, False, False],
>> > [ True, True, True, ..., False, False, False],...
>> >
>> >
>> ======================================================================
>> > FAIL: test_grey.test_binary_closing
>> >
>> ----------------------------------------------------------------------
>> > Traceback (most recent call last):
>> > File "X:\Python33\lib\site-packages\nose\case.py", line
>> 198, in
>> > runTest
>> > self.test(*self.arg)
>> > File
>> >
>> "X:\Python33\lib\site-packages\skimage\morphology\tests\test_grey.py",
>> > line 183, in test_binary_closing
>> > testing.assert_array_equal(binary_res, grey_res)
>> > File
>> "X:\Python33\lib\site-packages\numpy\testing\utils.py", line
>> > 718, in assert_array_equal
>> > verbose=verbose, header='Arrays are not equal')
>> > File
>> "X:\Python33\lib\site-packages\numpy\testing\utils.py", line
>> > 644, in assert_array_compare
>> > raise AssertionError(msg)
>> > AssertionError:
>> > Arrays are not equal
>> >
>> > (mismatch 66.302490234375%)
>> > x: array([[False, False, False, ..., False, False, False],
>> > [False, False, False, ..., False, False, False],
>> > [False, False, False, ..., False, False, False],...
>> > y: array([[ True, True, True, ..., True, True, True],
>> > [ True, True, True, ..., True, True, True],
>> > [ True, True, True, ..., False, False, False],...
>> >
>> >
>> ======================================================================
>> > FAIL: test_grey.test_binary_opening
>> >
>> ----------------------------------------------------------------------
>> > Traceback (most recent call last):
>> > File "X:\Python33\lib\site-packages\nose\case.py", line
>> 198, in
>> > runTest
>> > self.test(*self.arg)
>> > File
>> >
>> "X:\Python33\lib\site-packages\skimage\morphology\tests\test_grey.py",
>> > line 190, in test_binary_opening
>> > testing.assert_array_equal(binary_res, grey_res)
>> > File
>> "X:\Python33\lib\site-packages\numpy\testing\utils.py", line
>> > 718, in assert_array_equal
>> > verbose=verbose, header='Arrays are not equal')
>> > File
>> "X:\Python33\lib\site-packages\numpy\testing\utils.py", line
>> > 644, in assert_array_compare
>> > raise AssertionError(msg)
>> > AssertionError:
>> > Arrays are not equal
>> >
>> > (mismatch 58.465576171875%)
>> > x: array([[False, False, False, ..., False, False, False],
>> > [False, False, False, ..., False, False, False],
>> > [False, False, False, ..., False, False, False],...
>> > y: array([[ True, True, True, ..., True, True, False],
>> > [ True, True, True, ..., True, True, False],
>> > [ True, True, True, ..., False, False, False],...
>> > ```
>> >
>> > I'll bet the skimage problems come from
>> > https://github.com/numpy/numpy/pull/3811. They may be doing something
>> > naughty...
>> >
>> > Chuck
>> >
>>
>> A bool image is convolved with a uint8 kernel and the result compared
>> for equality with an uint32 scalar...
>>
>> https://github.com/scikit-image/scikit-image/blob/master/skimage/morphology/binary.py#L32
>>
>>
>> Looks like the result of the convolution is probably output as a bool,
>> which now means 0,1, which does not work when checking equality with the
>> number of pixels in the kernel. I'd call expressing the result of a
>> convolution as a boolean very naughty.
>
> using a view should fix it:
> conv = ndimage.convolve((image > 0).view(np.uint8), selem, output=out,
> mode='constant', cval=1)
> but it needs check for sum(selem) < 255 too.
I opened an issue at
<https://github.com/scikit-image/scikit-image/issues/745>
Christoph
More information about the NumPy-Discussion
mailing list