
Hi all, If I run numpy.test()
numpy.__version__ '1.2.0.dev5331'
I obtain ====================================================================== FAIL: Tests count ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib64/python2.5/site-packages/numpy/ma/tests/test_core.py", line 566, in test_count_func assert_equal(3, count(ott)) File "/usr/local/lib64/python2.5/site-packages/numpy/ma/testutils.py", line 97, in assert_equal assert desired == actual, msg AssertionError: Items are not equal: ACTUAL: 3 DESIRED: 4 ====================================================================== FAIL: Tests reshape ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib64/python2.5/site-packages/numpy/ma/tests/test_core.py", line 1461, in test_reshape assert_equal(y._mask.shape, (2,2,)) File "/usr/local/lib64/python2.5/site-packages/numpy/ma/testutils.py", line 94, in assert_equal return _assert_equal_on_sequences(actual, desired, err_msg='') File "/usr/local/lib64/python2.5/site-packages/numpy/ma/testutils.py", line 66, in _assert_equal_on_sequences assert_equal(len(actual),len(desired),err_msg) File "/usr/local/lib64/python2.5/site-packages/numpy/ma/testutils.py", line 97, in assert_equal assert desired == actual, msg AssertionError: Items are not equal: ACTUAL: 0 DESIRED: 2 ====================================================================== FAIL: Tests dot product ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib64/python2.5/site-packages/numpy/ma/tests/test_extras.py", line 223, in test_dot assert_equal(c.mask, [[1,1],[1,0]]) File "/usr/local/lib64/python2.5/site-packages/numpy/ma/testutils.py", line 111, in assert_equal return assert_array_equal(actual, desired, err_msg) File "/usr/local/lib64/python2.5/site-packages/numpy/ma/testutils.py", line 177, in assert_array_equal header='Arrays are not equal') File "/usr/local/lib64/python2.5/site-packages/numpy/ma/testutils.py", line 171, in assert_array_compare verbose=verbose, header=header) File "/usr/local/lib64/python2.5/site-packages/numpy/testing/utils.py", line 240, in assert_array_compare assert cond, msg AssertionError: Arrays are not equal (mismatch 75.0%) x: array([[False, False], [False, False]], dtype=bool) y: array([[1, 1], [1, 0]]) ====================================================================== FAIL: Test of average. ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib64/python2.5/site-packages/numpy/ma/tests/test_extras.py", line 35, in test_testAverage1 assert_equal(average(ott,axis=0), [2.0, 0.0]) File "/usr/local/lib64/python2.5/site-packages/numpy/ma/testutils.py", line 111, in assert_equal return assert_array_equal(actual, desired, err_msg) File "/usr/local/lib64/python2.5/site-packages/numpy/ma/testutils.py", line 177, in assert_array_equal header='Arrays are not equal') File "/usr/local/lib64/python2.5/site-packages/numpy/ma/testutils.py", line 171, in assert_array_compare verbose=verbose, header=header) File "/usr/local/lib64/python2.5/site-packages/numpy/testing/utils.py", line 240, in assert_array_compare assert cond, msg AssertionError: Arrays are not equal (mismatch 50.0%) x: array([ 1., 1.]) y: array([ 2., 1.]) ====================================================================== FAIL: Test of average. ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib64/python2.5/site-packages/numpy/ma/tests/test_old_ma.py", line 526, in test_testAverage1 self.failUnless(eq(average(ott,axis=0), [2.0, 0.0])) AssertionError ====================================================================== FAIL: Test count ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib64/python2.5/site-packages/numpy/ma/tests/test_old_ma.py", line 159, in test_xtestCount self.failUnless (eq(3, count(ott))) AssertionError ---------------------------------------------------------------------- Ran 1660 tests in 12.483s FAILED (failures=6) Nils

This shows up on all the 64-bit buildbots also. But the 32 bit Mac still works.
Chuck
There are also new test failures in scipy ====================================================================== FAIL: Tests the confidence intervals of the trimmed mean. ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib64/python2.5/site-packages/scipy/stats/tests/test_mmorestats.py", line 35, in test_trimmedmeanci assert_almost_equal(ms.trimmed_mean(data,0.2), 596.2, 1) File "/usr/local/lib64/python2.5/site-packages/numpy/testing/utils.py", line 158, in assert_almost_equal assert round(abs(desired - actual),decimal) == 0, msg AssertionError: Items are not equal: ACTUAL: 600.26666666666665 DESIRED: 596.20000000000005 ====================================================================== FAIL: Tests trimming ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib64/python2.5/site-packages/scipy/stats/tests/test_mstats.py", line 213, in test_trim [None,1,2,3,4,5,6,7,None,None]) File "/usr/local/lib64/python2.5/site-packages/numpy/ma/testutils.py", line 111, in assert_equal return assert_array_equal(actual, desired, err_msg) File "/usr/local/lib64/python2.5/site-packages/numpy/ma/testutils.py", line 177, in assert_array_equal header='Arrays are not equal') File "/usr/local/lib64/python2.5/site-packages/numpy/ma/testutils.py", line 171, in assert_array_compare verbose=verbose, header=header) File "/usr/local/lib64/python2.5/site-packages/numpy/testing/utils.py", line 240, in assert_array_compare assert cond, msg AssertionError: Arrays are not equal (mismatch 30.0%) x: array([0, 1, 2, 3, 4, 5, 6, 7, 8, 9]) y: array([None, 1, 2, 3, 4, 5, 6, 7, None, None], dtype=object) ====================================================================== FAIL: Tests trimming. ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib64/python2.5/site-packages/scipy/stats/tests/test_mstats.py", line 240, in test_trim_old assert_equal(mstats.trimboth(x).count(), 60) File "/usr/local/lib64/python2.5/site-packages/numpy/ma/testutils.py", line 97, in assert_equal assert desired == actual, msg AssertionError: Items are not equal: ACTUAL: 100 DESIRED: 60 ====================================================================== FAIL: Tests the trimmed mean. ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib64/python2.5/site-packages/scipy/stats/tests/test_mstats.py", line 255, in test_trimmedmean assert_almost_equal(mstats.trimmed_mean(data,0.1), 343, 0) File "/usr/local/lib64/python2.5/site-packages/numpy/ma/testutils.py", line 145, in assert_almost_equal assert round(abs(desired - actual),decimal) == 0, msg AssertionError: Items are not equal: ACTUAL: 448.10526315789474 DESIRED: 343 Nils

On Wed, 02 Jul 2008 19:41:56 +0200 "Nils Wagner" <nwagner@iam.uni-stuttgart.de> wrote:
This shows up on all the 64-bit buildbots also. But the 32 bit Mac still works.
Chuck
I can reproduce the test failures on my old 32-bit laptop. Linux linux 2.6.11.4-21.17-default #1 Fri Apr 6 08:42:34 UTC 2007 i686 athlon i386 GNU/Linux Cheers, Nils

On Wed, Jul 2, 2008 at 1:26 PM, Pierre GM <pgmdevlist@gmail.com> wrote:
On Wednesday 02 July 2008 15:13:37 Nils Wagner wrote:
I can reproduce the test failures on my old 32-bit laptop.
As you should. My bad, I messed up on my last commit. I'll fix that later this afternoon. ___
Hmmm. So I check the Mac output and it's almost all test failures, yet the test shows up as a success. Something's not quite right. Chuck

The buildbot test command should be using sys.exit to return the success flag from the test run, but it's not. The FreeBSD's test command is: /usr/local/bin/python2.4 -c 'import numpy,sys;sys.exit(not numpy.test(level=9999,verbosity=9999).wasSuccessful())' while the OSX bot's command is python -c 'import sys; sys.path=["numpy-install/lib/python2.5/site-packages"]+sys.path;import numpy;numpy.test(doctests=True)' On Wed, Jul 2, 2008 at 4:03 PM, Charles R Harris <charlesr.harris@gmail.com> wrote:
On Wed, Jul 2, 2008 at 1:26 PM, Pierre GM <pgmdevlist@gmail.com> wrote:
On Wednesday 02 July 2008 15:13:37 Nils Wagner wrote:
I can reproduce the test failures on my old 32-bit laptop.
As you should. My bad, I messed up on my last commit. I'll fix that later this afternoon. ___
Hmmm. So I check the Mac output and it's almost all test failures, yet the test shows up as a success. Something's not quite right.
Chuck
_______________________________________________ Numpy-discussion mailing list Numpy-discussion@scipy.org http://projects.scipy.org/mailman/listinfo/numpy-discussion

Fixed. Sorry. On Wed, Jul 2, 2008 at 1:07 PM, Alan McIntyre <alan.mcintyre@gmail.com> wrote:
The buildbot test command should be using sys.exit to return the success flag from the test run, but it's not. The FreeBSD's test command is:
/usr/local/bin/python2.4 -c 'import numpy,sys;sys.exit(not numpy.test(level=9999,verbosity=9999).wasSuccessful())'
while the OSX bot's command is
python -c 'import sys; sys.path=["numpy-install/lib/python2.5/site-packages"]+sys.path;import numpy;numpy.test(doctests=True)'
On Wed, Jul 2, 2008 at 4:03 PM, Charles R Harris <charlesr.harris@gmail.com> wrote:
On Wed, Jul 2, 2008 at 1:26 PM, Pierre GM <pgmdevlist@gmail.com> wrote:
On Wednesday 02 July 2008 15:13:37 Nils Wagner wrote:
I can reproduce the test failures on my old 32-bit laptop.
As you should. My bad, I messed up on my last commit. I'll fix that later this afternoon. ___
Hmmm. So I check the Mac output and it's almost all test failures, yet the test shows up as a success. Something's not quite right.
Chuck
_______________________________________________ Numpy-discussion mailing list Numpy-discussion@scipy.org http://projects.scipy.org/mailman/listinfo/numpy-discussion
_______________________________________________ Numpy-discussion mailing list Numpy-discussion@scipy.org http://projects.scipy.org/mailman/listinfo/numpy-discussion

On Wed, Jul 2, 2008 at 2:45 PM, Barry Wark <barrywark@gmail.com> wrote:
Fixed. Sorry.
The Mac seems to have a whole different set of errors than the other bots, lots of import errors like ERROR: Failure: ImportError (cannot import name log) I wonder if there is a path issue somewhere? Chuck

very likely a path issue. i've had two hard drive crashed on the buildslave box this week. i'm sure something's fubar'd. i'll take a look. thanks for the heads up. On Wed, Jul 2, 2008 at 2:22 PM, Charles R Harris <charlesr.harris@gmail.com> wrote:
On Wed, Jul 2, 2008 at 2:45 PM, Barry Wark <barrywark@gmail.com> wrote:
Fixed. Sorry.
The Mac seems to have a whole different set of errors than the other bots, lots of import errors like
ERROR: Failure: ImportError (cannot import name log)
I wonder if there is a path issue somewhere?
Chuck
_______________________________________________ Numpy-discussion mailing list Numpy-discussion@scipy.org http://projects.scipy.org/mailman/listinfo/numpy-discussion

On Wed, Jul 2, 2008 at 5:22 PM, Charles R Harris <charlesr.harris@gmail.com> wrote:
The Mac seems to have a whole different set of errors than the other bots, lots of import errors like
ERROR: Failure: ImportError (cannot import name log)
I wonder if there is a path issue somewhere?
At least one of those looks nose-related, and is probably due to a difference between 0.10 and 0.11. I was mistakenly using nose from svn locally, and I think I might need to check in a change to NoseTester. I won't be able to get to it for a few hours though. I'm not sure what those other import errors are about.

On Wednesday 02 July 2008 15:26:05 you wrote:
On Wednesday 02 July 2008 15:13:37 Nils Wagner wrote:
I can reproduce the test failures on my old 32-bit laptop.
As you should. My bad, I messed up on my last commit. I'll fix that later this afternoon.
OK, so it should be fixed in v5332. Sorry again for the noise.
participants (5)
-
Alan McIntyre
-
Barry Wark
-
Charles R Harris
-
Nils Wagner
-
Pierre GM