Many failing doctests - release blocker? Enable for default test runs?
Hi, I noticed that enabling the doctests on the 1.7.x maintenance branch caused lots and lots of doctest failures. (np-devel)[mb312@blair ~/dev_trees]$ python -c 'import numpy as np; np.test(doctests=True)' 1.7.0rc1.dev-1e8fcdf Running unit tests and doctests for numpy NumPy version 1.7.0rc1.dev-1e8fcdf NumPy is installed in /Users/mb312/.virtualenvs/np-devel/lib/python2.6/site-packages/numpy Python version 2.6.6 (r266:84374, Aug 31 2010, 11:00:51) [GCC 4.0.1 (Apple Inc. build 5493)] nose version 1.1.2 ... Ran 3839 tests in 59.928s FAILED (KNOWNFAIL=4, SKIP=4, errors=23, failures=175) The doctests also throw up somewhere round 10 matplotlib plots, so presumably those would fail as well on a machine without a display without forcing the import of an 'Agg' backend or similar. I have never checked the doctests on Python 3. Has anyone run those recently? For the projects I work on most, we enable doctests for the default test run - as in 'doctests=True' by default in the numpy testing machinery. Do ya'll see any disadvantage in doing that for numpy? In case someone gets to this before I do, we've also got some logic for doing conditional skips of doctests when optional packages are not available such as matplotlib, inspired by something similar in IPython: https://github.com/nipy/nipy/blob/master/nipy/testing/doctester.py#L193 If Christmas allows I'll send a pull request with something like that in the next few days. Cheers, Matthew
Hi Matthew, On Sat, Dec 22, 2012 at 9:40 PM, Matthew Brett <matthew.brett@gmail.com> wrote:
Hi,
I noticed that enabling the doctests on the 1.7.x maintenance branch caused lots and lots of doctest failures.
(np-devel)[mb312@blair ~/dev_trees]$ python -c 'import numpy as np; np.test(doctests=True)' 1.7.0rc1.dev-1e8fcdf Running unit tests and doctests for numpy NumPy version 1.7.0rc1.dev-1e8fcdf NumPy is installed in /Users/mb312/.virtualenvs/np-devel/lib/python2.6/site-packages/numpy Python version 2.6.6 (r266:84374, Aug 31 2010, 11:00:51) [GCC 4.0.1 (Apple Inc. build 5493)] nose version 1.1.2 ... Ran 3839 tests in 59.928s
FAILED (KNOWNFAIL=4, SKIP=4, errors=23, failures=175)
The doctests also throw up somewhere round 10 matplotlib plots, so presumably those would fail as well on a machine without a display without forcing the import of an 'Agg' backend or similar.
I have never checked the doctests on Python 3. Has anyone run those recently?
For the projects I work on most, we enable doctests for the default test run - as in 'doctests=True' by default in the numpy testing machinery. Do ya'll see any disadvantage in doing that for numpy?
In case someone gets to this before I do, we've also got some logic for doing conditional skips of doctests when optional packages are not available such as matplotlib, inspired by something similar in IPython:
https://github.com/nipy/nipy/blob/master/nipy/testing/doctester.py#L193
If Christmas allows I'll send a pull request with something like that in the next few days.
Thanks for pointing this out. I think in the long term, we should definitely run doctests as part of the test suite on Travis-CI. Because what use is a doctest if it doesn't work? Matthew, do you know if doctests fail for the 1.6 release as well? I am swamped with other bugs for the 1.7 release and since I assume they also fail for 1.6, I want to get the release out as soon as we fix our current issues. However, I think it's a good idea to run doctests automatically on Travis, once they are all fixed. Ondrej
On Sun, Dec 23, 2012 at 7:54 PM, Ondřej Čertík <ondrej.certik@gmail.com>wrote:
Hi Matthew,
On Sat, Dec 22, 2012 at 9:40 PM, Matthew Brett <matthew.brett@gmail.com> wrote:
Hi,
I noticed that enabling the doctests on the 1.7.x maintenance branch caused lots and lots of doctest failures.
(np-devel)[mb312@blair ~/dev_trees]$ python -c 'import numpy as np; np.test(doctests=True)' 1.7.0rc1.dev-1e8fcdf Running unit tests and doctests for numpy NumPy version 1.7.0rc1.dev-1e8fcdf NumPy is installed in /Users/mb312/.virtualenvs/np-devel/lib/python2.6/site-packages/numpy Python version 2.6.6 (r266:84374, Aug 31 2010, 11:00:51) [GCC 4.0.1 (Apple Inc. build 5493)] nose version 1.1.2 ... Ran 3839 tests in 59.928s
FAILED (KNOWNFAIL=4, SKIP=4, errors=23, failures=175)
The doctests also throw up somewhere round 10 matplotlib plots, so presumably those would fail as well on a machine without a display without forcing the import of an 'Agg' backend or similar.
I have never checked the doctests on Python 3. Has anyone run those recently?
For the projects I work on most, we enable doctests for the default test run - as in 'doctests=True' by default in the numpy testing machinery. Do ya'll see any disadvantage in doing that for numpy?
Yes, I do. The doctest framework and reproducibility of reprs across Python versions and platforms are too poor to do this. And failing tests give new users a bad impression of the quality of numpy. I'm +1 on enabling doctests on Travis for one Python version (2.7 probably) in order to reduce the number of out-of-date examples, -1 on default doctests=True.
In case someone gets to this before I do, we've also got some logic for doing conditional skips of doctests when optional packages are not available such as matplotlib, inspired by something similar in IPython:
https://github.com/nipy/nipy/blob/master/nipy/testing/doctester.py#L193
If Christmas allows I'll send a pull request with something like that in the next few days.
Thanks for pointing this out. I think in the long term, we should definitely run doctests as part of the test suite on Travis-CI. Because what use is a doctest if it doesn't work?
Since a "doctest" is an example and not a test, still quite useful.
Matthew, do you know if doctests fail for the 1.6 release as well?
I am swamped with other bugs for the 1.7 release and since I assume they also fail for 1.6, I want to get the release out as soon as we fix our current issues.
Agreed that this shouldn't be a release blocker. Ralf
However, I think it's a good idea to run doctests automatically on Travis, once they are all fixed.
Ondrej _______________________________________________ NumPy-Discussion mailing list NumPy-Discussion@scipy.org http://mail.scipy.org/mailman/listinfo/numpy-discussion
Hi, On Sun, Dec 23, 2012 at 8:53 PM, Ralf Gommers <ralf.gommers@gmail.com> wrote:
On Sun, Dec 23, 2012 at 7:54 PM, Ondřej Čertík <ondrej.certik@gmail.com> wrote:
Hi Matthew,
On Sat, Dec 22, 2012 at 9:40 PM, Matthew Brett <matthew.brett@gmail.com> wrote:
Hi,
I noticed that enabling the doctests on the 1.7.x maintenance branch caused lots and lots of doctest failures.
(np-devel)[mb312@blair ~/dev_trees]$ python -c 'import numpy as np; np.test(doctests=True)' 1.7.0rc1.dev-1e8fcdf Running unit tests and doctests for numpy NumPy version 1.7.0rc1.dev-1e8fcdf NumPy is installed in /Users/mb312/.virtualenvs/np-devel/lib/python2.6/site-packages/numpy Python version 2.6.6 (r266:84374, Aug 31 2010, 11:00:51) [GCC 4.0.1 (Apple Inc. build 5493)] nose version 1.1.2 ... Ran 3839 tests in 59.928s
FAILED (KNOWNFAIL=4, SKIP=4, errors=23, failures=175)
The doctests also throw up somewhere round 10 matplotlib plots, so presumably those would fail as well on a machine without a display without forcing the import of an 'Agg' backend or similar.
I have never checked the doctests on Python 3. Has anyone run those recently?
For the projects I work on most, we enable doctests for the default test run - as in 'doctests=True' by default in the numpy testing machinery. Do ya'll see any disadvantage in doing that for numpy?
Yes, I do. The doctest framework and reproducibility of reprs across Python versions and platforms are too poor to do this. And failing tests give new users a bad impression of the quality of numpy.
I believe the repr problems are fairly easily soluble by using minor extensions to the current numpy doctest machinery. I think I was the last person to do big modifications to that bit of the numpy codebase and I've been using small tweaks to that framework to run cross version and cross platform doctest runs by default for a while on lots of numpy stuff in nipy: https://github.com/nipy/nipy/blob/master/nipy/testing/doctester.py#L155 Cheers, Matthew
On Mon, Dec 24, 2012 at 9:15 AM, Matthew Brett <matthew.brett@gmail.com>wrote:
Hi,
On Sun, Dec 23, 2012 at 8:53 PM, Ralf Gommers <ralf.gommers@gmail.com> wrote:
On Sun, Dec 23, 2012 at 7:54 PM, Ondřej Čertík <ondrej.certik@gmail.com> wrote:
Hi Matthew,
On Sat, Dec 22, 2012 at 9:40 PM, Matthew Brett <matthew.brett@gmail.com
wrote:
Hi,
I noticed that enabling the doctests on the 1.7.x maintenance branch caused lots and lots of doctest failures.
(np-devel)[mb312@blair ~/dev_trees]$ python -c 'import numpy as np; np.test(doctests=True)' 1.7.0rc1.dev-1e8fcdf Running unit tests and doctests for numpy NumPy version 1.7.0rc1.dev-1e8fcdf NumPy is installed in /Users/mb312/.virtualenvs/np-devel/lib/python2.6/site-packages/numpy Python version 2.6.6 (r266:84374, Aug 31 2010, 11:00:51) [GCC 4.0.1 (Apple Inc. build 5493)] nose version 1.1.2 ... Ran 3839 tests in 59.928s
FAILED (KNOWNFAIL=4, SKIP=4, errors=23, failures=175)
The doctests also throw up somewhere round 10 matplotlib plots, so presumably those would fail as well on a machine without a display without forcing the import of an 'Agg' backend or similar.
I have never checked the doctests on Python 3. Has anyone run those recently?
For the projects I work on most, we enable doctests for the default test run - as in 'doctests=True' by default in the numpy testing machinery. Do ya'll see any disadvantage in doing that for numpy?
Yes, I do. The doctest framework and reproducibility of reprs across
Python
versions and platforms are too poor to do this. And failing tests give new users a bad impression of the quality of numpy.
I believe the repr problems are fairly easily soluble by using minor extensions to the current numpy doctest machinery. I think I was the last person to do big modifications to that bit of the numpy codebase and I've been using small tweaks to that framework to run cross version and cross platform doctest runs by default for a while on lots of numpy stuff in nipy:
https://github.com/nipy/nipy/blob/master/nipy/testing/doctester.py#L155
My experience is different, but I'm happy to be proven wrong. Let's first see it running on all Python versions on Travis without issues for a while, then consider turning it on by default. Ralf
Hi, On Sun, Dec 23, 2012 at 6:54 PM, Ondřej Čertík <ondrej.certik@gmail.com> wrote:
Hi Matthew,
On Sat, Dec 22, 2012 at 9:40 PM, Matthew Brett <matthew.brett@gmail.com> wrote:
Hi,
I noticed that enabling the doctests on the 1.7.x maintenance branch caused lots and lots of doctest failures.
(np-devel)[mb312@blair ~/dev_trees]$ python -c 'import numpy as np; np.test(doctests=True)' 1.7.0rc1.dev-1e8fcdf Running unit tests and doctests for numpy NumPy version 1.7.0rc1.dev-1e8fcdf NumPy is installed in /Users/mb312/.virtualenvs/np-devel/lib/python2.6/site-packages/numpy Python version 2.6.6 (r266:84374, Aug 31 2010, 11:00:51) [GCC 4.0.1 (Apple Inc. build 5493)] nose version 1.1.2 ... Ran 3839 tests in 59.928s
FAILED (KNOWNFAIL=4, SKIP=4, errors=23, failures=175)
The doctests also throw up somewhere round 10 matplotlib plots, so presumably those would fail as well on a machine without a display without forcing the import of an 'Agg' backend or similar.
I have never checked the doctests on Python 3. Has anyone run those recently?
For the projects I work on most, we enable doctests for the default test run - as in 'doctests=True' by default in the numpy testing machinery. Do ya'll see any disadvantage in doing that for numpy?
In case someone gets to this before I do, we've also got some logic for doing conditional skips of doctests when optional packages are not available such as matplotlib, inspired by something similar in IPython:
https://github.com/nipy/nipy/blob/master/nipy/testing/doctester.py#L193
If Christmas allows I'll send a pull request with something like that in the next few days.
Thanks for pointing this out. I think in the long term, we should definitely run doctests as part of the test suite on Travis-CI. Because what use is a doctest if it doesn't work?
Matthew, do you know if doctests fail for the 1.6 release as well?
On 1.6.2: FAILED (KNOWNFAIL=5, SKIP=3, errors=43, failures=167) On Python 3.2, current 1.7.x maintenance: FAILED (KNOWNFAIL=5, SKIP=4, errors=24, failures=211) The last time I looked I had the impression that we were not doing 2to3 conversion on doctests, but that was a while ago. See you, Matthew
participants (3)
-
Matthew Brett -
Ondřej Čertík -
Ralf Gommers