linux wheels coming soon
![](https://secure.gravatar.com/avatar/97c543aca1ac7bbcfb5279d0300c8330.jpg?s=120&d=mm&r=g)
Hi all, Just a heads-up that we're planning to upload Linux wheels for numpy to PyPI soon. Unless there's some objection, these will be using ATLAS, just like the current Windows wheels, for the same reasons -- moving to something faster like OpenBLAS would be good, but given the concerns about OpenBLAS's reliability we want to get something working first and then worry about making it fast. (Plus it doesn't make sense to ship different BLAS libraries on Windows versus Linux -- that just multiplies our support burden for no reason.) -n -- Nathaniel J. Smith -- https://vorpus.org
![](https://secure.gravatar.com/avatar/96dd777e397ab128fedab46af97a3a4a.jpg?s=120&d=mm&r=g)
On Tue, Mar 15, 2016 at 5:33 PM, Nathaniel Smith <njs@pobox.com> wrote:
Hi all,
Just a heads-up that we're planning to upload Linux wheels for numpy to PyPI soon. Unless there's some objection, these will be using ATLAS, just like the current Windows wheels, for the same reasons -- moving to something faster like OpenBLAS would be good, but given the concerns about OpenBLAS's reliability we want to get something working first and then worry about making it fast. (Plus it doesn't make sense to ship different BLAS libraries on Windows versus Linux -- that just multiplies our support burden for no reason.)
Good news, thanks to all who have worked on this. Question: what to do with the prerelease uploads on pypi after they are outdated? I'm inclined to delete them, as there may be four of five of them per release and that seems unnecessary clutter. Chuck
![](https://secure.gravatar.com/avatar/97c543aca1ac7bbcfb5279d0300c8330.jpg?s=120&d=mm&r=g)
On Mar 15, 2016 5:54 PM, "Charles R Harris" <charlesr.harris@gmail.com> wrote:
On Tue, Mar 15, 2016 at 5:33 PM, Nathaniel Smith <njs@pobox.com> wrote:
Hi all,
Just a heads-up that we're planning to upload Linux wheels for numpy to PyPI soon. Unless there's some objection, these will be using ATLAS, just like the current Windows wheels, for the same reasons -- moving to something faster like OpenBLAS would be good, but given the concerns about OpenBLAS's reliability we want to get something working first and then worry about making it fast. (Plus it doesn't make sense to ship different BLAS libraries on Windows versus Linux -- that just multiplies our support burden for no reason.)
Good news, thanks to all who have worked on this.
Question: what to do with the prerelease uploads on pypi after they are
outdated? I'm inclined to delete them, as there may be four of five of them per release and that seems unnecessary clutter. I'd just leave them? Pypi doesn't care, and who knows, they might be useful for archival purposes to someone. Plus this is less work :-) -n
![](https://secure.gravatar.com/avatar/96dd777e397ab128fedab46af97a3a4a.jpg?s=120&d=mm&r=g)
On Tue, Mar 15, 2016 at 7:10 PM, Nathaniel Smith <njs@pobox.com> wrote:
On Mar 15, 2016 5:54 PM, "Charles R Harris" <charlesr.harris@gmail.com> wrote:
On Tue, Mar 15, 2016 at 5:33 PM, Nathaniel Smith <njs@pobox.com> wrote:
Hi all,
Just a heads-up that we're planning to upload Linux wheels for numpy to PyPI soon. Unless there's some objection, these will be using ATLAS, just like the current Windows wheels, for the same reasons -- moving to something faster like OpenBLAS would be good, but given the concerns about OpenBLAS's reliability we want to get something working first and then worry about making it fast. (Plus it doesn't make sense to ship different BLAS libraries on Windows versus Linux -- that just multiplies our support burden for no reason.)
Good news, thanks to all who have worked on this.
Question: what to do with the prerelease uploads on pypi after they are
outdated? I'm inclined to delete them, as there may be four of five of them per release and that seems unnecessary clutter.
I'd just leave them? Pypi doesn't care, and who knows, they might be useful for archival purposes to someone. Plus this is less work :-)
Less work than hitting the delete button? Oh, my aching finger ;) Chuck
![](https://secure.gravatar.com/avatar/871426dddc1a9f702316c1ca03a33d9b.jpg?s=120&d=mm&r=g)
Hi Nathaniel, Will you be providing portable Linux wheels aka manylinux1? https://www.python.org/dev/peps/pep-0513/ Does this also open up the door to releasing wheels for SciPy too? While speeding up "pip install" would be of benefit in itself, I am particularly keen to see this for use within automated testing frameworks like TravisCI where currently having to install NumPy (and SciPy) from source is an unreasonable overhead. Many thanks to everyone working on this, Peter On Tue, Mar 15, 2016 at 11:33 PM, Nathaniel Smith <njs@pobox.com> wrote:
Hi all,
Just a heads-up that we're planning to upload Linux wheels for numpy to PyPI soon. Unless there's some objection, these will be using ATLAS, just like the current Windows wheels, for the same reasons -- moving to something faster like OpenBLAS would be good, but given the concerns about OpenBLAS's reliability we want to get something working first and then worry about making it fast. (Plus it doesn't make sense to ship different BLAS libraries on Windows versus Linux -- that just multiplies our support burden for no reason.)
-n
-- Nathaniel J. Smith -- https://vorpus.org _______________________________________________ NumPy-Discussion mailing list NumPy-Discussion@scipy.org https://mail.scipy.org/mailman/listinfo/numpy-discussion
![](https://secure.gravatar.com/avatar/5f88830d19f9c83e2ddfd913496c5025.jpg?s=120&d=mm&r=g)
On Thu, Mar 24, 2016 at 4:04 PM, Peter Cock <p.j.a.cock@googlemail.com> wrote:
Hi Nathaniel,
Will you be providing portable Linux wheels aka manylinux1? https://www.python.org/dev/peps/pep-0513/
Does this also open up the door to releasing wheels for SciPy too?
That should work just fine.
While speeding up "pip install" would be of benefit in itself, I am particularly keen to see this for use within automated testing frameworks like TravisCI where currently having to install NumPy (and SciPy) from source is an unreasonable overhead.
There's already http://travis-dev-wheels.scipy.org/ (latest dev versions of numpy and scipy) and http://travis-wheels.scikit-image.org/ (releases, there are multiple sources for this one) for TravisCI setups to reuse. Ralf
Many thanks to everyone working on this,
Peter
On Tue, Mar 15, 2016 at 11:33 PM, Nathaniel Smith <njs@pobox.com> wrote:
Hi all,
Just a heads-up that we're planning to upload Linux wheels for numpy to PyPI soon. Unless there's some objection, these will be using ATLAS, just like the current Windows wheels, for the same reasons -- moving to something faster like OpenBLAS would be good, but given the concerns about OpenBLAS's reliability we want to get something working first and then worry about making it fast. (Plus it doesn't make sense to ship different BLAS libraries on Windows versus Linux -- that just multiplies our support burden for no reason.)
-n
-- Nathaniel J. Smith -- https://vorpus.org _______________________________________________ NumPy-Discussion mailing list NumPy-Discussion@scipy.org https://mail.scipy.org/mailman/listinfo/numpy-discussion
NumPy-Discussion mailing list NumPy-Discussion@scipy.org https://mail.scipy.org/mailman/listinfo/numpy-discussion
![](https://secure.gravatar.com/avatar/97c543aca1ac7bbcfb5279d0300c8330.jpg?s=120&d=mm&r=g)
On Mar 24, 2016 8:04 AM, "Peter Cock" <p.j.a.cock@googlemail.com> wrote:
Hi Nathaniel,
Will you be providing portable Linux wheels aka manylinux1? https://www.python.org/dev/peps/pep-0513/
Matthew Brett will (probably) do the actual work, but yeah, that's the idea exactly. Note the author list on that PEP ;-) -n
![](https://secure.gravatar.com/avatar/871426dddc1a9f702316c1ca03a33d9b.jpg?s=120&d=mm&r=g)
On Thu, Mar 24, 2016 at 6:37 PM, Nathaniel Smith <njs@pobox.com> wrote:
On Mar 24, 2016 8:04 AM, "Peter Cock" <p.j.a.cock@googlemail.com> wrote:
Hi Nathaniel,
Will you be providing portable Linux wheels aka manylinux1? https://www.python.org/dev/peps/pep-0513/
Matthew Brett will (probably) do the actual work, but yeah, that's the idea exactly. Note the author list on that PEP ;-)
-n
Yep - I was partly double checking, but also aware many folk skim the NumPy list and might not be aware of PEP-513 and the standardisation efforts going on. Also in addition to http://travis-dev-wheels.scipy.org/ and http://travis-wheels.scikit-image.org/ mentioned by Ralf there is http://wheels.scipy.org/ which I presume will get the new Linux wheels once they go live. Is it possible to add a README to these listings explaining what they are intended to be used for? P.S. To save anyone else Googling, you can do things like this: pip install -r requirements.txt --timeout 60 --trusted-host travis-wheels.scikit-image.org -f http://travis-wheels.scikit-image.org/ Thanks, Peter
![](https://secure.gravatar.com/avatar/97c543aca1ac7bbcfb5279d0300c8330.jpg?s=120&d=mm&r=g)
On Thu, Mar 24, 2016 at 11:44 AM, Peter Cock <p.j.a.cock@googlemail.com> wrote:
On Thu, Mar 24, 2016 at 6:37 PM, Nathaniel Smith <njs@pobox.com> wrote:
On Mar 24, 2016 8:04 AM, "Peter Cock" <p.j.a.cock@googlemail.com> wrote:
Hi Nathaniel,
Will you be providing portable Linux wheels aka manylinux1? https://www.python.org/dev/peps/pep-0513/
Matthew Brett will (probably) do the actual work, but yeah, that's the idea exactly. Note the author list on that PEP ;-)
-n
Yep - I was partly double checking, but also aware many folk skim the NumPy list and might not be aware of PEP-513 and the standardisation efforts going on.
Also in addition to http://travis-dev-wheels.scipy.org/ and http://travis-wheels.scikit-image.org/ mentioned by Ralf there is http://wheels.scipy.org/ which I presume will get the new Linux wheels once they go live.
The new wheels will go up on pypi, and I guess once everyone has wheels on pypi then these ad-hoc wheel servers that existed only as a way to distribute Linux wheels will become obsolete. (travis-dev-wheels will remain useful, though, because its purpose is to hold up-to-the-minute builds of project master branches to allow downstream projects to get early warning of breaking changes -- we don't plan to upload to pypi after every commit :-).) -n -- Nathaniel J. Smith -- https://vorpus.org
![](https://secure.gravatar.com/avatar/9cf3859e4f8d4e422f242a842a6e301c.jpg?s=120&d=mm&r=g)
I suspect that many of the maintainers of major scipy-ecosystem projects are aware of these (or other similar) travis wheel caches, but would guess that the pool of travis-ci python users who weren't aware of these wheel caches is much much larger. So there will still be a lot of travis-ci clock cycles saved by manylinux wheels. -Robert On Thu, Mar 24, 2016 at 10:46 PM, Nathaniel Smith <njs@pobox.com> wrote:
On Thu, Mar 24, 2016 at 11:44 AM, Peter Cock <p.j.a.cock@googlemail.com> wrote:
On Thu, Mar 24, 2016 at 6:37 PM, Nathaniel Smith <njs@pobox.com> wrote:
On Mar 24, 2016 8:04 AM, "Peter Cock" <p.j.a.cock@googlemail.com> wrote:
Hi Nathaniel,
Will you be providing portable Linux wheels aka manylinux1? https://www.python.org/dev/peps/pep-0513/
Matthew Brett will (probably) do the actual work, but yeah, that's the idea exactly. Note the author list on that PEP ;-)
-n
Yep - I was partly double checking, but also aware many folk skim the NumPy list and might not be aware of PEP-513 and the standardisation efforts going on.
Also in addition to http://travis-dev-wheels.scipy.org/ and http://travis-wheels.scikit-image.org/ mentioned by Ralf there is http://wheels.scipy.org/ which I presume will get the new Linux wheels once they go live.
The new wheels will go up on pypi, and I guess once everyone has wheels on pypi then these ad-hoc wheel servers that existed only as a way to distribute Linux wheels will become obsolete.
(travis-dev-wheels will remain useful, though, because its purpose is to hold up-to-the-minute builds of project master branches to allow downstream projects to get early warning of breaking changes -- we don't plan to upload to pypi after every commit :-).)
-n
-- Nathaniel J. Smith -- https://vorpus.org _______________________________________________ NumPy-Discussion mailing list NumPy-Discussion@scipy.org https://mail.scipy.org/mailman/listinfo/numpy-discussion
-- -Robert
![](https://secure.gravatar.com/avatar/871426dddc1a9f702316c1ca03a33d9b.jpg?s=120&d=mm&r=g)
On Fri, Mar 25, 2016 at 3:02 AM, Robert T. McGibbon <rmcgibbo@gmail.com> wrote:
I suspect that many of the maintainers of major scipy-ecosystem projects are aware of these (or other similar) travis wheel caches, but would guess that the pool of travis-ci python users who weren't aware of these wheel caches is much much larger. So there will still be a lot of travis-ci clock cycles saved by manylinux wheels.
-Robert
Yes exactly. Availability of NumPy Linux wheels on PyPI is definitely something I would suggest adding to the release notes. Hopefully this will help trigger a general availability of wheels in the numpy-ecosystem :) In the case of Travis CI, their VM images for Python already have a version of NumPy installed, but having the latest version of NumPy and SciPy etc available as Linux wheels would be very nice. Peter P.S. As an aside, PyPI seems to be having trouble displaying the main NumPy page https://pypi.python.org/pypi/numpy at the moment (Error 404 page): https://bitbucket.org/pypa/pypi/issues/423/version-less-page-for-numpy-broke...
![](https://secure.gravatar.com/avatar/b4929294417e9ac44c17967baae75a36.jpg?s=120&d=mm&r=g)
On Fri, Mar 25, 2016 at 6:39 AM, Peter Cock <p.j.a.cock@googlemail.com> wrote:
On Fri, Mar 25, 2016 at 3:02 AM, Robert T. McGibbon <rmcgibbo@gmail.com> wrote:
I suspect that many of the maintainers of major scipy-ecosystem projects are aware of these (or other similar) travis wheel caches, but would guess that the pool of travis-ci python users who weren't aware of these wheel caches is much much larger. So there will still be a lot of travis-ci clock cycles saved by manylinux wheels.
-Robert
Yes exactly. Availability of NumPy Linux wheels on PyPI is definitely something I would suggest adding to the release notes. Hopefully this will help trigger a general availability of wheels in the numpy-ecosystem :)
In the case of Travis CI, their VM images for Python already have a version of NumPy installed, but having the latest version of NumPy and SciPy etc available as Linux wheels would be very nice.
We're very nearly there now. The latest versions of numpy, scipy, scikit-image, pandas, numexpr, statsmodels wheels for testing at http://ccdd0ebb5a931e58c7c5-aae005c4999d7244ac63632f8b80e089.r77.cf2.rackcdn... Please do test with: python -m install --upgrade pip pip install --trusted-host=ccdd0ebb5a931e58c7c5-aae005c4999d7244ac63632f8b80e089.r77.cf2.rackcdn.com --find-links=http://ccdd0ebb5a931e58c7c5-aae005c4999d7244ac63632f8b80e089.r77.cf2.rackcdn... numpy scipy scikit-learn numexpr python -c 'import numpy; numpy.test("full")' python -c 'import scipy; scipy.test("full")' We would love to get any feedback as to whether these work on your machines. Cheers, Matthew
![](https://secure.gravatar.com/avatar/aee56554ec30edfd680e1c937ed4e54d.jpg?s=120&d=mm&r=g)
typo: python -m install --upgrade pip should read: python -m pip install --upgrade pip -- Olivier
![](https://secure.gravatar.com/avatar/aee56554ec30edfd680e1c937ed4e54d.jpg?s=120&d=mm&r=g)
I ran some tests on an image of the future ubuntu xenial that ships a version of pip recent enough to install manylinux1 wheels by default and everything looks fine. Just to clarify, those wheels use openblas 0.2.17 that have proven to be both fast and very stable on various CPU architectures while we could not achieve similar results with atlas 3.10. -- Olivier Grisel
![](https://secure.gravatar.com/avatar/871426dddc1a9f702316c1ca03a33d9b.jpg?s=120&d=mm&r=g)
On Sun, Apr 3, 2016 at 2:11 AM, Matthew Brett <matthew.brett@gmail.com> wrote:
On Fri, Mar 25, 2016 at 6:39 AM, Peter Cock <p.j.a.cock@googlemail.com> wrote:
On Fri, Mar 25, 2016 at 3:02 AM, Robert T. McGibbon <rmcgibbo@gmail.com> wrote:
I suspect that many of the maintainers of major scipy-ecosystem projects are aware of these (or other similar) travis wheel caches, but would guess that the pool of travis-ci python users who weren't aware of these wheel caches is much much larger. So there will still be a lot of travis-ci clock cycles saved by manylinux wheels.
-Robert
Yes exactly. Availability of NumPy Linux wheels on PyPI is definitely something I would suggest adding to the release notes. Hopefully this will help trigger a general availability of wheels in the numpy-ecosystem :)
In the case of Travis CI, their VM images for Python already have a version of NumPy installed, but having the latest version of NumPy and SciPy etc available as Linux wheels would be very nice.
We're very nearly there now.
The latest versions of numpy, scipy, scikit-image, pandas, numexpr, statsmodels wheels for testing at http://ccdd0ebb5a931e58c7c5-aae005c4999d7244ac63632f8b80e089.r77.cf2.rackcdn...
Please do test with: ...
We would love to get any feedback as to whether these work on your machines.
Hi Matthew, Testing on a 64bit CentOS 6 machine with Python 3.5 compiled from source under my home directory: $ python3.5 -m pip install --upgrade pip Requirement already up-to-date: pip in ./lib/python3.5/site-packages $ python3.5 -m pip install --trusted-host=ccdd0ebb5a931e58c7c5-aae005c4999d7244ac63632f8b80e089.r77.cf2.rackcdn.com --find-links=http://ccdd0ebb5a931e58c7c5-aae005c4999d7244ac63632f8b80e089.r77.cf2.rackcdn... numpy scipy Requirement already satisfied (use --upgrade to upgrade): numpy in ./lib/python3.5/site-packages Requirement already satisfied (use --upgrade to upgrade): scipy in ./lib/python3.5/site-packages $ python3.5 -m pip install --trusted-host=ccdd0ebb5a931e58c7c5-aae005c4999d7244ac63632f8b80e089.r77.cf2.rackcdn.com --find-links=http://ccdd0ebb5a931e58c7c5-aae005c4999d7244ac63632f8b80e089.r77.cf2.rackcdn... numpy scipy --upgrade Collecting numpy Downloading http://ccdd0ebb5a931e58c7c5-aae005c4999d7244ac63632f8b80e089.r77.cf2.rackcdn... (15.5MB) 100% |████████████████████████████████| 15.5MB 42.1MB/s Collecting scipy Downloading http://ccdd0ebb5a931e58c7c5-aae005c4999d7244ac63632f8b80e089.r77.cf2.rackcdn... (40.8MB) 100% |████████████████████████████████| 40.8MB 53.6MB/s Installing collected packages: numpy, scipy Found existing installation: numpy 1.10.4 Uninstalling numpy-1.10.4: Successfully uninstalled numpy-1.10.4 Found existing installation: scipy 0.16.0 Uninstalling scipy-0.16.0: Successfully uninstalled scipy-0.16.0 Successfully installed numpy-1.11.0 scipy-0.17.0 $ python3.5 -c 'import numpy; numpy.test("full")' Running unit tests for numpy NumPy version 1.11.0 NumPy relaxed strides checking option: False NumPy is installed in /home/xxx/lib/python3.5/site-packages/numpy Python version 3.5.0 (default, Sep 28 2015, 11:25:31) [GCC 4.4.7 20120313 (Red Hat 4.4.7-16)] nose version 1.3.7 .............................................................................................................................................................................................................................S....................................................................................................................................................................KKK....................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................S....................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................K.................................................................................................................................................................................................................................................................................................................................................................................................................................................K.......................................................................................................................................................................................................................................................................................................................................................................................................................................................K......................K............................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................ ---------------------------------------------------------------------- Ran 6332 tests in 243.029s OK (KNOWNFAIL=7, SKIP=2) So far so good, but there are a lot of deprecation warnings etc from SciPy, $ python3.5 -c 'import scipy; scipy.test("full")' Running unit tests for scipy NumPy version 1.11.0 NumPy relaxed strides checking option: False NumPy is installed in /home/xxx/lib/python3.5/site-packages/numpy SciPy version 0.17.0 SciPy is installed in /home/xxx/lib/python3.5/site-packages/scipy Python version 3.5.0 (default, Sep 28 2015, 11:25:31) [GCC 4.4.7 20120313 (Red Hat 4.4.7-16)] nose version 1.3.7 [snip] /home/xxx/lib/python3.5/site-packages/numpy/lib/utils.py:99: DeprecationWarning: `rand` is deprecated! numpy.testing.rand is deprecated in numpy 1.11. Use numpy.random.rand instead. warnings.warn(depdoc, DeprecationWarning) [snip] /home/xxx/lib/python3.5/site-packages/scipy/io/arff/tests/test_arffread.py:254: DeprecationWarning: parsing timezone aware datetimes is deprecated; this will raise an error in the future ], dtype='datetime64[m]') /home/xxx/lib/python3.5/site-packages/scipy/io/arff/arffread.py:638: PendingDeprecationWarning: generator '_loadarff.<locals>.generator' raised StopIteration [snip] /home/xxx/lib/python3.5/site-packages/scipy/sparse/tests/test_base.py:2425: DeprecationWarning: This function is deprecated. Please call randint(-5, 5 + 1) instead I = np.random.random_integers(-M + 1, M - 1, size=NUM_SAMPLES) [snip] 0-th dimension must be fixed to 3 but got 15 [snip] ---------------------------------------------------------------------- Ran 21407 tests in 741.602s OK (KNOWNFAIL=130, SKIP=1775) Hopefully I didn't miss anything important in hand editing the scipy output. Peter
![](https://secure.gravatar.com/avatar/b4929294417e9ac44c17967baae75a36.jpg?s=120&d=mm&r=g)
Hi, On Mon, Apr 4, 2016 at 9:02 AM, Peter Cock <p.j.a.cock@googlemail.com> wrote:
On Sun, Apr 3, 2016 at 2:11 AM, Matthew Brett <matthew.brett@gmail.com> wrote:
On Fri, Mar 25, 2016 at 6:39 AM, Peter Cock <p.j.a.cock@googlemail.com> wrote:
On Fri, Mar 25, 2016 at 3:02 AM, Robert T. McGibbon <rmcgibbo@gmail.com> wrote:
I suspect that many of the maintainers of major scipy-ecosystem projects are aware of these (or other similar) travis wheel caches, but would guess that the pool of travis-ci python users who weren't aware of these wheel caches is much much larger. So there will still be a lot of travis-ci clock cycles saved by manylinux wheels.
-Robert
Yes exactly. Availability of NumPy Linux wheels on PyPI is definitely something I would suggest adding to the release notes. Hopefully this will help trigger a general availability of wheels in the numpy-ecosystem :)
In the case of Travis CI, their VM images for Python already have a version of NumPy installed, but having the latest version of NumPy and SciPy etc available as Linux wheels would be very nice.
We're very nearly there now.
The latest versions of numpy, scipy, scikit-image, pandas, numexpr, statsmodels wheels for testing at http://ccdd0ebb5a931e58c7c5-aae005c4999d7244ac63632f8b80e089.r77.cf2.rackcdn...
Please do test with: ...
We would love to get any feedback as to whether these work on your machines.
Hi Matthew,
Testing on a 64bit CentOS 6 machine with Python 3.5 compiled from source under my home directory:
$ python3.5 -m pip install --upgrade pip Requirement already up-to-date: pip in ./lib/python3.5/site-packages
$ python3.5 -m pip install --trusted-host=ccdd0ebb5a931e58c7c5-aae005c4999d7244ac63632f8b80e089.r77.cf2.rackcdn.com --find-links=http://ccdd0ebb5a931e58c7c5-aae005c4999d7244ac63632f8b80e089.r77.cf2.rackcdn... numpy scipy Requirement already satisfied (use --upgrade to upgrade): numpy in ./lib/python3.5/site-packages Requirement already satisfied (use --upgrade to upgrade): scipy in ./lib/python3.5/site-packages
$ python3.5 -m pip install --trusted-host=ccdd0ebb5a931e58c7c5-aae005c4999d7244ac63632f8b80e089.r77.cf2.rackcdn.com --find-links=http://ccdd0ebb5a931e58c7c5-aae005c4999d7244ac63632f8b80e089.r77.cf2.rackcdn... numpy scipy --upgrade Collecting numpy Downloading http://ccdd0ebb5a931e58c7c5-aae005c4999d7244ac63632f8b80e089.r77.cf2.rackcdn... (15.5MB) 100% |████████████████████████████████| 15.5MB 42.1MB/s Collecting scipy Downloading http://ccdd0ebb5a931e58c7c5-aae005c4999d7244ac63632f8b80e089.r77.cf2.rackcdn... (40.8MB) 100% |████████████████████████████████| 40.8MB 53.6MB/s Installing collected packages: numpy, scipy Found existing installation: numpy 1.10.4 Uninstalling numpy-1.10.4: Successfully uninstalled numpy-1.10.4 Found existing installation: scipy 0.16.0 Uninstalling scipy-0.16.0: Successfully uninstalled scipy-0.16.0 Successfully installed numpy-1.11.0 scipy-0.17.0
$ python3.5 -c 'import numpy; numpy.test("full")' Running unit tests for numpy NumPy version 1.11.0 NumPy relaxed strides checking option: False NumPy is installed in /home/xxx/lib/python3.5/site-packages/numpy Python version 3.5.0 (default, Sep 28 2015, 11:25:31) [GCC 4.4.7 20120313 (Red Hat 4.4.7-16)] nose version 1.3.7 .............................................................................................................................................................................................................................S....................................................................................................................................................................KKK....................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................S....................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................K.................................................................................................................................................................................................................................................................................................................................................................................................................................................K.......................................................................................................................................................................................................................................................................................................................................................................................................................................................K......................K............................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................ ---------------------------------------------------------------------- Ran 6332 tests in 243.029s
OK (KNOWNFAIL=7, SKIP=2)
So far so good, but there are a lot of deprecation warnings etc from SciPy,
$ python3.5 -c 'import scipy; scipy.test("full")' Running unit tests for scipy NumPy version 1.11.0 NumPy relaxed strides checking option: False NumPy is installed in /home/xxx/lib/python3.5/site-packages/numpy SciPy version 0.17.0 SciPy is installed in /home/xxx/lib/python3.5/site-packages/scipy Python version 3.5.0 (default, Sep 28 2015, 11:25:31) [GCC 4.4.7 20120313 (Red Hat 4.4.7-16)] nose version 1.3.7 [snip] /home/xxx/lib/python3.5/site-packages/numpy/lib/utils.py:99: DeprecationWarning: `rand` is deprecated! numpy.testing.rand is deprecated in numpy 1.11. Use numpy.random.rand instead. warnings.warn(depdoc, DeprecationWarning) [snip] /home/xxx/lib/python3.5/site-packages/scipy/io/arff/tests/test_arffread.py:254: DeprecationWarning: parsing timezone aware datetimes is deprecated; this will raise an error in the future ], dtype='datetime64[m]') /home/xxx/lib/python3.5/site-packages/scipy/io/arff/arffread.py:638: PendingDeprecationWarning: generator '_loadarff.<locals>.generator' raised StopIteration [snip] /home/xxx/lib/python3.5/site-packages/scipy/sparse/tests/test_base.py:2425: DeprecationWarning: This function is deprecated. Please call randint(-5, 5 + 1) instead I = np.random.random_integers(-M + 1, M - 1, size=NUM_SAMPLES) [snip] 0-th dimension must be fixed to 3 but got 15 [snip] ---------------------------------------------------------------------- Ran 21407 tests in 741.602s
OK (KNOWNFAIL=130, SKIP=1775)
Hopefully I didn't miss anything important in hand editing the scipy output.
Thanks a lot for testing. I believe the deprecation warnings are expected, because numpy 1.11.0 introduced a new deprecation warning when using `random_integers`. Scipy 0.17.0 is using `random_integers` in a few places. Best, Matthew
![](https://secure.gravatar.com/avatar/d495a692ec174d3053818ff85b96fbb7.jpg?s=120&d=mm&r=g)
Matthew, you are correct. A lot of things happened with random integer generation recently (including deprecating random_integers), but I believe those warnings should be squashed in the up and coming version of SciPy from what I remember. On Mon, Apr 4, 2016 at 6:47 PM, Matthew Brett <matthew.brett@gmail.com> wrote:
Hi,
On Sun, Apr 3, 2016 at 2:11 AM, Matthew Brett <matthew.brett@gmail.com> wrote:
On Fri, Mar 25, 2016 at 6:39 AM, Peter Cock <p.j.a.cock@googlemail.com> wrote:
On Fri, Mar 25, 2016 at 3:02 AM, Robert T. McGibbon < rmcgibbo@gmail.com> wrote:
I suspect that many of the maintainers of major scipy-ecosystem
aware of these (or other similar) travis wheel caches, but would guess that the pool of travis-ci python users who weren't aware of these wheel caches is much much larger. So there will still be a lot of travis-ci clock cycles saved by manylinux wheels.
-Robert
Yes exactly. Availability of NumPy Linux wheels on PyPI is definitely something I would suggest adding to the release notes. Hopefully this will help
On Mon, Apr 4, 2016 at 9:02 AM, Peter Cock <p.j.a.cock@googlemail.com> wrote: projects are trigger
a general availability of wheels in the numpy-ecosystem :)
In the case of Travis CI, their VM images for Python already have a version of NumPy installed, but having the latest version of NumPy and SciPy etc available as Linux wheels would be very nice.
We're very nearly there now.
The latest versions of numpy, scipy, scikit-image, pandas, numexpr, statsmodels wheels for testing at
http://ccdd0ebb5a931e58c7c5-aae005c4999d7244ac63632f8b80e089.r77.cf2.rackcdn...
Please do test with: ...
We would love to get any feedback as to whether these work on your
machines.
Hi Matthew,
Testing on a 64bit CentOS 6 machine with Python 3.5 compiled from source under my home directory:
$ python3.5 -m pip install --upgrade pip Requirement already up-to-date: pip in ./lib/python3.5/site-packages
$ python3.5 -m pip install --trusted-host= ccdd0ebb5a931e58c7c5-aae005c4999d7244ac63632f8b80e089.r77.cf2.rackcdn.com --find-links= http://ccdd0ebb5a931e58c7c5-aae005c4999d7244ac63632f8b80e089.r77.cf2.rackcdn... numpy scipy Requirement already satisfied (use --upgrade to upgrade): numpy in ./lib/python3.5/site-packages Requirement already satisfied (use --upgrade to upgrade): scipy in ./lib/python3.5/site-packages
$ python3.5 -m pip install --trusted-host= ccdd0ebb5a931e58c7c5-aae005c4999d7244ac63632f8b80e089.r77.cf2.rackcdn.com --find-links= http://ccdd0ebb5a931e58c7c5-aae005c4999d7244ac63632f8b80e089.r77.cf2.rackcdn... numpy scipy --upgrade Collecting numpy Downloading http://ccdd0ebb5a931e58c7c5-aae005c4999d7244ac63632f8b80e089.r77.cf2.rackcdn... (15.5MB) 100% |████████████████████████████████| 15.5MB 42.1MB/s Collecting scipy Downloading http://ccdd0ebb5a931e58c7c5-aae005c4999d7244ac63632f8b80e089.r77.cf2.rackcdn... (40.8MB) 100% |████████████████████████████████| 40.8MB 53.6MB/s Installing collected packages: numpy, scipy Found existing installation: numpy 1.10.4 Uninstalling numpy-1.10.4: Successfully uninstalled numpy-1.10.4 Found existing installation: scipy 0.16.0 Uninstalling scipy-0.16.0: Successfully uninstalled scipy-0.16.0 Successfully installed numpy-1.11.0 scipy-0.17.0
$ python3.5 -c 'import numpy; numpy.test("full")' Running unit tests for numpy NumPy version 1.11.0 NumPy relaxed strides checking option: False NumPy is installed in /home/xxx/lib/python3.5/site-packages/numpy Python version 3.5.0 (default, Sep 28 2015, 11:25:31) [GCC 4.4.7 20120313 (Red Hat 4.4.7-16)] nose version 1.3.7
.............................................................................................................................................................................................................................S....................................................................................................................................................................KKK....................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................S....................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................K.................................................................................................................................................................................................................................................................................................................................................................................................................................................K.......................................................................................................................................................................................................................................................................................................................................................................................................................................................K......................K............................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................
---------------------------------------------------------------------- Ran 6332 tests in 243.029s
OK (KNOWNFAIL=7, SKIP=2)
So far so good, but there are a lot of deprecation warnings etc from SciPy,
$ python3.5 -c 'import scipy; scipy.test("full")' Running unit tests for scipy NumPy version 1.11.0 NumPy relaxed strides checking option: False NumPy is installed in /home/xxx/lib/python3.5/site-packages/numpy SciPy version 0.17.0 SciPy is installed in /home/xxx/lib/python3.5/site-packages/scipy Python version 3.5.0 (default, Sep 28 2015, 11:25:31) [GCC 4.4.7 20120313 (Red Hat 4.4.7-16)] nose version 1.3.7 [snip] /home/xxx/lib/python3.5/site-packages/numpy/lib/utils.py:99: DeprecationWarning: `rand` is deprecated! numpy.testing.rand is deprecated in numpy 1.11. Use numpy.random.rand instead. warnings.warn(depdoc, DeprecationWarning) [snip]
/home/xxx/lib/python3.5/site-packages/scipy/io/arff/tests/test_arffread.py:254:
DeprecationWarning: parsing timezone aware datetimes is deprecated; this will raise an error in the future ], dtype='datetime64[m]') /home/xxx/lib/python3.5/site-packages/scipy/io/arff/arffread.py:638: PendingDeprecationWarning: generator '_loadarff.<locals>.generator' raised StopIteration [snip]
/home/xxx/lib/python3.5/site-packages/scipy/sparse/tests/test_base.py:2425:
DeprecationWarning: This function is deprecated. Please call randint(-5, 5 + 1) instead I = np.random.random_integers(-M + 1, M - 1, size=NUM_SAMPLES) [snip] 0-th dimension must be fixed to 3 but got 15 [snip] ---------------------------------------------------------------------- Ran 21407 tests in 741.602s
OK (KNOWNFAIL=130, SKIP=1775)
Hopefully I didn't miss anything important in hand editing the scipy output.
Thanks a lot for testing.
I believe the deprecation warnings are expected, because numpy 1.11.0 introduced a new deprecation warning when using `random_integers`. Scipy 0.17.0 is using `random_integers` in a few places.
Best,
Matthew _______________________________________________ NumPy-Discussion mailing list NumPy-Discussion@scipy.org https://mail.scipy.org/mailman/listinfo/numpy-discussion
![](https://secure.gravatar.com/avatar/b4929294417e9ac44c17967baae75a36.jpg?s=120&d=mm&r=g)
Hi, On Sat, Apr 2, 2016 at 6:11 PM, Matthew Brett <matthew.brett@gmail.com> wrote:
On Fri, Mar 25, 2016 at 6:39 AM, Peter Cock <p.j.a.cock@googlemail.com> wrote:
On Fri, Mar 25, 2016 at 3:02 AM, Robert T. McGibbon <rmcgibbo@gmail.com> wrote:
I suspect that many of the maintainers of major scipy-ecosystem projects are aware of these (or other similar) travis wheel caches, but would guess that the pool of travis-ci python users who weren't aware of these wheel caches is much much larger. So there will still be a lot of travis-ci clock cycles saved by manylinux wheels.
-Robert
Yes exactly. Availability of NumPy Linux wheels on PyPI is definitely something I would suggest adding to the release notes. Hopefully this will help trigger a general availability of wheels in the numpy-ecosystem :)
In the case of Travis CI, their VM images for Python already have a version of NumPy installed, but having the latest version of NumPy and SciPy etc available as Linux wheels would be very nice.
We're very nearly there now.
The latest versions of numpy, scipy, scikit-image, pandas, numexpr, statsmodels wheels for testing at http://ccdd0ebb5a931e58c7c5-aae005c4999d7244ac63632f8b80e089.r77.cf2.rackcdn...
Please do test with:
python -m pip install --upgrade pip
pip install --trusted-host=ccdd0ebb5a931e58c7c5-aae005c4999d7244ac63632f8b80e089.r77.cf2.rackcdn.com --find-links=http://ccdd0ebb5a931e58c7c5-aae005c4999d7244ac63632f8b80e089.r77.cf2.rackcdn... numpy scipy scikit-learn numexpr
python -c 'import numpy; numpy.test("full")' python -c 'import scipy; scipy.test("full")'
We would love to get any feedback as to whether these work on your machines.
I've just rebuilt these wheels with the just-released OpenBLAS 0.2.18. OpenBLAS is now passing all its own tests and tests on numpy / scipy / scikit-learn at http://build.openblas.net/builders Our tests of the wheels look good too: http://nipy.bic.berkeley.edu/builders/manylinux-2.7-debian http://nipy.bic.berkeley.edu/builders/manylinux-2.7-debian https://travis-ci.org/matthew-brett/manylinux-testing So I think these are ready to go. I propose uploading these wheels for numpy and scipy to pypi tomorrow unless anyone has an objection. Cheers, Matthew
![](https://secure.gravatar.com/avatar/b4929294417e9ac44c17967baae75a36.jpg?s=120&d=mm&r=g)
On Tue, Apr 12, 2016 at 7:15 PM, Matthew Brett <matthew.brett@gmail.com> wrote:
Hi,
On Sat, Apr 2, 2016 at 6:11 PM, Matthew Brett <matthew.brett@gmail.com> wrote:
On Fri, Mar 25, 2016 at 6:39 AM, Peter Cock <p.j.a.cock@googlemail.com> wrote:
On Fri, Mar 25, 2016 at 3:02 AM, Robert T. McGibbon <rmcgibbo@gmail.com> wrote:
I suspect that many of the maintainers of major scipy-ecosystem projects are aware of these (or other similar) travis wheel caches, but would guess that the pool of travis-ci python users who weren't aware of these wheel caches is much much larger. So there will still be a lot of travis-ci clock cycles saved by manylinux wheels.
-Robert
Yes exactly. Availability of NumPy Linux wheels on PyPI is definitely something I would suggest adding to the release notes. Hopefully this will help trigger a general availability of wheels in the numpy-ecosystem :)
In the case of Travis CI, their VM images for Python already have a version of NumPy installed, but having the latest version of NumPy and SciPy etc available as Linux wheels would be very nice.
We're very nearly there now.
The latest versions of numpy, scipy, scikit-image, pandas, numexpr, statsmodels wheels for testing at http://ccdd0ebb5a931e58c7c5-aae005c4999d7244ac63632f8b80e089.r77.cf2.rackcdn...
Please do test with:
python -m pip install --upgrade pip
pip install --trusted-host=ccdd0ebb5a931e58c7c5-aae005c4999d7244ac63632f8b80e089.r77.cf2.rackcdn.com --find-links=http://ccdd0ebb5a931e58c7c5-aae005c4999d7244ac63632f8b80e089.r77.cf2.rackcdn... numpy scipy scikit-learn numexpr
python -c 'import numpy; numpy.test("full")' python -c 'import scipy; scipy.test("full")'
We would love to get any feedback as to whether these work on your machines.
I've just rebuilt these wheels with the just-released OpenBLAS 0.2.18.
OpenBLAS is now passing all its own tests and tests on numpy / scipy / scikit-learn at http://build.openblas.net/builders
Our tests of the wheels look good too:
http://nipy.bic.berkeley.edu/builders/manylinux-2.7-debian http://nipy.bic.berkeley.edu/builders/manylinux-2.7-debian https://travis-ci.org/matthew-brett/manylinux-testing
So I think these are ready to go. I propose uploading these wheels for numpy and scipy to pypi tomorrow unless anyone has an objection.
Done. If y'all are on linux, and you have pip >= 8.11, you should now see this kind of thing: $ pip install numpy scipy Collecting numpy Downloading numpy-1.11.0-cp27-cp27mu-manylinux1_x86_64.whl (15.3MB) 100% |████████████████████████████████| 15.3MB 61kB/s Collecting scipy Downloading scipy-0.17.0-cp27-cp27mu-manylinux1_x86_64.whl (39.5MB) 100% |████████████████████████████████| 39.5MB 24kB/s Installing collected packages: numpy, scipy Successfully installed numpy-1.11.0 scipy-0.17.0 Cheers, Matthew
![](https://secure.gravatar.com/avatar/aee56554ec30edfd680e1c937ed4e54d.jpg?s=120&d=mm&r=g)
\o/ Thank you very much Matthew. I will upload the scikit-learn wheels soon. -- Olivier
![](https://secure.gravatar.com/avatar/97c543aca1ac7bbcfb5279d0300c8330.jpg?s=120&d=mm&r=g)
Woot! \o/ On Wed, Apr 13, 2016 at 12:15 PM, Matthew Brett <matthew.brett@gmail.com> wrote:
Hi,
On Sat, Apr 2, 2016 at 6:11 PM, Matthew Brett <matthew.brett@gmail.com> wrote:
On Fri, Mar 25, 2016 at 6:39 AM, Peter Cock <p.j.a.cock@googlemail.com> wrote:
On Fri, Mar 25, 2016 at 3:02 AM, Robert T. McGibbon < rmcgibbo@gmail.com> wrote:
I suspect that many of the maintainers of major scipy-ecosystem
aware of these (or other similar) travis wheel caches, but would guess that the pool of travis-ci python users who weren't aware of these wheel caches is much much larger. So there will still be a lot of travis-ci clock cycles saved by manylinux wheels.
-Robert
Yes exactly. Availability of NumPy Linux wheels on PyPI is definitely something I would suggest adding to the release notes. Hopefully this will help
On Tue, Apr 12, 2016 at 7:15 PM, Matthew Brett <matthew.brett@gmail.com> wrote: projects are trigger
a general availability of wheels in the numpy-ecosystem :)
In the case of Travis CI, their VM images for Python already have a version of NumPy installed, but having the latest version of NumPy and SciPy etc available as Linux wheels would be very nice.
We're very nearly there now.
The latest versions of numpy, scipy, scikit-image, pandas, numexpr, statsmodels wheels for testing at
http://ccdd0ebb5a931e58c7c5-aae005c4999d7244ac63632f8b80e089.r77.cf2.rackcdn...
Please do test with:
python -m pip install --upgrade pip
pip install --trusted-host=
ccdd0ebb5a931e58c7c5-aae005c4999d7244ac63632f8b80e089.r77.cf2.rackcdn.com
--find-links= http://ccdd0ebb5a931e58c7c5-aae005c4999d7244ac63632f8b80e089.r77.cf2.rackcdn... numpy scipy scikit-learn numexpr
python -c 'import numpy; numpy.test("full")' python -c 'import scipy; scipy.test("full")'
We would love to get any feedback as to whether these work on your machines.
I've just rebuilt these wheels with the just-released OpenBLAS 0.2.18.
OpenBLAS is now passing all its own tests and tests on numpy / scipy / scikit-learn at http://build.openblas.net/builders
Our tests of the wheels look good too:
http://nipy.bic.berkeley.edu/builders/manylinux-2.7-debian http://nipy.bic.berkeley.edu/builders/manylinux-2.7-debian https://travis-ci.org/matthew-brett/manylinux-testing
So I think these are ready to go. I propose uploading these wheels for numpy and scipy to pypi tomorrow unless anyone has an objection.
Done. If y'all are on linux, and you have pip >= 8.11, you should now see this kind of thing:
$ pip install numpy scipy Collecting numpy Downloading numpy-1.11.0-cp27-cp27mu-manylinux1_x86_64.whl (15.3MB) 100% |████████████████████████████████| 15.3MB 61kB/s Collecting scipy Downloading scipy-0.17.0-cp27-cp27mu-manylinux1_x86_64.whl (39.5MB) 100% |████████████████████████████████| 39.5MB 24kB/s Installing collected packages: numpy, scipy Successfully installed numpy-1.11.0 scipy-0.17.0
Cheers,
Matthew _______________________________________________ NumPy-Discussion mailing list NumPy-Discussion@scipy.org https://mail.scipy.org/mailman/listinfo/numpy-discussion
-- Nathaniel J. Smith -- https://vorpus.org <http://vorpus.org>
![](https://secure.gravatar.com/avatar/664d320baa05c827ff08ed361fe77769.jpg?s=120&d=mm&r=g)
On 13 April 2016 at 20:15, Matthew Brett <matthew.brett@gmail.com> wrote:
Done. If y'all are on linux, and you have pip >= 8.11, you should now see this kind of thing:
That's fantastic. Thanks Matt! I just test installed this and ran numpy.test(). All tests passed but then I got a segfault at the end by (semi-accidentally) hitting Ctrl-C at the prompt: $ python Python 2.7.9 (default, Apr 2 2015, 15:33:21) [GCC 4.9.2] on linux2 Type "help", "copyright", "credits" or "license" for more information.
import numpy numpy.test() Running unit tests for numpy <snip> Ran 5781 tests in 72.238s
OK (KNOWNFAIL=6, SKIP=15) <nose.result.TextTestResult run=5781 errors=0 failures=0>
Segmentation fault (core dumped)
It was stopped at the prompt and then I did Ctrl-C and then the seg-fault message. $ uname -a Linux vnwulf 3.19.0-15-generic #15-Ubuntu SMP Thu Apr 16 23:32:37 UTC 2015 x86_64 x86_64 x86_64 GNU/Linux $ lsb_release -a No LSB modules are available. Distributor ID: Ubuntu Description: Ubuntu 15.04 Release: 15.04 Codename: vivid -- Oscar
![](https://secure.gravatar.com/avatar/b4929294417e9ac44c17967baae75a36.jpg?s=120&d=mm&r=g)
On Wed, Apr 13, 2016 at 1:29 PM, Oscar Benjamin <oscar.j.benjamin@gmail.com> wrote:
On 13 April 2016 at 20:15, Matthew Brett <matthew.brett@gmail.com> wrote:
Done. If y'all are on linux, and you have pip >= 8.11, you should now see this kind of thing:
That's fantastic. Thanks Matt!
I just test installed this and ran numpy.test(). All tests passed but then I got a segfault at the end by (semi-accidentally) hitting Ctrl-C at the prompt:
$ python Python 2.7.9 (default, Apr 2 2015, 15:33:21) [GCC 4.9.2] on linux2 Type "help", "copyright", "credits" or "license" for more information.
import numpy numpy.test() Running unit tests for numpy <snip> Ran 5781 tests in 72.238s
OK (KNOWNFAIL=6, SKIP=15) <nose.result.TextTestResult run=5781 errors=0 failures=0>
Segmentation fault (core dumped)
It was stopped at the prompt and then I did Ctrl-C and then the seg-fault message.
$ uname -a Linux vnwulf 3.19.0-15-generic #15-Ubuntu SMP Thu Apr 16 23:32:37 UTC 2015 x86_64 x86_64 x86_64 GNU/Linux $ lsb_release -a No LSB modules are available. Distributor ID: Ubuntu Description: Ubuntu 15.04 Release: 15.04 Codename: vivid
Thanks so much for testing - that's very useful. I get the same thing on my Debian Sid machine. Actually I also get the same thing with a local compile against Debian ATLAS, here's the stack trace after:
import numpy; numpy.test() # Ctrl-C
https://gist.github.com/f6d8fb42f24689b39536a2416d717056 Do you get this as well? Cheers, Matthew
![](https://secure.gravatar.com/avatar/664d320baa05c827ff08ed361fe77769.jpg?s=120&d=mm&r=g)
On 13 Apr 2016 21:48, "Matthew Brett" <matthew.brett@gmail.com> wrote:
On Wed, Apr 13, 2016 at 1:29 PM, Oscar Benjamin <oscar.j.benjamin@gmail.com> wrote:
On 13 April 2016 at 20:15, Matthew Brett <matthew.brett@gmail.com>
wrote:
Done. If y'all are on linux, and you have pip >= 8.11, you should now see this kind of thing:
That's fantastic. Thanks Matt!
I just test installed this and ran numpy.test(). All tests passed but then I got a segfault at the end by (semi-accidentally) hitting Ctrl-C at the prompt:
$ python Python 2.7.9 (default, Apr 2 2015, 15:33:21) [GCC 4.9.2] on linux2 Type "help", "copyright", "credits" or "license" for more information.
import numpy numpy.test() Running unit tests for numpy <snip> Ran 5781 tests in 72.238s
OK (KNOWNFAIL=6, SKIP=15) <nose.result.TextTestResult run=5781 errors=0 failures=0>
Segmentation fault (core dumped)
It was stopped at the prompt and then I did Ctrl-C and then the seg-fault message.
$ uname -a Linux vnwulf 3.19.0-15-generic #15-Ubuntu SMP Thu Apr 16 23:32:37 UTC 2015 x86_64 x86_64 x86_64 GNU/Linux $ lsb_release -a No LSB modules are available. Distributor ID: Ubuntu Description: Ubuntu 15.04 Release: 15.04 Codename: vivid
Thanks so much for testing - that's very useful.
I get the same thing on my Debian Sid machine.
Actually I also get the same thing with a local compile against Debian ATLAS, here's the stack trace after:
import numpy; numpy.test() # Ctrl-C
https://gist.github.com/f6d8fb42f24689b39536a2416d717056
Do you get this as well?
It's late here but I'll test again tomorrow. What do I need to do to get comparable output? -- Oscar
![](https://secure.gravatar.com/avatar/97c543aca1ac7bbcfb5279d0300c8330.jpg?s=120&d=mm&r=g)
I can reproduce in self-compiled 1.9, so it's not a new bug. I think something's going wrong with NPY_SIGINT_ON / NPY_SIGINT_OFF, where our special sigint handler is getting left in place even after our code finishes running. Skimming the code, my best guess is that this is due to a race condition in how we save/restore the original signal handler, when multiple threads are running numpy fftpack code at the same time (and thus using NPY_SIGINT_{ON,OFF} from multiple threads). -n On Wed, Apr 13, 2016 at 1:47 PM, Matthew Brett <matthew.brett@gmail.com> wrote:
On Wed, Apr 13, 2016 at 1:29 PM, Oscar Benjamin <oscar.j.benjamin@gmail.com> wrote:
On 13 April 2016 at 20:15, Matthew Brett <matthew.brett@gmail.com> wrote:
Done. If y'all are on linux, and you have pip >= 8.11, you should now see this kind of thing:
That's fantastic. Thanks Matt!
I just test installed this and ran numpy.test(). All tests passed but then I got a segfault at the end by (semi-accidentally) hitting Ctrl-C at the prompt:
$ python Python 2.7.9 (default, Apr 2 2015, 15:33:21) [GCC 4.9.2] on linux2 Type "help", "copyright", "credits" or "license" for more information.
import numpy numpy.test() Running unit tests for numpy <snip> Ran 5781 tests in 72.238s
OK (KNOWNFAIL=6, SKIP=15) <nose.result.TextTestResult run=5781 errors=0 failures=0>
Segmentation fault (core dumped)
It was stopped at the prompt and then I did Ctrl-C and then the seg-fault message.
$ uname -a Linux vnwulf 3.19.0-15-generic #15-Ubuntu SMP Thu Apr 16 23:32:37 UTC 2015 x86_64 x86_64 x86_64 GNU/Linux $ lsb_release -a No LSB modules are available. Distributor ID: Ubuntu Description: Ubuntu 15.04 Release: 15.04 Codename: vivid
Thanks so much for testing - that's very useful.
I get the same thing on my Debian Sid machine.
Actually I also get the same thing with a local compile against Debian ATLAS, here's the stack trace after:
import numpy; numpy.test() # Ctrl-C
https://gist.github.com/f6d8fb42f24689b39536a2416d717056
Do you get this as well?
Cheers,
Matthew _______________________________________________ NumPy-Discussion mailing list NumPy-Discussion@scipy.org https://mail.scipy.org/mailman/listinfo/numpy-discussion
-- Nathaniel J. Smith -- https://vorpus.org
![](https://secure.gravatar.com/avatar/97c543aca1ac7bbcfb5279d0300c8330.jpg?s=120&d=mm&r=g)
https://github.com/numpy/numpy/issues/7545 On Wed, Apr 13, 2016 at 3:38 PM, Nathaniel Smith <njs@pobox.com> wrote:
I can reproduce in self-compiled 1.9, so it's not a new bug.
I think something's going wrong with NPY_SIGINT_ON / NPY_SIGINT_OFF, where our special sigint handler is getting left in place even after our code finishes running.
Skimming the code, my best guess is that this is due to a race condition in how we save/restore the original signal handler, when multiple threads are running numpy fftpack code at the same time (and thus using NPY_SIGINT_{ON,OFF} from multiple threads).
-n
On Wed, Apr 13, 2016 at 1:47 PM, Matthew Brett <matthew.brett@gmail.com> wrote:
On Wed, Apr 13, 2016 at 1:29 PM, Oscar Benjamin <oscar.j.benjamin@gmail.com> wrote:
On 13 April 2016 at 20:15, Matthew Brett <matthew.brett@gmail.com> wrote:
Done. If y'all are on linux, and you have pip >= 8.11, you should now see this kind of thing:
That's fantastic. Thanks Matt!
I just test installed this and ran numpy.test(). All tests passed but then I got a segfault at the end by (semi-accidentally) hitting Ctrl-C at the prompt:
$ python Python 2.7.9 (default, Apr 2 2015, 15:33:21) [GCC 4.9.2] on linux2 Type "help", "copyright", "credits" or "license" for more information.
import numpy numpy.test() Running unit tests for numpy <snip> Ran 5781 tests in 72.238s
OK (KNOWNFAIL=6, SKIP=15) <nose.result.TextTestResult run=5781 errors=0 failures=0>
Segmentation fault (core dumped)
It was stopped at the prompt and then I did Ctrl-C and then the seg-fault message.
$ uname -a Linux vnwulf 3.19.0-15-generic #15-Ubuntu SMP Thu Apr 16 23:32:37 UTC 2015 x86_64 x86_64 x86_64 GNU/Linux $ lsb_release -a No LSB modules are available. Distributor ID: Ubuntu Description: Ubuntu 15.04 Release: 15.04 Codename: vivid
Thanks so much for testing - that's very useful.
I get the same thing on my Debian Sid machine.
Actually I also get the same thing with a local compile against Debian ATLAS, here's the stack trace after:
import numpy; numpy.test() # Ctrl-C
https://gist.github.com/f6d8fb42f24689b39536a2416d717056
Do you get this as well?
Cheers,
Matthew _______________________________________________ NumPy-Discussion mailing list NumPy-Discussion@scipy.org https://mail.scipy.org/mailman/listinfo/numpy-discussion
-- Nathaniel J. Smith -- https://vorpus.org
-- Nathaniel J. Smith -- https://vorpus.org
![](https://secure.gravatar.com/avatar/21d2fbbb409915032f42c1bafa70cfc5.jpg?s=120&d=mm&r=g)
I have tried testing the wheels in a project that runs tests on Travis's Trusty infrastructure which. The wheels work great for python 3.5 and saves us several minuts of runtime. However, I am having trouble using the wheels on python 2.7 on the same Trusty machines. It seems to be because the wheels are tagged as cp27-cp27mu (numpy-1.11.0-cp27-cp27mu-manylinux1_x86_64.whl) where as pip.pep425tags.get_abi_tag() returns cp27m on this particular python version. (Stock python 2.7 installed on Travis 14.04 VMs) Any chance of a cp27m compatible wheel build? best Jens On Thu, 14 Apr 2016 at 01:46 Nathaniel Smith <njs@pobox.com> wrote:
https://github.com/numpy/numpy/issues/7545
On Wed, Apr 13, 2016 at 3:38 PM, Nathaniel Smith <njs@pobox.com> wrote:
I can reproduce in self-compiled 1.9, so it's not a new bug.
I think something's going wrong with NPY_SIGINT_ON / NPY_SIGINT_OFF, where our special sigint handler is getting left in place even after our code finishes running.
Skimming the code, my best guess is that this is due to a race condition in how we save/restore the original signal handler, when multiple threads are running numpy fftpack code at the same time (and thus using NPY_SIGINT_{ON,OFF} from multiple threads).
-n
On Wed, Apr 13, 2016 at 1:47 PM, Matthew Brett <matthew.brett@gmail.com> wrote:
On Wed, Apr 13, 2016 at 1:29 PM, Oscar Benjamin <oscar.j.benjamin@gmail.com> wrote:
On 13 April 2016 at 20:15, Matthew Brett <matthew.brett@gmail.com> wrote:
Done. If y'all are on linux, and you have pip >= 8.11, you should now see this kind of thing:
That's fantastic. Thanks Matt!
I just test installed this and ran numpy.test(). All tests passed but then I got a segfault at the end by (semi-accidentally) hitting Ctrl-C at the prompt:
$ python Python 2.7.9 (default, Apr 2 2015, 15:33:21) [GCC 4.9.2] on linux2 Type "help", "copyright", "credits" or "license" for more information.
> import numpy > numpy.test() Running unit tests for numpy <snip> Ran 5781 tests in 72.238s
OK (KNOWNFAIL=6, SKIP=15) <nose.result.TextTestResult run=5781 errors=0 failures=0>
> Segmentation fault (core dumped)
It was stopped at the prompt and then I did Ctrl-C and then the seg-fault message.
$ uname -a Linux vnwulf 3.19.0-15-generic #15-Ubuntu SMP Thu Apr 16 23:32:37 UTC 2015 x86_64 x86_64 x86_64 GNU/Linux $ lsb_release -a No LSB modules are available. Distributor ID: Ubuntu Description: Ubuntu 15.04 Release: 15.04 Codename: vivid
Thanks so much for testing - that's very useful.
I get the same thing on my Debian Sid machine.
Actually I also get the same thing with a local compile against Debian ATLAS, here's the stack trace after:
import numpy; numpy.test() # Ctrl-C
https://gist.github.com/f6d8fb42f24689b39536a2416d717056
Do you get this as well?
Cheers,
Matthew _______________________________________________ NumPy-Discussion mailing list NumPy-Discussion@scipy.org https://mail.scipy.org/mailman/listinfo/numpy-discussion
-- Nathaniel J. Smith -- https://vorpus.org
-- Nathaniel J. Smith -- https://vorpus.org _______________________________________________ NumPy-Discussion mailing list NumPy-Discussion@scipy.org https://mail.scipy.org/mailman/listinfo/numpy-discussion
![](https://secure.gravatar.com/avatar/b4929294417e9ac44c17967baae75a36.jpg?s=120&d=mm&r=g)
Hi, On Thu, Apr 14, 2016 at 8:02 AM, Jens Nielsen <jenshnielsen@gmail.com> wrote:
I have tried testing the wheels in a project that runs tests on Travis's Trusty infrastructure which. The wheels work great for python 3.5 and saves us several minuts of runtime.
However, I am having trouble using the wheels on python 2.7 on the same Trusty machines. It seems to be because the wheels are tagged as cp27-cp27mu (numpy-1.11.0-cp27-cp27mu-manylinux1_x86_64.whl) where as pip.pep425tags.get_abi_tag() returns cp27m on this particular python version. (Stock python 2.7 installed on Travis 14.04 VMs) Any chance of a cp27m compatible wheel build?
Ouch - do you know where travis-ci's Python 2.7 comes from? I see that the standard apt-get install -y python is a wide (mu) build... Cheers, Matthew
![](https://secure.gravatar.com/avatar/697900d3a29858ea20cc109a2aee0af6.jpg?s=120&d=mm&r=g)
Are we going to have to have documentation somewhere making it clear that the numpy wheel shouldn't be used in a conda environment? Not that I would expect this issue to come up all that often, but I could imagine a scenario where a non-scientist is simply using a base conda distribution because that is what IT put on their system. Then they do "pip install ipython" that indirectly brings in numpy (through the matplotlib dependency), and end up with an incompatible numpy because they would have been linked against different pythons? Or is this not an issue? Ben Root On Thu, Apr 14, 2016 at 2:04 PM, Matthew Brett <matthew.brett@gmail.com> wrote:
Hi,
On Thu, Apr 14, 2016 at 8:02 AM, Jens Nielsen <jenshnielsen@gmail.com> wrote:
I have tried testing the wheels in a project that runs tests on Travis's Trusty infrastructure which. The wheels work great for python 3.5 and saves us several minuts of runtime.
However, I am having trouble using the wheels on python 2.7 on the same Trusty machines. It seems to be because the wheels are tagged as cp27-cp27mu (numpy-1.11.0-cp27-cp27mu-manylinux1_x86_64.whl) where as pip.pep425tags.get_abi_tag() returns cp27m on this particular python version. (Stock python 2.7 installed on Travis 14.04 VMs) Any chance of a cp27m compatible wheel build?
Ouch - do you know where travis-ci's Python 2.7 comes from? I see that the standard apt-get install -y python is a wide (mu) build...
Cheers,
Matthew _______________________________________________ NumPy-Discussion mailing list NumPy-Discussion@scipy.org https://mail.scipy.org/mailman/listinfo/numpy-discussion
![](https://secure.gravatar.com/avatar/b4929294417e9ac44c17967baae75a36.jpg?s=120&d=mm&r=g)
Hi, On Thu, Apr 14, 2016 at 11:11 AM, Benjamin Root <ben.v.root@gmail.com> wrote:
Are we going to have to have documentation somewhere making it clear that the numpy wheel shouldn't be used in a conda environment? Not that I would expect this issue to come up all that often, but I could imagine a scenario where a non-scientist is simply using a base conda distribution because that is what IT put on their system. Then they do "pip install ipython" that indirectly brings in numpy (through the matplotlib dependency), and end up with an incompatible numpy because they would have been linked against different pythons?
Or is this not an issue?
I'm afraid I don't know conda at all, but I'm guessing that pip will not install numpy when it is installed via conda. So the potential difference is that, pre-wheel, if numpy was not installed in your conda environment, then pip would build numpy from source, whereas now you'll get a binary install. I _think_ that Python's binary API specification (pip.pep425tags.get_abi_tag()) should prevent pip from installing an incompatible wheel. Are there any conda experts out there who can give more detail, or more convincing assurance? Cheers, Matthew
![](https://secure.gravatar.com/avatar/d495a692ec174d3053818ff85b96fbb7.jpg?s=120&d=mm&r=g)
Actually, conda pip will install the wheels that you put up. The good news is: they all (by which I mean *numpy* and *scipy* both on 2.7 and 3.5) pass! On Thu, Apr 14, 2016 at 7:26 PM, Matthew Brett <matthew.brett@gmail.com> wrote:
Hi,
Are we going to have to have documentation somewhere making it clear that the numpy wheel shouldn't be used in a conda environment? Not that I would expect this issue to come up all that often, but I could imagine a scenario where a non-scientist is simply using a base conda distribution because
On Thu, Apr 14, 2016 at 11:11 AM, Benjamin Root <ben.v.root@gmail.com> wrote: that
is what IT put on their system. Then they do "pip install ipython" that indirectly brings in numpy (through the matplotlib dependency), and end up with an incompatible numpy because they would have been linked against different pythons?
Or is this not an issue?
I'm afraid I don't know conda at all, but I'm guessing that pip will not install numpy when it is installed via conda.
So the potential difference is that, pre-wheel, if numpy was not installed in your conda environment, then pip would build numpy from source, whereas now you'll get a binary install.
I _think_ that Python's binary API specification (pip.pep425tags.get_abi_tag()) should prevent pip from installing an incompatible wheel. Are there any conda experts out there who can give more detail, or more convincing assurance?
Cheers,
Matthew _______________________________________________ NumPy-Discussion mailing list NumPy-Discussion@scipy.org https://mail.scipy.org/mailman/listinfo/numpy-discussion
![](https://secure.gravatar.com/avatar/cd71bb17a5ef04a06383cdcde1f370ad.jpg?s=120&d=mm&r=g)
Hi,
Are we going to have to have documentation somewhere making it clear that the numpy wheel shouldn't be used in a conda environment? Not that I would expect this issue to come up all that often, but I could imagine a scenario where a non-scientist is simply using a base conda distribution because that is what IT put on their system. Then they do "pip install ipython" that indirectly brings in numpy (through the matplotlib dependency), and end up with an incompatible numpy because they would have been linked against different pythons?
Or is this not an issue? I'm afraid I don't know conda at all, but I'm guessing that pip will not install numpy when it is installed via conda. Correct, pip will not (or at least should not, and did not in my tests) install numpy over top of an existing conda installed numpy. Unfortunately from my testing, conda will install a conda version of numpy over top of a pip installed version. This may be the expected behavior as conda maintains its own list of installed packages. So the potential difference is that, pre-wheel, if numpy was not installed in your conda environment, then pip would build numpy from
On Thu, Apr 14, 2016 at 11:11 AM, Benjamin Root <ben.v.root@gmail.com> wrote: source, whereas now you'll get a binary install.
I _think_ that Python's binary API specification (pip.pep425tags.get_abi_tag()) should prevent pip from installing an incompatible wheel. Are there any conda experts out there who can give more detail, or more convincing assurance? I tested "pip install numpy" in conda environments (conda's equivalent to virtualenvs) which did not have numpy installed previously for Python 2.7, 3.4 and 3.5 in a Ubuntu 14.04 Docker container. In all cases numpy was installed from the whl file and appeared to be functional. Running
On 4/14/16 1:26 PM, Matthew Brett wrote: the numpy test suite found three failing tests for Python 2.7 and 3.5 and 21 errors in Python 3.4. The 2.7 and 3.5 failures do not look concerning but the 3.4 errors are a bit strange. Logs are in https://gist.github.com/jjhelmus/a433a66d56fb0e39b8ebde248ad3fe36 Cheers, - Jonathan Helmus
Cheers,
Matthew _______________________________________________ NumPy-Discussion mailing list NumPy-Discussion@scipy.org https://mail.scipy.org/mailman/listinfo/numpy-discussion
![](https://secure.gravatar.com/avatar/b4929294417e9ac44c17967baae75a36.jpg?s=120&d=mm&r=g)
On Thu, Apr 14, 2016 at 12:25 PM, Jonathan Helmus <jjhelmus@gmail.com> wrote:
On 4/14/16 1:26 PM, Matthew Brett wrote:
Hi,
On Thu, Apr 14, 2016 at 11:11 AM, Benjamin Root <ben.v.root@gmail.com> wrote:
Are we going to have to have documentation somewhere making it clear that the numpy wheel shouldn't be used in a conda environment? Not that I would expect this issue to come up all that often, but I could imagine a scenario where a non-scientist is simply using a base conda distribution because that is what IT put on their system. Then they do "pip install ipython" that indirectly brings in numpy (through the matplotlib dependency), and end up with an incompatible numpy because they would have been linked against different pythons?
Or is this not an issue?
I'm afraid I don't know conda at all, but I'm guessing that pip will not install numpy when it is installed via conda.
Correct, pip will not (or at least should not, and did not in my tests) install numpy over top of an existing conda installed numpy. Unfortunately from my testing, conda will install a conda version of numpy over top of a pip installed version. This may be the expected behavior as conda maintains its own list of installed packages.
So the potential difference is that, pre-wheel, if numpy was not installed in your conda environment, then pip would build numpy from source, whereas now you'll get a binary install.
I _think_ that Python's binary API specification (pip.pep425tags.get_abi_tag()) should prevent pip from installing an incompatible wheel. Are there any conda experts out there who can give more detail, or more convincing assurance?
I tested "pip install numpy" in conda environments (conda's equivalent to virtualenvs) which did not have numpy installed previously for Python 2.7, 3.4 and 3.5 in a Ubuntu 14.04 Docker container. In all cases numpy was installed from the whl file and appeared to be functional. Running the numpy test suite found three failing tests for Python 2.7 and 3.5 and 21 errors in Python 3.4. The 2.7 and 3.5 failures do not look concerning but the 3.4 errors are a bit strange. Logs are in https://gist.github.com/jjhelmus/a433a66d56fb0e39b8ebde248ad3fe36
Thanks for testing. For: docker run -ti --rm ubuntu:14.04 /bin/bash apt-get update && apt-get install -y curl curl -LO https://bootstrap.pypa.io/get-pip.py python3 get-pip.py pip install numpy nose python3 -c "import numpy; numpy.test()" I get: FAILED (KNOWNFAIL=7, SKIP=17, errors=21) This is stock Python 3.4 - so not a conda issue. It is definitely a problem with the wheel because a compiled numpy wheel on the same docker image: apt-get update && apt-get install -y curl python3-dev curl -LO https://bootstrap.pypa.io/get-pip.py python3 get-pip.py pip install --no-binary=:all: numpy nose python3 -c "import numpy; numpy.test()" gives no test errors. It looks like we have some more work to do... Cheers, Matthew
![](https://secure.gravatar.com/avatar/b4929294417e9ac44c17967baae75a36.jpg?s=120&d=mm&r=g)
On Thu, Apr 14, 2016 at 12:57 PM, Matthew Brett <matthew.brett@gmail.com> wrote:
On Thu, Apr 14, 2016 at 12:25 PM, Jonathan Helmus <jjhelmus@gmail.com> wrote:
On 4/14/16 1:26 PM, Matthew Brett wrote:
Hi,
On Thu, Apr 14, 2016 at 11:11 AM, Benjamin Root <ben.v.root@gmail.com> wrote:
Are we going to have to have documentation somewhere making it clear that the numpy wheel shouldn't be used in a conda environment? Not that I would expect this issue to come up all that often, but I could imagine a scenario where a non-scientist is simply using a base conda distribution because that is what IT put on their system. Then they do "pip install ipython" that indirectly brings in numpy (through the matplotlib dependency), and end up with an incompatible numpy because they would have been linked against different pythons?
Or is this not an issue?
I'm afraid I don't know conda at all, but I'm guessing that pip will not install numpy when it is installed via conda.
Correct, pip will not (or at least should not, and did not in my tests) install numpy over top of an existing conda installed numpy. Unfortunately from my testing, conda will install a conda version of numpy over top of a pip installed version. This may be the expected behavior as conda maintains its own list of installed packages.
So the potential difference is that, pre-wheel, if numpy was not installed in your conda environment, then pip would build numpy from source, whereas now you'll get a binary install.
I _think_ that Python's binary API specification (pip.pep425tags.get_abi_tag()) should prevent pip from installing an incompatible wheel. Are there any conda experts out there who can give more detail, or more convincing assurance?
I tested "pip install numpy" in conda environments (conda's equivalent to virtualenvs) which did not have numpy installed previously for Python 2.7, 3.4 and 3.5 in a Ubuntu 14.04 Docker container. In all cases numpy was installed from the whl file and appeared to be functional. Running the numpy test suite found three failing tests for Python 2.7 and 3.5 and 21 errors in Python 3.4. The 2.7 and 3.5 failures do not look concerning but the 3.4 errors are a bit strange. Logs are in https://gist.github.com/jjhelmus/a433a66d56fb0e39b8ebde248ad3fe36
Thanks for testing. For:
docker run -ti --rm ubuntu:14.04 /bin/bash
apt-get update && apt-get install -y curl curl -LO https://bootstrap.pypa.io/get-pip.py python3 get-pip.py pip install numpy nose python3 -c "import numpy; numpy.test()"
I get:
FAILED (KNOWNFAIL=7, SKIP=17, errors=21)
This is stock Python 3.4 - so not a conda issue. It is definitely a problem with the wheel because a compiled numpy wheel on the same docker image:
apt-get update && apt-get install -y curl python3-dev curl -LO https://bootstrap.pypa.io/get-pip.py python3 get-pip.py pip install --no-binary=:all: numpy nose python3 -c "import numpy; numpy.test()"
gives no test errors.
It looks like we have some more work to do...
Actually, I can solve these errors by first doing: apt-get install gcc I think these must be bugs in the numpy tests where numpy is assuming a functional compiler. Does the conda numpy give test errors when there is no compiler? Cheers, Matthew
![](https://secure.gravatar.com/avatar/cd71bb17a5ef04a06383cdcde1f370ad.jpg?s=120&d=mm&r=g)
On 4/14/16 3:11 PM, Matthew Brett wrote:
On Thu, Apr 14, 2016 at 12:57 PM, Matthew Brett <matthew.brett@gmail.com> wrote:
On Thu, Apr 14, 2016 at 12:25 PM, Jonathan Helmus <jjhelmus@gmail.com> wrote:
Hi,
On Thu, Apr 14, 2016 at 11:11 AM, Benjamin Root <ben.v.root@gmail.com> wrote:
Are we going to have to have documentation somewhere making it clear that the numpy wheel shouldn't be used in a conda environment? Not that I would expect this issue to come up all that often, but I could imagine a scenario where a non-scientist is simply using a base conda distribution because that is what IT put on their system. Then they do "pip install ipython" that indirectly brings in numpy (through the matplotlib dependency), and end up with an incompatible numpy because they would have been linked against different pythons?
Or is this not an issue? I'm afraid I don't know conda at all, but I'm guessing that pip will not install numpy when it is installed via conda. Correct, pip will not (or at least should not, and did not in my tests) install numpy over top of an existing conda installed numpy. Unfortunately from my testing, conda will install a conda version of numpy over top of a
So the potential difference is that, pre-wheel, if numpy was not installed in your conda environment, then pip would build numpy from source, whereas now you'll get a binary install.
I _think_ that Python's binary API specification (pip.pep425tags.get_abi_tag()) should prevent pip from installing an incompatible wheel. Are there any conda experts out there who can give more detail, or more convincing assurance? I tested "pip install numpy" in conda environments (conda's equivalent to virtualenvs) which did not have numpy installed previously for Python 2.7, 3.4 and 3.5 in a Ubuntu 14.04 Docker container. In all cases numpy was installed from the whl file and appeared to be functional. Running the numpy test suite found three failing tests for Python 2.7 and 3.5 and 21 errors in Python 3.4. The 2.7 and 3.5 failures do not look concerning but
On 4/14/16 1:26 PM, Matthew Brett wrote: pip installed version. This may be the expected behavior as conda maintains its own list of installed packages. the 3.4 errors are a bit strange. Logs are in https://gist.github.com/jjhelmus/a433a66d56fb0e39b8ebde248ad3fe36
Thanks for testing. For:
docker run -ti --rm ubuntu:14.04 /bin/bash
apt-get update && apt-get install -y curl curl -LO https://bootstrap.pypa.io/get-pip.py python3 get-pip.py pip install numpy nose python3 -c "import numpy; numpy.test()"
I get:
FAILED (KNOWNFAIL=7, SKIP=17, errors=21)
This is stock Python 3.4 - so not a conda issue. It is definitely a problem with the wheel because a compiled numpy wheel on the same docker image:
apt-get update && apt-get install -y curl python3-dev curl -LO https://bootstrap.pypa.io/get-pip.py python3 get-pip.py pip install --no-binary=:all: numpy nose python3 -c "import numpy; numpy.test()"
gives no test errors.
It looks like we have some more work to do... Actually, I can solve these errors by first doing:
apt-get install gcc
I think these must be bugs in the numpy tests where numpy is assuming a functional compiler.
Does the conda numpy give test errors when there is no compiler?
Cheers,
Matthew
Yes, both the wheel and conda numpy packages give errors when there is not a compiler. These errors clear when gcc is installed. Looks like the wheels are fine, just forgot about a compiler. Cheers, - Jonathan Helmus
![](https://secure.gravatar.com/avatar/b4929294417e9ac44c17967baae75a36.jpg?s=120&d=mm&r=g)
On Thu, Apr 14, 2016 at 1:47 PM, Jonathan Helmus <jjhelmus@gmail.com> wrote:
On 4/14/16 3:11 PM, Matthew Brett wrote:
On Thu, Apr 14, 2016 at 12:57 PM, Matthew Brett <matthew.brett@gmail.com> wrote:
On Thu, Apr 14, 2016 at 12:25 PM, Jonathan Helmus <jjhelmus@gmail.com> wrote:
On 4/14/16 1:26 PM, Matthew Brett wrote:
Hi,
On Thu, Apr 14, 2016 at 11:11 AM, Benjamin Root <ben.v.root@gmail.com> wrote:
Are we going to have to have documentation somewhere making it clear that the numpy wheel shouldn't be used in a conda environment? Not that I would expect this issue to come up all that often, but I could imagine a scenario where a non-scientist is simply using a base conda distribution because that is what IT put on their system. Then they do "pip install ipython" that indirectly brings in numpy (through the matplotlib dependency), and end up with an incompatible numpy because they would have been linked against different pythons?
Or is this not an issue?
I'm afraid I don't know conda at all, but I'm guessing that pip will not install numpy when it is installed via conda.
Correct, pip will not (or at least should not, and did not in my tests) install numpy over top of an existing conda installed numpy. Unfortunately from my testing, conda will install a conda version of numpy over top of a pip installed version. This may be the expected behavior as conda maintains its own list of installed packages.
So the potential difference is that, pre-wheel, if numpy was not installed in your conda environment, then pip would build numpy from source, whereas now you'll get a binary install.
I _think_ that Python's binary API specification (pip.pep425tags.get_abi_tag()) should prevent pip from installing an incompatible wheel. Are there any conda experts out there who can give more detail, or more convincing assurance?
I tested "pip install numpy" in conda environments (conda's equivalent to virtualenvs) which did not have numpy installed previously for Python 2.7, 3.4 and 3.5 in a Ubuntu 14.04 Docker container. In all cases numpy was installed from the whl file and appeared to be functional. Running the numpy test suite found three failing tests for Python 2.7 and 3.5 and 21 errors in Python 3.4. The 2.7 and 3.5 failures do not look concerning but the 3.4 errors are a bit strange. Logs are in https://gist.github.com/jjhelmus/a433a66d56fb0e39b8ebde248ad3fe36
Thanks for testing. For:
docker run -ti --rm ubuntu:14.04 /bin/bash
apt-get update && apt-get install -y curl curl -LO https://bootstrap.pypa.io/get-pip.py python3 get-pip.py pip install numpy nose python3 -c "import numpy; numpy.test()"
I get:
FAILED (KNOWNFAIL=7, SKIP=17, errors=21)
This is stock Python 3.4 - so not a conda issue. It is definitely a problem with the wheel because a compiled numpy wheel on the same docker image:
apt-get update && apt-get install -y curl python3-dev curl -LO https://bootstrap.pypa.io/get-pip.py python3 get-pip.py pip install --no-binary=:all: numpy nose python3 -c "import numpy; numpy.test()"
gives no test errors.
It looks like we have some more work to do...
Actually, I can solve these errors by first doing:
apt-get install gcc
I think these must be bugs in the numpy tests where numpy is assuming a functional compiler.
Does the conda numpy give test errors when there is no compiler?
Cheers,
Matthew
Yes, both the wheel and conda numpy packages give errors when there is not a compiler. These errors clear when gcc is installed. Looks like the wheels are fine, just forgot about a compiler.
Thanks for checking. I think the problem is fixed here: https://github.com/numpy/numpy/pull/7549 Cheers, Matthew
![](https://secure.gravatar.com/avatar/97c543aca1ac7bbcfb5279d0300c8330.jpg?s=120&d=mm&r=g)
On Apr 14, 2016 11:11 AM, "Benjamin Root" <ben.v.root@gmail.com> wrote:
Are we going to have to have documentation somewhere making it clear that
the numpy wheel shouldn't be used in a conda environment? Not that I would expect this issue to come up all that often, but I could imagine a scenario where a non-scientist is simply using a base conda distribution because that is what IT put on their system. Then they do "pip install ipython" that indirectly brings in numpy (through the matplotlib dependency), and end up with an incompatible numpy because they would have been linked against different pythons?
Or is this not an issue?
There are always issues when you have two different package managers maintaining separate and out-of-sync metadata about what they think is installed, but that's true for any mixed use of conda and pip. But: - pip won't install a numpy that is incompatible with your python, unless Anaconda is actively breaking cpython's standard abi (they aren't) or there's a bug in pip (possible, but no reports yet). - conda packages for python packages like numpy do generally include the .egg-info / .dist-info directories that pip uses to store its installation metadata, so pip can "see" packages installed by conda (but not vice-versa). So "pip install matplotlib" won't drag in a pypi numpy if there's already a conda numpy installed. AFAIK the one case that's nasty is if you first install a conda X, and then install a pypi X, and then try to use conda to (explicitly, or implicitly via dependencies) upgrade X. And maybe this is particularly nasty for X=numpy just because numpy is so low in the stack, but it's not really numpy specific. (NB I'm not an expert on the internals of conda though :-).) Actually, from the numpy developer point of view, one of the major advantages of having wheels is that we can ask people to test prereleases with 'pip install -U --pre numpy'. If you're a conda user you should only do this in a temporary environment (like any use of pip really), but I definitely hope that some conda users will do exactly that to test things :-). Also note that there's nothing Linux specific about this scenario. We've been shipping osx wheels for ages, and AFAIK it hasn't caused any disaster. -n
![](https://secure.gravatar.com/avatar/7840b7a3579d2a7065d5c0aa804b5b92.jpg?s=120&d=mm&r=g)
On Thu, Apr 14, 2016 at 12:07 PM, Nathaniel Smith <njs@pobox.com> wrote:
On Apr 14, 2016 11:11 AM, "Benjamin Root" <ben.v.root@gmail.com> wrote:
Are we going to have to have documentation somewhere making it clear
that the numpy wheel shouldn't be used in a conda environment? Not that I would expect this issue to come up all that often, but I could imagine a scenario where a non-scientist is simply using a base conda distribution because that is what IT put on their system. Then they do "pip install ipython" that indirectly brings in numpy (through the matplotlib dependency), and end up with an incompatible numpy because they would have been linked against different pythons?
Or is this not an issue?
There are always issues when you have two different package managers maintaining separate and out-of-sync metadata about what they think is installed, but that's true for any mixed use of conda and pip.
But: - pip won't install a numpy that is incompatible with your python, unless Anaconda is actively breaking cpython's standard abi (they aren't) or there's a bug in pip (possible, but no reports yet). - conda packages for python packages like numpy do generally include the .egg-info / .dist-info directories that pip uses to store its installation metadata, so pip can "see" packages installed by conda (but not vice-versa). So "pip install matplotlib" won't drag in a pypi numpy if there's already a conda numpy installed.
Minor clarification:. I believe conda can see pip-installed packages. If I execute "conda list" in an environment, I can see packaged installed by both pip, conda, and locally (i.e., "pip install . -e"). -paul
![](https://secure.gravatar.com/avatar/697900d3a29858ea20cc109a2aee0af6.jpg?s=120&d=mm&r=g)
I am honestly surprised that these worked (I haven't gotten around to testing for myself). I could have sworn there was a difference in how Continuum compiled python such that any binaries built against a stock python would not work in a conda environment. I ran into issues a couple years ago where a modwsgi package provided through yum wouldn't work with miniconda because of link-time differences. I cannot for the life of me remember the error message, though. Ben Root On Thu, Apr 14, 2016 at 3:59 PM, Paul Hobson <pmhobson@gmail.com> wrote:
On Thu, Apr 14, 2016 at 12:07 PM, Nathaniel Smith <njs@pobox.com> wrote:
On Apr 14, 2016 11:11 AM, "Benjamin Root" <ben.v.root@gmail.com> wrote:
Are we going to have to have documentation somewhere making it clear
that the numpy wheel shouldn't be used in a conda environment? Not that I would expect this issue to come up all that often, but I could imagine a scenario where a non-scientist is simply using a base conda distribution because that is what IT put on their system. Then they do "pip install ipython" that indirectly brings in numpy (through the matplotlib dependency), and end up with an incompatible numpy because they would have been linked against different pythons?
Or is this not an issue?
There are always issues when you have two different package managers maintaining separate and out-of-sync metadata about what they think is installed, but that's true for any mixed use of conda and pip.
But: - pip won't install a numpy that is incompatible with your python, unless Anaconda is actively breaking cpython's standard abi (they aren't) or there's a bug in pip (possible, but no reports yet). - conda packages for python packages like numpy do generally include the .egg-info / .dist-info directories that pip uses to store its installation metadata, so pip can "see" packages installed by conda (but not vice-versa). So "pip install matplotlib" won't drag in a pypi numpy if there's already a conda numpy installed.
Minor clarification:. I believe conda can see pip-installed packages.
If I execute "conda list" in an environment, I can see packaged installed by both pip, conda, and locally (i.e., "pip install . -e").
-paul
_______________________________________________ NumPy-Discussion mailing list NumPy-Discussion@scipy.org https://mail.scipy.org/mailman/listinfo/numpy-discussion
![](https://secure.gravatar.com/avatar/b4929294417e9ac44c17967baae75a36.jpg?s=120&d=mm&r=g)
Hi, On Thu, Apr 14, 2016 at 11:04 AM, Matthew Brett <matthew.brett@gmail.com> wrote:
Hi,
On Thu, Apr 14, 2016 at 8:02 AM, Jens Nielsen <jenshnielsen@gmail.com> wrote:
I have tried testing the wheels in a project that runs tests on Travis's Trusty infrastructure which. The wheels work great for python 3.5 and saves us several minuts of runtime.
However, I am having trouble using the wheels on python 2.7 on the same Trusty machines. It seems to be because the wheels are tagged as cp27-cp27mu (numpy-1.11.0-cp27-cp27mu-manylinux1_x86_64.whl) where as pip.pep425tags.get_abi_tag() returns cp27m on this particular python version. (Stock python 2.7 installed on Travis 14.04 VMs) Any chance of a cp27m compatible wheel build?
Ouch - do you know where travis-ci's Python 2.7 comes from? I see that the standard apt-get install -y python is a wide (mu) build...
I built some narrow unicode builds (numpy-1.11.0-cp27-cp27m-manylinux1_x86_64.whl etc) here: http://ccdd0ebb5a931e58c7c5-aae005c4999d7244ac63632f8b80e089.r77.cf2.rackcdn... Would you mind testing them to see if they work on travis-ci? Thanks, Matthew
![](https://secure.gravatar.com/avatar/b4929294417e9ac44c17967baae75a36.jpg?s=120&d=mm&r=g)
On Sat, Apr 16, 2016 at 8:02 PM, Matthew Brett <matthew.brett@gmail.com> wrote:
Hi,
On Thu, Apr 14, 2016 at 11:04 AM, Matthew Brett <matthew.brett@gmail.com> wrote:
Hi,
On Thu, Apr 14, 2016 at 8:02 AM, Jens Nielsen <jenshnielsen@gmail.com> wrote:
I have tried testing the wheels in a project that runs tests on Travis's Trusty infrastructure which. The wheels work great for python 3.5 and saves us several minuts of runtime.
However, I am having trouble using the wheels on python 2.7 on the same Trusty machines. It seems to be because the wheels are tagged as cp27-cp27mu (numpy-1.11.0-cp27-cp27mu-manylinux1_x86_64.whl) where as pip.pep425tags.get_abi_tag() returns cp27m on this particular python version. (Stock python 2.7 installed on Travis 14.04 VMs) Any chance of a cp27m compatible wheel build?
Ouch - do you know where travis-ci's Python 2.7 comes from? I see that the standard apt-get install -y python is a wide (mu) build...
I built some narrow unicode builds (numpy-1.11.0-cp27-cp27m-manylinux1_x86_64.whl etc) here:
http://ccdd0ebb5a931e58c7c5-aae005c4999d7244ac63632f8b80e089.r77.cf2.rackcdn...
Would you mind testing them to see if they work on travis-ci?
I tried testing on trusty with travis-ci, but it appears to pick up the mu builds as on precise... https://travis-ci.org/matthew-brett/manylinux-testing/jobs/123652670#L161 Cheers, Matthew
![](https://secure.gravatar.com/avatar/aee56554ec30edfd680e1c937ed4e54d.jpg?s=120&d=mm&r=g)
I tried on trusty and is also picked numpy-1.11.0-cp27-cp27mu-manylinux1_x86_64.whl using the system python 2.7 (in a virtualenv with pip 8.1.1):
import pip pip.pep425tags.get_abi_tag() 'cp27mu'
Outside of the virtualenv I still have the pip version from ubuntu trusty and it does cannot detect ABI tags: $ /usr/bin/pip --version pip 1.5.4 from /usr/lib/python2.7/dist-packages (python 2.7)
import pip pip.pep425tags.get_abi_tag() Traceback (most recent call last): File "<stdin>", line 1, in <module> AttributeError: 'module' object has no attribute 'get_abi_tag'
But we don't really care because manylinux1 wheels can only be installed by pip 8.1 and later. Previous versions of pip should just ignore those wheels and try to install from the source tarball instead. -- Olivier
![](https://secure.gravatar.com/avatar/21d2fbbb409915032f42c1bafa70cfc5.jpg?s=120&d=mm&r=g)
I have tested the new cp27m wheels and they seem to work great. @Matthew I am using the: ``` sudo: required dist: trusty images mentioned here https://docs.travis-ci.com/user/ci-environment/. As far as I can see you are doing: sudo: false dist: trusty I had no idea such an image exist since it's not documented on https://docs.travis-ci.com/user/ci-environment/ Anyway your tests runs with python 2.7.9 where as the sudo: requires ships python 2.7.10 so it's clearly a different python version: @Olivier Grisel this only applies to Travis's own home build versions of python 2.7 on the Trusty running on google compute engine. It ships it's own prebuild python version. I don't have any issues with the stock versions on Ubuntu which pip tells me are indeed cp27mu. It seems like the new cp27m wheels works as expected. Thanks a lot Doing: ``` python -c "from pip import pep425tags; print(pep425tags.is_manylinux1_compatible()); print(pep425tags.have_compatible_glibc(2, 5)); print(pep425tags.get_abi_tag())" pip install --timeout=60 --no-index --trusted-host " ccdd0ebb5a931e58c7c5-aae005c4999d7244ac63632f8b80e089.r77.cf2.rackcdn.com" --find-links " http://ccdd0ebb5a931e58c7c5-aae005c4999d7244ac63632f8b80e089.r77.cf2.rackcdn..." numpy scipy --upgrade ``` results in: ``` True True cp27m Ignoring indexes: https://pypi.python.org/simple Collecting numpy Downloading http://ccdd0ebb5a931e58c7c5-aae005c4999d7244ac63632f8b80e089.r77.cf2.rackcdn... (15.3MB) 100% |████████████████████████████████| 15.3MB 49.0MB/s Collecting scipy Downloading http://ccdd0ebb5a931e58c7c5-aae005c4999d7244ac63632f8b80e089.r77.cf2.rackcdn... (39.5MB) 100% |████████████████████████████████| 39.5MB 21.1MB/s Installing collected packages: numpy, scipy Found existing installation: numpy 1.10.1 Uninstalling numpy-1.10.1: Successfully uninstalled numpy-1.10.1 Successfully installed numpy-1.11.0 scipy-0.17.0 ``` And all my tests pass as expected. Thanks a lot for all the work. Best Jens On Sun, 17 Apr 2016 at 11:05 Olivier Grisel <olivier.grisel@ensta.org> wrote:
I tried on trusty and is also picked numpy-1.11.0-cp27-cp27mu-manylinux1_x86_64.whl using the system python 2.7 (in a virtualenv with pip 8.1.1):
import pip pip.pep425tags.get_abi_tag() 'cp27mu'
Outside of the virtualenv I still have the pip version from ubuntu trusty and it does cannot detect ABI tags:
$ /usr/bin/pip --version pip 1.5.4 from /usr/lib/python2.7/dist-packages (python 2.7)
import pip pip.pep425tags.get_abi_tag() Traceback (most recent call last): File "<stdin>", line 1, in <module> AttributeError: 'module' object has no attribute 'get_abi_tag'
But we don't really care because manylinux1 wheels can only be installed by pip 8.1 and later. Previous versions of pip should just ignore those wheels and try to install from the source tarball instead.
-- Olivier _______________________________________________ NumPy-Discussion mailing list NumPy-Discussion@scipy.org https://mail.scipy.org/mailman/listinfo/numpy-discussion
![](https://secure.gravatar.com/avatar/aee56554ec30edfd680e1c937ed4e54d.jpg?s=120&d=mm&r=g)
Thanks for the clarification, I read your original report too quickly. I wonder why the travis maintainers built Python 2.7 with a non-standard unicode option. Edit (after googling): this is a known issue. The image with Python 2.7.11 will be fixed: https://github.com/travis-ci/travis-ci/issues/5107 -- Olivier
![](https://secure.gravatar.com/avatar/697900d3a29858ea20cc109a2aee0af6.jpg?s=120&d=mm&r=g)
Yeah! That's the bug I encountered! So, that would explain why this seems to work fine now (I tried it out a bit on Friday on a CentOS6 system, but didn't run the test suite). Cheers! Ben Root On Sun, Apr 17, 2016 at 1:46 PM, Olivier Grisel <olivier.grisel@ensta.org> wrote:
Thanks for the clarification, I read your original report too quickly.
I wonder why the travis maintainers built Python 2.7 with a non-standard unicode option.
Edit (after googling): this is a known issue. The image with Python 2.7.11 will be fixed:
https://github.com/travis-ci/travis-ci/issues/5107
-- Olivier _______________________________________________ NumPy-Discussion mailing list NumPy-Discussion@scipy.org https://mail.scipy.org/mailman/listinfo/numpy-discussion
![](https://secure.gravatar.com/avatar/97c543aca1ac7bbcfb5279d0300c8330.jpg?s=120&d=mm&r=g)
On Apr 17, 2016 10:47 AM, "Olivier Grisel" <olivier.grisel@ensta.org> wrote:
Thanks for the clarification, I read your original report too quickly.
I wonder why the travis maintainers built Python 2.7 with a non-standard unicode option.
Because for some reason cpython's configure script (in the now somewhat ancient versions we're talking about) defaults to non-standard broken Unicode support, and you have to explicitly override it if you want working standard Unicode support. I guess this made sense in like the 90s before people realized how unicode was going to go down. Same issue affects pyenv users (or used to, I think they might have just fixed it [0]) and Enthought Canopy. -n [0] https://github.com/yyuu/pyenv/issues/257
![](https://secure.gravatar.com/avatar/b4929294417e9ac44c17967baae75a36.jpg?s=120&d=mm&r=g)
On Sun, Apr 17, 2016 at 9:48 AM, Jens Nielsen <jenshnielsen@gmail.com> wrote:
I have tested the new cp27m wheels and they seem to work great.
@Matthew I am using the:
``` sudo: required dist: trusty
images mentioned here https://docs.travis-ci.com/user/ci-environment/. As far as I can see you are doing: sudo: false dist: trusty
I had no idea such an image exist since it's not documented on https://docs.travis-ci.com/user/ci-environment/
Anyway your tests runs with python 2.7.9 where as the sudo: requires ships python 2.7.10 so it's clearly a different python version:
@Olivier Grisel this only applies to Travis's own home build versions of python 2.7 on the Trusty running on google compute engine. It ships it's own prebuild python version. I don't have any issues with the stock versions on Ubuntu which pip tells me are indeed cp27mu.
It seems like the new cp27m wheels works as expected. Thanks a lot Doing:
``` python -c "from pip import pep425tags; print(pep425tags.is_manylinux1_compatible()); print(pep425tags.have_compatible_glibc(2, 5)); print(pep425tags.get_abi_tag())" pip install --timeout=60 --no-index --trusted-host "ccdd0ebb5a931e58c7c5-aae005c4999d7244ac63632f8b80e089.r77.cf2.rackcdn.com" --find-links "http://ccdd0ebb5a931e58c7c5-aae005c4999d7244ac63632f8b80e089.r77.cf2.rackcdn..." numpy scipy --upgrade ``` results in:
``` True True cp27m Ignoring indexes: https://pypi.python.org/simple Collecting numpy Downloading http://ccdd0ebb5a931e58c7c5-aae005c4999d7244ac63632f8b80e089.r77.cf2.rackcdn... (15.3MB) 100% |████████████████████████████████| 15.3MB 49.0MB/s Collecting scipy Downloading http://ccdd0ebb5a931e58c7c5-aae005c4999d7244ac63632f8b80e089.r77.cf2.rackcdn... (39.5MB) 100% |████████████████████████████████| 39.5MB 21.1MB/s Installing collected packages: numpy, scipy Found existing installation: numpy 1.10.1 Uninstalling numpy-1.10.1: Successfully uninstalled numpy-1.10.1 Successfully installed numpy-1.11.0 scipy-0.17.0 ``` And all my tests pass as expected.
Thanks for testing. I set up a buildbot test to run against a narrow unicode build of Python: http://nipy.bic.berkeley.edu/builders/manylinux-2.7-debian-narrow/builds/1 All tests pass for me too, so I've done the pypi upload for the narrow unicode numpy, scipy, cython wheels. Cheers, Matthew
![](https://secure.gravatar.com/avatar/b4929294417e9ac44c17967baae75a36.jpg?s=120&d=mm&r=g)
Hi, On Mon, Apr 18, 2016 at 2:49 PM, Matthew Brett <matthew.brett@gmail.com> wrote:
On Sun, Apr 17, 2016 at 9:48 AM, Jens Nielsen <jenshnielsen@gmail.com> wrote:
I have tested the new cp27m wheels and they seem to work great.
@Matthew I am using the:
``` sudo: required dist: trusty
images mentioned here https://docs.travis-ci.com/user/ci-environment/. As far as I can see you are doing: sudo: false dist: trusty
I had no idea such an image exist since it's not documented on https://docs.travis-ci.com/user/ci-environment/
Anyway your tests runs with python 2.7.9 where as the sudo: requires ships python 2.7.10 so it's clearly a different python version:
@Olivier Grisel this only applies to Travis's own home build versions of python 2.7 on the Trusty running on google compute engine. It ships it's own prebuild python version. I don't have any issues with the stock versions on Ubuntu which pip tells me are indeed cp27mu.
It seems like the new cp27m wheels works as expected. Thanks a lot Doing:
``` python -c "from pip import pep425tags; print(pep425tags.is_manylinux1_compatible()); print(pep425tags.have_compatible_glibc(2, 5)); print(pep425tags.get_abi_tag())" pip install --timeout=60 --no-index --trusted-host "ccdd0ebb5a931e58c7c5-aae005c4999d7244ac63632f8b80e089.r77.cf2.rackcdn.com" --find-links "http://ccdd0ebb5a931e58c7c5-aae005c4999d7244ac63632f8b80e089.r77.cf2.rackcdn..." numpy scipy --upgrade ``` results in:
``` True True cp27m Ignoring indexes: https://pypi.python.org/simple Collecting numpy Downloading http://ccdd0ebb5a931e58c7c5-aae005c4999d7244ac63632f8b80e089.r77.cf2.rackcdn... (15.3MB) 100% |████████████████████████████████| 15.3MB 49.0MB/s Collecting scipy Downloading http://ccdd0ebb5a931e58c7c5-aae005c4999d7244ac63632f8b80e089.r77.cf2.rackcdn... (39.5MB) 100% |████████████████████████████████| 39.5MB 21.1MB/s Installing collected packages: numpy, scipy Found existing installation: numpy 1.10.1 Uninstalling numpy-1.10.1: Successfully uninstalled numpy-1.10.1 Successfully installed numpy-1.11.0 scipy-0.17.0 ``` And all my tests pass as expected.
Thanks for testing.
I've also tested a range of numpy and scipy wheels built with the manylinux docker image. Built numpy and scipy wheels here: http://nipy.bic.berkeley.edu/manylinux/ Test script and output here: http://nipy.bic.berkeley.edu/manylinux/tests/ There are some test failures in the logs there, but I think they are all known failures from old numpy / scipy versions, particularly https://github.com/scipy/scipy/issues/5370 Y'all can test for yourselves with something like: python -m pip install -U pip pip install -f https://nipy.bic.berkeley.edu/manylinux numpy==1.6.2 scipy==0.16.0 I propose to upload these historical wheels to pypi to make it easier to test against older versions of numpy / scipy. Any objections? Cheers, Matthew
![](https://secure.gravatar.com/avatar/aee56554ec30edfd680e1c937ed4e54d.jpg?s=120&d=mm&r=g)
I think that would be very useful, e.g. for downstream projects to check that they work properly with old versions using a simple pip install command on their CI workers. -- Olivier
![](https://secure.gravatar.com/avatar/b4929294417e9ac44c17967baae75a36.jpg?s=120&d=mm&r=g)
On Tue, Apr 19, 2016 at 1:12 AM, Olivier Grisel <olivier.grisel@ensta.org> wrote:
I think that would be very useful, e.g. for downstream projects to check that they work properly with old versions using a simple pip install command on their CI workers.
Done for numpy 1.6.0 through 1.10.4, scipy 0.9 through scipy 0.16.1 Please let me know of any problems, Matthew
![](https://secure.gravatar.com/avatar/aee56554ec30edfd680e1c937ed4e54d.jpg?s=120&d=mm&r=g)
Thanks, I think next we could upgrade the travis configuration of numpy and scipy to build and upload manylinux1 wheels to http://travis-dev-wheels.scipy.org/ for downstream project to test against the master branch of numpy and scipy whithout having to build those from source. However that would require publishing an official pre-built libopenblas.so (+headers) archive or RPM package. That archive would server as the reference libary to build scipy stack manylinux1 wheels. -- Olivier
![](https://secure.gravatar.com/avatar/21d2fbbb409915032f42c1bafa70cfc5.jpg?s=120&d=mm&r=g)
Thanks I can confirm that the new narrow unicode build wheels of Scipy works as expected for my project. @Oliver Grisel Thanks for finding the Travis issue it's probably worth considering switching the Travis build to 2.7.11 to avoid other similar issues. The old versions of numpy are very handy for downstream testing. I have verified that they work as expected in the Matplotlib tests here: https://github.com/jenshnielsen/matplotlib/tree/travisnowheelhouse where we are testing against numpy 1.6 as the earliest. This branch switches matplotlib from the scikit image wheelhouse to manylinux wheels which seems to work great. best Jens On Wed, 20 Apr 2016 at 09:59 Olivier Grisel <olivier.grisel@ensta.org> wrote:
Thanks,
I think next we could upgrade the travis configuration of numpy and scipy to build and upload manylinux1 wheels to http://travis-dev-wheels.scipy.org/ for downstream project to test against the master branch of numpy and scipy whithout having to build those from source.
However that would require publishing an official pre-built libopenblas.so (+headers) archive or RPM package. That archive would server as the reference libary to build scipy stack manylinux1 wheels. -- Olivier _______________________________________________ NumPy-Discussion mailing list NumPy-Discussion@scipy.org https://mail.scipy.org/mailman/listinfo/numpy-discussion
![](https://secure.gravatar.com/avatar/b4929294417e9ac44c17967baae75a36.jpg?s=120&d=mm&r=g)
Hi, On Wed, Apr 20, 2016 at 3:33 AM, Jens Nielsen <jenshnielsen@gmail.com> wrote:
Thanks
I can confirm that the new narrow unicode build wheels of Scipy works as expected for my project. @Oliver Grisel Thanks for finding the Travis issue it's probably worth considering switching the Travis build to 2.7.11 to avoid other similar issues.
The old versions of numpy are very handy for downstream testing. I have verified that they work as expected in the Matplotlib tests here: https://github.com/jenshnielsen/matplotlib/tree/travisnowheelhouse where we are testing against numpy 1.6 as the earliest. This branch switches matplotlib from the scikit image wheelhouse to manylinux wheels which seems to work great.
Jens - any interest in working together on a good matplotlib build recipe? Matthew
![](https://secure.gravatar.com/avatar/b4929294417e9ac44c17967baae75a36.jpg?s=120&d=mm&r=g)
On Wed, Apr 20, 2016 at 1:59 AM, Olivier Grisel <olivier.grisel@ensta.org> wrote:
Thanks,
I think next we could upgrade the travis configuration of numpy and scipy to build and upload manylinux1 wheels to http://travis-dev-wheels.scipy.org/ for downstream project to test against the master branch of numpy and scipy whithout having to build those from source.
However that would require publishing an official pre-built libopenblas.so (+headers) archive or RPM package. That archive would server as the reference libary to build scipy stack manylinux1 wheels.
There's an OpenBLAS archive up at : http://ccdd0ebb5a931e58c7c5-aae005c4999d7244ac63632f8b80e089.r77.cf2.rackcdn... - is that the right place for it? It gets uploaded by the manylinux-builds travis run. Cheers, Matthew
![](https://secure.gravatar.com/avatar/aee56554ec30edfd680e1c937ed4e54d.jpg?s=120&d=mm&r=g)
2016-04-20 16:57 GMT+02:00 Matthew Brett <matthew.brett@gmail.com>:
On Wed, Apr 20, 2016 at 1:59 AM, Olivier Grisel <olivier.grisel@ensta.org> wrote:
Thanks,
I think next we could upgrade the travis configuration of numpy and scipy to build and upload manylinux1 wheels to http://travis-dev-wheels.scipy.org/ for downstream project to test against the master branch of numpy and scipy whithout having to build those from source.
However that would require publishing an official pre-built libopenblas.so (+headers) archive or RPM package. That archive would server as the reference libary to build scipy stack manylinux1 wheels.
There's an OpenBLAS archive up at : http://ccdd0ebb5a931e58c7c5-aae005c4999d7244ac63632f8b80e089.r77.cf2.rackcdn...
Thanks.
- is that the right place for it? It gets uploaded by the manylinux-builds travis run.
The only problem with rackspace cloud files is that as of now there is no way to put a short domain name (CNAME) with https. Maybe we could use the github "release" system on a github repo under the numpy github organization to host it. Or alternatively use an external binary file host that use github credentials for upload rigths, for instance bintray (I have no experience with this yet though). -- Olivier http://twitter.com/ogrisel - http://github.com/ogrisel
![](https://secure.gravatar.com/avatar/b4929294417e9ac44c17967baae75a36.jpg?s=120&d=mm&r=g)
On Thu, Apr 21, 2016 at 1:47 AM, Olivier Grisel <olivier.grisel@ensta.org> wrote:
2016-04-20 16:57 GMT+02:00 Matthew Brett <matthew.brett@gmail.com>:
On Wed, Apr 20, 2016 at 1:59 AM, Olivier Grisel <olivier.grisel@ensta.org> wrote:
Thanks,
I think next we could upgrade the travis configuration of numpy and scipy to build and upload manylinux1 wheels to http://travis-dev-wheels.scipy.org/ for downstream project to test against the master branch of numpy and scipy whithout having to build those from source.
However that would require publishing an official pre-built libopenblas.so (+headers) archive or RPM package. That archive would server as the reference libary to build scipy stack manylinux1 wheels.
There's an OpenBLAS archive up at : http://ccdd0ebb5a931e58c7c5-aae005c4999d7244ac63632f8b80e089.r77.cf2.rackcdn...
Thanks.
- is that the right place for it? It gets uploaded by the manylinux-builds travis run.
The only problem with rackspace cloud files is that as of now there is no way to put a short domain name (CNAME) with https. Maybe we could use the github "release" system on a github repo under the numpy github organization to host it. Or alternatively use an external binary file host that use github credentials for upload rigths, for instance bintray (I have no experience with this yet though).
The github releases idea sounds intriguing. Do you have any experience with that? Are there good examples other than the API documentation? https://developer.github.com/v3/repos/releases/ Cheers, Matthew
![](https://secure.gravatar.com/avatar/aee56554ec30edfd680e1c937ed4e54d.jpg?s=120&d=mm&r=g)
2016-04-22 20:17 GMT+02:00 Matthew Brett <matthew.brett@gmail.com>:
The github releases idea sounds intriguing. Do you have any experience with that? Are there good examples other than the API documentation?
I never used it by I assume we could create a numpy-openblas repo to host official builds suitable for embedding numpy wheels for stable each releases of OpenBLAS: There is also a travis deployment target. https://docs.travis-ci.com/user/deployment/releases I have not sure that the travis timeout is long enough to build openblas. I believe so but I have not tried myself yet. -- Olivier http://twitter.com/ogrisel - http://github.com/ogrisel
![](https://secure.gravatar.com/avatar/b4929294417e9ac44c17967baae75a36.jpg?s=120&d=mm&r=g)
On Fri, Apr 22, 2016 at 11:27 AM, Olivier Grisel <olivier.grisel@ensta.org> wrote:
2016-04-22 20:17 GMT+02:00 Matthew Brett <matthew.brett@gmail.com>:
The github releases idea sounds intriguing. Do you have any experience with that? Are there good examples other than the API documentation?
I never used it by I assume we could create a numpy-openblas repo to host official builds suitable for embedding numpy wheels for stable each releases of OpenBLAS:
There is also a travis deployment target.
Ah - thanks - that's good resource.
I have not sure that the travis timeout is long enough to build openblas. I believe so but I have not tried myself yet.
Yes, the manylinux-builds repo currently builds openblas for each entry in the build matrix, so it's easily within time: https://travis-ci.org/matthew-brett/manylinux-builds/builds/123643313 It would be good to think of a way of supporting a set of libraries, such as libpng, freetype, openblas. We might need to support both 64-bit and 32-bit versions as well. Then, some automated build script would by default pick up the latest of these for numpy, matplotlib etc. Matthew
![](https://secure.gravatar.com/avatar/b4929294417e9ac44c17967baae75a36.jpg?s=120&d=mm&r=g)
On Thu, Apr 14, 2016 at 8:02 AM, Jens Nielsen <jenshnielsen@gmail.com> wrote:
I have tried testing the wheels in a project that runs tests on Travis's Trusty infrastructure which. The wheels work great for python 3.5 and saves us several minuts of runtime.
However, I am having trouble using the wheels on python 2.7 on the same Trusty machines. It seems to be because the wheels are tagged as cp27-cp27mu (numpy-1.11.0-cp27-cp27mu-manylinux1_x86_64.whl) where as pip.pep425tags.get_abi_tag() returns cp27m on this particular python version. (Stock python 2.7 installed on Travis 14.04 VMs) Any chance of a cp27m compatible wheel build?
Nathaniel / other pip experts - I can't remember the history of these tags. Is there any danger that an older pip will install a cp27m wheel on a cp27mu system? Matthew
![](https://secure.gravatar.com/avatar/97c543aca1ac7bbcfb5279d0300c8330.jpg?s=120&d=mm&r=g)
On Thu, Apr 14, 2016 at 1:22 PM, Matthew Brett <matthew.brett@gmail.com> wrote:
On Thu, Apr 14, 2016 at 8:02 AM, Jens Nielsen <jenshnielsen@gmail.com> wrote:
I have tried testing the wheels in a project that runs tests on Travis's Trusty infrastructure which. The wheels work great for python 3.5 and saves us several minuts of runtime.
However, I am having trouble using the wheels on python 2.7 on the same Trusty machines. It seems to be because the wheels are tagged as cp27-cp27mu (numpy-1.11.0-cp27-cp27mu-manylinux1_x86_64.whl) where as pip.pep425tags.get_abi_tag() returns cp27m on this particular python version. (Stock python 2.7 installed on Travis 14.04 VMs) Any chance of a cp27m compatible wheel build?
Nathaniel / other pip experts - I can't remember the history of these tags.
Is there any danger that an older pip will install a cp27m wheel on a cp27mu system?
No, support for cp27m/cp27mu tags went in before support for manylinux tags. And in any case, a pip that doesn't know about cp27m/cp27mu will just not install such wheels. The dangerous case is if you were to use an old version of bdist_wheel that generated a wheel with the "none" abi tag instead of a cp27m/cp27mu abi tag -- this will mess up all versions of pip, new and old. But the manylinux docker image definitely has a new enough version of the wheel package that this is not a problem. ...I guess the other dangerous case is if you generate a wheel that simply has the wrong name -- this happened to the gevent packager due to some distutils brokenness involving using the same source directory to build both wheels. So don't do that :-). (IIRC there's an open bug against auditwheel to check for all these problems -- belt *and* suspenders -- but that hasn't been implemented yet.) -n -- Nathaniel J. Smith -- https://vorpus.org
![](https://secure.gravatar.com/avatar/96dd777e397ab128fedab46af97a3a4a.jpg?s=120&d=mm&r=g)
On Wed, Apr 13, 2016 at 1:15 PM, Matthew Brett <matthew.brett@gmail.com> wrote:
Hi,
On Sat, Apr 2, 2016 at 6:11 PM, Matthew Brett <matthew.brett@gmail.com> wrote:
On Fri, Mar 25, 2016 at 6:39 AM, Peter Cock <p.j.a.cock@googlemail.com> wrote:
On Fri, Mar 25, 2016 at 3:02 AM, Robert T. McGibbon < rmcgibbo@gmail.com> wrote:
I suspect that many of the maintainers of major scipy-ecosystem
aware of these (or other similar) travis wheel caches, but would guess that the pool of travis-ci python users who weren't aware of these wheel caches is much much larger. So there will still be a lot of travis-ci clock cycles saved by manylinux wheels.
-Robert
Yes exactly. Availability of NumPy Linux wheels on PyPI is definitely something I would suggest adding to the release notes. Hopefully this will help
On Tue, Apr 12, 2016 at 7:15 PM, Matthew Brett <matthew.brett@gmail.com> wrote: projects are trigger
a general availability of wheels in the numpy-ecosystem :)
In the case of Travis CI, their VM images for Python already have a version of NumPy installed, but having the latest version of NumPy and SciPy etc available as Linux wheels would be very nice.
We're very nearly there now.
The latest versions of numpy, scipy, scikit-image, pandas, numexpr, statsmodels wheels for testing at
http://ccdd0ebb5a931e58c7c5-aae005c4999d7244ac63632f8b80e089.r77.cf2.rackcdn...
Please do test with:
python -m pip install --upgrade pip
pip install --trusted-host=
ccdd0ebb5a931e58c7c5-aae005c4999d7244ac63632f8b80e089.r77.cf2.rackcdn.com
--find-links= http://ccdd0ebb5a931e58c7c5-aae005c4999d7244ac63632f8b80e089.r77.cf2.rackcdn... numpy scipy scikit-learn numexpr
python -c 'import numpy; numpy.test("full")' python -c 'import scipy; scipy.test("full")'
We would love to get any feedback as to whether these work on your machines.
I've just rebuilt these wheels with the just-released OpenBLAS 0.2.18.
OpenBLAS is now passing all its own tests and tests on numpy / scipy / scikit-learn at http://build.openblas.net/builders
Our tests of the wheels look good too:
http://nipy.bic.berkeley.edu/builders/manylinux-2.7-debian http://nipy.bic.berkeley.edu/builders/manylinux-2.7-debian https://travis-ci.org/matthew-brett/manylinux-testing
So I think these are ready to go. I propose uploading these wheels for numpy and scipy to pypi tomorrow unless anyone has an objection.
Done. If y'all are on linux, and you have pip >= 8.11, you should now see this kind of thing:
$ pip install numpy scipy Collecting numpy Downloading numpy-1.11.0-cp27-cp27mu-manylinux1_x86_64.whl (15.3MB) 100% |████████████████████████████████| 15.3MB 61kB/s Collecting scipy Downloading scipy-0.17.0-cp27-cp27mu-manylinux1_x86_64.whl (39.5MB) 100% |████████████████████████████████| 39.5MB 24kB/s Installing collected packages: numpy, scipy Successfully installed numpy-1.11.0 scipy-0.17.0
Great work. It is nice that we are finally getting the Windows thing squared away after all these years. Chuck
participants (13)
-
Benjamin Root
-
Charles R Harris
-
G Young
-
Jens Nielsen
-
Jonathan Helmus
-
Matthew Brett
-
Nathaniel Smith
-
Olivier Grisel
-
Oscar Benjamin
-
Paul Hobson
-
Peter Cock
-
Ralf Gommers
-
Robert T. McGibbon