After having used and built bdist_wininst installers for years (well, I should say decades) I have recently started playing with pip. In theory, it is a grat solution. Respects virtual environments (although I haven't tested them yet), allows to freeze to a requirements file, and so on. It would allow to freeze and then commit a requirements file into our version control system, add a 'pip install -r requirements.txt' to our build scripts, and any user in out company as well as the build server would automatically have installed all the required stuff. (We have 8 developers working on our source code, plus a build server that does automatic builds). However, does it work in real life? There are no wheels for important packages like pywin32, numpy, lxml, at least not on PyPi. 'pip install' using the source packages fails to build for most but the simplest stuff; probably because distutils has been extended too much, or because header files or libraries are missing, or whatever... Is there a solution to this? I've seen that the wheel tool can convert bdist_wininst installers into wheels - does this work for the packages I mentioned above? Do we have to build or convert to wheel those packages, and setup a central place where we store them and make them available to our developers? BTW: It seems the pip 1.4.1 doesn't download the wheel built with the most recent setuptools/wheel version, instead it downloads and tries to build from the source package. Thanks, Thomas
On 23 January 2014 11:48, Thomas Heller <theller@ctypes.org> wrote:
After having used and built bdist_wininst installers for years (well, I should say decades) I have recently started playing with pip.
Thanks for trying this - it's really useful to have a newcomer's perspective to the pip experience.
However, does it work in real life?
There are no wheels for important packages like pywin32, numpy, lxml, at least not on PyPi. 'pip install' using the source packages fails to build for most but the simplest stuff; probably because distutils has been extended too much, or because header files or libraries are missing, or whatever...
This is the key point - pip install from source fails on Windows for pretty much any package with C code and non-trivial dependencies. Building any of the above packages is murder on Windows, and it's 100% critical to the user experience that you shouldn't have to.
Is there a solution to this? I've seen that the wheel tool can convert bdist_wininst installers into wheels - does this work for the packages I mentioned above? Do we have to build or convert to wheel those packages, and setup a central place where we store them and make them available to our developers?
The good news is that "wheel convert XXX.exe" works on all the above for creating a wheel from the wininst installer. The official numpy installer uses some complex magic to select the right binaries based on your CPU, and this means that the official numpy "superpack" wininst files don't convert (at least I don't think they do, it's a while since I tried). But happily, Christoph Gohlke hosts a huge list of readymade wininst installers for hard-to-build projects, and the 3 you mention are all there. He's very good about building for latest Pythons, too (3.4 is already there for many packages). Anyone working on Windows who doesn't know his site (http://www.lfd.uci.edu/~gohlke/pythonlibs/) should check it out. So, to summarise, yes, you can get wheels for pretty much everything you need by using wheel convert on wininst installers. You do need to manually download, run wheel convert, and host the wheels locally (a simple directory is fine though). I'd love to see some or all of these projects host wheels themselves on PyPI, or someone like Christoph host a pypi-style index of wheels, so we could just point pip at that index as well as PyPI (note: I don't know if pip works when a package is split across 2 index sites, that's something I should check, but if it doesn't, I'd like to see that added). But that's in the future for now - we probably need more uptake of wheels before the demand becomes sufficient to persuade anyone to do this.
BTW: It seems the pip 1.4.1 doesn't download the wheel built with the most recent setuptools/wheel version, instead it downloads and tries to build from the source package.
Pip 1.4 doesn't use wheels by default - you should add "--use-wheel" to the install command (or put it in your pip.ini). "--use-wheel is the default from pip 1.5 onwards. Hope this helps, Paul.
Am 23.01.2014 13:16, schrieb Paul Moore:
On 23 January 2014 11:48, Thomas Heller <theller@ctypes.org> wrote:
Is there a solution to this? I've seen that the wheel tool can convert bdist_wininst installers into wheels - does this work for the packages I mentioned above? Do we have to build or convert to wheel those packages, and setup a central place where we store them and make them available to our developers?
The good news is that "wheel convert XXX.exe" works on all the above for creating a wheel from the wininst installer. The official numpy installer uses some complex magic to select the right binaries based on your CPU, and this means that the official numpy "superpack" wininst files don't convert (at least I don't think they do, it's a while since I tried). But happily, Christoph Gohlke hosts a huge list of readymade wininst installers for hard-to-build projects, and the 3 you mention are all there. He's very good about building for latest Pythons, too (3.4 is already there for many packages). Anyone working on Windows who doesn't know his site (http://www.lfd.uci.edu/~gohlke/pythonlibs/) should check it out.
So, to summarise, yes, you can get wheels for pretty much everything you need by using wheel convert on wininst installers. You do need to manually download, run wheel convert, and host the wheels locally (a simple directory is fine though).
Thanks Paul, for this info. Thomas (over to the next topic)
On Thu, Jan 23, 2014 at 12:16:02PM +0000, Paul Moore wrote:
The official numpy installer uses some complex magic to select the right binaries based on your CPU, and this means that the official numpy "superpack" wininst files don't convert (at least I don't think they do, it's a while since I tried).
It's probably worth noting that numpy are toying around with wheels and have uploaded a number of them to PyPI for testing: http://sourceforge.net/projects/numpy/files/wheels_to_test/ Currently there are only OSX wheels there (excluding the puer Python ones) and they're not available on PyPI. I assume that they're waiting for a solution for the Windows installer (a post-install script for wheels). That would give a lot more impetus to put wheels up on PyPI. The Sourceforge OSX wheels are presumably not getting that much use right now. The OSX-specific numpy wheel has been downloaded 4 times in the last week: twice on Windows and twice on Linux!
But happily, Christoph Gohlke hosts a huge list of readymade wininst installers for hard-to-build projects, and the 3 you mention are all there. He's very good about building for latest Pythons, too (3.4 is already there for many packages). Anyone working on Windows who doesn't know his site (http://www.lfd.uci.edu/~gohlke/pythonlibs/) should check it out.
Also I've seen Cristoph mention on the numpy-discussion list that he was at leasting testing building wheels although none seem to be available on his site at the moment. Oscar
On Thu, Jan 23, 2014 at 3:42 PM, Oscar Benjamin <oscar.j.benjamin@gmail.com>wrote:
On Thu, Jan 23, 2014 at 12:16:02PM +0000, Paul Moore wrote:
The official numpy installer uses some complex magic to select the right binaries based on your CPU, and this means that the official numpy "superpack" wininst files don't convert (at least I don't think they do, it's a while since I tried).
It's probably worth noting that numpy are toying around with wheels and have uploaded a number of them to PyPI for testing: http://sourceforge.net/projects/numpy/files/wheels_to_test/
Currently there are only OSX wheels there (excluding the puer Python ones) and they're not available on PyPI. I assume that they're waiting for a solution for the Windows installer (a post-install script for wheels). That would give a lot more impetus to put wheels up on PyPI.
Indeed. We discussed just picking the SSE2 or SSE3 build and putting that up as a wheel, but that was deemed a not so great idea: http://article.gmane.org/gmane.comp.python.numeric.general/56072 The Sourceforge OSX wheels are presumably not getting that much use
right now. The OSX-specific numpy wheel has been downloaded 4 times in the last week: twice on Windows and twice on Linux!
Some feedback from the people who did try those wheels would help. I asked for that on the numpy list after creating them, but didn't get much. So I haven't been in a hurry to move them over to PyPi. Ralf
Am 23.01.2014 19:52, schrieb Ralf Gommers:
On Thu, Jan 23, 2014 at 3:42 PM, Oscar Benjamin <oscar.j.benjamin@gmail.com <mailto:oscar.j.benjamin@gmail.com>> wrote:
On Thu, Jan 23, 2014 at 12:16:02PM +0000, Paul Moore wrote: > > The official numpy installer uses some complex magic to select the > right binaries based on your CPU, and this means that the official > numpy "superpack" wininst files don't convert (at least I don't think > they do, it's a while since I tried).
It's probably worth noting that numpy are toying around with wheels and have uploaded a number of them to PyPI for testing: http://sourceforge.net/projects/numpy/files/wheels_to_test/
Currently there are only OSX wheels there (excluding the puer Python ones) and they're not available on PyPI. I assume that they're waiting for a solution for the Windows installer (a post-install script for wheels). That would give a lot more impetus to put wheels up on PyPI.
Indeed. We discussed just picking the SSE2 or SSE3 build and putting that up as a wheel, but that was deemed a not so great idea: http://article.gmane.org/gmane.comp.python.numeric.general/56072
Did I say this before? I would suggest that numpy develops a way where all the SSE binary variations would be installed, and the appropriate ones be loaded at runtime, depending on the user's CPU capabilities. This would also allow py2exe'd distributions to include them all.
The Sourceforge OSX wheels are presumably not getting that much use right now. The OSX-specific numpy wheel has been downloaded 4 times in the last week: twice on Windows and twice on Linux!
Some feedback from the people who did try those wheels would help. I asked for that on the numpy list after creating them, but didn't get much. So I haven't been in a hurry to move them over to PyPi.
I would have tried wheels for windows, python 3.3 or 3.4, but there aren't any. Thomas
On 24 January 2014 06:25, Thomas Heller <theller@ctypes.org> wrote:
Am 23.01.2014 19:52, schrieb Ralf Gommers:
On Thu, Jan 23, 2014 at 3:42 PM, Oscar Benjamin <oscar.j.benjamin@gmail.com <mailto:oscar.j.benjamin@gmail.com>> wrote:
On Thu, Jan 23, 2014 at 12:16:02PM +0000, Paul Moore wrote: > > The official numpy installer uses some complex magic to select the > right binaries based on your CPU, and this means that the official > numpy "superpack" wininst files don't convert (at least I don't think > they do, it's a while since I tried).
It's probably worth noting that numpy are toying around with wheels and have uploaded a number of them to PyPI for testing: http://sourceforge.net/projects/numpy/files/wheels_to_test/
Currently there are only OSX wheels there (excluding the puer Python ones) and they're not available on PyPI. I assume that they're waiting for a solution for the Windows installer (a post-install script for wheels). That would give a lot more impetus to put wheels up on PyPI.
Indeed. We discussed just picking the SSE2 or SSE3 build and putting that up as a wheel, but that was deemed a not so great idea: http://article.gmane.org/gmane.comp.python.numeric.general/56072
Did I say this before? I would suggest that numpy develops a way where all the SSE binary variations would be installed, and the appropriate ones be loaded at runtime, depending on the user's CPU capabilities. This would also allow py2exe'd distributions to include them all.
I believe I suggested that at one point, but the dependencies were scattered throughout NumPy rather than being in one place (so you had to toggle several modules, rather than just one), and it made for some interesting build problems further up the stack. Agreed that runtime selection would be better, though - the current approach not only makes it difficult to do things like create universal py2exe and wheel binaries, it also makes it difficult to run NumPy from portable media like a USB key, because different systems may need different SSE modules. Cheers, Nick. -- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia
On Thu, Jan 23, 2014 at 12:25 PM, Thomas Heller <theller@ctypes.org> wrote:
Did I say this before? I would suggest that numpy develops a way where all the SSE binary variations would be installed, and the appropriate ones be loaded at runtime, depending on the user's CPU capabilities. This would also allow py2exe'd distributions to include them all.
That was discussed on the numpy list, and would be really nice, but it may also be really difficult. OS-X has built-in support for multi-architecture binaries, but Windows does not, and while selecting a particular .dll (or .pyd) to load at run-time would be fairly straightforward numpy has more than one, and then there is the whole scipy stack, and all the third-party stuff compiled against it. I suspect this wold have to be built-in to the python importing and distutils build system to be workable. But maybe someone smarter than me will figure it out. Some feedback from the people who did try those wheels would help. I
asked for that on the numpy list after creating them, but didn't get much. So I haven't been in a hurry to move them over to PyPi.
Serious chicken-egg problem there....
I would have tried wheels for windows, python 3.3 or 3.4, but there aren't any.
Yeah we need to get those up -- SSE2 only ones would work for MOST people. -CHB -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker@noaa.gov
On 24 Jan 2014 08:03, "Chris Barker" <chris.barker@noaa.gov> wrote:
On Thu, Jan 23, 2014 at 12:25 PM, Thomas Heller <theller@ctypes.org>
Did I say this before? I would suggest that numpy develops a way where all the SSE binary variations would be installed, and the appropriate ones be loaded at runtime, depending on the user's CPU capabilities. This would also allow py2exe'd distributions to include them all.
That was discussed on the numpy list, and would be really nice, but it may also be really difficult. OS-X has built-in support for multi-architecture binaries, but Windows does not, and while selecting a
wrote: particular .dll (or .pyd) to load at run-time would be fairly straightforward numpy has more than one, and then there is the whole scipy stack, and all the third-party stuff compiled against it.
I suspect this wold have to be built-in to the python importing and
distutils build system to be workable. But maybe someone smarter than me will figure it out.
Some feedback from the people who did try those wheels would help. I asked for that on the numpy list after creating them, but didn't get much. So I haven't been in a hurry to move them over to PyPi.
Serious chicken-egg problem there....
I would have tried wheels for windows, python 3.3 or 3.4, but there
aren't any.
Yeah we need to get those up -- SSE2 only ones would work for MOST people.
I really think that's our best near term workaround - still room for improvement, but " pip install numpy assumes SSE2" is a much better situation than "pip install numpy doesn't work on Windows". Such a change would help a lot of people *right now*, while still leaving room to eventually figure out something more sophisticated (like postinstall hooks or simpler runtime multi-build support or NumPy changing to a dependency that internally makes this decision at runtime). Cheers, Nick.
-CHB
--
Christopher Barker, Ph.D. Oceanographer
Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception
Chris.Barker@noaa.gov
_______________________________________________ Distutils-SIG maillist - Distutils-SIG@python.org https://mail.python.org/mailman/listinfo/distutils-sig
On 23 January 2014 23:58, Nick Coghlan <ncoghlan@gmail.com> wrote:
I really think that's our best near term workaround - still room for improvement, but " pip install numpy assumes SSE2" is a much better situation than "pip install numpy doesn't work on Windows".
Is it? Do you have any idea what proportion of (the relevant) people would be using Windows with hardware that doesn't support SSE2? I feel confident that it's less than 10% but I don't know how to justify a tighter bound than that. You need to bear in mind that people currently have a variety of ways to install numpy on Windows that do work already without limitations on CPU instruction set. Most numpy users will not get any immediate benefit from the fact that "it works using pip" rather than "it works using the .exe installer" (or any of a number of other options). It's the unfortunate end users and the numpy folks who would have to pick up the pieces if/when the SSE2 assumption fails.
Such a change would help a lot of people *right now*, while still leaving room to eventually figure out something more sophisticated (like postinstall hooks or simpler runtime multi-build support or NumPy changing to a dependency that internally makes this decision at runtime).
Postinstall hooks are not that sophisticated and most packaging systems have them. You're advocating for numpy to take a dodgy compromise here but can it not be the other way round? Wheel, pip etc. could quickly agree on and implement a postinstall hook that would work for numpy and then that could be made more sophisticated later on. Oscar
On Jan 23, 2014, at 4:17 PM, Oscar Benjamin <oscar.j.benjamin@gmail.com> wrote:
On 23 January 2014 23:58, Nick Coghlan <ncoghlan@gmail.com> wrote:
I really think that's our best near term workaround - still room for improvement, but " pip install numpy assumes SSE2" is a much better situation than "pip install numpy doesn't work on Windows".
Is it? Do you have any idea what proportion of (the relevant) people would be using Windows with hardware that doesn't support SSE2? I feel confident that it's less than 10% but I don't know how to justify a tighter bound than that.
You need to bear in mind that people currently have a variety of ways to install numpy on Windows that do work already without limitations on CPU instruction set. Most numpy users will not get any immediate benefit from the fact that "it works using pip" rather than "it works using the .exe installer" (or any of a number of other options). It's the unfortunate end users and the numpy folks who would have to pick up the pieces if/when the SSE2 assumption fails.
This all sounds very similar to the issues with Linux binary wheels and varying system ABIs. Should probably keep in mind for any solution that might apply to both. --Noah
On 24 January 2014 00:17, Oscar Benjamin <oscar.j.benjamin@gmail.com> wrote:
You need to bear in mind that people currently have a variety of ways to install numpy on Windows that do work already without limitations on CPU instruction set. Most numpy users will not get any immediate benefit from the fact that "it works using pip" rather than "it works using the .exe installer" (or any of a number of other options). It's the unfortunate end users and the numpy folks who would have to pick up the pieces if/when the SSE2 assumption fails.
The people who would benefit are those who (like me!) don't have a core requirement for numpy, but who just want to "try it out" casually, or for experimenting or one-off specialised scripts. These are the people who won't be using one of the curated distributions, and quite possibly will be using a virtualenv, so the exe installers won't work. Giving these people a means to try numpy could introduce a wider audience to it. Having said that, I can understand the reluctance to have to deal with non-specialist users hitting obscure "your CPU is too old" errors - that's *not* a good initial experience. And your point that it's just as reasonable for pip to adopt a partial solution in the short term is also fair - although it would be harder for pip to replace an API we added and which people are using, than it would be for numpy to switch to deploying better wheels when the facilities become available. So the comparison isn't entirely equal. Paul
On 24 Jan 2014 19:41, "Paul Moore" <p.f.moore@gmail.com> wrote:
On 24 January 2014 00:17, Oscar Benjamin <oscar.j.benjamin@gmail.com>
wrote:
You need to bear in mind that people currently have a variety of ways to install numpy on Windows that do work already without limitations on CPU instruction set. Most numpy users will not get any immediate benefit from the fact that "it works using pip" rather than "it works using the .exe installer" (or any of a number of other options). It's the unfortunate end users and the numpy folks who would have to pick up the pieces if/when the SSE2 assumption fails.
The people who would benefit are those who (like me!) don't have a core requirement for numpy, but who just want to "try it out" casually, or for experimenting or one-off specialised scripts. These are the people who won't be using one of the curated distributions, and quite possibly will be using a virtualenv, so the exe installers won't work. Giving these people a means to try numpy could introduce a wider audience to it.
Having said that, I can understand the reluctance to have to deal with non-specialist users hitting obscure "your CPU is too old" errors - that's *not* a good initial experience.
And your point that it's just as reasonable for pip to adopt a partial solution in the short term is also fair - although it would be harder for pip to replace an API we added and which people are using, than it would be for numpy to switch to deploying better wheels when the facilities become available. So the comparison isn't entirely equal.
There's also the fact that we're still trying to recover from the setup.py situation (which was a "quick and easy" alternative to a declarative build system), so quick hacks in the core metadata specs that will then be locked in for years by backwards compatibility requirements are definitely *not* acceptable. We already have more than enough of those in the legacy metadata we're aiming to replace :P All NumPy should need to reduce end user confusion to tolerable levels is an import time CPU check that raises an error that includes a link to stable URL explaining the limitations of the published wheel file, and alternative ways of obtaining NumPy (like Christophe's installers, or a science & data analysis focused distribution like Anaconda or EPD, or bootstrapping conda). In return, as Paul points out, it becomes substantially easier for people that *aren't* wholly invested in the scientific Python stack to try it out with their regular tools, rather than having to completely change how they work with Python. Also consider that, given the status quo, any users that might see that new error instead get even *more* incomprehensible errors as pip attempts to build NumPy from source and fails at doing so. The choice given the current metadata standards isn't between confusing Windows users or not, it's between confusing 100% of those that try "pip install numpy" with cryptic errors from a failed build at install time and confusing a much smaller percentage of those with a CPU compatibility error at runtime. Is the latter a desirable *final* state? No, and metadata 2.0 will aim to address that. It is, however, substantially better than the status quo and doesn't run the risk of compromising interoperability standards we're going to have to live with indefinitely. Cheers, Nick.
Paul
On Fri, Jan 24, 2014 at 2:18 AM, Nick Coghlan <ncoghlan@gmail.com> wrote:
In return, as Paul points out, it becomes substantially easier for people that *aren't* wholly invested in the scientific Python stack to try it out with their regular tools, rather than having to completely change how they work with Python.
This is a really important constituency, actually. And one that has been neglected for a while.
Also consider that, given the status quo, any users that might see that new error instead get even *more* incomprehensible errors as pip attempts to build NumPy from source and fails at doing so.
well, numpy _should_ build out of the box with nothing special if you are set up to build regular extensions. I understand that a lto f Windows users are not set up to build extensions at all, but tehy ar presumably used to getting "compiler not found" errors (or whatever the message is). But you won't get an optimized numpy and much of the rest of the "stack" is harder to build: scipy, matplotlib. So a set of working binary wheels would be great. And while we in the numpy commmunity don't really want a lot of "numpy is slower than MATLAB" FUD out there, I still think it's better to get a sub-optimum, but working build out there. The "should I use python instead of MATLAB?" crowd would be better served by one of the other options anyway... So how rare are non-SSE2 systems? Any w ay to find out? I"m guessing rare enough that we can a) not worry about it, and b) those users will know they have an old system and may expect issue,s particularly with something billed as being for high-performance computation. So I say SSE2 -- but if we do think there ar a lot of non-SSE2 users out there, then do SSE1-only , it would still work just fine for casual use. -Chris -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker@noaa.gov
On 24 January 2014 22:21, Chris Barker <chris.barker@noaa.gov> wrote:
well, numpy _should_ build out of the box with nothing special if you are set up to build regular extensions. I understand that a lto f Windows users are not set up to build extensions at all, but tehy ar presumably used to getting "compiler not found" errors (or whatever the message is). But you won't get an optimized numpy and much of the rest of the "stack" is harder to build: scipy, matplotlib.
Seriously? If I have MSVC 2010 installed, pip install numpy will correctly build numpy from source? It's a *long* time since I tried this, but I really thought building numpy was harder than that. A quick test later: No BLAS/ATLAS/LAPACK causes a string of warnings, And ignoring the rest of the error stack (which I'm frankly not interested in investing the time to diagnose and fix) I get "RuntimeError: Broken toolchain: cannot link a simple C program". Which is utter rubbish - I routinely build extensions with this installation. So no, numpy does not build out of the box. Ah well. Paul
On Fri, Jan 24, 2014 at 2:40 PM, Paul Moore <p.f.moore@gmail.com> wrote:
So no, numpy does not build out of the box. Ah well.
Darn -- it used to, and it should. It has shipped for years with an "LAPACK light", and shouldn't need any fortran. It used to not even look for LAPACK with a default configuration. But I haven't done if or years, so who know when this might have been broken? -CHB -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker@noaa.gov
On 24 January 2014 22:40, Paul Moore <p.f.moore@gmail.com> wrote:
On 24 January 2014 22:21, Chris Barker <chris.barker@noaa.gov> wrote:
well, numpy _should_ build out of the box with nothing special if you are set up to build regular extensions. I understand that a lto f Windows users are not set up to build extensions at all, but tehy ar presumably used to getting "compiler not found" errors (or whatever the message is). But you won't get an optimized numpy and much of the rest of the "stack" is harder to build: scipy, matplotlib.
Seriously? If I have MSVC 2010 installed, pip install numpy will correctly build numpy from source? It's a *long* time since I tried this, but I really thought building numpy was harder than that.
A quick test later: No BLAS/ATLAS/LAPACK causes a string of warnings, And ignoring the rest of the error stack (which I'm frankly not interested in investing the time to diagnose and fix) I get "RuntimeError: Broken toolchain: cannot link a simple C program". Which is utter rubbish - I routinely build extensions with this installation.
So no, numpy does not build out of the box. Ah well.
Last time I tried with mingw it worked (I've since departed the Windows world). I think official numpy binaries for Windows are built with mingw (Christoph uses MSVC though). Oscar
On 25 January 2014 21:33, Oscar Benjamin <oscar.j.benjamin@gmail.com> wrote:
Last time I tried with mingw it worked (I've since departed the Windows world). I think official numpy binaries for Windows are built with mingw (Christoph uses MSVC though).
That may well be the case, but MSVC (Express or full) is the "standard" for building extensions - it's reasonable to expect that that is what the "casual user" we're talking about here would have (anyone who has taken the time to set up mingw for builds can be assumed to be "experienced" to some level or other...) Paul
On 24 January 2014 10:18, Nick Coghlan <ncoghlan@gmail.com> wrote:
On 24 Jan 2014 19:41, "Paul Moore" <p.f.moore@gmail.com> wrote:
On 24 January 2014 00:17, Oscar Benjamin <oscar.j.benjamin@gmail.com> wrote:
You need to bear in mind that people currently have a variety of ways to install numpy on Windows that do work already without limitations on CPU instruction set. Most numpy users will not get any immediate benefit from the fact that "it works using pip" rather than "it works using the .exe installer" (or any of a number of other options). It's the unfortunate end users and the numpy folks who would have to pick up the pieces if/when the SSE2 assumption fails.
The people who would benefit are those who (like me!) don't have a core requirement for numpy, but who just want to "try it out" casually, or for experimenting or one-off specialised scripts. These are the people who won't be using one of the curated distributions, and quite possibly will be using a virtualenv, so the exe installers won't work. Giving these people a means to try numpy could introduce a wider audience to it.
Having said that, I can understand the reluctance to have to deal with non-specialist users hitting obscure "your CPU is too old" errors - that's *not* a good initial experience.
And your point that it's just as reasonable for pip to adopt a partial solution in the short term is also fair - although it would be harder for pip to replace an API we added and which people are using, than it would be for numpy to switch to deploying better wheels when the facilities become available. So the comparison isn't entirely equal.
There's also the fact that we're still trying to recover from the setup.py situation (which was a "quick and easy" alternative to a declarative build system), so quick hacks in the core metadata specs that will then be locked in for years by backwards compatibility requirements are definitely *not* acceptable. We already have more than enough of those in the legacy metadata we're aiming to replace :P
It wasn't a totally serious suggestion: I knew what your response would be. ;) I'll try to summarise your take on this: You would like to take the time to ensure that Python packaging is done properly. That may mean that some functionality isn't available for some time, but you think that it's better to "get it right" than rush something out the door just to "get it working fast". That's not an unreasonable position to take but I wanted to contrast that with your advice to numpy: Just rush something out of the door even if it has obvious problems. Don't worry about getting it right; we'll do that later... We all want a solution that definitely works so that you can advise any old noob to use it. So if you could say 'just use pip' then that would be great. But if you say... ''' 'just use pip... unless your CPU doesn't support SSE2. Don't worry if you've never heard of SSE2 just do 'pip install numpy' and then 'python -c "import numpy"'. If you see an error like "your CPU doesn't support SSE2 please install the non-SSE version of numpy." then you'll need to install numpy using one of the other options listed below and make sure that you do that before trying to use pip to install any of these other packages and if you use Cristoph's .exe for numpy then the you can't use pip for scipy and some other set of packages (I'm not totally sure which) so you shouldn't use pip for anything after that. Unless it's a pure Python package. Don't worry if you don't know what a pure Python package is, just try it with pip and if it doesn't work just try something else... """ ... then putting the wheel on PyPI becomes substantially less attractive. Just having to explain that pip might not work and then trying to describe when it will and won't and what to do about it is a pain. I wouldn't want to recommend to my students that they do this unless I was confident that it would work. Also, note that I don't really think a post-install script is the best solution for something like this. It would be better to have an extensible system for querying things like CPU capability. It would also be better to have an extensible system for detecting things like Fortran ABI compatibility - this can also be handled with a post-install script but it's not the best solution. Are there any plans to solve these problems? Also is there a roadmap describing the expected timeline for future packaging features? Oscar
On 25 January 2014 21:56, Oscar Benjamin <oscar.j.benjamin@gmail.com> wrote:
I'll try to summarise your take on this: You would like to take the time to ensure that Python packaging is done properly. That may mean that some functionality isn't available for some time, but you think that it's better to "get it right" than rush something out the door just to "get it working fast".
That's not an unreasonable position to take but I wanted to contrast that with your advice to numpy: Just rush something out of the door even if it has obvious problems. Don't worry about getting it right; we'll do that later...
Just to be clear, that's *not* my position. (It may be the position of some of the other pip developers). My view is that the sooner "pip install X" works out of the box, for as many of the cases where it didn't in the past, the faster we'll get adoption and the sooner people will start reporting any remaining issues. It won't be a perfect solution, but it will be better than the current status quo (at least, I'm looking for solutions that *are* better than the status quo :-)). I'd also like to see some visible impact from wheels - at the moment, even though (for example) pip and setuptools publish wheels, I doubt anyone sees any difference. I sort of wish we hadn't managed to make wheels quite so transparent in use, ironically :-) Wheels are an improvement over the wininst status quo because they support virtualenvs. So I want to see numpy wheels available, because that would be a significant step in making people aware of some of the improvements we're making. I do *not* want to see existing numpy users, or the specialists already well-served by the scientific python community, being harmed by the existence of wheels - but my impression was that the advice given to people who want to use numpy/scipy seriously, is to use one of the curated stacks like conda or enthought. So I'm looking at people who at the moment don't use numpy, but are somewhat interested in trying it out (not enough to install a whole new Python stack, though). Those people currently either use the wininst installers (which I'm *not* advocating that we remove) or they use virtualenvs, and have the view that they have to at a minimum jump through some hoops to get numpy to work. It's purely the people who can't use the wininst installers, and don't want a curated stack, that are my focus at the moment. It seems to me that there are a number of solutions for them: 1. Ignore them. It's a small enough group as to not matter. I have a personal dislike of this option, because I'm in this group :-) But it *is* a fair position to take. 2. The pip developers add facilities to pip to allow the numpy folks to generate multi-architecture wheels 3. The numpy folks put up interim wheels that work for some, but not all, users. Option 2 has an issue because developing a "proper" solution is still a long way off. But a "quick fix" postinstall script solution is possible. However, even this requires some development work, and once that has been done, a new pip release is needed, and the numpy folks then need to add the relevant postinstall scripts and update their build process to incorporate the multi-architecture DLLs into their wheels. So it's a non-trivial amount of work and time, even if it's better than a "full" solution. Option 3 has a problem because there's a support problem with people who try to use the wheels and get obscure runtime errors. But that's the only cost I can see - the numpy people can build wheels by just adding "setup.py bdist_wheel" to their build process alongside the current "setup.py bdist_wininst", and then upload them. Maybe to testpypi if it's important to make using the wheels an opt-in process. Also, maybe we could distributw a script that allowed people to check in advance if the numpy wheels would work on their PC. I'm not giving conflicting advice here - all I'm doing is looking for the short-term action that gives the most benefit for the least cost. It seems to me that option 2 takes longer and involves more effort than option 3. I'm not even convinced that option 2 is less effort for the numpy developers (if we ignore the pip development work). But I know nothing about the numpy build process, so my assumptions could be way off here - I'd be more than happy for someone to clarify what effort is involved in publishing wheels for numpy (which already exist, is that right?). As general information, it would be very valuable, because I'm pretty sure most people round here assume it's little more than adding "bdist_wheel" to an existing binary distribution production process. My apologies for this email being so long, but hopefully it's explained my position more clearly than I seem to have done previously. Paul
-------------------------------------------- On Sat, 25/1/14, Paul Moore <p.f.moore@gmail.com> wrote:
Wheels are an improvement over the wininst status quo because they support virtualenvs. So I want to see numpy wheels available, because that would be a significant step in making people aware of some of the improvements we're making. ...
Those people currently either use the wininst installers (which I'm *not* advocating that we remove) or they use virtualenvs, and have the view that they have to at a minimum jump through some hoops to get numpy to work.
I don't know if you're aware that recent versions of distil can convert bdist_wininst installers to wheels. The feature has not been extensively tested, but AFAIK it basically works. For example, with one of Christoph Gohlke's installers (I just happened to run the conversion on POSIX, it should work the same way on Windows): $ distil package --format=wheel /tmp/numpy-MKL-1.8.0.win32-py3.3.exe The following packages were built: /tmp/numpy-1.8.0-cp33-none-win32.whl Installing on Windows in a freshly created venv created with python -m venv:: C:\Documents and Settings\Vinay>distil -e \temp\venv33 install \temp\numpy-1.8.0-cp33-none-win32.whl Checking requirements for numpy (1.8.0) ... done. The following new packages will be installed from a local location: numpy (1.8.0) Installing numpy (1.8.0) ... Installation completed. C:\Documents and Settings\Vinay>\temp\venv33\Scripts\python ActivePython 3.3.2.0 (ActiveState Software Inc.) based on Python 3.3.2 (default, Sep 16 2013, 23:10:06) [MSC v.1600 32 bit (Intel)] on win 32 Type "help", "copyright", "credits" or "license" for more information.
import numpy numpy.__file__ 'C:\\temp\\venv33\\lib\\site-packages\\numpy\\__init__.py'
Of course, that doesn't prove much, other than that the conversion and installation processes can be fairly painless. I'd be interested in any feedback about these distil features. Regards, Vinay Sajip
On 25 January 2014 23:36, Vinay Sajip <vinay_sajip@yahoo.co.uk> wrote:
Those people currently either use the wininst installers (which I'm *not* advocating that we remove) or they use virtualenvs, and have the view that they have to at a minimum jump through some hoops to get numpy to work.
I don't know if you're aware that recent versions of distil can convert bdist_wininst installers to wheels. The feature has not been extensively tested, but AFAIK it basically works. For example, with one of Christoph Gohlke's installers (I just happened to run the conversion on POSIX, it should work the same way on Windows):
Wheel convert also has that ability pretty much since the start - I added the code for it </showoff>. But yes, it's what I meant by "jumping through hoops" Just for information, you can also convert the official numpy superpack exes, but they are multiple wininsts combined in a single exe, and you have to open them up in a zip manager, grab the individual wininsts out, and convert the one you need. (If I recall the process, it was some time ago and may have changed). Paul
Paul's position exactly mirrors my own - I an perfectly fine with the recommended advice to scientific users continuing to be "NumPy doesn't officially support pip and virtualenv because of the way it is built and installed, so you will have to get one of the curated scientific stacks, bootstrap conda, use the Windows installers or use the version provided by your Linux distro vendor." The metadata 2.0 standards *will not* be accepted until the pip 1.6 or 1.7 time frame, and it's more likely the latter, since I don't want to distract anyone from the current security work (I know I said otherwise recently, but I managed to temporarily forget that the Warehouse transition and implementing PEP 458 was next on the to do list when I said that). So, if the NumPy community choose to wait for general post-install script support in wheel files, they're likely to be waiting at least until the release of pip 1.7. In the meantime, the failure mode for people attempting to try out the Scientific Python stack via "pip install numpy" in an existing Python installation or virtualenv will remain a failure to build with a likely cryptic error. I do see a few possible workarounds, but none of them would change the metadata 2.0 standards: 1. Add explicit NumPy support *directly* to pip. This would be the quick hack, private API support that Oscar is requesting, since it would be a special arrangement between the pip devs and the numpy devs, and eventually replaced by a general purpose post-install mechanism in metadata 2.0. 2. Add support to pip to request the conversion of available wininst installers (and bdist_dumb?) to wheels for installation with pip. Vinay has this working from a technical perspective, so it may be something the pip devs are interested in exploring for pip 1.6. 3. Both of the above options require waiting for pip 1.6 (at the earliest), which means neither will improve the behaviour in CPython 3.4 (which will ship pip 1.5.1). The only folks with the power to improve *that* situation are the NumPy devs, who have the ability to choose between the "doesn't work for anyone except experienced build engineers" status quo and "works for a large percentage of users, but will still fail at runtime for users on hardware without SSE2 support". To put the "but what if the user doesn't have SSE2 support?" concern in context, it should only affect Intel users with CPUs older than a Pentium 4 (released 2001), and AMD users with a CPU older than an Opteron or Athlon 64 (both released 2003). All x86/x86_64 CPUs released in the past decade should be able to handle SSE2 binaries, so our caveat can be "if your computer is more than a decade old, 'pip install numpy' may not work for you, but it should do the right thing on newer systems". Now, the NumPy devs may feel that persisting with the status quo for another 6 to 12 months while waiting for still hypothetical additional changes in pip specifically to accommodate NumPy's current installation practices is a better alternative than taking option 3. However, from my perspective, having NumPy readily available to users using the python.orgWindows installers for Python 3.4 would *significantly* lower the barrier to entry to the Scientific Python stack for new users on relatively modern systems when compared to the 4 current options (while we accept the Linux distro problem is on distutils-sig to deal with, that's far from being a NumPy specific problem). Cheers, Nick.
On Sat, Jan 25, 2014 at 4:29 PM, Nick Coghlan <ncoghlan@gmail.com> wrote:
To put the "but what if the user doesn't have SSE2 support?" concern in context, it should only affect Intel users with CPUs older than a Pentium 4 (released 2001), and AMD users with a CPU older than an Opteron or Athlon 64 (both released 2003). All x86/x86_64 CPUs released in the past decade should be able to handle SSE2 binaries, so our caveat can be "if your computer is more than a decade old, 'pip install numpy' may not work for you, but it should do the right thing on newer systems".
Exactly
However, from my perspective, having NumPy readily available to users using the python.org Windows installers for Python 3.4 would *significantly* lower the barrier to entry to the Scientific Python stack for new users on relatively modern systems when compared to the 4 current options
+1 with a note: This isn't just for users of the SciPy Stack -- there are LOT of use-cases for just numpy by itself. Not that I don't want folks to have easy access of the rest of the stack as well -- just sayin' -Chris -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker@noaa.gov
On 29 January 2014 05:06, Chris Barker <chris.barker@noaa.gov> wrote:
This isn't just for users of the SciPy Stack -- there are LOT of use-cases for just numpy by itself. Not that I don't want folks to have easy access of the rest of the stack as well -- just sayin'
Agreed - my main use for NumPy is with Pandas for data analysis tasks, which is a truly great combination. Paul
On 29 January 2014 18:06, Paul Moore <p.f.moore@gmail.com> wrote:
On 29 January 2014 05:06, Chris Barker <chris.barker@noaa.gov> wrote:
This isn't just for users of the SciPy Stack -- there are LOT of use-cases for just numpy by itself. Not that I don't want folks to have easy access of the rest of the stack as well -- just sayin'
Agreed - my main use for NumPy is with Pandas for data analysis tasks, which is a truly great combination.
I confess I tend to lump the many and varied data analysis tools that depend on NumPy under the phrase "Scientific Python Stack", rather than intending to refer specifically to SciPy :) Cheers, Nick. -- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia
On Sun, Jan 26, 2014 at 12:29 AM, Nick Coghlan <ncoghlan@gmail.com> wrote:
Paul's position exactly mirrors my own - I an perfectly fine with the recommended advice to scientific users continuing to be "NumPy doesn't officially support pip and virtualenv because of the way it is built and installed, so you will have to get one of the curated scientific stacks, bootstrap conda, use the Windows installers or use the version provided by your Linux distro vendor."
The metadata 2.0 standards *will not* be accepted until the pip 1.6 or 1.7 time frame, and it's more likely the latter, since I don't want to distract anyone from the current security work (I know I said otherwise recently, but I managed to temporarily forget that the Warehouse transition and implementing PEP 458 was next on the to do list when I said that).
So, if the NumPy community choose to wait for general post-install script support in wheel files, they're likely to be waiting at least until the release of pip 1.7.
In the meantime, the failure mode for people attempting to try out the Scientific Python stack via "pip install numpy" in an existing Python installation or virtualenv will remain a failure to build with a likely cryptic error.
I do see a few possible workarounds, but none of them would change the metadata 2.0 standards:
1. Add explicit NumPy support *directly* to pip. This would be the quick hack, private API support that Oscar is requesting, since it would be a special arrangement between the pip devs and the numpy devs, and eventually replaced by a general purpose post-install mechanism in metadata 2.0.
I am not speaking for the whole numpy team, but as the former maintainer of numpy.distutils, I think this will be more hurtful than helpful. I think the SSE issue is a bit of a side discussion: most people who care about performance already know how to install numpy. What we care about here are people who don't care so much about fast eigenvalue decomposition, but want to use e.g. pandas. Building numpy in a way that supports every architecture is both doable and acceptable IMO. 2. Add support to pip to request the conversion of available wininst
installers (and bdist_dumb?) to wheels for installation with pip. Vinay has this working from a technical perspective, so it may be something the pip devs are interested in exploring for pip 1.6.
Building numpy wheels is not hard, we can do that fairly easily (I have already done so several times, the hard parts have nothing to do with wheel or even python, and are related to mingw issues on win 64 bits).
3. Both of the above options require waiting for pip 1.6 (at the earliest), which means neither will improve the behaviour in CPython 3.4 (which will ship pip 1.5.1). The only folks with the power to improve *that* situation are the NumPy devs, who have the ability to choose between the "doesn't work for anyone except experienced build engineers" status quo and "works for a large percentage of users, but will still fail at runtime for users on hardware without SSE2 support".
To put the "but what if the user doesn't have SSE2 support?" concern in context, it should only affect Intel users with CPUs older than a Pentium 4 (released 2001), and AMD users with a CPU older than an Opteron or Athlon 64 (both released 2003). All x86/x86_64 CPUs released in the past decade should be able to handle SSE2 binaries, so our caveat can be "if your computer is more than a decade old, 'pip install numpy' may not work for you, but it should do the right thing on newer systems".
Now, the NumPy devs may feel that persisting with the status quo for another 6 to 12 months while waiting for still hypothetical additional changes in pip specifically to accommodate NumPy's current installation practices is a better alternative than taking option 3. However, from my perspective, having NumPy readily available to users using the python.orgWindows installers for Python 3.4 would *significantly* lower the barrier to entry to the Scientific Python stack for new users on relatively modern systems when compared to the 4 current options (while we accept the Linux distro problem is on distutils-sig to deal with, that's far from being a NumPy specific problem).
Just to clarify: you actually can install numpy on windows with python.orginstallers fairly easily by using easy_install already (we upload a bdist_wininst compatible binary which should not use any CPU-specific instructions). It looks like those are missing for 1.8.0, but we can fix this fairly easily. David
Cheers, Nick.
_______________________________________________ Distutils-SIG maillist - Distutils-SIG@python.org https://mail.python.org/mailman/listinfo/distutils-sig
On Wed, Jan 29, 2014 at 2:04 PM, David Cournapeau <cournape@gmail.com>wrote:
I think the SSE issue is a bit of a side discussion: most people who care about performance already know how to install numpy. What we care about here are people who don't care so much about fast eigenvalue decomposition, but want to use e.g. pandas. Building numpy in a way that supports every architecture is both doable and acceptable IMO.
Exactly -- I'm pretty sure SSE2 is being suggested because that's the lowest common denominator that we expect to see a lot of -- if their really are a lot of non-SSE-2 machines out there we could leave that off, too.
Building numpy wheels is not hard, we can do that fairly easily (I have already done so several times, the hard parts have nothing to do with wheel or even python, and are related to mingw issues on win 64 bits).
David, Where is numpy as with building "out of the box" with the python.org binary for Windows, and the "standard" MS compilers that are used with those builds. That used to be an easy "python setup.py install" away -- has that changed? If so, is this a known bug, or a known we-aren't-supporting-that? i.e. it would be nice if anyone setup to build C extensions could "just build numpy". -Chris Just to clarify: you actually can install numpy on windows with python.orginstallers fairly easily by using easy_install already (we upload a
bdist_wininst compatible binary which should not use any CPU-specific instructions). It looks like those are missing for 1.8.0, but we can fix this fairly easily.
presumably just as easy to do a binary wheel then -- I vote for that. -Chris -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker@noaa.gov
On Wed, Jan 29, 2014 at 10:27 PM, Chris Barker <chris.barker@noaa.gov>wrote:
On Wed, Jan 29, 2014 at 2:04 PM, David Cournapeau <cournape@gmail.com>wrote:
I think the SSE issue is a bit of a side discussion: most people who care about performance already know how to install numpy. What we care about here are people who don't care so much about fast eigenvalue decomposition, but want to use e.g. pandas. Building numpy in a way that supports every architecture is both doable and acceptable IMO.
Exactly -- I'm pretty sure SSE2 is being suggested because that's the lowest common denominator that we expect to see a lot of -- if their really are a lot of non-SSE-2 machines out there we could leave that off, too.
The failure mode is fairly horrible though, and the gain is not that substantial anyway compared to really optimized installation (MKL, etc... as provided by Continuum or us).
Building numpy wheels is not hard, we can do that fairly easily (I have
already done so several times, the hard parts have nothing to do with wheel or even python, and are related to mingw issues on win 64 bits).
David,
Where is numpy as with building "out of the box" with the python.orgbinary for Windows, and the "standard" MS compilers that are used with those builds. That used to be an easy "python setup.py install" away -- has that changed? If so, is this a known bug, or a known we-aren't-supporting-that?
i.e. it would be nice if anyone setup to build C extensions could "just build numpy".
This has always been possible, and if not, that's certainly considered as a bug (I would be eager to fix). Numpy is actually fairly easy to build if you have a C Compiler (which is the obvious pain point on windows). Scipy, and fortran is where things fall apart. David
I don’t see any reason why SSE couldn’t be added as tags in the Wheel filename fwiw. That doesn’t help for things like MKL though. On Jan 29, 2014, at 5:50 PM, David Cournapeau <cournape@gmail.com> wrote:
On Wed, Jan 29, 2014 at 10:27 PM, Chris Barker <chris.barker@noaa.gov> wrote: On Wed, Jan 29, 2014 at 2:04 PM, David Cournapeau <cournape@gmail.com> wrote: I think the SSE issue is a bit of a side discussion: most people who care about performance already know how to install numpy. What we care about here are people who don't care so much about fast eigenvalue decomposition, but want to use e.g. pandas. Building numpy in a way that supports every architecture is both doable and acceptable IMO.
Exactly -- I'm pretty sure SSE2 is being suggested because that's the lowest common denominator that we expect to see a lot of -- if their really are a lot of non-SSE-2 machines out there we could leave that off, too.
The failure mode is fairly horrible though, and the gain is not that substantial anyway compared to really optimized installation (MKL, etc... as provided by Continuum or us).
Building numpy wheels is not hard, we can do that fairly easily (I have already done so several times, the hard parts have nothing to do with wheel or even python, and are related to mingw issues on win 64 bits).
David,
Where is numpy as with building "out of the box" with the python.org binary for Windows, and the "standard" MS compilers that are used with those builds. That used to be an easy "python setup.py install" away -- has that changed? If so, is this a known bug, or a known we-aren't-supporting-that?
i.e. it would be nice if anyone setup to build C extensions could "just build numpy".
This has always been possible, and if not, that's certainly considered as a bug (I would be eager to fix).
Numpy is actually fairly easy to build if you have a C Compiler (which is the obvious pain point on windows). Scipy, and fortran is where things fall apart.
David _______________________________________________ Distutils-SIG maillist - Distutils-SIG@python.org https://mail.python.org/mailman/listinfo/distutils-sig
----------------- Donald Stufft PGP: 0x6E3CBCE93372DCFA // 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA
On Wed, Jan 29, 2014 at 10:52 PM, Donald Stufft <donald@stufft.io> wrote:
I don’t see any reason why SSE couldn’t be added as tags in the Wheel filename fwiw.
You still need to decide when to install what, but I would be interested in talking more about that part.
That doesn’t help for things like MKL though.
Nope, but MKL is actually easy in the sense that it deals with architectures at runtime. OSS numerical libraries generally don't (lots of work, and often a non issue when you can build stuff by yourself :) ). David
On Jan 29, 2014, at 5:50 PM, David Cournapeau <cournape@gmail.com> wrote:
On Wed, Jan 29, 2014 at 10:27 PM, Chris Barker <chris.barker@noaa.gov>wrote:
On Wed, Jan 29, 2014 at 2:04 PM, David Cournapeau <cournape@gmail.com>wrote:
I think the SSE issue is a bit of a side discussion: most people who care about performance already know how to install numpy. What we care about here are people who don't care so much about fast eigenvalue decomposition, but want to use e.g. pandas. Building numpy in a way that supports every architecture is both doable and acceptable IMO.
Exactly -- I'm pretty sure SSE2 is being suggested because that's the lowest common denominator that we expect to see a lot of -- if their really are a lot of non-SSE-2 machines out there we could leave that off, too.
The failure mode is fairly horrible though, and the gain is not that substantial anyway compared to really optimized installation (MKL, etc... as provided by Continuum or us).
Building numpy wheels is not hard, we can do that fairly easily (I have
already done so several times, the hard parts have nothing to do with wheel or even python, and are related to mingw issues on win 64 bits).
David,
Where is numpy as with building "out of the box" with the python.orgbinary for Windows, and the "standard" MS compilers that are used with those builds. That used to be an easy "python setup.py install" away -- has that changed? If so, is this a known bug, or a known we-aren't-supporting-that?
i.e. it would be nice if anyone setup to build C extensions could "just build numpy".
This has always been possible, and if not, that's certainly considered as a bug (I would be eager to fix).
Numpy is actually fairly easy to build if you have a C Compiler (which is the obvious pain point on windows). Scipy, and fortran is where things fall apart.
David _______________________________________________ Distutils-SIG maillist - Distutils-SIG@python.org https://mail.python.org/mailman/listinfo/distutils-sig
----------------- Donald Stufft PGP: 0x6E3CBCE93372DCFA // 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA
On 29 January 2014 22:50, David Cournapeau <cournape@gmail.com> wrote:
i.e. it would be nice if anyone setup to build C extensions could "just build numpy".
This has always been possible, and if not, that's certainly considered as a bug (I would be eager to fix).
I don't know if you saw my comment earlier in this thread:
A quick test later: No BLAS/ATLAS/LAPACK causes a string of warnings, And ignoring the rest of the error stack (which I'm frankly not interested in investing the time to diagnose and fix) I get "RuntimeError: Broken toolchain: cannot link a simple C program". Which is utter rubbish - I routinely build extensions with this installation.
This is a straight "pip install numpy" run in a virtualenv on Windows 7 64-bit with MSVC 2010 (full edition) installed. I still don't have the time to do detailed diagnosis or digging, but if you cannot generate the same error yourself I can rerun the command and put the output into a bug report if that helps. (Please provide a link to the bug tracker, I didn't see it mentioned on www.numpy.org or on the pypi page for numpy). Paul
On Thu, Jan 30, 2014 at 7:50 AM, Paul Moore <p.f.moore@gmail.com> wrote:
On 29 January 2014 22:50, David Cournapeau <cournape@gmail.com> wrote:
i.e. it would be nice if anyone setup to build C extensions could "just build numpy".
This has always been possible, and if not, that's certainly considered as a bug (I would be eager to fix).
I don't know if you saw my comment earlier in this thread:
A quick test later: No BLAS/ATLAS/LAPACK causes a string of warnings, And ignoring the rest of the error stack (which I'm frankly not interested in investing the time to diagnose and fix) I get "RuntimeError: Broken toolchain: cannot link a simple C program". Which is utter rubbish - I routinely build extensions with this installation.
This is a straight "pip install numpy" run in a virtualenv on Windows 7 64-bit with MSVC 2010 (full edition) installed.
which version of python ? For 2.x, it is expected to fail at that point, since you can't build C extensions with something different than 2008 (with python.org builds). For 3.3, it means there is a bug that needs to be fixed. David
On 30 January 2014 09:12, David Cournapeau <cournape@gmail.com> wrote:
On Thu, Jan 30, 2014 at 7:50 AM, Paul Moore <p.f.moore@gmail.com> wrote:
On 29 January 2014 22:50, David Cournapeau <cournape@gmail.com> wrote:
i.e. it would be nice if anyone setup to build C extensions could "just build numpy".
This has always been possible, and if not, that's certainly considered as a bug (I would be eager to fix).
I don't know if you saw my comment earlier in this thread:
A quick test later: No BLAS/ATLAS/LAPACK causes a string of warnings, And ignoring the rest of the error stack (which I'm frankly not interested in investing the time to diagnose and fix) I get "RuntimeError: Broken toolchain: cannot link a simple C program". Which is utter rubbish - I routinely build extensions with this installation.
This is a straight "pip install numpy" run in a virtualenv on Windows 7 64-bit with MSVC 2010 (full edition) installed.
which version of python ?
For 2.x, it is expected to fail at that point, since you can't build C extensions with something different than 2008 (with python.org builds).
For 3.3, it means there is a bug that needs to be fixed.
Doh. Sorry, I knew I'd forget something. 3.3. Paul
On Thu, Jan 30, 2014 at 10:20 AM, Paul Moore <p.f.moore@gmail.com> wrote:
On 30 January 2014 09:12, David Cournapeau <cournape@gmail.com> wrote:
On Thu, Jan 30, 2014 at 7:50 AM, Paul Moore <p.f.moore@gmail.com> wrote:
On 29 January 2014 22:50, David Cournapeau <cournape@gmail.com> wrote:
i.e. it would be nice if anyone setup to build C extensions could
"just
build numpy".
This has always been possible, and if not, that's certainly considered as a bug (I would be eager to fix).
I don't know if you saw my comment earlier in this thread:
A quick test later: No BLAS/ATLAS/LAPACK causes a string of warnings, And ignoring the rest of the error stack (which I'm frankly not interested in investing the time to diagnose and fix) I get "RuntimeError: Broken toolchain: cannot link a simple C program". Which is utter rubbish - I routinely build extensions with this installation.
This is a straight "pip install numpy" run in a virtualenv on Windows 7 64-bit with MSVC 2010 (full edition) installed.
which version of python ?
For 2.x, it is expected to fail at that point, since you can't build C extensions with something different than 2008 (with python.org builds).
For 3.3, it means there is a bug that needs to be fixed.
Doh. Sorry, I knew I'd forget something. 3.3.
Here we go: https://github.com/numpy/numpy/issues/4245 David
Paul
Thanks. I've added the installation details and output from a test run. Paul On 30 January 2014 10:25, David Cournapeau <cournape@gmail.com> wrote:
On Thu, Jan 30, 2014 at 10:20 AM, Paul Moore <p.f.moore@gmail.com> wrote:
On 30 January 2014 09:12, David Cournapeau <cournape@gmail.com> wrote:
On Thu, Jan 30, 2014 at 7:50 AM, Paul Moore <p.f.moore@gmail.com> wrote:
On 29 January 2014 22:50, David Cournapeau <cournape@gmail.com> wrote:
i.e. it would be nice if anyone setup to build C extensions could "just build numpy".
This has always been possible, and if not, that's certainly considered as a bug (I would be eager to fix).
I don't know if you saw my comment earlier in this thread:
A quick test later: No BLAS/ATLAS/LAPACK causes a string of warnings, And ignoring the rest of the error stack (which I'm frankly not interested in investing the time to diagnose and fix) I get "RuntimeError: Broken toolchain: cannot link a simple C program". Which is utter rubbish - I routinely build extensions with this installation.
This is a straight "pip install numpy" run in a virtualenv on Windows 7 64-bit with MSVC 2010 (full edition) installed.
which version of python ?
For 2.x, it is expected to fail at that point, since you can't build C extensions with something different than 2008 (with python.org builds).
For 3.3, it means there is a bug that needs to be fixed.
Doh. Sorry, I knew I'd forget something. 3.3.
Here we go: https://github.com/numpy/numpy/issues/4245
David
Paul
On 30 January 2014 10:56, Paul Moore <p.f.moore@gmail.com> wrote:
Here we go: https://github.com/numpy/numpy/issues/4245
Thanks. I've added the installation details and output from a test run.
That bug report was just closed blaming a distutils issue which apparently numpy aren't going to work around :-( I don't know if you want to pick up on the issue and argue the case with the guy who closed it. So no, numpy won't build on Windows. Maybe in Python 3.5, if someone aggressively pushes for a distutils fix. But I'm not holding my breath. Sigh. Paul
Might I suggest you could upload some wheels (both windows and linux) to testpypi, which afaik is pretty much made for this purpose? https://wiki.python.org/moin/TestPyPI People can easily install then with e.g. `pip install --index-url https://testpypi.python.org/pypi numpy`, and see what tends to break or what doesn't. On 24 January 2014 05:52, Ralf Gommers <ralf.gommers@gmail.com> wrote:
On Thu, Jan 23, 2014 at 3:42 PM, Oscar Benjamin < oscar.j.benjamin@gmail.com> wrote:
On Thu, Jan 23, 2014 at 12:16:02PM +0000, Paul Moore wrote:
The official numpy installer uses some complex magic to select the right binaries based on your CPU, and this means that the official numpy "superpack" wininst files don't convert (at least I don't think they do, it's a while since I tried).
It's probably worth noting that numpy are toying around with wheels and have uploaded a number of them to PyPI for testing: http://sourceforge.net/projects/numpy/files/wheels_to_test/
Currently there are only OSX wheels there (excluding the puer Python ones) and they're not available on PyPI. I assume that they're waiting for a solution for the Windows installer (a post-install script for wheels). That would give a lot more impetus to put wheels up on PyPI.
Indeed. We discussed just picking the SSE2 or SSE3 build and putting that up as a wheel, but that was deemed a not so great idea: http://article.gmane.org/gmane.comp.python.numeric.general/56072
The Sourceforge OSX wheels are presumably not getting that much use
right now. The OSX-specific numpy wheel has been downloaded 4 times in the last week: twice on Windows and twice on Linux!
Some feedback from the people who did try those wheels would help. I asked for that on the numpy list after creating them, but didn't get much. So I haven't been in a hurry to move them over to PyPi.
Ralf
_______________________________________________ Distutils-SIG maillist - Distutils-SIG@python.org https://mail.python.org/mailman/listinfo/distutils-sig
On 24 January 2014 20:09, Matthew Iversen <matt@notevencode.com> wrote:
Might I suggest you could upload some wheels (both windows and linux) to testpypi, which afaik is pretty much made for this purpose?
Well, Windows and Mac OS X - we don't allow PyPI wheels for Linux at the moment (since it turns out the compatibility tagging needs to be updated to distinguish distros as well before that's a good idea), although Armin Ronacher pointed out our documentation of that limitation is rather lacking at this point in time :(
https://wiki.python.org/moin/TestPyPI
People can easily install then with e.g. `pip install --index-url https://testpypi.python.org/pypi numpy`, and see what tends to break or what doesn't.
And yes, using testpypi to experiment with wheels is a good idea. Cheers, Nick. -- Nick Coghlan | ncoghlan@gmail.com | Brisbane, Australia
participants (11)
-
Chris Barker
-
David Cournapeau
-
Donald Stufft
-
Matthew Iversen
-
Nick Coghlan
-
Noah Kantrowitz
-
Oscar Benjamin
-
Paul Moore
-
Ralf Gommers
-
Thomas Heller
-
Vinay Sajip