new mingw-w64 based numpy and scipy wheel (still experimental)
![](https://secure.gravatar.com/avatar/29a8424a5c1ddc5e0e79104965a85011.jpg?s=120&d=mm&r=g)
I took time to create mingw-w64 based wheels of numpy-1.9.1 and scipy-0.15.1 source distributions and put them on https://bitbucket.org/carlkl/mingw-w64-for-python/downloads as well as on binstar.org. The test matrix is python-2.7 and 3.4 for both 32bit and 64bit. Feedback is welcome. The wheels can be pip installed with: pip install -i https://pypi.binstar.org/carlkl/simple numpy pip install -i https://pypi.binstar.org/carlkl/simple scipy Some technical details: the binaries are build upon OpenBLAS as accelerated BLAS/Lapack. OpenBLAS itself is build with dynamic kernels (similar to MKL) and automatic runtime selection depending on the CPU. The minimal requested feature supplied by the CPU is SSE2. SSE1 and non-SSE CPUs are not supported with this builds. This is the default for 64bit binaries anyway. OpenBLAS is deployed as part of the numpy wheel. That said, the scipy wheels mentioned above are dependant on the installation of the OpenBLAS based numpy and won't work i.e. with an installed numpy-MKL. For the numpy 32bit builds there are 3 failures for special FP value tests, due to a bug in mingw-w64 that is still present. All scipy versions show up 7 failures with some numerical noise, that could be ignored (or corrected with relaxed asserts in the test code). PR's for numpy and scipy are in preparation. The mingw-w64 compiler used for building can be found at https://bitbucket.org/carlkl/mingw-w64-for-python/downloads.
![](https://secure.gravatar.com/avatar/841d38374d3c672e47dd35f88f6f8b44.jpg?s=120&d=mm&r=g)
![](https://secure.gravatar.com/avatar/aee56554ec30edfd680e1c937ed4e54d.jpg?s=120&d=mm&r=g)
2015-01-23 9:25 GMT+01:00 Carl Kleffner <cmkleffner@gmail.com>:
All tests for the 64bit builds passed.
Thanks very much Carl. Did you have to patch the numpy / distutils source to build those wheels are is this using the source code from the official releases? -- Olivier http://twitter.com/ogrisel - http://github.com/ogrisel
![](https://secure.gravatar.com/avatar/97c543aca1ac7bbcfb5279d0300c8330.jpg?s=120&d=mm&r=g)
On Thu, Jan 22, 2015 at 9:29 PM, Carl Kleffner <cmkleffner@gmail.com> wrote:
According to the steam hardware survey, 99.98% of windows computers have SSE2. (http://store.steampowered.com/hwsurvey , click on "other settings" at the bottom). So this is probably OK :-).
This sounds like it probably needs to be fixed before we can recommend the scipy wheels for anyone? OTOH it might be fine to start distributing numpy wheels first.
Correct me if I'm wrong, but it looks like there isn't any details on how exactly the compiler was set up? Which is fine, I know you've been doing a ton of work on this and it's much appreciated :-). But eventually I do think a prerequisite for us adopting these as official builds is that we'll need a text document (or an executable script!) that walks through all the steps in setting up the toolchain etc., so that someone starting from scratch could get it all up and running. Otherwise we run the risk of eventually ending up back where we are today, with a creaky old mingw binary snapshot that no-one knows how it works or how to reproduce... -n -- Nathaniel J. Smith Postdoctoral researcher - Informatics - University of Edinburgh http://vorpus.org
![](https://secure.gravatar.com/avatar/841d38374d3c672e47dd35f88f6f8b44.jpg?s=120&d=mm&r=g)
On 22-Jan-15 6:23 PM, Nathaniel Smith wrote:
Karl, I tried and failed, even after adding --pre. My log file is here: ------------------------------------------------------------ C:\Python27\Scripts\pip run on 01/24/15 07:51:10 Downloading/unpacking https://pypi.binstar.org/carlkl/simple Downloading simple Downloading from URL https://pypi.binstar.org/carlkl/simple Cleaning up... Exception: Traceback (most recent call last): File "C:\Python27\lib\site-packages\pip\basecommand.py", line 122, in main status = self.run(options, args) File "C:\Python27\lib\site-packages\pip\commands\install.py", line 278, in run requirement_set.prepare_files(finder, force_root_egg_info=self.bundle, bundle=self.bundle) File "C:\Python27\lib\site-packages\pip\req.py", line 1197, in prepare_files do_download, File "C:\Python27\lib\site-packages\pip\req.py", line 1375, in unpack_url self.session, File "C:\Python27\lib\site-packages\pip\download.py", line 582, in unpack_http_url unpack_file(temp_location, location, content_type, link) File "C:\Python27\lib\site-packages\pip\util.py", line 627, in unpack_file and is_svn_page(file_contents(filename))): File "C:\Python27\lib\site-packages\pip\util.py", line 210, in file_contents return fp.read().decode('utf-8') File "C:\Python27\lib\encodings\utf_8.py", line 16, in decode return codecs.utf_8_decode(input, errors, True) UnicodeDecodeError: 'utf8' codec can't decode byte 0x8b in position 1: invalid start byte Do you have any suggestions? Colin W.
![](https://secure.gravatar.com/avatar/29a8424a5c1ddc5e0e79104965a85011.jpg?s=120&d=mm&r=g)
Just a wild guess: (1) update your pip and try again (2) use the bitbucket wheels with: pip install --no-index -f https://bitbucket.org/carlkl/mingw-w64-for-python/downloads numpy pip install --no-index -f https://bitbucket.org/carlkl/mingw-w64-for-python/downloads scipy (3) check if there i something left in site-packages\numpy in the case you have uninstalled another numpy distribution before. Carl 2015-01-24 15:48 GMT+01:00 cjw <cjw@ncf.ca>:
![](https://secure.gravatar.com/avatar/841d38374d3c672e47dd35f88f6f8b44.jpg?s=120&d=mm&r=g)
On 24-Jan-15 12:14 PM, Carl Kleffner wrote:
C:\Python27\Lib\site-packages>python Python 2.7.9 (default, Dec 10 2014, 12:28:03) [MSC v.1500 64 bit (AMD64)] on win32 C:\Python27\Lib>cd site-packages C:\Python27\Lib\site-packages> dir Volume in drive C has no label. Volume Serial Number is 9691-2C8F Directory of C:\Python27\Lib\site-packages 25-Jan-15 06:11 AM <DIR> . 25-Jan-15 06:11 AM <DIR> .. 26-Nov-14 06:30 PM 909 apsw-3.8.7-py2.7.egg-info 26-Nov-14 06:30 PM 990,208 apsw.pyd 11-Dec-14 06:35 PM <DIR> astroid 11-Dec-14 06:35 PM <DIR> astroid-1.3.2.dist-info 11-Dec-14 06:35 PM <DIR> colorama 11-Dec-14 06:35 PM <DIR> colorama-0.3.2-py2.7.egg-info 09-Sep-14 09:38 AM <DIR> dateutil 29-Dec-14 10:16 PM 126 easy_install.py 29-Dec-14 10:16 PM 343 easy_install.pyc 27-Dec-14 09:18 AM <DIR> epydoc 21-Jan-13 03:19 PM 297 epydoc-3.0.1-py2.7.egg-info 11-Dec-14 06:35 PM <DIR> logilab 11-Dec-14 06:35 PM 309 logilab_common-0.63.2-py2.7-nspkg.pth 11-Dec-14 06:35 PM <DIR> logilab_common-0.63.2-py2.7.egg-info 16-Nov-14 04:02 PM <DIR> matplotlib 22-Oct-14 03:11 PM 324 matplotlib-1.4.2-py2.7-nspkg.pth 16-Nov-14 04:02 PM <DIR> matplotlib-1.4.2-py2.7.egg-info 16-Nov-14 04:02 PM <DIR> mpl_toolkits 25-Jan-15 06:07 AM <DIR> numpy 25-Jan-15 06:07 AM <DIR> numpy-1.9.1.dist-info 25-Jan-15 06:01 AM <DIR> pip 25-Jan-15 06:01 AM <DIR> pip-6.0.6.dist-info 29-Dec-14 10:16 PM 101,530 pkg_resources.py 29-Dec-14 10:16 PM 115,360 pkg_resources.pyc 10-Sep-14 12:30 PM <DIR> pycparser 10-Sep-14 12:30 PM <DIR> pycparser-2.10-py2.7.egg-info 16-Dec-14 08:21 AM <DIR> pygame 25-Mar-14 11:03 AM 543 pygame-1.9.2a0-py2.7.egg-info 24-Nov-14 06:55 AM <DIR> pygit2 24-Nov-14 06:55 AM <DIR> pygit2-0.21.3-py2.7.egg-info 26-Mar-14 01:23 PM 90 pylab.py 16-Nov-14 04:02 PM 237 pylab.pyc 16-Nov-14 04:02 PM 237 pylab.pyo 11-Dec-14 06:35 PM <DIR> pylint 11-Dec-14 06:35 PM <DIR> pylint-1.4.0.dist-info 11-Sep-14 08:26 PM <DIR> pyparsing-2.0.2-py2.7.egg-info 24-Nov-14 06:27 AM 157,300 pyparsing.py 30-Nov-14 08:51 AM 154,996 pyparsing.pyc 09-Sep-14 09:38 AM <DIR> python_dateutil-2.2-py2.7.egg-info 09-Sep-14 10:03 AM <DIR> pytz 09-Sep-14 10:03 AM <DIR> pytz-2014.7-py2.7.egg-info 30-Apr-14 08:54 AM 119 README.txt 25-Jan-15 06:11 AM <DIR> scipy 25-Jan-15 06:11 AM <DIR> scipy-0.15.1.dist-info 29-Dec-14 10:16 PM <DIR> setuptools 29-Dec-14 10:16 PM <DIR> setuptools-7.0.dist-info 09-Sep-14 09:50 AM <DIR> six-1.7.3.dist-info 09-Sep-14 09:50 AM 26,518 six.py 09-Sep-14 09:50 AM 28,288 six.pyc 21-Dec-14 07:56 PM <DIR> System 21-Dec-14 07:55 PM <DIR> User 21-Sep-14 06:00 PM 878,592 _cffi__xf1819144xd61e91d9.pyd 29-Dec-14 10:16 PM <DIR> _markerlib 21-Sep-14 06:00 PM 890,368 _pygit2.pyd 20 File(s) 3,346,694 bytes 36 Dir(s) 9,810,276,352 bytes free C:\Python27\Lib\site-packages>
![](https://secure.gravatar.com/avatar/29a8424a5c1ddc5e0e79104965a85011.jpg?s=120&d=mm&r=g)
2015-01-23 0:23 GMT+01:00 Nathaniel Smith <njs@pobox.com>:
I very much prefer dynamic linking to numpy\core\libopenblas.dll instead of static linking to avoid bloat. This matters, because libopenblas.dll is a heavy library (around 30Mb for amd64). As a consequence all packages with dynamic linkage to OpenBLAS depend on numpy-openblas. This is not different to scipy-MKL that has a dependancy to numpy-MKL - see C. Gohlke's site.
This has to be done and is in preperation, but not ready for consumption right now. Some preliminary information is given here: https://bitbucket.org/carlkl/mingw-w64-for-python/downloads/mingwstatic-2014...
![](https://secure.gravatar.com/avatar/2a9d09b311f11f92cdc6a91b3c6519b1.jpg?s=120&d=mm&r=g)
Carl Kleffner <cmkleffner@gmail.com> wrote:
It is probably ok if we name the OpenBLAS DLL something else than libopenblas.dll. We could e.g. add to the filename a combined hash for NumPy version, CPU, OpenBLAS version, Python version, C compiler, platform, build number, etc. Sturla
![](https://secure.gravatar.com/avatar/97c543aca1ac7bbcfb5279d0300c8330.jpg?s=120&d=mm&r=g)
On Sat, Jan 24, 2015 at 5:29 PM, Carl Kleffner <cmkleffner@gmail.com> wrote:
The difference is that if we upload this as the standard scipy wheel, and then someone goes "hey, look, a new scipy release just got announced, 'pip upgrade scipy'", then the result will often be that they just get random unexplained crashes. I think we should try to avoid that kind of outcome, even if it means making some technical compromises. The whole idea of having the wheels is to make fetching particular versions seamless and robust, and the other kinds of builds will still be available for those willing to invest more effort. One solution would be for the scipy wheel to explicitly depend on a numpy+openblas wheel, so that someone doing 'pip install scipy' also forced a numpy upgrade. But I think we should forget about trying this given the current state of python packaging tools: pip/setuptools/etc. are not really sophisticated enough to let us do this without a lot of kluges and compromises, and anyway it is nicer to allow scipy and numpy to be upgraded separately. Another solution would be to just include openblas in both. This bloats downloads, but I'd rather waste 30 MiB then waste users' time fighting with random library incompatibility nonsense that they don't care about. Another solution would be to split the openblas library off into its own "python package", that just dropped the binary somewhere where it could be found later, and then have both the numpy and scipy wheels depend on this package. We could start with the brute force solution (just including openblas in both) for the first release, and then upgrade to the fancier solution (both depend on a separate package) later.
Right, I read that :-). There's no way that I could sit down with that document and a clean windows install and replicate your mingw-w64 toolchain, though :-). Which, like I said, is totally fine at this stage in the process, I just wanted to make sure that this step is on the radar, b/c it will eventually become crucial. -n -- Nathaniel J. Smith Postdoctoral researcher - Informatics - University of Edinburgh http://vorpus.org
![](https://secure.gravatar.com/avatar/29a8424a5c1ddc5e0e79104965a85011.jpg?s=120&d=mm&r=g)
2015-01-25 16:46 GMT+01:00 Nathaniel Smith <njs@pobox.com>:
I've learned, that mark numpy with something like numpy+openblas is called "local version identifier": https://www.python.org/dev/peps/pep-0440/#local-version-identifiers These identifieres are not allowed for Pypi however.
Creating a dedicated OpenBLAS package and adding this package as an dependancy to numpy/scipy would also allow independant upgrade paths to OpenBLAS, numpy and scipy. The API of OpenBLAS seems to be stable enough to allow for that. Having an additional package dependancy is a minor problem, as pip can handle this automatically for the user.
![](https://secure.gravatar.com/avatar/97c543aca1ac7bbcfb5279d0300c8330.jpg?s=120&d=mm&r=g)
On 25 Jan 2015 18:46, "Carl Kleffner" <cmkleffner@gmail.com> wrote:
2015-01-25 16:46 GMT+01:00 Nathaniel Smith <njs@pobox.com>:
On Sat, Jan 24, 2015 at 5:29 PM, Carl Kleffner <cmkleffner@gmail.com>
wrote: based
Right, it's fine for the testing wheels, but even if it were allowed on pypi then it still wouldn't let us specify the correct dependency -- we'd have to say that scipy build X depends on exactly numpy 1.9.1+openblas, not numpy <anything>+openblas. So then when a new version of numpy was uploaded it'd be impossible to upgrade without also rebuilding numpy. Alternatively pip would be within its rights to simply ignore the local version part, because "Local version identifiers are used to denote fully API (and, if applicable, ABI) compatible patched versions of upstream projects." Here the +openblas is exactly designed to communicate ABI incompatibility. Soooooo yeah this is ugly all around. Pip and friends are getting better but they're just not up to this kind of thing.
Exactly. We might even want to give it a tiny python wrapper, e.g. you do import openblas openblas.add_to_library_path() and that would be a little function that modifies LD_LIBRARY_PATH or calls AddDllDirectory etc. as appropriate, so that code linking to openblas can ignore all these details. -n
![](https://secure.gravatar.com/avatar/b4929294417e9ac44c17967baae75a36.jpg?s=120&d=mm&r=g)
Hi, On Sun, Jan 25, 2015 at 12:23 PM, Nathaniel Smith <njs@pobox.com> wrote:
I agree, that shipping openblas with both numpy and scipy seems perfectly reasonable to me - I don't think anyone will much care about the 30M, and I think our job is to make something that works with the least complexity and likelihood of error. It would be good to rename the dll according to the package and version though, to avoid a scipy binary using a pre-loaded but incompatible 'libopenblas.dll'. Say something like openblas-scipy-0.15.1.dll - on the basis that there can only be one copy of scipy loaded at a time. Cheers, Matthew
![](https://secure.gravatar.com/avatar/29a8424a5c1ddc5e0e79104965a85011.jpg?s=120&d=mm&r=g)
Thanks for all your ideas. The next version will contain an augumented libopenblas.dll in both numpy and scipy. On the long term I would prefer an external openblas wheel package, if there is an agreement about this among numpy-dev. Another idea for the future is to conditionally load a debug version of libopenblas instead. Together with the backtrace.dll (part of mingwstatic, but undocumentated right now) a meaningfull stacktrace in case of segfaults inside the code comiled with mingwstatic will be given. 2015-01-26 2:16 GMT+01:00 Sturla Molden <sturla.molden@gmail.com>:
![](https://secure.gravatar.com/avatar/2a9d09b311f11f92cdc6a91b3c6519b1.jpg?s=120&d=mm&r=g)
On 26/01/15 16:30, Carl Kleffner wrote:
Thanks for all your great work on this. An OpenBLAS wheel might be a good idea. Probably we should have some sort of instruction on the website how to install the binary wheel. And then we could include the OpenBLAS wheel in the instruction. Or we could have the OpenBLAS wheel as a part of the scipy stack. But make the bloated SciPy and NumPy wheels work first, then we can worry about a dedicated OpenBLAS wheel later :-)
An OpenBLAS wheel could also include multiple architectures. We can compile OpenBLAS for any kind of CPUs and and install the one that fits best with the computer. Also note that an OpenBLAS wheel could be useful on Linux. It is clearly superior to the ATLAS libraries that most distros ship. If we make a binary wheel that works for Windows, we are almost there for Linux too :-) For Apple we don't need OpenBLAS anymore. On OSX 10.9 and 10.10 Accelerate Framework is actually faster than MKL under many circumstances. DGEMM is about the same, but e.g. DAXPY and DDOT are faster in Accelerate. Sturla
![](https://secure.gravatar.com/avatar/29a8424a5c1ddc5e0e79104965a85011.jpg?s=120&d=mm&r=g)
2015-01-27 0:16 GMT+01:00 Sturla Molden <sturla.molden@gmail.com>:
OpenBLAS in the test wheels is build with DYNAMIC_ARCH, that is all assembler based kernels are included and are choosen at runtime. Non optimized parts of Lapack have been build with -march=sse2.
I have in mind, that binary wheels are not supported for Linux. Maybe this could be done as conda package for Anaconda/Miniconda as an OSS alternative to MKL.
![](https://secure.gravatar.com/avatar/2a9d09b311f11f92cdc6a91b3c6519b1.jpg?s=120&d=mm&r=g)
On 27/01/15 11:32, Carl Kleffner wrote:
OpenBLAS in the test wheels is build with DYNAMIC_ARCH, that is all assembler based kernels are included and are choosen at runtime.
Ok, I wasn't aware of that option. Last time I built OpenBLAS I think I had to specify the target CPU.
Non optimized parts of Lapack have been build with -march=sse2.
Since LAPACK delegates almost all of its heavy lifting to BLAS, there is probably not a lot to gain from SSE3, SSE4 or AVX here. Sturla
![](https://secure.gravatar.com/avatar/5f88830d19f9c83e2ddfd913496c5025.jpg?s=120&d=mm&r=g)
On Mon, Jan 26, 2015 at 4:30 PM, Carl Kleffner <cmkleffner@gmail.com> wrote:
Sounds fine in principle, but reliable dependency handling will be hard to support in setup.py. You'd want the dependency on Openblas when installing a complete set of wheels, but not make it impossible to use: - building against ATLAS/MKL/... from source with pip or distutils - allowing use of a local wheelhouse which uses ATLAS/MKL/... wheels - pip install numpy --no-use-wheel - etc. Static bundling is a lot easier to get right.
+1 Ralf
![](https://secure.gravatar.com/avatar/97c543aca1ac7bbcfb5279d0300c8330.jpg?s=120&d=mm&r=g)
On Tue, Jan 27, 2015 at 8:53 PM, Ralf Gommers <ralf.gommers@gmail.com> wrote:
In principle I think this should be easy: when installing a .whl, pip or whatever looks at the dependencies declared in the distribution metadata file inside the wheel. When installing via setup.py, pip or whatever uses the dependencies declared by setup.py. We just have to make sure that the wheels we distribute have the right metadata inside them and everything should work. Accomplishing this may be somewhat awkward with existing tools, but as a worst-case/proof-of-concept approach we could just have a step in the wheel build that opens up the .whl and edits it to add the dependency. Ugly, but it'd work. -n -- Nathaniel J. Smith Postdoctoral researcher - Informatics - University of Edinburgh http://vorpus.org
![](https://secure.gravatar.com/avatar/5f88830d19f9c83e2ddfd913496c5025.jpg?s=120&d=mm&r=g)
On Tue, Jan 27, 2015 at 10:13 PM, Nathaniel Smith <njs@pobox.com> wrote:
Good point, that should work. Not all that much uglier than some of the other stuff we do in release scripts for Windows binaries. Ralf
![](https://secure.gravatar.com/avatar/aee56554ec30edfd680e1c937ed4e54d.jpg?s=120&d=mm&r=g)
Hi Carl, Could you please provide some details on how you used your mingw-static toolchain to build OpenBLAS, numpy & scipy? I would like to replicate but apparently the default Makefile in the openblas projects expects unix commands such as `uname` and `perl` that are not part of your archive. Did you compile those utilities from source or did you use another distribution of mingw with additional tools such as MSYS? For numpy and scipy, besides applying your patches, did you configure anything in site.cfg? I understand that you put the libopenblas.dll in the numpy/core folder but where do you put the BLAS / LAPACK header files? I would like to help automating that build in some CI environment (with either Windows or Linux + wine) but I am affraid that I am not familiar enough with the windows build of numpy & scipy to get it working all by myself. -- Olivier
![](https://secure.gravatar.com/avatar/2a9d09b311f11f92cdc6a91b3c6519b1.jpg?s=120&d=mm&r=g)
Two quick comments: - You need MSYS or Cygwin to build OpenBLAS. MSYS has uname and perl. Carl probably used MSYS. - BLAS and LAPACK are Fortran libs, hence there are no header files. NumPy and SciPy include their own cblas headers. Sturla Olivier Grisel <olivier.grisel@ensta.org> wrote:
![](https://secure.gravatar.com/avatar/29a8424a5c1ddc5e0e79104965a85011.jpg?s=120&d=mm&r=g)
gendef python27.dll dlltool --dllname python27.dll --def python27.def --output-lib
Basically you need: (1) site.cfg or %HOME%\.numpy-site.cfg with the following content: (change the paths according to your installation) [openblas] libraries = openblas library_dirs = D:/devel/packages/openblas/amd64/lib include_dirs = D:/devel/packages/openblas/amd64/include OpenBLAS was build with the help of msys2, the successor of msys. (2) I created an import library for python##.dll in <python>\libs\ I copied python##.dll in a temporary folder and executed: (example for python-2.7) libpython27.dll.a
copy libpython27.dll.a <python_root>\libs\libpython27.dll.a
(3) before starting the numpy build I copied libopenblas.dll to numpy\core\ Actually I rework the overall procedure to allow the installation of the toolchain as a wheel with some postprocessing to handle all this intermediate steps. Cheers, Carl 2015-02-09 22:22 GMT+01:00 Sturla Molden <sturla.molden@gmail.com>:
![](https://secure.gravatar.com/avatar/aee56554ec30edfd680e1c937ed4e54d.jpg?s=120&d=mm&r=g)
+1 for bundling OpenBLAS both in scipy and numpy in the short term. Introducing a new dependency project for OpenBLAS sounds like a good idea but this is probably more work. -- Olivier
![](https://secure.gravatar.com/avatar/841d38374d3c672e47dd35f88f6f8b44.jpg?s=120&d=mm&r=g)
![](https://secure.gravatar.com/avatar/aee56554ec30edfd680e1c937ed4e54d.jpg?s=120&d=mm&r=g)
2015-01-23 9:25 GMT+01:00 Carl Kleffner <cmkleffner@gmail.com>:
All tests for the 64bit builds passed.
Thanks very much Carl. Did you have to patch the numpy / distutils source to build those wheels are is this using the source code from the official releases? -- Olivier http://twitter.com/ogrisel - http://github.com/ogrisel
![](https://secure.gravatar.com/avatar/97c543aca1ac7bbcfb5279d0300c8330.jpg?s=120&d=mm&r=g)
On Thu, Jan 22, 2015 at 9:29 PM, Carl Kleffner <cmkleffner@gmail.com> wrote:
According to the steam hardware survey, 99.98% of windows computers have SSE2. (http://store.steampowered.com/hwsurvey , click on "other settings" at the bottom). So this is probably OK :-).
This sounds like it probably needs to be fixed before we can recommend the scipy wheels for anyone? OTOH it might be fine to start distributing numpy wheels first.
Correct me if I'm wrong, but it looks like there isn't any details on how exactly the compiler was set up? Which is fine, I know you've been doing a ton of work on this and it's much appreciated :-). But eventually I do think a prerequisite for us adopting these as official builds is that we'll need a text document (or an executable script!) that walks through all the steps in setting up the toolchain etc., so that someone starting from scratch could get it all up and running. Otherwise we run the risk of eventually ending up back where we are today, with a creaky old mingw binary snapshot that no-one knows how it works or how to reproduce... -n -- Nathaniel J. Smith Postdoctoral researcher - Informatics - University of Edinburgh http://vorpus.org
![](https://secure.gravatar.com/avatar/841d38374d3c672e47dd35f88f6f8b44.jpg?s=120&d=mm&r=g)
On 22-Jan-15 6:23 PM, Nathaniel Smith wrote:
Karl, I tried and failed, even after adding --pre. My log file is here: ------------------------------------------------------------ C:\Python27\Scripts\pip run on 01/24/15 07:51:10 Downloading/unpacking https://pypi.binstar.org/carlkl/simple Downloading simple Downloading from URL https://pypi.binstar.org/carlkl/simple Cleaning up... Exception: Traceback (most recent call last): File "C:\Python27\lib\site-packages\pip\basecommand.py", line 122, in main status = self.run(options, args) File "C:\Python27\lib\site-packages\pip\commands\install.py", line 278, in run requirement_set.prepare_files(finder, force_root_egg_info=self.bundle, bundle=self.bundle) File "C:\Python27\lib\site-packages\pip\req.py", line 1197, in prepare_files do_download, File "C:\Python27\lib\site-packages\pip\req.py", line 1375, in unpack_url self.session, File "C:\Python27\lib\site-packages\pip\download.py", line 582, in unpack_http_url unpack_file(temp_location, location, content_type, link) File "C:\Python27\lib\site-packages\pip\util.py", line 627, in unpack_file and is_svn_page(file_contents(filename))): File "C:\Python27\lib\site-packages\pip\util.py", line 210, in file_contents return fp.read().decode('utf-8') File "C:\Python27\lib\encodings\utf_8.py", line 16, in decode return codecs.utf_8_decode(input, errors, True) UnicodeDecodeError: 'utf8' codec can't decode byte 0x8b in position 1: invalid start byte Do you have any suggestions? Colin W.
![](https://secure.gravatar.com/avatar/29a8424a5c1ddc5e0e79104965a85011.jpg?s=120&d=mm&r=g)
Just a wild guess: (1) update your pip and try again (2) use the bitbucket wheels with: pip install --no-index -f https://bitbucket.org/carlkl/mingw-w64-for-python/downloads numpy pip install --no-index -f https://bitbucket.org/carlkl/mingw-w64-for-python/downloads scipy (3) check if there i something left in site-packages\numpy in the case you have uninstalled another numpy distribution before. Carl 2015-01-24 15:48 GMT+01:00 cjw <cjw@ncf.ca>:
![](https://secure.gravatar.com/avatar/841d38374d3c672e47dd35f88f6f8b44.jpg?s=120&d=mm&r=g)
On 24-Jan-15 12:14 PM, Carl Kleffner wrote:
C:\Python27\Lib\site-packages>python Python 2.7.9 (default, Dec 10 2014, 12:28:03) [MSC v.1500 64 bit (AMD64)] on win32 C:\Python27\Lib>cd site-packages C:\Python27\Lib\site-packages> dir Volume in drive C has no label. Volume Serial Number is 9691-2C8F Directory of C:\Python27\Lib\site-packages 25-Jan-15 06:11 AM <DIR> . 25-Jan-15 06:11 AM <DIR> .. 26-Nov-14 06:30 PM 909 apsw-3.8.7-py2.7.egg-info 26-Nov-14 06:30 PM 990,208 apsw.pyd 11-Dec-14 06:35 PM <DIR> astroid 11-Dec-14 06:35 PM <DIR> astroid-1.3.2.dist-info 11-Dec-14 06:35 PM <DIR> colorama 11-Dec-14 06:35 PM <DIR> colorama-0.3.2-py2.7.egg-info 09-Sep-14 09:38 AM <DIR> dateutil 29-Dec-14 10:16 PM 126 easy_install.py 29-Dec-14 10:16 PM 343 easy_install.pyc 27-Dec-14 09:18 AM <DIR> epydoc 21-Jan-13 03:19 PM 297 epydoc-3.0.1-py2.7.egg-info 11-Dec-14 06:35 PM <DIR> logilab 11-Dec-14 06:35 PM 309 logilab_common-0.63.2-py2.7-nspkg.pth 11-Dec-14 06:35 PM <DIR> logilab_common-0.63.2-py2.7.egg-info 16-Nov-14 04:02 PM <DIR> matplotlib 22-Oct-14 03:11 PM 324 matplotlib-1.4.2-py2.7-nspkg.pth 16-Nov-14 04:02 PM <DIR> matplotlib-1.4.2-py2.7.egg-info 16-Nov-14 04:02 PM <DIR> mpl_toolkits 25-Jan-15 06:07 AM <DIR> numpy 25-Jan-15 06:07 AM <DIR> numpy-1.9.1.dist-info 25-Jan-15 06:01 AM <DIR> pip 25-Jan-15 06:01 AM <DIR> pip-6.0.6.dist-info 29-Dec-14 10:16 PM 101,530 pkg_resources.py 29-Dec-14 10:16 PM 115,360 pkg_resources.pyc 10-Sep-14 12:30 PM <DIR> pycparser 10-Sep-14 12:30 PM <DIR> pycparser-2.10-py2.7.egg-info 16-Dec-14 08:21 AM <DIR> pygame 25-Mar-14 11:03 AM 543 pygame-1.9.2a0-py2.7.egg-info 24-Nov-14 06:55 AM <DIR> pygit2 24-Nov-14 06:55 AM <DIR> pygit2-0.21.3-py2.7.egg-info 26-Mar-14 01:23 PM 90 pylab.py 16-Nov-14 04:02 PM 237 pylab.pyc 16-Nov-14 04:02 PM 237 pylab.pyo 11-Dec-14 06:35 PM <DIR> pylint 11-Dec-14 06:35 PM <DIR> pylint-1.4.0.dist-info 11-Sep-14 08:26 PM <DIR> pyparsing-2.0.2-py2.7.egg-info 24-Nov-14 06:27 AM 157,300 pyparsing.py 30-Nov-14 08:51 AM 154,996 pyparsing.pyc 09-Sep-14 09:38 AM <DIR> python_dateutil-2.2-py2.7.egg-info 09-Sep-14 10:03 AM <DIR> pytz 09-Sep-14 10:03 AM <DIR> pytz-2014.7-py2.7.egg-info 30-Apr-14 08:54 AM 119 README.txt 25-Jan-15 06:11 AM <DIR> scipy 25-Jan-15 06:11 AM <DIR> scipy-0.15.1.dist-info 29-Dec-14 10:16 PM <DIR> setuptools 29-Dec-14 10:16 PM <DIR> setuptools-7.0.dist-info 09-Sep-14 09:50 AM <DIR> six-1.7.3.dist-info 09-Sep-14 09:50 AM 26,518 six.py 09-Sep-14 09:50 AM 28,288 six.pyc 21-Dec-14 07:56 PM <DIR> System 21-Dec-14 07:55 PM <DIR> User 21-Sep-14 06:00 PM 878,592 _cffi__xf1819144xd61e91d9.pyd 29-Dec-14 10:16 PM <DIR> _markerlib 21-Sep-14 06:00 PM 890,368 _pygit2.pyd 20 File(s) 3,346,694 bytes 36 Dir(s) 9,810,276,352 bytes free C:\Python27\Lib\site-packages>
![](https://secure.gravatar.com/avatar/29a8424a5c1ddc5e0e79104965a85011.jpg?s=120&d=mm&r=g)
2015-01-23 0:23 GMT+01:00 Nathaniel Smith <njs@pobox.com>:
I very much prefer dynamic linking to numpy\core\libopenblas.dll instead of static linking to avoid bloat. This matters, because libopenblas.dll is a heavy library (around 30Mb for amd64). As a consequence all packages with dynamic linkage to OpenBLAS depend on numpy-openblas. This is not different to scipy-MKL that has a dependancy to numpy-MKL - see C. Gohlke's site.
This has to be done and is in preperation, but not ready for consumption right now. Some preliminary information is given here: https://bitbucket.org/carlkl/mingw-w64-for-python/downloads/mingwstatic-2014...
![](https://secure.gravatar.com/avatar/2a9d09b311f11f92cdc6a91b3c6519b1.jpg?s=120&d=mm&r=g)
Carl Kleffner <cmkleffner@gmail.com> wrote:
It is probably ok if we name the OpenBLAS DLL something else than libopenblas.dll. We could e.g. add to the filename a combined hash for NumPy version, CPU, OpenBLAS version, Python version, C compiler, platform, build number, etc. Sturla
![](https://secure.gravatar.com/avatar/97c543aca1ac7bbcfb5279d0300c8330.jpg?s=120&d=mm&r=g)
On Sat, Jan 24, 2015 at 5:29 PM, Carl Kleffner <cmkleffner@gmail.com> wrote:
The difference is that if we upload this as the standard scipy wheel, and then someone goes "hey, look, a new scipy release just got announced, 'pip upgrade scipy'", then the result will often be that they just get random unexplained crashes. I think we should try to avoid that kind of outcome, even if it means making some technical compromises. The whole idea of having the wheels is to make fetching particular versions seamless and robust, and the other kinds of builds will still be available for those willing to invest more effort. One solution would be for the scipy wheel to explicitly depend on a numpy+openblas wheel, so that someone doing 'pip install scipy' also forced a numpy upgrade. But I think we should forget about trying this given the current state of python packaging tools: pip/setuptools/etc. are not really sophisticated enough to let us do this without a lot of kluges and compromises, and anyway it is nicer to allow scipy and numpy to be upgraded separately. Another solution would be to just include openblas in both. This bloats downloads, but I'd rather waste 30 MiB then waste users' time fighting with random library incompatibility nonsense that they don't care about. Another solution would be to split the openblas library off into its own "python package", that just dropped the binary somewhere where it could be found later, and then have both the numpy and scipy wheels depend on this package. We could start with the brute force solution (just including openblas in both) for the first release, and then upgrade to the fancier solution (both depend on a separate package) later.
Right, I read that :-). There's no way that I could sit down with that document and a clean windows install and replicate your mingw-w64 toolchain, though :-). Which, like I said, is totally fine at this stage in the process, I just wanted to make sure that this step is on the radar, b/c it will eventually become crucial. -n -- Nathaniel J. Smith Postdoctoral researcher - Informatics - University of Edinburgh http://vorpus.org
![](https://secure.gravatar.com/avatar/29a8424a5c1ddc5e0e79104965a85011.jpg?s=120&d=mm&r=g)
2015-01-25 16:46 GMT+01:00 Nathaniel Smith <njs@pobox.com>:
I've learned, that mark numpy with something like numpy+openblas is called "local version identifier": https://www.python.org/dev/peps/pep-0440/#local-version-identifiers These identifieres are not allowed for Pypi however.
Creating a dedicated OpenBLAS package and adding this package as an dependancy to numpy/scipy would also allow independant upgrade paths to OpenBLAS, numpy and scipy. The API of OpenBLAS seems to be stable enough to allow for that. Having an additional package dependancy is a minor problem, as pip can handle this automatically for the user.
![](https://secure.gravatar.com/avatar/97c543aca1ac7bbcfb5279d0300c8330.jpg?s=120&d=mm&r=g)
On 25 Jan 2015 18:46, "Carl Kleffner" <cmkleffner@gmail.com> wrote:
2015-01-25 16:46 GMT+01:00 Nathaniel Smith <njs@pobox.com>:
On Sat, Jan 24, 2015 at 5:29 PM, Carl Kleffner <cmkleffner@gmail.com>
wrote: based
Right, it's fine for the testing wheels, but even if it were allowed on pypi then it still wouldn't let us specify the correct dependency -- we'd have to say that scipy build X depends on exactly numpy 1.9.1+openblas, not numpy <anything>+openblas. So then when a new version of numpy was uploaded it'd be impossible to upgrade without also rebuilding numpy. Alternatively pip would be within its rights to simply ignore the local version part, because "Local version identifiers are used to denote fully API (and, if applicable, ABI) compatible patched versions of upstream projects." Here the +openblas is exactly designed to communicate ABI incompatibility. Soooooo yeah this is ugly all around. Pip and friends are getting better but they're just not up to this kind of thing.
Exactly. We might even want to give it a tiny python wrapper, e.g. you do import openblas openblas.add_to_library_path() and that would be a little function that modifies LD_LIBRARY_PATH or calls AddDllDirectory etc. as appropriate, so that code linking to openblas can ignore all these details. -n
![](https://secure.gravatar.com/avatar/b4929294417e9ac44c17967baae75a36.jpg?s=120&d=mm&r=g)
Hi, On Sun, Jan 25, 2015 at 12:23 PM, Nathaniel Smith <njs@pobox.com> wrote:
I agree, that shipping openblas with both numpy and scipy seems perfectly reasonable to me - I don't think anyone will much care about the 30M, and I think our job is to make something that works with the least complexity and likelihood of error. It would be good to rename the dll according to the package and version though, to avoid a scipy binary using a pre-loaded but incompatible 'libopenblas.dll'. Say something like openblas-scipy-0.15.1.dll - on the basis that there can only be one copy of scipy loaded at a time. Cheers, Matthew
![](https://secure.gravatar.com/avatar/29a8424a5c1ddc5e0e79104965a85011.jpg?s=120&d=mm&r=g)
Thanks for all your ideas. The next version will contain an augumented libopenblas.dll in both numpy and scipy. On the long term I would prefer an external openblas wheel package, if there is an agreement about this among numpy-dev. Another idea for the future is to conditionally load a debug version of libopenblas instead. Together with the backtrace.dll (part of mingwstatic, but undocumentated right now) a meaningfull stacktrace in case of segfaults inside the code comiled with mingwstatic will be given. 2015-01-26 2:16 GMT+01:00 Sturla Molden <sturla.molden@gmail.com>:
![](https://secure.gravatar.com/avatar/2a9d09b311f11f92cdc6a91b3c6519b1.jpg?s=120&d=mm&r=g)
On 26/01/15 16:30, Carl Kleffner wrote:
Thanks for all your great work on this. An OpenBLAS wheel might be a good idea. Probably we should have some sort of instruction on the website how to install the binary wheel. And then we could include the OpenBLAS wheel in the instruction. Or we could have the OpenBLAS wheel as a part of the scipy stack. But make the bloated SciPy and NumPy wheels work first, then we can worry about a dedicated OpenBLAS wheel later :-)
An OpenBLAS wheel could also include multiple architectures. We can compile OpenBLAS for any kind of CPUs and and install the one that fits best with the computer. Also note that an OpenBLAS wheel could be useful on Linux. It is clearly superior to the ATLAS libraries that most distros ship. If we make a binary wheel that works for Windows, we are almost there for Linux too :-) For Apple we don't need OpenBLAS anymore. On OSX 10.9 and 10.10 Accelerate Framework is actually faster than MKL under many circumstances. DGEMM is about the same, but e.g. DAXPY and DDOT are faster in Accelerate. Sturla
![](https://secure.gravatar.com/avatar/29a8424a5c1ddc5e0e79104965a85011.jpg?s=120&d=mm&r=g)
2015-01-27 0:16 GMT+01:00 Sturla Molden <sturla.molden@gmail.com>:
OpenBLAS in the test wheels is build with DYNAMIC_ARCH, that is all assembler based kernels are included and are choosen at runtime. Non optimized parts of Lapack have been build with -march=sse2.
I have in mind, that binary wheels are not supported for Linux. Maybe this could be done as conda package for Anaconda/Miniconda as an OSS alternative to MKL.
![](https://secure.gravatar.com/avatar/2a9d09b311f11f92cdc6a91b3c6519b1.jpg?s=120&d=mm&r=g)
On 27/01/15 11:32, Carl Kleffner wrote:
OpenBLAS in the test wheels is build with DYNAMIC_ARCH, that is all assembler based kernels are included and are choosen at runtime.
Ok, I wasn't aware of that option. Last time I built OpenBLAS I think I had to specify the target CPU.
Non optimized parts of Lapack have been build with -march=sse2.
Since LAPACK delegates almost all of its heavy lifting to BLAS, there is probably not a lot to gain from SSE3, SSE4 or AVX here. Sturla
![](https://secure.gravatar.com/avatar/5f88830d19f9c83e2ddfd913496c5025.jpg?s=120&d=mm&r=g)
On Mon, Jan 26, 2015 at 4:30 PM, Carl Kleffner <cmkleffner@gmail.com> wrote:
Sounds fine in principle, but reliable dependency handling will be hard to support in setup.py. You'd want the dependency on Openblas when installing a complete set of wheels, but not make it impossible to use: - building against ATLAS/MKL/... from source with pip or distutils - allowing use of a local wheelhouse which uses ATLAS/MKL/... wheels - pip install numpy --no-use-wheel - etc. Static bundling is a lot easier to get right.
+1 Ralf
![](https://secure.gravatar.com/avatar/97c543aca1ac7bbcfb5279d0300c8330.jpg?s=120&d=mm&r=g)
On Tue, Jan 27, 2015 at 8:53 PM, Ralf Gommers <ralf.gommers@gmail.com> wrote:
In principle I think this should be easy: when installing a .whl, pip or whatever looks at the dependencies declared in the distribution metadata file inside the wheel. When installing via setup.py, pip or whatever uses the dependencies declared by setup.py. We just have to make sure that the wheels we distribute have the right metadata inside them and everything should work. Accomplishing this may be somewhat awkward with existing tools, but as a worst-case/proof-of-concept approach we could just have a step in the wheel build that opens up the .whl and edits it to add the dependency. Ugly, but it'd work. -n -- Nathaniel J. Smith Postdoctoral researcher - Informatics - University of Edinburgh http://vorpus.org
![](https://secure.gravatar.com/avatar/5f88830d19f9c83e2ddfd913496c5025.jpg?s=120&d=mm&r=g)
On Tue, Jan 27, 2015 at 10:13 PM, Nathaniel Smith <njs@pobox.com> wrote:
Good point, that should work. Not all that much uglier than some of the other stuff we do in release scripts for Windows binaries. Ralf
![](https://secure.gravatar.com/avatar/aee56554ec30edfd680e1c937ed4e54d.jpg?s=120&d=mm&r=g)
Hi Carl, Could you please provide some details on how you used your mingw-static toolchain to build OpenBLAS, numpy & scipy? I would like to replicate but apparently the default Makefile in the openblas projects expects unix commands such as `uname` and `perl` that are not part of your archive. Did you compile those utilities from source or did you use another distribution of mingw with additional tools such as MSYS? For numpy and scipy, besides applying your patches, did you configure anything in site.cfg? I understand that you put the libopenblas.dll in the numpy/core folder but where do you put the BLAS / LAPACK header files? I would like to help automating that build in some CI environment (with either Windows or Linux + wine) but I am affraid that I am not familiar enough with the windows build of numpy & scipy to get it working all by myself. -- Olivier
![](https://secure.gravatar.com/avatar/2a9d09b311f11f92cdc6a91b3c6519b1.jpg?s=120&d=mm&r=g)
Two quick comments: - You need MSYS or Cygwin to build OpenBLAS. MSYS has uname and perl. Carl probably used MSYS. - BLAS and LAPACK are Fortran libs, hence there are no header files. NumPy and SciPy include their own cblas headers. Sturla Olivier Grisel <olivier.grisel@ensta.org> wrote:
![](https://secure.gravatar.com/avatar/29a8424a5c1ddc5e0e79104965a85011.jpg?s=120&d=mm&r=g)
gendef python27.dll dlltool --dllname python27.dll --def python27.def --output-lib
Basically you need: (1) site.cfg or %HOME%\.numpy-site.cfg with the following content: (change the paths according to your installation) [openblas] libraries = openblas library_dirs = D:/devel/packages/openblas/amd64/lib include_dirs = D:/devel/packages/openblas/amd64/include OpenBLAS was build with the help of msys2, the successor of msys. (2) I created an import library for python##.dll in <python>\libs\ I copied python##.dll in a temporary folder and executed: (example for python-2.7) libpython27.dll.a
copy libpython27.dll.a <python_root>\libs\libpython27.dll.a
(3) before starting the numpy build I copied libopenblas.dll to numpy\core\ Actually I rework the overall procedure to allow the installation of the toolchain as a wheel with some postprocessing to handle all this intermediate steps. Cheers, Carl 2015-02-09 22:22 GMT+01:00 Sturla Molden <sturla.molden@gmail.com>:
![](https://secure.gravatar.com/avatar/aee56554ec30edfd680e1c937ed4e54d.jpg?s=120&d=mm&r=g)
+1 for bundling OpenBLAS both in scipy and numpy in the short term. Introducing a new dependency project for OpenBLAS sounds like a good idea but this is probably more work. -- Olivier
participants (7)
-
Carl Kleffner
-
cjw
-
Matthew Brett
-
Nathaniel Smith
-
Olivier Grisel
-
Ralf Gommers
-
Sturla Molden