
Hello, I'm happy to announce the of Numpy 1.8.1. This is a bugfix only release supporting Python 2.6 - 2.7 and 3.2 - 3.4. More than 48 issues have been fixed, the most important issues are listed in the release notes: https://github.com/numpy/numpy/blob/maintenance/1.8.x/doc/release/1.8.1-note... Compared to the last release candidate we have fixed a regression of the 1.8 series that prevented using some gufunc based linalg functions on larger matrices on 32 bit systems. This implied a few changes in the NDIter C-API which might expose insufficient checks for error conditions in third party applications. Please check the release notes for details. Source tarballs, windows installers and release notes can be found at https://sourceforge.net/projects/numpy/files/NumPy/1.8.1 Cheers, Julian Taylor

Hi, On Tue, Mar 25, 2014 at 4:38 PM, Julian Taylor <jtaylor.debian@googlemail.com> wrote:
Hello,
I'm happy to announce the of Numpy 1.8.1. This is a bugfix only release supporting Python 2.6 - 2.7 and 3.2 - 3.4.
More than 48 issues have been fixed, the most important issues are listed in the release notes: https://github.com/numpy/numpy/blob/maintenance/1.8.x/doc/release/1.8.1-note...
Compared to the last release candidate we have fixed a regression of the 1.8 series that prevented using some gufunc based linalg functions on larger matrices on 32 bit systems. This implied a few changes in the NDIter C-API which might expose insufficient checks for error conditions in third party applications. Please check the release notes for details.
Source tarballs, windows installers and release notes can be found at https://sourceforge.net/projects/numpy/files/NumPy/1.8.1
Thanks a lot for this. I've just posted OSX wheels for Pythons 2.7, 3.3, 3.4. It's a strange feeling doing this: $ pip install numpy Downloading/unpacking numpy Downloading numpy-1.8.1-cp27-none-macosx_10_6_intel.whl (3.6MB): 3.6MB downloaded Installing collected packages: numpy Successfully installed numpy Cleaning up... 5 seconds waiting on a home internet connection and a numpy install.... Nice. Cheers, Matthew

On Tue, Mar 25, 2014 at 9:47 PM, Matthew Brett <matthew.brett@gmail.com>wrote:
Hi,
On Tue, Mar 25, 2014 at 4:38 PM, Julian Taylor <jtaylor.debian@googlemail.com> wrote:
Hello,
I'm happy to announce the of Numpy 1.8.1. This is a bugfix only release supporting Python 2.6 - 2.7 and 3.2 - 3.4.
More than 48 issues have been fixed, the most important issues are listed in the release notes:
https://github.com/numpy/numpy/blob/maintenance/1.8.x/doc/release/1.8.1-note...
Compared to the last release candidate we have fixed a regression of the 1.8 series that prevented using some gufunc based linalg functions on larger matrices on 32 bit systems. This implied a few changes in the NDIter C-API which might expose insufficient checks for error conditions in third party applications. Please check the release notes for details.
Source tarballs, windows installers and release notes can be found at https://sourceforge.net/projects/numpy/files/NumPy/1.8.1
Thanks a lot for this. I've just posted OSX wheels for Pythons 2.7, 3.3, 3.4.
It's a strange feeling doing this:
$ pip install numpy Downloading/unpacking numpy Downloading numpy-1.8.1-cp27-none-macosx_10_6_intel.whl (3.6MB): 3.6MB downloaded Installing collected packages: numpy Successfully installed numpy Cleaning up...
5 seconds waiting on a home internet connection and a numpy install.... Nice.
That's pretty neat. Now if we can get the windows versions to be as easy. Chuck

On Wed, Mar 26, 2014 at 8:56 AM, Charles R Harris <charlesr.harris@gmail.com
wrote:
5 seconds waiting on a home internet connection and a numpy install....
Nice.
That's pretty neat. Now if we can get the windows versions to be as easy.
Indeed -- where are we on that? Wasn't there more or less a consensus to put up Windows Wheels with SSE2? Or did we decide that was going to break a few too many systems... I also recall that some folks were working with a new BLAS (OpenBLAS ? ) that might support multi-architecture binaries...that would be a great solution. -Chris -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker@noaa.gov

Hi, On Wed, Mar 26, 2014 at 3:02 PM, Chris Barker <chris.barker@noaa.gov> wrote:
On Wed, Mar 26, 2014 at 8:56 AM, Charles R Harris <charlesr.harris@gmail.com> wrote:
5 seconds waiting on a home internet connection and a numpy install.... Nice.
That's pretty neat. Now if we can get the windows versions to be as easy.
Indeed -- where are we on that? Wasn't there more or less a consensus to put up Windows Wheels with SSE2?
Or did we decide that was going to break a few too many systems...
I also recall that some folks were working with a new BLAS (OpenBLAS ? ) that might support multi-architecture binaries...that would be a great solution.
From what Julian said elsewhere, we can completely rely on SSE2 being
From [2] it looks like Windows XP 64-bit is about 20 times less common
In another conversation it looked as though OpenBLAS was not robust enough for a standard distribution. present for 64 bit, correct? [1] than 32 bit XP, meaning likely something less than 2 percent of Windows users overall. So - can we build windows 64 bit SSE2 wheels for windows 7? With ATLAS for example? It sounds like they would be fairly safe. Cheers, Matthew [1] http://en.wikipedia.org/wiki/X86-64#Architectural_features [2] http://store.steampowered.com/hwsurvey?platform=pc

This is great! Has anyone started to work on OSX whl packages for scipy? I assume the libgfortran, libquadmath & libgcc_s dylibs will not make it as easy as for numpy. Would it be possible to use a static gcc toolchain as Carl Kleffner is using for his experimental windows whl packages? -- Olivier

Hi, On Fri, Mar 28, 2014 at 3:09 PM, Olivier Grisel <olivier.grisel@ensta.org> wrote:
This is great! Has anyone started to work on OSX whl packages for scipy? I assume the libgfortran, libquadmath & libgcc_s dylibs will not make it as easy as for numpy. Would it be possible to use a static gcc toolchain as Carl Kleffner is using for his experimental windows whl packages?
Yes, these are already done for the beta release, and for matplotlib: https://nipy.bic.berkeley.edu/scipy_installers/ Luckily OSX has a sensible way of setting relative paths to required libraries, so it's pretty easy to copy the required dlls into the binary distribution: https://github.com/matthew-brett/delocate Cheers, Matthew

2014-03-28 23:13 GMT+01:00 Matthew Brett <matthew.brett@gmail.com>:
Hi,
On Fri, Mar 28, 2014 at 3:09 PM, Olivier Grisel <olivier.grisel@ensta.org> wrote:
This is great! Has anyone started to work on OSX whl packages for scipy? I assume the libgfortran, libquadmath & libgcc_s dylibs will not make it as easy as for numpy. Would it be possible to use a static gcc toolchain as Carl Kleffner is using for his experimental windows whl packages?
Yes, these are already done for the beta release, and for matplotlib:
https://nipy.bic.berkeley.edu/scipy_installers/
Luckily OSX has a sensible way of setting relative paths to required libraries, so it's pretty easy to copy the required dlls into the binary distribution:
Great! Do you think it would be possible to upload such a delocated .whl package for scipy 0.13.3 on pypi if all tests pass? Bonus question: do you think a similar solution could work for windows and / or linux? -- Olivier http://twitter.com/ogrisel - http://github.com/ogrisel

2014-03-31 13:53 GMT+02:00 Olivier Grisel <olivier.grisel@ensta.org>:
2014-03-28 23:13 GMT+01:00 Matthew Brett <matthew.brett@gmail.com>:
Hi,
On Fri, Mar 28, 2014 at 3:09 PM, Olivier Grisel <olivier.grisel@ensta.org> wrote:
This is great! Has anyone started to work on OSX whl packages for scipy? I assume the libgfortran, libquadmath & libgcc_s dylibs will not make it as easy as for numpy. Would it be possible to use a static gcc toolchain as Carl Kleffner is using for his experimental windows whl packages?
Yes, these are already done for the beta release, and for matplotlib:
https://nipy.bic.berkeley.edu/scipy_installers/
Luckily OSX has a sensible way of setting relative paths to required libraries, so it's pretty easy to copy the required dlls into the binary distribution:
Great! Do you think it would be possible to upload such a delocated .whl package for scipy 0.13.3 on pypi if all tests pass?
I built such a whl package for the v0.13.3 tag of scipy, delocated it, "brew uninstall gfortran" to make sure that the dynlib loader would not be able to find the system libs, installed the resulting whl package in a new virtualenv and ran the tests: $ python -c "import scipy; scipy.test()" [...] Ran 8775 tests in 123.315s OK (KNOWNFAIL=113, SKIP=221) This is built on OSX 10.9. You can find the resulting wheel package on my dropbox: https://dl.dropboxusercontent.com/u/5743203/sklearn/wheelhouse/scipy-0.13.3-... If scipy maintainers would like to upload such wheel packages for scipy 0.13.3 I can also prepare them for Python 2.7 and Python 3.3. -- Olivier http://twitter.com/ogrisel - http://github.com/ogrisel

Hi, On Mon, Mar 31, 2014 at 5:17 AM, Olivier Grisel <olivier.grisel@ensta.org> wrote:
2014-03-31 13:53 GMT+02:00 Olivier Grisel <olivier.grisel@ensta.org>:
2014-03-28 23:13 GMT+01:00 Matthew Brett <matthew.brett@gmail.com>:
Hi,
On Fri, Mar 28, 2014 at 3:09 PM, Olivier Grisel <olivier.grisel@ensta.org> wrote:
This is great! Has anyone started to work on OSX whl packages for scipy? I assume the libgfortran, libquadmath & libgcc_s dylibs will not make it as easy as for numpy. Would it be possible to use a static gcc toolchain as Carl Kleffner is using for his experimental windows whl packages?
Yes, these are already done for the beta release, and for matplotlib:
https://nipy.bic.berkeley.edu/scipy_installers/
Luckily OSX has a sensible way of setting relative paths to required libraries, so it's pretty easy to copy the required dlls into the binary distribution:
Great! Do you think it would be possible to upload such a delocated .whl package for scipy 0.13.3 on pypi if all tests pass?
I built such a whl package for the v0.13.3 tag of scipy, delocated it, "brew uninstall gfortran" to make sure that the dynlib loader would not be able to find the system libs, installed the resulting whl package in a new virtualenv and ran the tests:
$ python -c "import scipy; scipy.test()" [...] Ran 8775 tests in 123.315s
OK (KNOWNFAIL=113, SKIP=221)
This is built on OSX 10.9. You can find the resulting wheel package on my dropbox:
https://dl.dropboxusercontent.com/u/5743203/sklearn/wheelhouse/scipy-0.13.3-...
If scipy maintainers would like to upload such wheel packages for scipy 0.13.3 I can also prepare them for Python 2.7 and Python 3.3.
Thanks for doing those checks. Yes, I think it would be good to upload the scipy wheels, if nothing else they'd allow us to get early warning of any problems. Ralf, Pauli - any objections to uploading binary wheels for 0.13.3? I'm will test on a clean 10.6 installation before I upload them. Cheers, Matthew

On Mon, Mar 31, 2014 at 6:30 PM, Matthew Brett <matthew.brett@gmail.com>wrote:
Hi,
On Mon, Mar 31, 2014 at 5:17 AM, Olivier Grisel <olivier.grisel@ensta.org> wrote:
2014-03-31 13:53 GMT+02:00 Olivier Grisel <olivier.grisel@ensta.org>:
2014-03-28 23:13 GMT+01:00 Matthew Brett <matthew.brett@gmail.com>:
Hi,
On Fri, Mar 28, 2014 at 3:09 PM, Olivier Grisel <olivier.grisel@ensta.org> wrote:
This is great! Has anyone started to work on OSX whl packages for scipy? I assume the libgfortran, libquadmath & libgcc_s dylibs will not make it as easy as for numpy. Would it be possible to use a static gcc toolchain as Carl Kleffner is using for his experimental windows whl packages?
Yes, these are already done for the beta release, and for matplotlib:
https://nipy.bic.berkeley.edu/scipy_installers/
Luckily OSX has a sensible way of setting relative paths to required libraries, so it's pretty easy to copy the required dlls into the binary distribution:
Great! Do you think it would be possible to upload such a delocated .whl package for scipy 0.13.3 on pypi if all tests pass?
I built such a whl package for the v0.13.3 tag of scipy, delocated it, "brew uninstall gfortran" to make sure that the dynlib loader would not be able to find the system libs, installed the resulting whl package in a new virtualenv and ran the tests:
$ python -c "import scipy; scipy.test()" [...] Ran 8775 tests in 123.315s
OK (KNOWNFAIL=113, SKIP=221)
This is built on OSX 10.9. You can find the resulting wheel package on my dropbox:
https://dl.dropboxusercontent.com/u/5743203/sklearn/wheelhouse/scipy-0.13.3-...
If scipy maintainers would like to upload such wheel packages for scipy 0.13.3 I can also prepare them for Python 2.7 and Python 3.3.
Thanks for doing those checks. Yes, I think it would be good to upload the scipy wheels, if nothing else they'd allow us to get early warning of any problems.
Ralf, Pauli - any objections to uploading binary wheels for 0.13.3? I'm will test on a clean 10.6 installation before I upload them.
No objections, looks like a good idea to me. Ralf

Hi, On Mon, Mar 31, 2014 at 4:53 AM, Olivier Grisel <olivier.grisel@ensta.org> wrote:
2014-03-28 23:13 GMT+01:00 Matthew Brett <matthew.brett@gmail.com>:
Hi,
On Fri, Mar 28, 2014 at 3:09 PM, Olivier Grisel <olivier.grisel@ensta.org> wrote:
This is great! Has anyone started to work on OSX whl packages for scipy? I assume the libgfortran, libquadmath & libgcc_s dylibs will not make it as easy as for numpy. Would it be possible to use a static gcc toolchain as Carl Kleffner is using for his experimental windows whl packages?
Yes, these are already done for the beta release, and for matplotlib:
https://nipy.bic.berkeley.edu/scipy_installers/
Luckily OSX has a sensible way of setting relative paths to required libraries, so it's pretty easy to copy the required dlls into the binary distribution:
Great! Do you think it would be possible to upload such a delocated .whl package for scipy 0.13.3 on pypi if all tests pass?
Bonus question: do you think a similar solution could work for windows and / or linux?
For linux - yes - I think that should be easy with a combination of ``ldd`` to find the dependencies and ``patchelf`` to set the rpath to point to the copied library. For Windows - I believe that it is not possible to set relative paths for windows DLLs, but I'd be very happy to be corrected. There is a function SetDllDirectory [1], but this would need some extra extension code in the package. Windows experts - is that an option? Cheers, Matthew [1] http://msdn.microsoft.com/en-us/library/ms686203(VS.85).aspx

On Mon, Mar 31, 2014 at 10:18 AM, Matthew Brett <matthew.brett@gmail.com>wrote:
Bonus question: do you think a similar solution could work for windows and / or linux?
For linux - yes - I think that should be easy with a combination of ``ldd`` to find the dependencies and ``patchelf`` to set the rpath to point to the copied library.
that part, yes, but isn't Linux too much of a varying target for there to be any point anyway?
For Windows - I believe that it is not possible to set relative paths for windows DLLs, but I'd be very happy to be corrected. There is a function SetDllDirectory [1], but this would need some extra extension code in the package. Windows experts - is that an option?
The "usual" way is to put the dll next to where it is needed. I _think_ a when a one dll (the pyton extension) is linked to another one, the first place windows looks is right next to the one loading it -- same as for dlls linked to main executables. Unfortunately, anywehre else and all bets are off -- I was fighting with this a while back and found what I think is the source of "DLL Hell" -- it's the combination of these two: 1) Windows looks next to the executable for dll. 2) The search PATH for executables and dlls is the same. So some folks put dlls next to the executable And other folks get bit because the search PATH finds dlls next to unrelated executables. The python.org python install has a DLLs directory: C:\Python27\DLLs Maybe putting them there with nice long, non-standard names would work. Has anyone looked at how Anaconda does it? -Chris -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker@noaa.gov

Hi, On Mon, Mar 31, 2014 at 11:55 AM, Chris Barker <chris.barker@noaa.gov> wrote:
On Mon, Mar 31, 2014 at 10:18 AM, Matthew Brett <matthew.brett@gmail.com> wrote:
Bonus question: do you think a similar solution could work for windows and / or linux?
For linux - yes - I think that should be easy with a combination of ``ldd`` to find the dependencies and ``patchelf`` to set the rpath to point to the copied library.
that part, yes, but isn't Linux too much of a varying target for there to be any point anyway?
You mean, the /usr/lib stuff varies too much, so that any copied dynamic libraries would have little chance of binary compatibility with the system libs?
For Windows - I believe that it is not possible to set relative paths for windows DLLs, but I'd be very happy to be corrected. There is a function SetDllDirectory [1], but this would need some extra extension code in the package. Windows experts - is that an option?
The "usual" way is to put the dll next to where it is needed. I _think_ a when a one dll (the pyton extension) is linked to another one, the first place windows looks is right next to the one loading it -- same as for dlls linked to main executables.
I had assumed from [1] is that it's the path of the executable not the loading DLL that is on the DLL search path, but I might well be wrong I guess, if it was the path of the loading DLL, you'd run into trouble because you'd likely have python extensions in several directories, and then you'd need to copy the dependencies into all of them.
Unfortunately, anywehre else and all bets are off -- I was fighting with this a while back and found what I think is the source of "DLL Hell" -- it's the combination of these two:
1) Windows looks next to the executable for dll. 2) The search PATH for executables and dlls is the same.
So some folks put dlls next to the executable And other folks get bit because the search PATH finds dlls next to unrelated executables.
The python.org python install has a DLLs directory:
C:\Python27\DLLs
Maybe putting them there with nice long, non-standard names would work.
Sounds reasonable to me.
Has anyone looked at how Anaconda does it?
Not me - would be interested to know. Cheers, Matthew [1] http://msdn.microsoft.com/en-us/library/ms682586(v=vs.85).aspx#standard_sear...

On Mon, Mar 31, 2014 at 12:05 PM, Matthew Brett <matthew.brett@gmail.com>wrote:
that part, yes, but isn't Linux too much of a varying target for there to be any point anyway?
You mean, the /usr/lib stuff varies too much, so that any copied dynamic libraries would have little chance of binary compatibility with the system libs?
exactly.
when a one dll (the pyton extension) is linked to another one, the first place windows looks is right next to the one loading it -- same as for
The "usual" way is to put the dll next to where it is needed. I _think_ a dlls
linked to main executables.
I had assumed from [1] is that it's the path of the executable not the loading DLL that is on the DLL search path, but I might well be wrong
I could be wring, too -- I'm pretty sure I tested this at some point, but It could be getting lost in the fog of memory. I guess, if it was the path of the loading DLL, you'd run into trouble
because you'd likely have python extensions in several directories, and then you'd need to copy the dependencies into all of them.
yup -- not ideal
The python.org python install has a DLLs directory:
C:\Python27\DLLs
Maybe putting them there with nice long, non-standard names would work.
Sounds reasonable to me.
on that note -- looking at my Python install on Windows, I don't see "C:\Python27\DLLs" in PATH. So there must be some run-time way to tel Windows to look there. Maybe that could be leveraged. This may be a question for distutils-sig or something.... -Chris -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker@noaa.gov

Hi, On Mon, Mar 31, 2014 at 12:27 PM, Chris Barker <chris.barker@noaa.gov> wrote:
On Mon, Mar 31, 2014 at 12:05 PM, Matthew Brett <matthew.brett@gmail.com> wrote:
that part, yes, but isn't Linux too much of a varying target for there to be any point anyway?
You mean, the /usr/lib stuff varies too much, so that any copied dynamic libraries would have little chance of binary compatibility with the system libs?
exactly.
The "usual" way is to put the dll next to where it is needed. I _think_ a when a one dll (the pyton extension) is linked to another one, the first place windows looks is right next to the one loading it -- same as for dlls linked to main executables.
I had assumed from [1] is that it's the path of the executable not the loading DLL that is on the DLL search path, but I might well be wrong
I could be wring, too -- I'm pretty sure I tested this at some point, but It could be getting lost in the fog of memory.
I am hopelessly lost here, but it looks as though Python extension modules get loaded via hDLL = LoadLibraryEx(pathname, NULL, LOAD_WITH_ALTERED_SEARCH_PATH); See: http://hg.python.org/cpython/file/3a1db0d2747e/Python/dynload_win.c#l195 I think this means that the first directory on the search path is indeed the path containing the extension module: http://msdn.microsoft.com/en-us/library/windows/desktop/ms682586(v=vs.85).as... So I'm guessing that it would not work putting DLLs into the 'DLLs' directory - unless the extension modules went in there too. I _think_ (David ?) this means it would not work to copy the dependent DLLs into sys.exec_prefix Cheers, Matthew

On Mon, Mar 31, 2014 at 3:09 PM, Matthew Brett <matthew.brett@gmail.com>wrote:
I am hopelessly lost here, but it looks as though Python extension modules get loaded via
hDLL = LoadLibraryEx(pathname, NULL, LOAD_WITH_ALTERED_SEARCH_PATH);
See: http://hg.python.org/cpython/file/3a1db0d2747e/Python/dynload_win.c#l195
I think this means that the first directory on the search path is indeed the path containing the extension module:
http://msdn.microsoft.com/en-us/library/windows/desktop/ms682586(v=vs.85).as...
yup -- that seems to be what it says... So I'm guessing that it would not work putting DLLs into the 'DLLs'
directory - unless the extension modules went in there too.
and yet there is a bunch of stuff there, so something is going on...It looks like my Windows box is down at the moment, but I _think_ there are a bunch of dependency dlls in there -- and not the extensions themselves. But I'm way out of my depth, too. -Chris -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker@noaa.gov

Hi, On Tue, Apr 1, 2014 at 9:04 AM, Chris Barker <chris.barker@noaa.gov> wrote:
On Mon, Mar 31, 2014 at 3:09 PM, Matthew Brett <matthew.brett@gmail.com> wrote:
I am hopelessly lost here, but it looks as though Python extension modules get loaded via
hDLL = LoadLibraryEx(pathname, NULL, LOAD_WITH_ALTERED_SEARCH_PATH);
See: http://hg.python.org/cpython/file/3a1db0d2747e/Python/dynload_win.c#l195
I think this means that the first directory on the search path is indeed the path containing the extension module:
http://msdn.microsoft.com/en-us/library/windows/desktop/ms682586(v=vs.85).as...
yup -- that seems to be what it says...
So I'm guessing that it would not work putting DLLs into the 'DLLs' directory - unless the extension modules went in there too.
and yet there is a bunch of stuff there, so something is going on...It looks like my Windows box is down at the moment, but I _think_ there are a bunch of dependency dlls in there -- and not the extensions themselves.
I'm guessing that the LOAD_WITH_ALTERED_SEARCH_PATH means that a DLL loaded via: hDLL = LoadLibraryEx(pathname, NULL, LOAD_WITH_ALTERED_SEARCH_PATH); will in turn (by default) search for its dependent DLLs in their own directory. Or maybe in the directory of the first DLL to be loaded with LOAD_WITH_ALTERED_SEARCH_PATH, damned if I can follow the documentation. Looking forward to doing my tax return after this. But - anyway - that means that any extensions in the DLLs directory will get their dependencies from the DLLs directory, but that is only true for extensions in that directory. Cheers, Matthew

On Tue, Apr 1, 2014 at 6:26 PM, Matthew Brett <matthew.brett@gmail.com> wrote:
I'm guessing that the LOAD_WITH_ALTERED_SEARCH_PATH means that a DLL loaded via:
hDLL = LoadLibraryEx(pathname, NULL, LOAD_WITH_ALTERED_SEARCH_PATH);
will in turn (by default) search for its dependent DLLs in their own directory. Or maybe in the directory of the first DLL to be loaded with LOAD_WITH_ALTERED_SEARCH_PATH, damned if I can follow the documentation. Looking forward to doing my tax return after this.
But - anyway - that means that any extensions in the DLLs directory will get their dependencies from the DLLs directory, but that is only true for extensions in that directory.
So in conclusion, if we just drop our compiled dependencies next to the compiled module files then we're good, even on older Windows versions? That sounds much simpler than previous discussions, but good news if it's true... -- Nathaniel J. Smith Postdoctoral researcher - Informatics - University of Edinburgh http://vorpus.org

Hi, On Tue, Apr 1, 2014 at 10:43 AM, Nathaniel Smith <njs@pobox.com> wrote:
On Tue, Apr 1, 2014 at 6:26 PM, Matthew Brett <matthew.brett@gmail.com> wrote:
I'm guessing that the LOAD_WITH_ALTERED_SEARCH_PATH means that a DLL loaded via:
hDLL = LoadLibraryEx(pathname, NULL, LOAD_WITH_ALTERED_SEARCH_PATH);
will in turn (by default) search for its dependent DLLs in their own directory. Or maybe in the directory of the first DLL to be loaded with LOAD_WITH_ALTERED_SEARCH_PATH, damned if I can follow the documentation. Looking forward to doing my tax return after this.
But - anyway - that means that any extensions in the DLLs directory will get their dependencies from the DLLs directory, but that is only true for extensions in that directory.
So in conclusion, if we just drop our compiled dependencies next to the compiled module files then we're good, even on older Windows versions? That sounds much simpler than previous discussions, but good news if it's true...
I think that's right, but as you can see, I am not sure. It might explain why Carl Kleffner found that he could drop libopenblas.dll in numpy/core and it just worked [1]. Well, if all the extensions using blas / lapack are in fact in numpy/core. Christoph - have you tried doing the same with MKL? Cheers, Matthew [1] http://numpy-discussion.10968.n7.nabble.com/Default-builds-of-OpenBLAS-devel...

Hi, I just noticed this C reference implementation of blas: https://github.com/rljames/coblas No lapack, no benchmarks, but tests, and BSD. I wonder if it is possible to craft a Frankenlibrary from OpenBLAS and reference implementations to avoid broken parts of OpenBLAS? Cheers, Matthew

On Tue, Apr 1, 2014 at 6:43 PM, Nathaniel Smith <njs@pobox.com> wrote:
On Tue, Apr 1, 2014 at 6:26 PM, Matthew Brett <matthew.brett@gmail.com> wrote:
I'm guessing that the LOAD_WITH_ALTERED_SEARCH_PATH means that a DLL loaded via:
hDLL = LoadLibraryEx(pathname, NULL, LOAD_WITH_ALTERED_SEARCH_PATH);
will in turn (by default) search for its dependent DLLs in their own directory. Or maybe in the directory of the first DLL to be loaded with LOAD_WITH_ALTERED_SEARCH_PATH, damned if I can follow the documentation. Looking forward to doing my tax return after this.
But - anyway - that means that any extensions in the DLLs directory will get their dependencies from the DLLs directory, but that is only true for extensions in that directory.
So in conclusion, if we just drop our compiled dependencies next to the compiled module files then we're good, even on older Windows versions? That sounds much simpler than previous discussions, but good news if it's true...
That does not work very well in my experience: - numpy has extension modules in multiple directories, so we would need to copy the dlls in multiple subdirectories - copying dlls means that windows will load that dll multiple times, with all the ensuing problems (I don't know for MKL/OpenBlas, but we've seen serious issues when doing something similar for hdf5 dll and pytables/h5py). David
-- Nathaniel J. Smith Postdoctoral researcher - Informatics - University of Edinburgh http://vorpus.org _______________________________________________ NumPy-Discussion mailing list NumPy-Discussion@scipy.org http://mail.scipy.org/mailman/listinfo/numpy-discussion

On Tue, Apr 1, 2014 at 11:58 PM, David Cournapeau <cournape@gmail.com> wrote:
On Tue, Apr 1, 2014 at 6:43 PM, Nathaniel Smith <njs@pobox.com> wrote:
On Tue, Apr 1, 2014 at 6:26 PM, Matthew Brett <matthew.brett@gmail.com> wrote:
I'm guessing that the LOAD_WITH_ALTERED_SEARCH_PATH means that a DLL loaded via:
hDLL = LoadLibraryEx(pathname, NULL, LOAD_WITH_ALTERED_SEARCH_PATH);
will in turn (by default) search for its dependent DLLs in their own directory. Or maybe in the directory of the first DLL to be loaded with LOAD_WITH_ALTERED_SEARCH_PATH, damned if I can follow the documentation. Looking forward to doing my tax return after this.
But - anyway - that means that any extensions in the DLLs directory will get their dependencies from the DLLs directory, but that is only true for extensions in that directory.
So in conclusion, if we just drop our compiled dependencies next to the compiled module files then we're good, even on older Windows versions? That sounds much simpler than previous discussions, but good news if it's true...
That does not work very well in my experience:
- numpy has extension modules in multiple directories, so we would need to copy the dlls in multiple subdirectories - copying dlls means that windows will load that dll multiple times, with all the ensuing problems (I don't know for MKL/OpenBlas, but we've seen serious issues when doing something similar for hdf5 dll and pytables/h5py).
We could just ship all numpy's extension modules in the same directory if we wanted. It would be pretty easy to stick some code at the top of numpy/__init__.py to load them from numpy/all_dlls/ and then slot them into the appropriate places in the package namespace. Of course scipy and numpy will still both have to ship BLAS etc., and so I guess it will get loaded at least twice in *any* binary install system. I'm not sure why this would be a problem (Windows, unlike Unix, carefully separates DLL namespaces, right?), but if it is a problem then it's a very fundamental one for any binaries we ship. Do the binaries we ship now have this problem? Or are we currently managing to statically link everything? -n -- Nathaniel J. Smith Postdoctoral researcher - Informatics - University of Edinburgh http://vorpus.org

On Wed, Apr 2, 2014 at 12:36 AM, Nathaniel Smith <njs@pobox.com> wrote:
On Tue, Apr 1, 2014 at 6:43 PM, Nathaniel Smith <njs@pobox.com> wrote:
On Tue, Apr 1, 2014 at 6:26 PM, Matthew Brett <matthew.brett@gmail.com> wrote:
I'm guessing that the LOAD_WITH_ALTERED_SEARCH_PATH means that a DLL loaded via:
hDLL = LoadLibraryEx(pathname, NULL, LOAD_WITH_ALTERED_SEARCH_PATH);
will in turn (by default) search for its dependent DLLs in their own directory. Or maybe in the directory of the first DLL to be loaded with LOAD_WITH_ALTERED_SEARCH_PATH, damned if I can follow the documentation. Looking forward to doing my tax return after this.
But - anyway - that means that any extensions in the DLLs directory will get their dependencies from the DLLs directory, but that is only true for extensions in that directory.
So in conclusion, if we just drop our compiled dependencies next to the compiled module files then we're good, even on older Windows versions? That sounds much simpler than previous discussions, but good news if it's true...
That does not work very well in my experience:
- numpy has extension modules in multiple directories, so we would need to copy the dlls in multiple subdirectories - copying dlls means that windows will load that dll multiple times, with all the ensuing problems (I don't know for MKL/OpenBlas, but we've seen serious issues when doing something similar for hdf5 dll and
On Tue, Apr 1, 2014 at 11:58 PM, David Cournapeau <cournape@gmail.com> wrote: pytables/h5py).
We could just ship all numpy's extension modules in the same directory if we wanted. It would be pretty easy to stick some code at the top of numpy/__init__.py to load them from numpy/all_dlls/ and then slot them into the appropriate places in the package namespace.
Of course scipy and numpy will still both have to ship BLAS etc., and so I guess it will get loaded at least twice in *any* binary install system. I'm not sure why this would be a problem (Windows, unlike Unix, carefully separates DLL namespaces, right?)
It does not really matter here. For pure blas/lapack, that may be ok because the functions are "stateless", but I would not count on it either. The cleanest solution I can think of is to have 'privately shared DLL', but that would AFAIK require patching python, so not really an option. , but if it is a
problem then it's a very fundamental one for any binaries we ship.
Do the binaries we ship now have this problem? Or are we currently managing to statically link everything?
We currently statically link everything. The main challenge is that 'new' (>= 4) versions of mingw don't easily allow statically linking all the mingw-related dependencies. While the options are there, everytime I tried to do it with an official build of mingw, I had some weird, very hard to track crashes. The other alternative that has been suggested is to build one own's toolchain where everything is static by default. I am not sure why that works, and that brings the risk of depending on a toolchain that we can't really maintain. David
-n
-- Nathaniel J. Smith Postdoctoral researcher - Informatics - University of Edinburgh http://vorpus.org _______________________________________________ NumPy-Discussion mailing list NumPy-Discussion@scipy.org http://mail.scipy.org/mailman/listinfo/numpy-discussion

Hi, On Tue, Apr 1, 2014 at 4:46 PM, David Cournapeau <cournape@gmail.com> wrote:
On Wed, Apr 2, 2014 at 12:36 AM, Nathaniel Smith <njs@pobox.com> wrote:
On Tue, Apr 1, 2014 at 11:58 PM, David Cournapeau <cournape@gmail.com> wrote:
On Tue, Apr 1, 2014 at 6:43 PM, Nathaniel Smith <njs@pobox.com> wrote:
On Tue, Apr 1, 2014 at 6:26 PM, Matthew Brett <matthew.brett@gmail.com> wrote:
I'm guessing that the LOAD_WITH_ALTERED_SEARCH_PATH means that a DLL loaded via:
hDLL = LoadLibraryEx(pathname, NULL, LOAD_WITH_ALTERED_SEARCH_PATH);
will in turn (by default) search for its dependent DLLs in their own directory. Or maybe in the directory of the first DLL to be loaded with LOAD_WITH_ALTERED_SEARCH_PATH, damned if I can follow the documentation. Looking forward to doing my tax return after this.
But - anyway - that means that any extensions in the DLLs directory will get their dependencies from the DLLs directory, but that is only true for extensions in that directory.
So in conclusion, if we just drop our compiled dependencies next to the compiled module files then we're good, even on older Windows versions? That sounds much simpler than previous discussions, but good news if it's true...
That does not work very well in my experience:
- numpy has extension modules in multiple directories, so we would need to copy the dlls in multiple subdirectories - copying dlls means that windows will load that dll multiple times, with all the ensuing problems (I don't know for MKL/OpenBlas, but we've seen serious issues when doing something similar for hdf5 dll and pytables/h5py).
We could just ship all numpy's extension modules in the same directory if we wanted. It would be pretty easy to stick some code at the top of numpy/__init__.py to load them from numpy/all_dlls/ and then slot them into the appropriate places in the package namespace.
Of course scipy and numpy will still both have to ship BLAS etc., and so I guess it will get loaded at least twice in *any* binary install system. I'm not sure why this would be a problem (Windows, unlike Unix, carefully separates DLL namespaces, right?)
It does not really matter here. For pure blas/lapack, that may be ok because the functions are "stateless", but I would not count on it either.
The cleanest solution I can think of is to have 'privately shared DLL', but that would AFAIK require patching python, so not really an option.
David - do you know anything about private assemblies [1]? Might they work for our problem? How about AddDllDirectory [2]? Cheers, Matthew [1] http://msdn.microsoft.com/en-us/library/windows/desktop/ff951638(v=vs.85).as... [2] http://msdn.microsoft.com/en-us/library/windows/desktop/hh310513(v=vs.85).as...

On Wed, Apr 2, 2014 at 7:52 AM, Matthew Brett <matthew.brett@gmail.com>wrote:
Hi,
On Tue, Apr 1, 2014 at 4:46 PM, David Cournapeau <cournape@gmail.com> wrote:
On Wed, Apr 2, 2014 at 12:36 AM, Nathaniel Smith <njs@pobox.com> wrote:
On Tue, Apr 1, 2014 at 11:58 PM, David Cournapeau <cournape@gmail.com> wrote:
On Tue, Apr 1, 2014 at 6:43 PM, Nathaniel Smith <njs@pobox.com>
wrote:
On Tue, Apr 1, 2014 at 6:26 PM, Matthew Brett <
matthew.brett@gmail.com>
wrote:
I'm guessing that the LOAD_WITH_ALTERED_SEARCH_PATH means that a DLL loaded via:
hDLL = LoadLibraryEx(pathname, NULL, LOAD_WITH_ALTERED_SEARCH_PATH);
will in turn (by default) search for its dependent DLLs in their own directory. Or maybe in the directory of the first DLL to be loaded with LOAD_WITH_ALTERED_SEARCH_PATH, damned if I can follow the documentation. Looking forward to doing my tax return after this.
But - anyway - that means that any extensions in the DLLs directory will get their dependencies from the DLLs directory, but that is only true for extensions in that directory.
So in conclusion, if we just drop our compiled dependencies next to the compiled module files then we're good, even on older Windows versions? That sounds much simpler than previous discussions, but good news if it's true...
That does not work very well in my experience:
- numpy has extension modules in multiple directories, so we would need to copy the dlls in multiple subdirectories - copying dlls means that windows will load that dll multiple times, with all the ensuing problems (I don't know for MKL/OpenBlas, but we've seen serious issues when doing something similar for hdf5 dll and pytables/h5py).
We could just ship all numpy's extension modules in the same directory if we wanted. It would be pretty easy to stick some code at the top of numpy/__init__.py to load them from numpy/all_dlls/ and then slot them into the appropriate places in the package namespace.
Of course scipy and numpy will still both have to ship BLAS etc., and so I guess it will get loaded at least twice in *any* binary install system. I'm not sure why this would be a problem (Windows, unlike Unix, carefully separates DLL namespaces, right?)
It does not really matter here. For pure blas/lapack, that may be ok because the functions are "stateless", but I would not count on it either.
The cleanest solution I can think of is to have 'privately shared DLL', but that would AFAIK require patching python, so not really an option.
David - do you know anything about private assemblies [1]?
I never managed to make that work properly.
Might they work for our problem? How about AddDllDirectory [2]?
I don't think it is appropriate to use those functions in a C extension module (as it impacts the whole process). David
Cheers,
Matthew
[1] http://msdn.microsoft.com/en-us/library/windows/desktop/ff951638(v=vs.85).as... [2] http://msdn.microsoft.com/en-us/library/windows/desktop/hh310513(v=vs.85).as... _______________________________________________ NumPy-Discussion mailing list NumPy-Discussion@scipy.org http://mail.scipy.org/mailman/listinfo/numpy-discussion

I'ts time for me to come back to the discussion after a longer break. some personal history: I was looking for a 64bit mingw more than a year ago (unrelated to python) for Fortran development and tried out quite some mingw toolchain variants based on the mingw-w64 project. In a nutshell: the most appropriate and best documentated solution are the toolchains provided by the mingw builds project IMHO. The next step was the idea to use this toolchain for compiling python extensions (C and Fortran) and then to try out compiling numpy and scipy with OpenBLAS. Despite the fact, that a mingw-w64 based toolchain is rock solid today the following possible issues should be considered for Python development: (1) deploy problem: mingw runtime DLLs can not be found at runtime Solution: use flags for static linking or use a dedicated 'static' GCC toolchain for compiling and linking. Both solutions should work. (2) Win32 default stack alignment incompatibility: GCC uses 16 bytes since GCC4.6, MSVC uses 4 bytes Solution: use the -mincoming-stack-boundary=2 flag for compiling. Win64 X86_64 is not affected. This issue is the major cause for segment faults on 32bit systems. (3) Import library problem: numpy distutils does not play well with mingw-w64 Solution: create a Python import library with the mingw-w64 tools. Use a patched numpy distutils. A detailed description can be found here: http://article.gmane.org/gmane.comp.python.numeric.general/56749 . (4) linking against the correct msvcrXXX Version. Solution: create a 'specs' file (howto see http://www.mingw.org/wiki/HOWTO_Use_the_GCC_specs_file ) that includes necessary informations. (5) manifest resources Solution: extend the GCC toolchain with the Manifest resource files and ensure linkage with the help of the 'specs' file. (6) Blas Lapack for numpy scipy There is no silver bullet! A trade-off between licence acceptance, performance and stability remains to be found. OpenBLAS on Win32 seems to be quite stable. Some OpenBLAS issues on Win64 can be adressed with a single threaded version of that library. On my google drive: https://drive.google.com/folderview?id=0B4DmELLTwYmldUVpSjdpZlpNM1k&usp=sharingI provide the necessary parts to try the procedures described at http://article.gmane.org/gmane.comp.python.numeric.general/56749 and http://article.gmane.org/gmane.comp.python.numeric.general/56767 Up to now I didn't find time to put a comprehensive description on the Web and to update all that stuff (MSVCR100 support for the toolchain still missing), so I add my incomplete, not yet published mingw-w64 FAQ at the end of my longish E-Mail for further discussions. Carl --- my personal mingw-w64 FAQ ========================= what is mingw-w64 ----------------- mingw-w64 is a fork of the mingw32 project - http://sourceforge.net/apps/trac/mingw-w64/wiki/History why choose mingw-w64 over mingw ------------------------------- - 32 AND 64bit support - large file support - winpthread pthreads implementation, MIT licenced. - cross compiler toolchains availabe for Linux official mingw-w64 releases --------------------------- source releases of the mingw-64 repository - http://sourceforge.net/projects/mingw-w64/files/mingw-w64/mingw-w64-release/ official mingw-w64 GCC toolchains --------------------------------- 'recommened' builds are available from the mingw-builds project http://mingw-w64.sourceforge.net/download.php#mingw-builds for example - http://sourceforge.net/projects/mingw-w64/files/Toolchains%20targetting%20Wi... - http://sourceforge.net/projects/mingw-w64/files/Toolchains%20targetting%20Wi... These are common combinations of exception and thread models. You can find other combinations as well. Exception handling affects C++ development. Don't ever link object code with different types of exception and/or thread handling! threads concerning the question 'where to find mingw-w64 builds' - http://article.gmane.org/gmane.comp.gnu.mingw.w64.general/7700 - http://article.gmane.org/gmane.comp.gnu.mingw.w64.general/8484 how to build a mingw-w64 based GCC toolchain on Windows ------------------------------------------------------- "mingw-builds" is a set of scripts and patches for compiling the GCC toolchain under Windows with the help of msys2 POSIX enviroment - https://github.com/niXman/mingw-builds/tree/develop recent 'mingw-builds' GCC toolchains can be downloaded from the mingw-w64 sf.net (4) what is msys2 ------------- msys2 is the successor of msys. Msys2 is necessary as enviroment for the mingw build process on Windows. - http://sourceforge.net/p/msys2/wiki/MSYS2%20installation/ where to get precompiled mingw-w64 compiled libraries ----------------------------------------------------- recent mingw-w64 based tools and library packages together with sources and patches are available from archlinux as well as from the msys2 maintainers. - http://sourceforge.net/projects/mingw-w64-archlinux/files/ (i686: Sjlj | x86_64: SEH) - http://sourceforge.net/projects/msys2/files/REPOS/MINGW/ (i686: Dwarf | x86_64: SEH) what is a static GCC toolchain ------------------------------ GCC as well as the all neccessary libraries for a toolchain can be compiled with "-disabled-shared". This is supported by the mingw-builds scripts as an option. All the necessary object code from the GCC runtimes will be statically linked into the binaries. As a consequence the binary size will be increased in comparison to the standard toolchains. The advantage is, that there will be no dependancy to external GCC runtime libraries, so the deployment of python extensions is greatly improved. Using such a toolchain is more reliable than using -static-XXX flags. However, exception heavy C++ programms (i.e. QT) should be compiled with shared runtimes to avoid problems with exceptions handling over DLL boundaries. For building typically Python extensions a customized static GCC toolchain is the best compromise IMHO. customizations over standard mingw-builds releases -------------------------------------------------- - two dedicated GCC toolchains for both 32bit (posix threads, Dwarf exceptions) and 64 bit (posix threads, SEH exceptions) - statically build toolchain based on gcc-4.8.2 and mingw-w64 v 3.1.0 - languages: C, C++, gfortran, LTO - customized 'specs' file for MSVCR90 linkage and manifest support (MSVCR100 linkage coming soon) - additional ftime64 patch to allow winpthreads and OpenMP to work with MSVCR90 linkage - openblas-2.9rc1 with windows thread support (OpenMP disabled) included 2014-04-02 1:46 GMT+02:00 David Cournapeau <cournape@gmail.com>:
On Wed, Apr 2, 2014 at 12:36 AM, Nathaniel Smith <njs@pobox.com> wrote:
On Tue, Apr 1, 2014 at 6:43 PM, Nathaniel Smith <njs@pobox.com> wrote:
On Tue, Apr 1, 2014 at 6:26 PM, Matthew Brett <matthew.brett@gmail.com
wrote:
I'm guessing that the LOAD_WITH_ALTERED_SEARCH_PATH means that a DLL loaded via:
hDLL = LoadLibraryEx(pathname, NULL, LOAD_WITH_ALTERED_SEARCH_PATH);
will in turn (by default) search for its dependent DLLs in their own directory. Or maybe in the directory of the first DLL to be loaded with LOAD_WITH_ALTERED_SEARCH_PATH, damned if I can follow the documentation. Looking forward to doing my tax return after this.
But - anyway - that means that any extensions in the DLLs directory will get their dependencies from the DLLs directory, but that is only true for extensions in that directory.
So in conclusion, if we just drop our compiled dependencies next to the compiled module files then we're good, even on older Windows versions? That sounds much simpler than previous discussions, but good news if it's true...
That does not work very well in my experience:
- numpy has extension modules in multiple directories, so we would need to copy the dlls in multiple subdirectories - copying dlls means that windows will load that dll multiple times, with all the ensuing problems (I don't know for MKL/OpenBlas, but we've seen serious issues when doing something similar for hdf5 dll and
On Tue, Apr 1, 2014 at 11:58 PM, David Cournapeau <cournape@gmail.com> wrote: pytables/h5py).
We could just ship all numpy's extension modules in the same directory if we wanted. It would be pretty easy to stick some code at the top of numpy/__init__.py to load them from numpy/all_dlls/ and then slot them into the appropriate places in the package namespace.
Of course scipy and numpy will still both have to ship BLAS etc., and so I guess it will get loaded at least twice in *any* binary install system. I'm not sure why this would be a problem (Windows, unlike Unix, carefully separates DLL namespaces, right?)
It does not really matter here. For pure blas/lapack, that may be ok because the functions are "stateless", but I would not count on it either.
The cleanest solution I can think of is to have 'privately shared DLL', but that would AFAIK require patching python, so not really an option.
, but if it is a
problem then it's a very fundamental one for any binaries we ship.
Do the binaries we ship now have this problem? Or are we currently managing to statically link everything?
We currently statically link everything. The main challenge is that 'new' (>= 4) versions of mingw don't easily allow statically linking all the mingw-related dependencies. While the options are there, everytime I tried to do it with an official build of mingw, I had some weird, very hard to track crashes. The other alternative that has been suggested is to build one own's toolchain where everything is static by default. I am not sure why that works, and that brings the risk of depending on a toolchain that we can't really maintain.
David
-n
-- Nathaniel J. Smith Postdoctoral researcher - Informatics - University of Edinburgh http://vorpus.org _______________________________________________ NumPy-Discussion mailing list NumPy-Discussion@scipy.org http://mail.scipy.org/mailman/listinfo/numpy-discussion
_______________________________________________ NumPy-Discussion mailing list NumPy-Discussion@scipy.org http://mail.scipy.org/mailman/listinfo/numpy-discussion

Hi, On Fri, Apr 4, 2014 at 8:19 AM, Carl Kleffner <cmkleffner@gmail.com> wrote:
I'ts time for me to come back to the discussion after a longer break.
some personal history: I was looking for a 64bit mingw more than a year ago (unrelated to python) for Fortran development and tried out quite some mingw toolchain variants based on the mingw-w64 project. In a nutshell: the most appropriate and best documentated solution are the toolchains provided by the mingw builds project IMHO.
The next step was the idea to use this toolchain for compiling python extensions (C and Fortran) and then to try out compiling numpy and scipy with OpenBLAS.
Despite the fact, that a mingw-w64 based toolchain is rock solid today the following possible issues should be considered for Python development:
(1) deploy problem: mingw runtime DLLs can not be found at runtime Solution: use flags for static linking or use a dedicated 'static' GCC toolchain for compiling and linking. Both solutions should work.
(2) Win32 default stack alignment incompatibility: GCC uses 16 bytes since GCC4.6, MSVC uses 4 bytes Solution: use the -mincoming-stack-boundary=2 flag for compiling. Win64 X86_64 is not affected. This issue is the major cause for segment faults on 32bit systems.
(3) Import library problem: numpy distutils does not play well with mingw-w64 Solution: create a Python import library with the mingw-w64 tools. Use a patched numpy distutils. A detailed description can be found here: http://article.gmane.org/gmane.comp.python.numeric.general/56749 .
(4) linking against the correct msvcrXXX Version. Solution: create a 'specs' file (howto see http://www.mingw.org/wiki/HOWTO_Use_the_GCC_specs_file ) that includes necessary informations.
(5) manifest resources Solution: extend the GCC toolchain with the Manifest resource files and ensure linkage with the help of the 'specs' file.
(6) Blas Lapack for numpy scipy There is no silver bullet! A trade-off between licence acceptance, performance and stability remains to be found. OpenBLAS on Win32 seems to be quite stable. Some OpenBLAS issues on Win64 can be adressed with a single threaded version of that library.
On my google drive: https://drive.google.com/folderview?id=0B4DmELLTwYmldUVpSjdpZlpNM1k&usp=sharing I provide the necessary parts to try the procedures described at http://article.gmane.org/gmane.comp.python.numeric.general/56749 and http://article.gmane.org/gmane.comp.python.numeric.general/56767
Up to now I didn't find time to put a comprehensive description on the Web and to update all that stuff (MSVCR100 support for the toolchain still missing), so I add my incomplete, not yet published mingw-w64 FAQ at the end of my longish E-Mail for further discussions.
Carl
---
my personal mingw-w64 FAQ =========================
what is mingw-w64 -----------------
mingw-w64 is a fork of the mingw32 project - http://sourceforge.net/apps/trac/mingw-w64/wiki/History
why choose mingw-w64 over mingw -------------------------------
- 32 AND 64bit support - large file support - winpthread pthreads implementation, MIT licenced. - cross compiler toolchains availabe for Linux
official mingw-w64 releases ---------------------------
source releases of the mingw-64 repository - http://sourceforge.net/projects/mingw-w64/files/mingw-w64/mingw-w64-release/
official mingw-w64 GCC toolchains ---------------------------------
'recommened' builds are available from the mingw-builds project http://mingw-w64.sourceforge.net/download.php#mingw-builds for example - http://sourceforge.net/projects/mingw-w64/files/Toolchains%20targetting%20Wi... - http://sourceforge.net/projects/mingw-w64/files/Toolchains%20targetting%20Wi...
These are common combinations of exception and thread models. You can find other combinations as well. Exception handling affects C++ development. Don't ever link object code with different types of exception and/or thread handling!
threads concerning the question 'where to find mingw-w64 builds' - http://article.gmane.org/gmane.comp.gnu.mingw.w64.general/7700 - http://article.gmane.org/gmane.comp.gnu.mingw.w64.general/8484
how to build a mingw-w64 based GCC toolchain on Windows -------------------------------------------------------
"mingw-builds" is a set of scripts and patches for compiling the GCC toolchain under Windows with the help of msys2 POSIX enviroment - https://github.com/niXman/mingw-builds/tree/develop recent 'mingw-builds' GCC toolchains can be downloaded from the mingw-w64 sf.net (4)
what is msys2 -------------
msys2 is the successor of msys. Msys2 is necessary as enviroment for the mingw build process on Windows. - http://sourceforge.net/p/msys2/wiki/MSYS2%20installation/
where to get precompiled mingw-w64 compiled libraries -----------------------------------------------------
recent mingw-w64 based tools and library packages together with sources and patches are available from archlinux as well as from the msys2 maintainers. - http://sourceforge.net/projects/mingw-w64-archlinux/files/ (i686: Sjlj | x86_64: SEH) - http://sourceforge.net/projects/msys2/files/REPOS/MINGW/ (i686: Dwarf | x86_64: SEH)
what is a static GCC toolchain ------------------------------
GCC as well as the all neccessary libraries for a toolchain can be compiled with "-disabled-shared". This is supported by the mingw-builds scripts as an option. All the necessary object code from the GCC runtimes will be statically linked into the binaries. As a consequence the binary size will be increased in comparison to the standard toolchains. The advantage is, that there will be no dependancy to external GCC runtime libraries, so the deployment of python extensions is greatly improved. Using such a toolchain is more reliable than using -static-XXX flags. However, exception heavy C++ programms (i.e. QT) should be compiled with shared runtimes to avoid problems with exceptions handling over DLL boundaries. For building typically Python extensions a customized static GCC toolchain is the best compromise IMHO.
customizations over standard mingw-builds releases --------------------------------------------------
- two dedicated GCC toolchains for both 32bit (posix threads, Dwarf exceptions) and 64 bit (posix threads, SEH exceptions) - statically build toolchain based on gcc-4.8.2 and mingw-w64 v 3.1.0 - languages: C, C++, gfortran, LTO - customized 'specs' file for MSVCR90 linkage and manifest support (MSVCR100 linkage coming soon) - additional ftime64 patch to allow winpthreads and OpenMP to work with MSVCR90 linkage - openblas-2.9rc1 with windows thread support (OpenMP disabled) included
Thanks very much for this. Would you consider putting this up as a numpy wiki page? https://github.com/numpy/numpy/wiki I think it would be very valuable... Cheers, Matthew

Hi, I will add a a wiki page ASAP. BTW: I copied my tools (gcc toolchain, numpy, scipy wheels) from my google drive to bitbucket: https://bitbucket.org/carlkl/mingw-w64-for-python/downloads Regards Carl 2014-04-04 21:56 GMT+02:00 Matthew Brett <matthew.brett@gmail.com>:
Hi,
On Fri, Apr 4, 2014 at 8:19 AM, Carl Kleffner <cmkleffner@gmail.com> wrote:
I'ts time for me to come back to the discussion after a longer break.
some personal history: I was looking for a 64bit mingw more than a year
(unrelated to python) for Fortran development and tried out quite some mingw toolchain variants based on the mingw-w64 project. In a nutshell: the most appropriate and best documentated solution are the toolchains provided by the mingw builds project IMHO.
The next step was the idea to use this toolchain for compiling python extensions (C and Fortran) and then to try out compiling numpy and scipy with OpenBLAS.
Despite the fact, that a mingw-w64 based toolchain is rock solid today
ago the
following possible issues should be considered for Python development:
(1) deploy problem: mingw runtime DLLs can not be found at runtime Solution: use flags for static linking or use a dedicated 'static' GCC toolchain for compiling and linking. Both solutions should work.
(2) Win32 default stack alignment incompatibility: GCC uses 16 bytes since GCC4.6, MSVC uses 4 bytes Solution: use the -mincoming-stack-boundary=2 flag for compiling. Win64 X86_64 is not affected. This issue is the major cause for segment faults on 32bit systems.
(3) Import library problem: numpy distutils does not play well with mingw-w64 Solution: create a Python import library with the mingw-w64 tools. Use a patched numpy distutils. A detailed description can be found here: http://article.gmane.org/gmane.comp.python.numeric.general/56749 .
(4) linking against the correct msvcrXXX Version. Solution: create a 'specs' file (howto see http://www.mingw.org/wiki/HOWTO_Use_the_GCC_specs_file ) that includes necessary informations.
(5) manifest resources Solution: extend the GCC toolchain with the Manifest resource files and ensure linkage with the help of the 'specs' file.
(6) Blas Lapack for numpy scipy There is no silver bullet! A trade-off between licence acceptance, performance and stability remains to be found. OpenBLAS on Win32 seems to be quite stable. Some OpenBLAS issues on Win64 can be adressed with a single threaded version of that library.
On my google drive:
https://drive.google.com/folderview?id=0B4DmELLTwYmldUVpSjdpZlpNM1k&usp=sharing
I provide the necessary parts to try the procedures described at http://article.gmane.org/gmane.comp.python.numeric.general/56749 and http://article.gmane.org/gmane.comp.python.numeric.general/56767
Up to now I didn't find time to put a comprehensive description on the Web and to update all that stuff (MSVCR100 support for the toolchain still missing), so I add my incomplete, not yet published mingw-w64 FAQ at the end of my longish E-Mail for further discussions.
Carl
---
my personal mingw-w64 FAQ =========================
what is mingw-w64 -----------------
mingw-w64 is a fork of the mingw32 project - http://sourceforge.net/apps/trac/mingw-w64/wiki/History
why choose mingw-w64 over mingw -------------------------------
- 32 AND 64bit support - large file support - winpthread pthreads implementation, MIT licenced. - cross compiler toolchains availabe for Linux
official mingw-w64 releases ---------------------------
source releases of the mingw-64 repository -
http://sourceforge.net/projects/mingw-w64/files/mingw-w64/mingw-w64-release/
official mingw-w64 GCC toolchains ---------------------------------
'recommened' builds are available from the mingw-builds project http://mingw-w64.sourceforge.net/download.php#mingw-builds for example -
http://sourceforge.net/projects/mingw-w64/files/Toolchains%20targetting%20Wi...
-
http://sourceforge.net/projects/mingw-w64/files/Toolchains%20targetting%20Wi...
These are common combinations of exception and thread models. You can
other combinations as well. Exception handling affects C++ development. Don't ever link object code with different types of exception and/or
handling!
threads concerning the question 'where to find mingw-w64 builds' - http://article.gmane.org/gmane.comp.gnu.mingw.w64.general/7700 - http://article.gmane.org/gmane.comp.gnu.mingw.w64.general/8484
how to build a mingw-w64 based GCC toolchain on Windows -------------------------------------------------------
"mingw-builds" is a set of scripts and patches for compiling the GCC toolchain under Windows with the help of msys2 POSIX enviroment - https://github.com/niXman/mingw-builds/tree/develop recent 'mingw-builds' GCC toolchains can be downloaded from the mingw-w64 sf.net (4)
what is msys2 -------------
msys2 is the successor of msys. Msys2 is necessary as enviroment for the mingw build process on Windows. - http://sourceforge.net/p/msys2/wiki/MSYS2%20installation/
where to get precompiled mingw-w64 compiled libraries -----------------------------------------------------
recent mingw-w64 based tools and library packages together with sources and patches are available from archlinux as well as from the msys2
- http://sourceforge.net/projects/mingw-w64-archlinux/files/ (i686: Sjlj | x86_64: SEH) - http://sourceforge.net/projects/msys2/files/REPOS/MINGW/ (i686: Dwarf | x86_64: SEH)
what is a static GCC toolchain ------------------------------
GCC as well as the all neccessary libraries for a toolchain can be compiled with "-disabled-shared". This is supported by the mingw-builds scripts as an option. All the necessary object code from the GCC runtimes will be statically linked into the binaries. As a consequence the binary size will be increased in comparison to the standard toolchains. The advantage is, that there will be no dependancy to external GCC runtime libraries, so
find thread maintainers. the
deployment of python extensions is greatly improved. Using such a toolchain is more reliable than using -static-XXX flags. However, exception heavy C++ programms (i.e. QT) should be compiled with shared runtimes to avoid problems with exceptions handling over DLL boundaries. For building typically Python extensions a customized static GCC toolchain is the best compromise IMHO.
customizations over standard mingw-builds releases --------------------------------------------------
- two dedicated GCC toolchains for both 32bit (posix threads, Dwarf exceptions) and 64 bit (posix threads, SEH exceptions) - statically build toolchain based on gcc-4.8.2 and mingw-w64 v 3.1.0 - languages: C, C++, gfortran, LTO - customized 'specs' file for MSVCR90 linkage and manifest support (MSVCR100 linkage coming soon) - additional ftime64 patch to allow winpthreads and OpenMP to work with MSVCR90 linkage - openblas-2.9rc1 with windows thread support (OpenMP disabled) included
Thanks very much for this.
Would you consider putting this up as a numpy wiki page?
https://github.com/numpy/numpy/wiki
I think it would be very valuable...
Cheers,
Matthew _______________________________________________ NumPy-Discussion mailing list NumPy-Discussion@scipy.org http://mail.scipy.org/mailman/listinfo/numpy-discussion

I'ts time for me to come back to the discussion after a longer break. some personal history: I was looking for a 64bit mingw more than a year ago (unrelated to python) for Fortran development and tried out quite some mingw toolchain variants based on the mingw-w64 project. In a nutshell: the most appropriate and best documentated solution are the toolchains provided by the mingw builds project IMHO. The next step was the idea to use this toolchain for compiling python extensions (C and Fortran) and then to try out compiling numpy and scipy with OpenBLAS. Despite the fact, that a mingw-w64 based toolchain is rock solid today the following possible issues should be considered for Python development: (1) deploy problem: mingw runtime DLLs can not be found at runtime Solution: use flags for static linking or use a dedicated 'static' GCC toolchain for compiling and linking. Both solutions should work. (2) Win32 default stack alignment incompatibility: GCC uses 16 bytes since GCC4.6, MSVC uses 4 bytes Solution: use the -mincoming-stack-boundary=2 flag for compiling. Win64 X86_64 is not affected. This issue is the major cause for segment faults on 32bit systems. (3) Import library problem: numpy distutils does not play well with mingw-w64 Solution: create a Python import library with the mingw-w64 tools. Use a patched numpy distutils. A detailed description can be found here: http://article.gmane.org/gmane.comp.python.numeric.general/56749 . (4) linking against the correct msvcrXXX Version. Solution: create a 'specs' file (howto see http://www.mingw.org/wiki/HOWTO_Use_the_GCC_specs_file ) that includes necessary informations. (5) manifest resources Solution: extend the GCC toolchain with the Manifest resource files and ensure linkage with the help of the 'specs' file. (6) Blas Lapack for numpy scipy There is no silver bullet! A trade-off between licence acceptance, performance and stability remains to be found. OpenBLAS on Win32 seems to be quite stable. Some OpenBLAS issues on Win64 can be adressed with a single threaded version of that library. On my google drive: https://drive.google.com/folderview?id=0B4DmELLTwYmldUVpSjdpZlpNM1k&usp=sharingI provide the necessary parts to try the procedures described at http://article.gmane.org/gmane.comp.python.numeric.general/56749 and http://article.gmane.org/gmane.comp.python.numeric.general/56767 Up to now I didn't find time to put a comprehensive description on the Web and to update all that stuff (MSVCR100 support for the toolchain still missing), so I add my incomplete, not yet published mingw-w64 FAQ at the end of my longish E-Mail for further discussions. Carl --- my personal mingw-w64 FAQ ========================= what is mingw-w64 ----------------- mingw-w64 is a fork of the mingw32 project - http://sourceforge.net/apps/trac/mingw-w64/wiki/History why choose mingw-w64 over mingw ------------------------------- - 32 AND 64bit support - large file support - winpthread pthreads implementation, MIT licenced. - cross compiler toolchains availabe for Linux official mingw-w64 releases --------------------------- source releases of the mingw-64 repository - http://sourceforge.net/projects/mingw-w64/files/mingw-w64/mingw-w64-release/ official mingw-w64 GCC toolchains --------------------------------- 'recommened' builds are available from the mingw-builds project http://mingw-w64.sourceforge.net/download.php#mingw-builds for example - http://sourceforge.net/projects/mingw-w64/files/Toolchains%20targetting%20Wi... - http://sourceforge.net/projects/mingw-w64/files/Toolchains%20targetting%20Wi... These are common combinations of exception and thread models. You can find other combinations as well. Exception handling affects C++ development. Don't ever link object code with different types of exception and/or thread handling! threads concerning the question 'where to find mingw-w64 builds' - http://article.gmane.org/gmane.comp.gnu.mingw.w64.general/7700 - http://article.gmane.org/gmane.comp.gnu.mingw.w64.general/8484 how to build a mingw-w64 based GCC toolchain on Windows ------------------------------------------------------- "mingw-builds" is a set of scripts and patches for compiling the GCC toolchain under Windows with the help of msys2 POSIX enviroment - https://github.com/niXman/mingw-builds/tree/develop recent 'mingw-builds' GCC toolchains can be downloaded from the mingw-w64 sf.net (4) what is msys2 ------------- msys2 is the successor of msys. Msys2 is necessary as enviroment for the mingw build process on Windows. - http://sourceforge.net/p/msys2/wiki/MSYS2%20installation/ where to get precompiled mingw-w64 compiled libraries ----------------------------------------------------- recent mingw-w64 based tools and library packages together with sources and patches are available from archlinux as well as from the msys2 maintainers. - http://sourceforge.net/projects/mingw-w64-archlinux/files/ (i686: Sjlj | x86_64: SEH) - http://sourceforge.net/projects/msys2/files/REPOS/MINGW/ (i686: Dwarf | x86_64: SEH) what is a static GCC toolchain ------------------------------ GCC as well as the all neccessary libraries for a toolchain can be compiled with "-disabled-shared". This is supported by the mingw-builds scripts as an option. All the necessary object code from the GCC runtimes will be statically linked into the binaries. As a consequence the binary size will be increased in comparison to the standard toolchains. The advantage is, that there will be no dependancy to external GCC runtime libraries, so the deployment of python extensions is greatly improved. Using such a toolchain is more reliable than using -static-XXX flags. However, exception heavy C++ programms (i.e. QT) should be compiled with shared runtimes to avoid problems with exceptions handling over DLL boundaries. For building typically Python extensions a customized static GCC toolchain is the best compromise IMHO. customizations over standard mingw-builds releases -------------------------------------------------- - two dedicated GCC toolchains for both 32bit (posix threads, Dwarf exceptions) and 64 bit (posix threads, SEH exceptions) - statically build toolchain based on gcc-4.8.2 and mingw-w64 v 3.1.0 - languages: C, C++, gfortran, LTO - customized 'specs' file for MSVCR90 linkage and manifest support (MSVCR100 linkage coming soon) - additional ftime64 patch to allow winpthreads and OpenMP to work with MSVCR90 linkage - openblas-2.9rc1 with windows thread support (OpenMP disabled) included 2014-04-02 1:46 GMT+02:00 David Cournapeau <cournape@gmail.com>:
On Wed, Apr 2, 2014 at 12:36 AM, Nathaniel Smith <njs@pobox.com> wrote:
On Tue, Apr 1, 2014 at 6:43 PM, Nathaniel Smith <njs@pobox.com> wrote:
On Tue, Apr 1, 2014 at 6:26 PM, Matthew Brett <matthew.brett@gmail.com
wrote:
I'm guessing that the LOAD_WITH_ALTERED_SEARCH_PATH means that a DLL loaded via:
hDLL = LoadLibraryEx(pathname, NULL, LOAD_WITH_ALTERED_SEARCH_PATH);
will in turn (by default) search for its dependent DLLs in their own directory. Or maybe in the directory of the first DLL to be loaded with LOAD_WITH_ALTERED_SEARCH_PATH, damned if I can follow the documentation. Looking forward to doing my tax return after this.
But - anyway - that means that any extensions in the DLLs directory will get their dependencies from the DLLs directory, but that is only true for extensions in that directory.
So in conclusion, if we just drop our compiled dependencies next to the compiled module files then we're good, even on older Windows versions? That sounds much simpler than previous discussions, but good news if it's true...
That does not work very well in my experience:
- numpy has extension modules in multiple directories, so we would need to copy the dlls in multiple subdirectories - copying dlls means that windows will load that dll multiple times, with all the ensuing problems (I don't know for MKL/OpenBlas, but we've seen serious issues when doing something similar for hdf5 dll and
On Tue, Apr 1, 2014 at 11:58 PM, David Cournapeau <cournape@gmail.com> wrote: pytables/h5py).
We could just ship all numpy's extension modules in the same directory if we wanted. It would be pretty easy to stick some code at the top of numpy/__init__.py to load them from numpy/all_dlls/ and then slot them into the appropriate places in the package namespace.
Of course scipy and numpy will still both have to ship BLAS etc., and so I guess it will get loaded at least twice in *any* binary install system. I'm not sure why this would be a problem (Windows, unlike Unix, carefully separates DLL namespaces, right?)
It does not really matter here. For pure blas/lapack, that may be ok because the functions are "stateless", but I would not count on it either.
The cleanest solution I can think of is to have 'privately shared DLL', but that would AFAIK require patching python, so not really an option.
, but if it is a
problem then it's a very fundamental one for any binaries we ship.
Do the binaries we ship now have this problem? Or are we currently managing to statically link everything?
We currently statically link everything. The main challenge is that 'new' (>= 4) versions of mingw don't easily allow statically linking all the mingw-related dependencies. While the options are there, everytime I tried to do it with an official build of mingw, I had some weird, very hard to track crashes. The other alternative that has been suggested is to build one own's toolchain where everything is static by default. I am not sure why that works, and that brings the risk of depending on a toolchain that we can't really maintain.
David
-n
-- Nathaniel J. Smith Postdoctoral researcher - Informatics - University of Edinburgh http://vorpus.org _______________________________________________ NumPy-Discussion mailing list NumPy-Discussion@scipy.org http://mail.scipy.org/mailman/listinfo/numpy-discussion
_______________________________________________ NumPy-Discussion mailing list NumPy-Discussion@scipy.org http://mail.scipy.org/mailman/listinfo/numpy-discussion

On Apr 1, 2014, at 4:36 PM, Nathaniel Smith <njs@pobox.com> wrote:
We could just ship all numpy's extension modules in the same directory if we wanted. It would be pretty easy to stick some code at the top of numpy/__init__.py to load them from numpy/all_dlls/ and then slot them into the appropriate places in the package namespace.
I like this, and it could play well with wheels and virtualenv. (which reminds me -- looking at what Anaconda does may be instructive)
Of course scipy and numpy will still both have to ship BLAS etc., and so I guess it will get loaded at least twice in *any* binary install
Since scipy has a strict dependency on numpy, and the same people are maintaining the builds, couldn't scipy use the libs provided by numpy? ( maybe adding more) -Chris

On 28.03.2014 23:09, Olivier Grisel wrote:
This is great! Has anyone started to work on OSX whl packages for scipy? I assume the libgfortran, libquadmath & libgcc_s dylibs will not make it as easy as for numpy. Would it be possible to use a static gcc toolchain as Carl Kleffner is using for his experimental windows whl packages?
you can get rid of libgfortran and quadmath with the -static-libgfortran flag libgcc_s is probably more tricky as scipy uses c++ so -static-libgcc may need checking before using it doesn't mac provide libgcc_s anyway? Even though they have clang by default now, I doubt they can remove libgcc very soon.

Julian Taylor <jtaylor.debian@googlemail.com> wrote:
On 28.03.2014 23:09, Olivier Grisel wrote:
you can get rid of libgfortran and quadmath with the -static-libgfortran flag libgcc_s is probably more tricky as scipy uses c++ so -static-libgcc may need checking before using it doesn't mac provide libgcc_s anyway? Even though they have clang by default now, I doubt they can remove libgcc very soon.
As of OS X 10.9 (Mavericks), -static-libgcc is not supported by the C compiler. Sturla

On 29.03.2014 00:02, Sturla Molden wrote:
Julian Taylor <jtaylor.debian@googlemail.com> wrote:
On 28.03.2014 23:09, Olivier Grisel wrote:
you can get rid of libgfortran and quadmath with the -static-libgfortran flag libgcc_s is probably more tricky as scipy uses c++ so -static-libgcc may need checking before using it doesn't mac provide libgcc_s anyway? Even though they have clang by default now, I doubt they can remove libgcc very soon.
As of OS X 10.9 (Mavericks), -static-libgcc is not supported by the C compiler.
because the C compiler is not gcc, so obviously also no libgcc. 10.9 uses clang by default. But the library is still installed in the system (at least on the 10.9 macs I saw)

On Fri, Mar 28, 2014 at 5:09 PM, Sturla Molden <sturla.molden@gmail.com> wrote:
On 29/03/14 00:05, Julian Taylor wrote:
But the library is still installed in the system (at least on the 10.9 macs I saw)
I only find it in the gfortran 4.8 I installed separately. Nowhere else.
Have a look at the README for delocate: https://github.com/matthew-brett/delocate The worked example is scipy; you can see it copying these libs into the binary wheel: /usr/local/Cellar/gfortran/4.8.2/gfortran/lib/libgcc_s.1.dylib /usr/local/Cellar/gfortran/4.8.2/gfortran/lib/libgfortran.3.dylib /usr/local/Cellar/gfortran/4.8.2/gfortran/lib/libquadmath.0.dylib in this case from a homebrew installation. The resulting wheel is here: https://nipy.bic.berkeley.edu/scipy_installers/numpy-1.8.0-cp27-none-macosx_... If you do: pip install --upgrade pip pip install --pre --find-links https://nipy.bic.berkeley.edu/scipy_installers scipy you should find that you get a scipy version that passes its tests, even if you rename your libgcc file. Cheers, Matthew
participants (11)
-
Carl Kleffner
-
Charles R Harris
-
Chris Barker
-
Chris Barker - NOAA Federal
-
David Cournapeau
-
Julian Taylor
-
Matthew Brett
-
Nathaniel Smith
-
Olivier Grisel
-
Ralf Gommers
-
Sturla Molden