Specifying compiler command line options for numpy.disutils.core

Hi, I've been trying to use the Intel C Compiler for some extensions, and as a matter of fact, only numpy.distutils seems to support it... (hope that the next version of setuptools will...) Is it possible to change the compiler command line options in the commandline or in a .cfg file ? For the moment, I have only -shared, I'd like to add -xP for instance. This seems to be related to rex's last mail, but I did not find anywhere a way to specify these options in (numpy.)distutils or setuptools, even though it is available for every other non-Python "library builder". Matthieu

Hi, I think the default for the standard python distutils is to use the compiler and the compiler settings for the C compiler that were used to build Python itself. There might be ways to specify other compilers; but if you have a shared python library build with one compiler and modules build with another you might run into trouble if the two compilers use different system libraries which are not resolved by standard python build. I believe that numpy is similar in the sense that you can always build additional modules with the compilers that were used to build the numpy core; then, using two fortran based modules (say) will work well because both require the same shared system libraries of the compiler. Probably, the compiler options used to build numpy will also work for your additinal modules (with respect to paths to linear algebra libraries and so on). Again, I think there could be ways to build with different compilers, but you do run the risk of incompatibilities with the shared libraries. Therefore, I have become used to build python with the C-compiler kI'd like to use, even if that means a lot of work. Hope this helps, Chris. On Thu, June 14, 2007 11:10, Matthieu Brucher wrote:
Hi,
I've been trying to use the Intel C Compiler for some extensions, and as a matter of fact, only numpy.distutils seems to support it... (hope that the next version of setuptools will...) Is it possible to change the compiler command line options in the commandline or in a .cfg file ? For the moment, I have only -shared, I'd like to add -xP for instance.
This seems to be related to rex's last mail, but I did not find anywhere a way to specify these options in (numpy.)distutils or setuptools, even though it is available for every other non-Python "library builder".
Matthieu _______________________________________________ Numpy-discussion mailing list Numpy-discussion@scipy.org http://projects.scipy.org/mailman/listinfo/numpy-discussion

I think the default for the standard python distutils is to use the compiler and the compiler settings for the C compiler that were used to build Python itself. There might be ways to specify other compilers; but if you have a shared python library build with one compiler and modules build with another you might run into trouble if the two compilers use different system libraries which are not resolved by standard python build.
Well, the Intel compiler uses the same libraries than gcc on Linux, and on Windows, I don't know, but it is possible to mix VS2003 and VS2005, whuch is forbidden by the distutils, so I find this too restricting although understandable. I believe that numpy is similar in the sense that you can always build
additional modules with the compilers that were used to build the numpy core; then, using two fortran based modules (say) will work well because both require the same shared system libraries of the compiler. Probably, the compiler options used to build numpy will also work for your additinal modules (with respect to paths to linear algebra libraries and so on).
No, in this case, I want to build with icc and special compiler options. I tried by build by hand - and CMake - the libraries, it works like a charm and it is very very fast compared to gcc :( Again, I think there could be ways to build with different compilers, but
you do run the risk of incompatibilities with the shared libraries. Therefore, I have become used to build python with the C-compiler kI'd like to use, even if that means a lot of work.
This would mean building every other modules added - numpy, scipy, matplotlib, wxPython, ... -, doable, but I'd prefer not to do it, but if it is not possible, I would have to live with it... Matthieu

Matthieu Brucher wrote:
I think the default for the standard python distutils is to use the compiler and the compiler settings for the C compiler that were used to build Python itself. There might be ways to specify other compilers; but if you have a shared python library build with one compiler and modules build with another you might run into trouble if the two compilers use different system libraries which are not resolved by standard python build.
Well, the Intel compiler uses the same libraries than gcc on Linux, and on Windows, I don't know, but it is possible to mix VS2003 and VS2005, whuch is forbidden by the distutils, so I find this too restricting although understandable.
It is possible to mix object code, but not runtime, which is the problem AFAIK. VS2003 and VS2005 have different C runtimes (msvcrt7.1.dll against msvcrt8.dll). The problem is (at least for me, who just go through the pain for windows users :) ) that VS2003 is not available anymore for free...
I believe that numpy is similar in the sense that you can always build additional modules with the compilers that were used to build the numpy core; then, using two fortran based modules (say) will work well because both require the same shared system libraries of the compiler. Probably, the compiler options used to build numpy will also work for your additinal modules (with respect to paths to linear algebra libraries and so on).
No, in this case, I want to build with icc and special compiler options. I tried by build by hand - and CMake - the libraries, it works like a charm and it is very very fast compared to gcc :(
Which libraries are you talking about ? Also, beware that ICC uses by default some flags which are potentially dangerous (I don't know if this is true anymore, but ICC used to use the equivalent of --ffast-math of gcc by default: http://david.monniaux.free.fr/dotclear/index.php/2006/03/17/4-l-art-de-calcu...). For libraries like atlas, I don't think there will be a huge difference between ICC and gcc; if you use the mkl, then you don't care :)
Again, I think there could be ways to build with different compilers, but you do run the risk of incompatibilities with the shared libraries. Therefore, I have become used to build python with the C-compiler kI'd like to use, even if that means a lot of work.
This would mean building every other modules added - numpy, scipy, matplotlib, wxPython, ... -, doable, but I'd prefer not to do it, but if it is not possible, I would have to live with it...
I think it is important to separate different issues: object code compatibility, runtime compatibility, etc... Those are different issues. First, mixing ICC compiled code and gcc code *has* to be possible (I have never tried), otherwise, I don't see much use for it under linux. Then you have the problem of runtime services: I really doubt that ICC runtime is not compatible with gcc, and more globally with the GNU runtime (glibc, etc...); actually, ICC used to use the "standard" linux runtime, and I would be surprised if that changed. To say it simply: on linux at least, what should matter is whether the runtime services are compatible (on windows, it looks like they are not: official python is compiled with visual studio 2003, and you cannot use VS 2005; note that mingw seems to work). David

It is possible to mix object code, but not runtime, which is the problem AFAIK. VS2003 and VS2005 have different C runtimes (msvcrt7.1.dll against msvcrt8.dll). The problem is (at least for me, who just go through the pain for windows users :) ) that VS2003 is not available anymore for free...
Exactly. Well, there is an ongoing discussion on Python-dev on this specific point (VS2005 building).
No, in this case, I want to build with icc and special compiler options. I tried by build by hand - and CMake - the libraries, it works like a charm and it is very very fast compared to gcc :(
Which libraries are you talking about ? Also, beware that ICC uses by
default some flags which are potentially dangerous (I don't know if this is true anymore, but ICC used to use the equivalent of --ffast-math of gcc by default:
http://david.monniaux.free.fr/dotclear/index.php/2006/03/17/4-l-art-de-calcu... ). For libraries like atlas, I don't think there will be a huge difference between ICC and gcc; if you use the mkl, then you don't care :)
My libraries on manifold learning ;) There is difference in performance because of the -ipo and -xP flags. I have to install the MKL, I just compiled Python from the svn trunk yesterday.
This would mean building every other modules added - numpy, scipy,
matplotlib, wxPython, ... -, doable, but I'd prefer not to do it, but if it is not possible, I would have to live with it... I think it is important to separate different issues: object code compatibility, runtime compatibility, etc... Those are different issues. First, mixing ICC compiled code and gcc code *has* to be possible (I have never tried), otherwise, I don't see much use for it under linux.
Exactly. They are binary compatible (C and C++), they use the same headers, ... it has to be possible. Well, it is possible with numpy.distutils, it only missed to link with stdc++... but no specific compiler options :( Then you have the problem of runtime services: I really doubt that ICC
runtime is not compatible with gcc, and more globally with the GNU runtime (glibc, etc...); actually, ICC used to use the "standard" linux runtime, and I would be surprised if that changed.
As I said, _my_ problem is that I'd like specific compiler options. To say it simply: on linux at least, what should matter is whether the
runtime services are compatible (on windows, it looks like they are not: official python is compiled with visual studio 2003, and you cannot use VS 2005; note that mingw seems to work).
Some people reported that it is possible. The only catch seems to be that everything allocated with a runtime should be destroyed by the same allocator. Matthieu

On Fri, June 15, 2007 06:01, David Cournapeau wrote:
I think it is important to separate different issues: object code compatibility, runtime compatibility, etc... Those are different issues. First, mixing ICC compiled code and gcc code *has* to be possible (I have never tried), otherwise, I don't see much use for it under linux. Then you have the problem of runtime services: I really doubt that ICC runtime is not compatible with gcc, and more globally with the GNU runtime (glibc, etc...); actually, ICC used to use the "standard" linux runtime, and I would be surprised if that changed.
Yes, this is possible - icc does use the standard system libraries. But depending on the compiler options, icc will require additional libraries from it's own set of libs. For example, with the -x[...] and -ax[...] options which exploit the floating point pipelines of the Intel cpus, it's using it's very own libsvml (vector math lib or something) which replace some of the math versions in the system lib. If the linker - runtime or static - doesn't know about these, linking will fail. Therefore, if an icc object with certain optimisation is linked with gcc without specifying the required optimised libraries, linking fails. I remember that this also happened for me when building an optimised version of numpy and trying to load it from a gcc-compiled and linked version of Python. Actually, if I remember correctly, this is even a problem for the icc itself; try to link a program from optimised objects with icc without giving the same -x[...] options to the linker... It might be possible that the shared libraries can be told where additional required shared libraries are located (--rpath?), but I was never brave enough to try that out... I simply rebuilt python with the additional benefit that everything in python get faster. Or so ones hopes... It should be straightforward to link gcc objects and shared libs with icc being the linker, though. Has anyone ever tried to build the core python and numpy with icc, but continue to use the standard gcc build extensions? Just a thought... maybe a bit over the top:-(( Chris.

Christian Marquardt wrote:
Yes, this is possible - icc does use the standard system libraries. But depending on the compiler options, icc will require additional libraries from it's own set of libs. For example, with the -x[...] and -ax[...] options which exploit the floating point pipelines of the Intel cpus, it's using it's very own libsvml (vector math lib or something) which replace some of the math versions in the system lib. If the linker - runtime or static - doesn't know about these, linking will fail.
Therefore, if an icc object with certain optimisation is linked with gcc without specifying the required optimised libraries, linking fails. I remember that this also happened for me when building an optimised version of numpy and trying to load it from a gcc-compiled and linked version of Python. Actually, if I remember correctly, this is even a problem for the icc itself; try to link a program from optimised objects with icc without giving the same -x[...] options to the linker...
It might be possible that the shared libraries can be told where additional required shared libraries are located (--rpath?),
If I understand correctly, that would be --rpath-link. --rpath only helps locating libraries for runtime, whereas --rpath-link helps for locating libraries for linking. values for --rpath-link may use --rpath values, though. David

David Cournapeau wrote:
The problem is (at least for me, who just go through the pain for windows users :) ) that VS2003 is not available anymore for free...
while MS isn't distributing it, there area lot of copies floating around, and I don't think it's illegal to distribute them (anyone know different?) However, I found that it's a pain to set up distutils to use the free version -- at least with python2.5, using MinGW is much easier. -Chris -- Christopher Barker, Ph.D. Oceanographer Emergency Response Division NOAA/NOS/OR&R (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chris.Barker@noaa.gov
participants (4)
-
Christian Marquardt
-
Christopher Barker
-
David Cournapeau
-
Matthieu Brucher