scipy build with gcc on solaris problems
Hi, I'm on solaris using gcc: SunOS mahler 5.10 Generic_118833-36 sun4u sparc SUNW,Sun-Fire-880 Solaris Using built-in specs. Target: sparc-sun-solaris2.10 Configured with: ../configure --prefix=/usr/local --with-gnu-as --with-as=/usr/local/bin/as --with-gnu-ld --with-ld=/usr/local/bin/ld --with-libiconv --enable-libada --enable-libssp --enable-objc-gc --enable-threads --enable-languages=c,c++,objc,obj-c++,fortran Thread model: posix gcc version 4.2.2 The scipy tests generate a core dump. The numpy tests seem fine except for a few warnings about invalid values. What am I doing wrong? I build atlas following the advice here: http://www.scipy.org/Installing_SciPy/Linux#head-89e1f6afaa3314d98a22c79b063... I install scipy using "easy_install scipy" when I get some warnings like the following: scipy/ndimage/src/nd_image.c: In function 'Py_Filter1DFunc': scipy/ndimage/src/nd_image.c:273: warning: function called through a non-compatible type scipy/ndimage/src/nd_image.c:273: note: if this code is reached, the program will abort scipy/ndimage/src/nd_image.c:274: warning: function called through a non-compatible type scipy/ndimage/src/nd_image.c:274: note: if this code is reached, the program will abort scipy/ndimage/src/nd_image.c: In function 'Py_FilterFunc': scipy/ndimage/src/nd_image.c:351: warning: function called through a non-compatible type scipy/ndimage/src/nd_image.c:351: note: if this code is reached, the program will abort scipy/ndimage/src/nd_image.c: In function 'Py_Histogram': scipy/ndimage/src/nd_image.c:1100: warning: function called through a non-compatible type scipy/ndimage/src/nd_image.c:1100: note: if this code is reached, the program will abort and when I run the tests I get a core dump: ipython
import scipy scipy.test()
Here is the "gdb python core" output: Core was generated by `/usr/local/bin/python /usr/local/bin/ipython'. Program terminated with signal 4, Illegal instruction. #0 Py_FilterFunc (buffer=0xc91778, filter_size=2, output=0xffbfc118, data=0xffbfc1c4) at scipy/ndimage/src/nd_image.c:351 351 scipy/ndimage/src/nd_image.c: No such file or directory. in scipy/ndimage/src/nd_image.c (gdb) where #0 Py_FilterFunc (buffer=0xc91778, filter_size=2, output=0xffbfc118, data=0xffbfc1c4) at scipy/ndimage/src/nd_image.c:351 #1 0xfdc16eb8 in NI_GenericFilter (input=0x900708, function=0xfdc130b8 <Py_FilterFunc>, data=0xffbfc1c4, footprint=0xb96be0, output=0xb96c40, mode=<value optimized out>, cvalue=0, origins=0xbfcd60) at scipy/ndimage/src/ni_filters.c:858 #2 0xfdc14a54 in Py_GenericFilter (obj=0x0, args=0xb493f0) at scipy/ndimage/src/nd_image.c:411 #3 0x000f7a24 in PyCFunction_Call (func=0x838878, arg=0xb493f0, kw=0x0) at Objects/methodobject.c:108 #4 0x000a188c in PyEval_EvalFrameEx (f=0xbf6570, throwflag=-4209780) at Python/ceval.c:3564 #5 0x000a3784 in PyEval_EvalCodeEx (co=0x7f7b60, globals=0x7f7b60, locals=0x1, args=0x1a210c, argcount=2, kws=0x28, kwcount=3, defs=0x833dfc, defcount=8, closure=0x0) at Python/ceval.c:2831 #6 0x000a1804 in PyEval_EvalFrameEx (f=0xbf63c0, throwflag=-4209276) at Python/ceval.c:3659 #7 0x000a3784 in PyEval_EvalCodeEx (co=0x9fdf98, globals=0x9fdf98, locals=0x1, args=0x43d2b4, argcount=10432304, kws=0x20, kwcount=1, defs=0x0, defcount=8, closure=0x0) at Python/ceval.c:2831 #8 0x000a1804 in PyEval_EvalFrameEx (f=0xa4ee30, throwflag=-4208772) at Python/ceval.c:3659 #9 0x000a3784 in PyEval_EvalCodeEx (co=0x2bdf50, globals=0x2bdf50, locals=0x1, args=0x138800, argcount=2, kws=0x9db3c8, kwcount=0, defs=0x2d7c9c, defcount=1, closure=0x0) at Python/ceval.c:2831 #10 0x000f72ec in function_call (func=0x2dd2b0, arg=0x93c788, kw=0xb4b270) at Objects/funcobject.c:517 #11 0x00026448 in PyObject_Call (func=0x2dd2b0, arg=0x93c788, kw=0xb4b270) at Objects/abstract.c:1860 #12 0x0009f6f8 in PyEval_EvalFrameEx (f=0xa4ecc0, throwflag=0) at Python/ceval.c:3844 #13 0x000a3784 in PyEval_EvalCodeEx (co=0x2bdf98, globals=0x2bdf98, locals=0x1, args=0x138800, argcount=2, kws=0x0, kwcount=0, defs=0x0, defcount=0, closure=0x0) at Python/ceval.c:2831 #14 0x000f72ec in function_call (func=0x2dd2f0, arg=0x93c620, kw=0x0) at Objects/funcobject.c:517 #15 0x00026448 in PyObject_Call (func=0x2dd2f0, arg=0x93c620, kw=0x0) at Objects/abstract.c:1860 #16 0x00031310 in instancemethod_call (func=0x2, arg=0x93c620, kw=0x0) at Objects/classobject.c:2497 #17 0x00026448 in PyObject_Call (func=0x8c7d78, arg=0x93c620, kw=0x0) at Objects/abstract.c:1860 #18 0x0009fe10 in PyEval_EvalFrameEx (f=0xa4eb40, throwflag=-4206524) at Python/ceval.c:3775 #19 0x000a3784 in PyEval_EvalCodeEx (co=0x2a35c0, globals=0x2a35c0, locals=0x1, args=0x138800, argcount=2, kws=0x0, kwcount=0, defs=0x61e25c, defcount=1, closure=0x0) at Python/ceval.c:2831 #20 0x000f72ec in function_call (func=0x67fdf0, arg=0x93c738, kw=0x0) at Objects/funcobject.c:517 #21 0x00026448 in PyObject_Call (func=0x67fdf0, arg=0x93c738, kw=0x0) at Objects/abstract.c:1860 #22 0x00031310 in instancemethod_call (func=0x0, arg=0x93c738, kw=0x0) at Objects/classobject.c:2497 #23 0x00026448 in PyObject_Call (func=0x8ce3f0, arg=0xaf9d70, kw=0x0) at Objects/abstract.c:1860 #24 0x00074ad8 in slot_tp_call (self=0xf, args=0xaf9d70, kwds=0x0) at Objects/typeobject.c:4633 #25 0x00026448 in PyObject_Call (func=0xa8de30, arg=0xaf9d70, kw=0x0) at Objects/abstract.c:1860 #26 0x0009fe10 in PyEval_EvalFrameEx (f=0xa57fa8, throwflag=-4204804) at Python/ceval.c:3775 #27 0x000a3784 in PyEval_EvalCodeEx (co=0x2d46e0, globals=0x2d46e0, locals=0x1, args=0x138800, argcount=2, kws=0x9db4f8, kwcount=0, defs=0x0, defcount=0, closure=0x0) at Python/ceval.c:2831 #28 0x000f72ec in function_call (func=0x2dd730, arg=0x93c558, kw=0xb40e40) at Objects/funcobject.c:517 #29 0x00026448 in PyObject_Call (func=0x2dd730, arg=0x93c558, kw=0xb40e40) at Objects/abstract.c:1860 #30 0x0009f6f8 in PyEval_EvalFrameEx (f=0xa57e38, throwflag=0) at Python/ceval.c:3844 #31 0x000a3784 in PyEval_EvalCodeEx (co=0x2d4728, globals=0x2d4728, locals=0x1, args=0x138800, argcount=2, kws=0x0, kwcount=0, defs=0x0, defcount=0, closure=0x0) at Python/ceval.c:2831 #32 0x000f72ec in function_call (func=0x2dd770, arg=0x942c10, kw=0x0) at Objects/funcobject.c:517 #33 0x00026448 in PyObject_Call (func=0x2dd770, arg=0x942c10, kw=0x0) at Objects/abstract.c:1860 #34 0x00031310 in instancemethod_call (func=0x0, arg=0x942c10, kw=0x0) at Objects/classobject.c:2497 #35 0x00026448 in PyObject_Call (func=0x759a58, arg=0x7d6bd0, kw=0x0) at Objects/abstract.c:1860 #36 0x00074ad8 in slot_tp_call (self=0xc, args=0x7d6bd0, kwds=0x0) at Objects/typeobject.c:4633 #37 0x00026448 in PyObject_Call (func=0x7bea30, arg=0x7d6bd0, kw=0x0) at Objects/abstract.c:1860 #38 0x0009fe10 in PyEval_EvalFrameEx (f=0x96e7a8, throwflag=-4202332) at Python/ceval.c:3775 #39 0x000a2b58 in PyEval_EvalFrameEx (f=0x95e978, throwflag=-4201972) at Python/ceval.c:3650 #40 0x000a3784 in PyEval_EvalCodeEx (co=0x2a3068, globals=0x2a3068, locals=0x1, args=0x610880, argcount=3, kws=0x18, kwcount=0, defs=0x6779fc, defcount=5, closure=0x0) at Python/ceval.c:2831 #41 0x000a1804 in PyEval_EvalFrameEx (f=0x78e3f8, throwflag=-4201468) at Python/ceval.c:3659 #42 0x000a3784 in PyEval_EvalCodeEx (co=0x60b188, globals=0x60b188, locals=0x1, args=0x1a2100, argcount=0, kws=0x8, kwcount=0, defs=0x774c44, defcount=2, closure=0x0) at Python/ceval.c:2831 #43 0x000a1804 in PyEval_EvalFrameEx (f=0x6baa30, throwflag=-4200964) at Python/ceval.c:3659 #44 0x000a3784 in PyEval_EvalCodeEx (co=0x775ba8, globals=0x775ba8, locals=0x1, args=0x0, argcount=0, kws=0x0, kwcount=0, defs=0x0, defcount=0, closure=0x0) at Python/ceval.c:2831 #45 0x000a22e8 in PyEval_EvalFrameEx (f=0x804e78, throwflag=7822248) at Python/ceval.c:494 ---Type <return> to continue, or q <return> to quit--- #46 0x000a3784 in PyEval_EvalCodeEx (co=0x4e5020, globals=0x4e5020, locals=0x1, args=0x138800, argcount=2, kws=0x804e44, kwcount=0, defs=0x0, defcount=0, closure=0x0) at Python/ceval.c:2831 #47 0x000a1804 in PyEval_EvalFrameEx (f=0x804cf0, throwflag=-4199956) at Python/ceval.c:3659 #48 0x000a3784 in PyEval_EvalCodeEx (co=0x4dcf50, globals=0x4dcf50, locals=0x1, args=0x2aa020, argcount=3, kws=0x10, kwcount=0, defs=0x58d424, defcount=2, closure=0x0) at Python/ceval.c:2831 #49 0x000a1804 in PyEval_EvalFrameEx (f=0x5d7aa0, throwflag=-4199452) at Python/ceval.c:3659 #50 0x000a2b58 in PyEval_EvalFrameEx (f=0x5e43c0, throwflag=-4199092) at Python/ceval.c:3650 #51 0x000a3784 in PyEval_EvalCodeEx (co=0x4dccc8, globals=0x4dccc8, locals=0x1, args=0x138800, argcount=2, kws=0x5df1f8, kwcount=0, defs=0x5901fc, defcount=1, closure=0x0) at Python/ceval.c:2831 #52 0x000a1804 in PyEval_EvalFrameEx (f=0x5df0b0, throwflag=-4198588) at Python/ceval.c:3659 #53 0x000a3784 in PyEval_EvalCodeEx (co=0x4dcbf0, globals=0x4dcbf0, locals=0x1, args=0x138800, argcount=2, kws=0x37c9fc, kwcount=0, defs=0x5901dc, defcount=1, closure=0x0) at Python/ceval.c:2831 #54 0x000a1804 in PyEval_EvalFrameEx (f=0x37c8b0, throwflag=-4198084) at Python/ceval.c:3659 #55 0x000a3784 in PyEval_EvalCodeEx (co=0x448410, globals=0x448410, locals=0x1, args=0x13796c, argcount=1, kws=0xc, kwcount=0, defs=0x358b04, defcount=2, closure=0x0) at Python/ceval.c:2831 #56 0x000a1804 in PyEval_EvalFrameEx (f=0x5d7338, throwflag=-4197580) at Python/ceval.c:3659 #57 0x000a3784 in PyEval_EvalCodeEx (co=0x428d10, globals=0x428d10, locals=0x1, args=0x13796c, argcount=0, kws=0x4, kwcount=0, defs=0x43d87c, defcount=1, closure=0x0) at Python/ceval.c:2831 #58 0x000a1804 in PyEval_EvalFrameEx (f=0x216480, throwflag=-4197076) at Python/ceval.c:3659 #59 0x000a3784 in PyEval_EvalCodeEx (co=0x1d1578, globals=0x1d1578, locals=0x1, args=0x0, argcount=0, kws=0x0, kwcount=0, defs=0x0, defcount=0, closure=0x0) at Python/ceval.c:2831 #60 0x000a3918 in PyEval_EvalCode (co=0x1d1578, globals=0x176d20, locals=0x176d20) at Python/ceval.c:494 #61 0x000c8c6c in PyRun_FileExFlags (fp=0x0, filename=0xffbffa52 "/usr/local/bin/ipython", start=-4, globals=0x176d20, locals=0x176d20, closeit=1, flags=0xffbff814) at Python/pythonrun.c:1271 #62 0x000c9ba8 in PyRun_SimpleFileExFlags (fp=0x15c4e8, filename=0xffbffa52 "/usr/local/bin/ipython", closeit=1, flags=0xffbff814) at Python/pythonrun.c:877 #63 0x0001de58 in Py_Main (argc=2, argv=0x0) at Modules/main.c:523 #64 0x0001d440 in _start ()
Also when I import scipy.optimize I have the following error. Could easy_install be using the solaris CC compiler? Would this cause this problem? In [6]: import scipy.optimize --------------------------------------------------------------------------- ImportError Traceback (most recent call last) /usr/local/lib/python2.5/site-packages/<ipython console> in <module>() /usr/local/lib/python2.5/site-packages/scipy-0.6.0-py2.5-solaris-2.10-sun4u.egg/scipy/optimize/__init__.py in <module>() 9 from zeros import * 10 from anneal import * ---> 11 from lbfgsb import fmin_l_bfgs_b global lbfgsb = undefined global fmin_l_bfgs_b = undefined 12 from tnc import fmin_tnc 13 from cobyla import fmin_cobyla /usr/local/lib/python2.5/site-packages/scipy-0.6.0-py2.5-solaris-2.10-sun4u.egg/scipy/optimize/lbfgsb.py in <module>() 28 29 from numpy import zeros, float64, array, int32 ---> 30 import _lbfgsb global _lbfgsb = undefined 31 import optimize 32 ImportError: ld.so.1: python: fatal: relocation error: file /usr/local/lib/python2.5/site-packages/scipy-0.6.0-py2.5-solaris-2.10-sun4u.egg/scipy/optimize/_lbfgsb.so: symbol etime_: referenced symbol not found
John Reid wrote:
Also when I import scipy.optimize I have the following error. Could easy_install be using the solaris CC compiler? Would this cause this problem?
I found out how to change the default compiler with easy_install but now I am told I cannot compile the code on Solaris with gcc: Building modules... Building module "mvn"... Constructing wrapper function "mvnun"... value,inform = mvnun(lower,upper,means,covar,[maxpts,abseps,releps]) Constructing wrapper function "mvndst"... error,value,inform = mvndst(lower,upper,infin,correl,[maxpts,abseps,releps]) Constructing COMMON block support for "dkblck"... ivls Wrote C/API module "mvn" to file "build/src.solaris-2.10-sun4u-2.5/scipy/stats/mvnmodule.c" Fortran 77 wrappers are saved to "build/src.solaris-2.10-sun4u-2.5/scipy/stats/mvn-f2pywrappers.f" error: Setup script exited with error: don't know how to compile C/C++ code on platform 'posix' with 'gcc' compiler I created the following ~/.pydistutils.cfg [build] compiler = gcc to change the default compiler. Any help appreciated, John.
On Mon, Mar 3, 2008 at 5:29 AM, John Reid <j.reid@mail.cryst.bbk.ac.uk> wrote:
John Reid wrote:
Also when I import scipy.optimize I have the following error. Could easy_install be using the solaris CC compiler? Would this cause this problem?
I found out how to change the default compiler with easy_install but now I am told I cannot compile the code on Solaris with gcc:
Using --compiler=gcc doesn't help, actually. That option chooses between different classes of compilers rather than specific executables; 'gcc' is not an option. Look at "python setup.py build_ext --help-compiler" for the available options. Almost certainly, you shouldn't change it. The only real choices are for Windows and old Mac OS. The correct compiler *should* be picked up from the Python Makefile which is stored in your Python installation. Usually, you will need the same compiler that built your Python. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco
Robert Kern wrote:
The correct compiler *should* be picked up from the Python Makefile which is stored in your Python installation. Usually, you will need the same compiler that built your Python.
Thanks for the info. Now I've compiled LAPACK with f77 rather than gfortran and "-dalign -native -xO5" (which is what was in Make.inc for the ATLAS config as recommended) I've configured ATLAS using: ../configure -C ic cc -F ic -KPIC -F gc -fPIC --with-netlib-lapack=../../lapack-3.1.1/lapack_LINUX.a so I think the C API to LAPACK should now be compiled with cc which is what I believe my python was compiled with. I seem to get the exact same core dump though. Any ideas what else I could check? Thanks, John.
How do I know which compilers I should be using? Which compilers does scipy use by default on Solaris 10? My guess is that I need to compile LAPACK and ATLAS with a combination of f77 and cc. The documentation for ATLAS strongly suggests using gcc though. Should I ignore this? Thanks, John.
On Tue, Mar 4, 2008 at 4:17 AM, John Reid <j.reid@mail.cryst.bbk.ac.uk> wrote:
How do I know which compilers I should be using? Which compilers does scipy use by default on Solaris 10?
For C, it should (in both the descriptive and normative sense) use whatever compiler that built Python. Look in $PREFIX/lib/python2.x/config/Makefile for the CC variable. For Fortran, the build will probably use f77 if you don't pick anything else using --fcompiler. You may or may not need to change that.
My guess is that I need to compile LAPACK and ATLAS with a combination of f77 and cc. The documentation for ATLAS strongly suggests using gcc though. Should I ignore this?
I don't know any specific details about building on Solaris, but using a different compiler for the ATLAS library than your Python is probably not going to work too well. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco
Robert Kern wrote:
My guess is that I need to compile LAPACK and ATLAS with a combination of f77 and cc. The documentation for ATLAS strongly suggests using gcc though. Should I ignore this?
I don't know any specific details about building on Solaris, but using a different compiler for the ATLAS library than your Python is probably not going to work too well.
ATLAS uses different compilers in 8 or so different ways. You are suggesting I want to change all of them? Here is some info from ATLAS's install guide: 3.2 Changing the compilers and flags that ATLAS uses for the build ATLAS defines eight different compilers and associated flag macros in its Make.inc which are used to compile various files during the install process. ATLAS's configure provides flags for changing both the compiler and flags for each of these macros. In the following list, the macro name is given first, and the configure flag abbreviation is in parentheses: 1. XCC (xc): C compiler used to compile ATLAS's build harness routines (these never appear in any user-callable library) 2. GOODGCC (gc): gcc with any required architectural flags (eg. -m64), which will be used to assemble cpp-enabled assembly and to compile certain multiple implementation routines that specifically request gcc 3. F77 (if): FORTRAN compiler used to compile ATLAS's FORTRAN77 API interface routines. 4. ICC (ic): C compiler used to compile ATLAS's C API interface routines. 5. DMC (dm): C compiler used to compile ATLAS's generated double precision (real and complex) matmul kernels 6. SMC (sm): C compiler used to compile ATLAS's generated single precision (real and complex) matmul kernels 7. DKC (dk): C compiler used to compile all other double precision routines (mainly used for other kernels, thus the K) 8. SKC (sk): C compiler used to compile all other single precision routines (mainly used for other kernels, thus the K) It is almost never a good idea to change DMC or SMC, and it is only very rarely a good idea to change DKC or SKC. For ATLAS 3.8.0, all architectural defaults are set using gcc 4.2 only (the one exception is MIPS/IRIX, where SGI's compiler is used). In most cases, switching these compilers will get you worse performance and accuracy, even when you are absolutely sure it is a better compiler and flag combination! In particular we tried the Intel compiler icc (called icl on Windows) on Intel x86 platforms, and overall performance was lower than gcc. Even worse, from the documentation icc does not seem to have any firm IEEE floating point compliance unless you want to run so slow that you could compute it by hand faster. This means that whenever icc achieves reasonable performance, I have no idea if the error will be bounded or not. I could not obtain access to icc on the Itaniums, where icc has historically been much faster than gcc, but I note that the performance of gcc4.2 is much better than gcc3 for most routines, so gcc may be the best compiler there now as well. There is almost never a need to change XCC, since it doesn't affect the output libraries in any way, and we have seen that changing the kernel compilers is a bad idea. However, what if you yourself use a non-gnu compiler, like Intel's icc or ifort, then what you need to do is tell ATLAS to compile its interface routines with your compilers, which is discussed in Section 3.2.1. Another common problem is that your OS has been built with an older gcc whose libraries are incompatible with gcc 4.2. In this case, creating an executable with gcc4.2 can cause problems, and so what you want to do is keep gcc3 as you default compiler (compiling ATLAS interface routines with it, as well as using it for all linking) but compile the ATLAS kernel routines with gcc4. This case is discussed in Section 3.2.2. For those who insist on monkeying with other compilers, Section 3.2.3 gives some guidance. Finally installing ATLAS without a FORTRAN compiler is discussed in Section 3.2.4.
On Tue, Mar 4, 2008 at 6:30 AM, John Reid <j.reid@mail.cryst.bbk.ac.uk> wrote:
Robert Kern wrote:
My guess is that I need to compile LAPACK and ATLAS with a combination of f77 and cc. The documentation for ATLAS strongly suggests using gcc though. Should I ignore this?
I don't know any specific details about building on Solaris, but using a different compiler for the ATLAS library than your Python is probably not going to work too well.
ATLAS uses different compilers in 8 or so different ways. You are suggesting I want to change all of them?
3-8 probably. But like I said, I know very few details about building ATLAS or scipy on Solaris. I'm out of my depth. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco
John Reid wrote:
ATLAS uses different compilers in 8 or so different ways. You are suggesting I want to change all of them? Here is some info from ATLAS's install guide:
I would first try building with gcc, and only changing the fortran compiler. But the question is: why are you trying to build atlas on solaris ? sun performance libraries are faster in general, and you do not need to build them, cheers, David
David Cournapeau wrote:
I would first try building with gcc, and only changing the fortran compiler. But the question is: why are you trying to build atlas on solaris ? sun performance libraries are faster in general, and you do not need to build them,
Thanks. I had seen some postings that people had trouble using sun performance libraries with scipy. Perhaps those were old posts? Is that all resolved? Also I was just trying to follow what documentation there is for building scipy. I could not find any that mentioned Solaris.
John Reid wrote:
Thanks. I had seen some postings that people had trouble using sun performance libraries with scipy. Perhaps those were old posts? Is that all resolved? Also I was just trying to follow what documentation there is for building scipy. I could not find any that mentioned Solaris.
You have two solutions: - you add libraries and libraries path in site.cfg under the sections blas and lapack. To find the options, you should use the command "suncc -xlic_lib=sunperf -#" to see which libraries are linked when sunperf is used (-# is for verbose link with sun studio). - you can also try numscons, an alternative build system I am currently working on. This one explicitly supports sunperf. Numpy is buildable, and works with sunperf (on indiana with sunstudio 12, at least). scipy is almost done, but there are still some issues, mostly related to recent changes in sparse module. Unfortunately, the trick I am using in numscons is not easily 'backportable' to the current default scipy build system, because of what looks like a bug of sun linker (the -xlic_lib=sunperf flag is ignored by sun linker when building a shared library, that is when -G is used). cheers, David
Robert Kern wrote:
For C, it should (in both the descriptive and normative sense) use whatever compiler that built Python. Look in $PREFIX/lib/python2.x/config/Makefile for the CC variable.
For Fortran, the build will probably use f77 if you don't pick anything else using --fcompiler. You may or may not need to change that.
I noticed that atlas and scipy often do not pick the same fortran compiler by default. If you have gcc and sun compilers on solaris, I would not be surprised if this were the case. With ATLAS, you can change the fortran compiler with the option -C if fortran_compiler during configure stage. cheers, David
participants (3)
-
David Cournapeau
-
John Reid
-
Robert Kern