Thanks for the reply. Well, I built my Python stuff, including NumPy previously, before I changed to the higher GCC version. Do you know if there's an option I can toggle that will specify Apple's GCC to be used?
$ CC=/usr/bin/gcc python setup.py build
--
Robert Kern
"I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco _______________________________________________ Numpy-discussion mailing list Numpy-discussion@scipy.org http://projects.scipy.org/mailman/listinfo/numpy-discussion
Thanks, it worked perfectly. I'd hate to be a little off topic in the NumPy discussion and bug you, but when I try to compile the latest scipy, I get: g++: unrecognized option '-no-cpp-precomp' cc1plus: error: unrecognized command line option "-arch" cc1plus: error: unrecognized command line option "-arch" cc1plus: error: unrecognized command line option "-Wno-long-double" g++: unrecognized option '-no-cpp-precomp' cc1plus: error: unrecognized command line option "-arch" cc1plus: error: unrecognized command line option "-arch" cc1plus: error: unrecognized command line option "-Wno-long-double" And I assume from the way it looks that it's due to a similar reason as my previous compiling problem. I tried adding in CCplus=/usr/bin/g++ but it doesn't seem to do the trick. Again, sorry to keep bugging on an issue I created in my compiling environment, but is there something different I should be typing?