This is particularly aimed at Matt since he has experience with
installing mpi4py on an altix :-) I saw your post on the mpi4py list (http://tinyurl.com/ybnb6p7 ), but I'm running into another problem.
I'm trying to install the latest svn version with the stock sgimpi
setup in mpi.cfg. I've also tried with intel's v10.1 compilers (v9.1
was the default).
It compiles and installs just fine, but when I try to load mpi4py.MPI
it can't find the MPI libraries.
he:jwise>python Python 2.6.3 (r263:75183, Oct 19 2009, 12:15:19) [GCC 3.3.3 (SuSE Linux)] on linux2 Type "help", "copyright", "credits" or "license" for more information.
import mpi4py.MPI Traceback (most recent call last): File "<stdin>", line 1, in <module> ImportError: /home/astro/jwise/local/lib/python2.6/site-packages/
libmpi and libmpi++ are in /usr/lib, which is in my
(just in case... it should be included by default).
If you haven't run into this problem before, maybe you (or someone
else) could give me some tips on tracking down the mis-config. I
haven't debugged something like this before...
This is particularly aimed at Matt since he has experience with installing mpi4py on an altix :-) I saw your post on the mpi4py list (http://tinyurl.com/ybnb6p7), but I'm running into another problem.
Ahh, I remember writing those from a very lonely, dark room in Garching... :)
ImportError: /home/astro/jwise/local/lib/python2.6/site-packages/mpi4py/MPI.so: undefined symbol: MPI_Comm_get_name
What's ldd /home/astro/jwise/local/lib/python2.6/site-packages/mpi4py/MPI.so report? And can you try running with the python2.6-mpi executable, see if that helps?
If you haven't run into this problem before, maybe you (or someone else) could give me some tips on tracking down the mis-config. I haven't debugged something like this before...
This is a bit new to me, as well... maybe it's turning down some library loads because of conflicts? What happens if you set LD_DEBUG (maybe to 'files' or 'bindings' -- but 'help' should clear up what's available.)
What's ldd /home/astro/jwise/local/lib/python2.6/site-packages/ mpi4py/MPI.so report? And can you try running with the python2.6-mpi executable, see if that helps? >
Ah, ldd was what I was thinking about! I did some more digging, and
ldd showed that MPI.so was correctly linked to /usr/lib/libmpi.so. I
found out what's going on.
I did a "nm -o /usr/lib/libmpi.o", and MPI_comm_get_name wasn't in
SGI's libmpi (Propack 5, I think ... based on kernel 2.6.5) is still
an MPI-1 implementation. The current svn version of mpi4py has config
headers in the src/config, and the sgimpi.h is probably set up for a
So I had to modify it to include the missing routines from SGI's MPI.
MPI_Comm_get_name MPI_Comm_set_name MPI_MAX_OBJECT_NAME MPI_MAX_PORT_NAME MPI_Open_port MPI_Type_get_name MPI_Win_get_name MPI_Close_port (plus some more that I may have forgotten)
I discovered these by including the "#define PyMPI_MISSING_XXX 1" that
was causing the ImportError, recompile, and try importing mpi4py.MPI
again. If another ImportError occurred, repeat this. And repeat
until it imports correctly.
Once I accounted for all of the missing routines, I ran into the same
error you described in your mpi4py posts, where it seg. faults in
MPI_Finalized(). So using the python2.6-mpi executable completed the
I'm not sure if will help anyone else, but I wanted to let people know
about the solution!