So it looks like either mpi4py isn't able to properly link against the MPI installation on Blue waters or properly initialize itself at runtime.
In either case, this points to the issue being a problem with mpi4py, not yt.
Three things to try:
* check which version version of mpi4py you have installed in your yt-2.x environment and install that version in your yt-3.x environment. You may have found a bug in mpi4py 2.0, which made major changes to the mpi4py library.
* contact the Blue Waters sysadmins.
* contact the mpi4py developers
Hope that helps,
On Thursday, November 19, 2015, Pengfei Chen firstname.lastname@example.org wrote:
Thank you very much for your reply! When I run that test script in parallel with 16 cores, I get 16 0s. Any suggestions to fix this?
Can you try to run the following test script in parallel? This will determine if the issue is on the yt side of things or the mpi4py side of things:
from mpi4py import MPI print(MPI.COMM_WORLD.rank)
I have trouble running yt in parallel on Blue Waters. I installed yt using miniconda, the version of yt is
~/miniconda/lib $yt version yt module located at:
/u/sciteam/madcpf/miniconda/lib/python2.7/site-packages/yt-3.3.dev0-py2.7-linux-x86_64.egg The current version and changeset for the code is:
Version = 3.3-dev Changeset = 90f900be7a36+ yt
Then with miniconda/bin in PATH I installed mpi4py-2.0.0. But when I tried to make the following simple output in parallel, I get:
import yt yt.enable_parallelism() from yt.utilities.parallel_tools.parallel_analysis_interface import\ parallel_objects, communication_system
comm = communication_system.communicators[-1] print comm.rank, comm.size
0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 ...
When I run a similar code but with yt-2.x also on Blue Waters, I get what I expect:
7 16 15 16 6 16 9 16 11 16 8 16 0 16 4 16 ...
I'm confused about it. Could anyone give me some suggestions please?