parallel volume render returns blank image
Hi, I'm doing a pretty simple volume render that uses the Camera.snapshot function. In serial, everything seems to be working fine, but when I run in parallel, the image comes out completely blank. I have gone into the snapshot routine and verified that the image array is all zeros for every processor after the loop: for brick in self.volume.traverse(self.back_center, self.front_center, image): This is in the tip of the development version of yt. Does anyone know what the issue is here? Thanks, Britton
Hi again,
Just an update on this. I was incorrect when I stated that the image
arrays were all zeros in parallel. What actually seems to be the case is
that with two processors, processor 0 which does the image writing, only
has a fraction of the total image, which to me looked like all zeros.
Processor 1 seems to have the entire image, though. It has the same number
of non-zero cells as the image array does in serial mode.
I still don't know what's happening, but I'm looking at it.
Britton
On Tue, Nov 1, 2011 at 11:55 AM, Britton Smith
Hi,
I'm doing a pretty simple volume render that uses the Camera.snapshot function. In serial, everything seems to be working fine, but when I run in parallel, the image comes out completely blank. I have gone into the snapshot routine and verified that the image array is all zeros for every processor after the loop: for brick in self.volume.traverse(self.back_center, self.front_center, image):
This is in the tip of the development version of yt.
Does anyone know what the issue is here?
Thanks, Britton
Hi Britton,
Would you mind pasting your script and mpirun command line? Processor 0
should be the only processor that has the entire image at the end. If
anything processor 1 would have half an image, which is why this confuses
me.
Thanks,
Sam
On Tue, Nov 1, 2011 at 10:07 AM, Britton Smith
Hi again,
Just an update on this. I was incorrect when I stated that the image arrays were all zeros in parallel. What actually seems to be the case is that with two processors, processor 0 which does the image writing, only has a fraction of the total image, which to me looked like all zeros. Processor 1 seems to have the entire image, though. It has the same number of non-zero cells as the image array does in serial mode.
I still don't know what's happening, but I'm looking at it.
Britton
On Tue, Nov 1, 2011 at 11:55 AM, Britton Smith
wrote: Hi,
I'm doing a pretty simple volume render that uses the Camera.snapshot function. In serial, everything seems to be working fine, but when I run in parallel, the image comes out completely blank. I have gone into the snapshot routine and verified that the image array is all zeros for every processor after the loop: for brick in self.volume.traverse(self.back_center, self.front_center, image):
This is in the tip of the development version of yt.
Does anyone know what the issue is here?
Thanks, Britton
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
Hi Sam,
Here is the script I'm using for the render:
http://paste.yt-project.org/show/1899/
The command I'm using is:
mpirun -np 2 python render_Z.py --parallel
Also, I seem to be having trouble reading today. Processor zero does have
the right number of nonzero pixels, but the image still comes out all black.
Britton
On Tue, Nov 1, 2011 at 12:42 PM, Sam Skillman
Hi Britton,
Would you mind pasting your script and mpirun command line? Processor 0 should be the only processor that has the entire image at the end. If anything processor 1 would have half an image, which is why this confuses me.
Thanks, Sam
On Tue, Nov 1, 2011 at 10:07 AM, Britton Smith
wrote: Hi again,
Just an update on this. I was incorrect when I stated that the image arrays were all zeros in parallel. What actually seems to be the case is that with two processors, processor 0 which does the image writing, only has a fraction of the total image, which to me looked like all zeros. Processor 1 seems to have the entire image, though. It has the same number of non-zero cells as the image array does in serial mode.
I still don't know what's happening, but I'm looking at it.
Britton
On Tue, Nov 1, 2011 at 11:55 AM, Britton Smith
wrote: Hi,
I'm doing a pretty simple volume render that uses the Camera.snapshot function. In serial, everything seems to be working fine, but when I run in parallel, the image comes out completely blank. I have gone into the snapshot routine and verified that the image array is all zeros for every processor after the loop: for brick in self.volume.traverse(self.back_center, self.front_center, image):
This is in the tip of the development version of yt.
Does anyone know what the issue is here?
Thanks, Britton
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
Just to put some sort of resolution to this, I removed this import line:
import yt.visualization.volume_rendering.api as vr
as Sam suggested, allowing the yt.mods line to import the volume renderer
and such. I then removed all the vr. instances from the script and now
everything seems to work. It is not clear why, but that seems to have done
it.
On Tue, Nov 1, 2011 at 1:10 PM, Britton Smith
Hi Sam,
Here is the script I'm using for the render: http://paste.yt-project.org/show/1899/
The command I'm using is: mpirun -np 2 python render_Z.py --parallel
Also, I seem to be having trouble reading today. Processor zero does have the right number of nonzero pixels, but the image still comes out all black.
Britton
On Tue, Nov 1, 2011 at 12:42 PM, Sam Skillman
wrote: Hi Britton,
Would you mind pasting your script and mpirun command line? Processor 0 should be the only processor that has the entire image at the end. If anything processor 1 would have half an image, which is why this confuses me.
Thanks, Sam
On Tue, Nov 1, 2011 at 10:07 AM, Britton Smith
wrote: Hi again,
Just an update on this. I was incorrect when I stated that the image arrays were all zeros in parallel. What actually seems to be the case is that with two processors, processor 0 which does the image writing, only has a fraction of the total image, which to me looked like all zeros. Processor 1 seems to have the entire image, though. It has the same number of non-zero cells as the image array does in serial mode.
I still don't know what's happening, but I'm looking at it.
Britton
On Tue, Nov 1, 2011 at 11:55 AM, Britton Smith
wrote: Hi,
I'm doing a pretty simple volume render that uses the Camera.snapshot function. In serial, everything seems to be working fine, but when I run in parallel, the image comes out completely blank. I have gone into the snapshot routine and verified that the image array is all zeros for every processor after the loop: for brick in self.volume.traverse(self.back_center, self.front_center, image):
This is in the tip of the development version of yt.
Does anyone know what the issue is here?
Thanks, Britton
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
Hi Sam and Britton,
This is likely due to the initialization of parallelism. If you
import from mods, the communicator stack sets itself up correctly
initially, which creates the ParallelAnalysisInterface class. At
binding time of methods => class, the parallel state is read from, in
order to do things like calculate which processors execute which task.
This sounds like a bug, though. If there are "works" and "doesn't
work" scripts I can take a look at it; file it on the issue tracker?
-Matt
On Tue, Nov 1, 2011 at 1:29 PM, Britton Smith
Just to put some sort of resolution to this, I removed this import line: import yt.visualization.volume_rendering.api as vr as Sam suggested, allowing the yt.mods line to import the volume renderer and such. I then removed all the vr. instances from the script and now everything seems to work. It is not clear why, but that seems to have done it.
On Tue, Nov 1, 2011 at 1:10 PM, Britton Smith
wrote: Hi Sam,
Here is the script I'm using for the render: http://paste.yt-project.org/show/1899/
The command I'm using is: mpirun -np 2 python render_Z.py --parallel
Also, I seem to be having trouble reading today. Processor zero does have the right number of nonzero pixels, but the image still comes out all black.
Britton
On Tue, Nov 1, 2011 at 12:42 PM, Sam Skillman
wrote: Hi Britton, Would you mind pasting your script and mpirun command line? Processor 0 should be the only processor that has the entire image at the end. If anything processor 1 would have half an image, which is why this confuses me. Thanks, Sam On Tue, Nov 1, 2011 at 10:07 AM, Britton Smith
wrote: Hi again,
Just an update on this. I was incorrect when I stated that the image arrays were all zeros in parallel. What actually seems to be the case is that with two processors, processor 0 which does the image writing, only has a fraction of the total image, which to me looked like all zeros. Processor 1 seems to have the entire image, though. It has the same number of non-zero cells as the image array does in serial mode.
I still don't know what's happening, but I'm looking at it.
Britton
On Tue, Nov 1, 2011 at 11:55 AM, Britton Smith
wrote: Hi,
I'm doing a pretty simple volume render that uses the Camera.snapshot function. In serial, everything seems to be working fine, but when I run in parallel, the image comes out completely blank. I have gone into the snapshot routine and verified that the image array is all zeros for every processor after the loop: for brick in self.volume.traverse(self.back_center, self.front_center, image):
This is in the tip of the development version of yt.
Does anyone know what the issue is here?
Thanks, Britton
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
Hi Matt,
I just filed this as a bug and included links to nearly identical working
and non-working test cases. Let me know if I can help out in any way.
Britton
On Tue, Nov 1, 2011 at 2:53 PM, Matthew Turk
Hi Sam and Britton,
This is likely due to the initialization of parallelism. If you import from mods, the communicator stack sets itself up correctly initially, which creates the ParallelAnalysisInterface class. At binding time of methods => class, the parallel state is read from, in order to do things like calculate which processors execute which task.
This sounds like a bug, though. If there are "works" and "doesn't work" scripts I can take a look at it; file it on the issue tracker?
-Matt
On Tue, Nov 1, 2011 at 1:29 PM, Britton Smith
wrote: Just to put some sort of resolution to this, I removed this import line: import yt.visualization.volume_rendering.api as vr as Sam suggested, allowing the yt.mods line to import the volume renderer and such. I then removed all the vr. instances from the script and now everything seems to work. It is not clear why, but that seems to have done it.
On Tue, Nov 1, 2011 at 1:10 PM, Britton Smith
wrote: Hi Sam,
Here is the script I'm using for the render: http://paste.yt-project.org/show/1899/
The command I'm using is: mpirun -np 2 python render_Z.py --parallel
Also, I seem to be having trouble reading today. Processor zero does
have
the right number of nonzero pixels, but the image still comes out all black.
Britton
On Tue, Nov 1, 2011 at 12:42 PM, Sam Skillman
wrote: Hi Britton, Would you mind pasting your script and mpirun command line? Processor
0
should be the only processor that has the entire image at the end. If anything processor 1 would have half an image, which is why this confuses me. Thanks, Sam On Tue, Nov 1, 2011 at 10:07 AM, Britton Smith
wrote:
Hi again,
Just an update on this. I was incorrect when I stated that the image arrays were all zeros in parallel. What actually seems to be the
case is
that with two processors, processor 0 which does the image writing, only has a fraction of the total image, which to me looked like all zeros. Processor 1 seems to have the entire image, though. It has the same number of non-zero cells as the image array does in serial mode.
I still don't know what's happening, but I'm looking at it.
Britton
On Tue, Nov 1, 2011 at 11:55 AM, Britton Smith < brittonsmith@gmail.com> wrote:
Hi,
I'm doing a pretty simple volume render that uses the Camera.snapshot function. In serial, everything seems to be working fine, but when
I run in
parallel, the image comes out completely blank. I have gone into the snapshot routine and verified that the image array is all zeros for every processor after the loop: for brick in self.volume.traverse(self.back_center, self.front_center, image):
This is in the tip of the development version of yt.
Does anyone know what the issue is here?
Thanks, Britton
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
participants (3)
-
Britton Smith
-
Matthew Turk
-
Sam Skillman