I was able to reproduce the missing halo problem for the HOP in parallel compare to serial, but I wasn't able to reproduce the parallelHF problem.  I'm running this on a 64 cube test dataset, so it's not as big as the 512 cube.  This is using both my old YT and the version after Matt's pull request was merged.  So the HOP in parallel bug existed before the memory update.

From
G.S.

On Fri, Feb 24, 2012 at 3:36 PM, Michael Kuhlen <mqk@astro.berkeley.edu> wrote:
Sorry to be the bearer of bad news again, but ParallelHF failed for me.

The installation of Forthon, etc. went smoothly. Only took 5 minutes.
Note that in addition to replacing HaloFinder with
parallelHF you also have to add "from
yt.analysis_modules.halo_finding.api import *" to the top of the
script.

The error I encounter seems to be related to the large grid count that
was also giving me problems yesterday. Here's the traceback.

 File "test.py", line 6, in <module>
   halo_list = parallelHF(pf)
 File "/home/mqk/local/lib/python2.7/site-packages/yt-2.4dev-py2.7-linux-x86_64.egg/yt/analysis_modules/halo_finding/halo_objects.py",
line 2047, in __init__
   total_mass =
self.comm.mpi_allreduce((self._data_source["ParticleMassMsun"].astype('float64')).sum(),
 File "/home/mqk/local/lib/python2.7/site-packages/yt-2.4dev-py2.7-linux-x86_64.egg/yt/data_objects/data_containers.py",
line 321, in __getitem__
   self.get_data(key)
 File "/home/mqk/local/lib/python2.7/site-packages/yt-2.4dev-py2.7-linux-x86_64.egg/yt/data_objects/data_containers.py",
line 2423, in get_data
   self.particles.get_data(field)
 File "/home/mqk/local/lib/python2.7/site-packages/yt-2.4dev-py2.7-linux-x86_64.egg/yt/data_objects/particle_io.py",
line 99, in get_data
   count=len(grid_list), dtype='float64'))
ValueError: iterator too short

Again it's the total_mass calculation that is causing the problem.
Note that this yt install includes the changes that Matt brought in
yesterday to fix the memory consumption.

When I specify total_mass manually, everything works and I get the
correct number of halos in the final catalog. The number of halos and
their properties are not exactly identical to serial Hop, but it's
close enough. I got a speed-up of about a factor of 3, when running
parallelHF on 8 cores vs. HaloFinder on 1 core.

Mike


On Fri, Feb 24, 2012 at 2:37 PM, Michael Kuhlen <mqk@astro.berkeley.edu> wrote:
> No, I haven't yet. From the documentation ParallelHop seems to require
> installing Forthon, and I was lazy. ;) I'll give that a shot. Thanks
> for the suggestion.
>
> Mike
>
> On Fri, Feb 24, 2012 at 2:31 PM, Britton Smith <brittonsmith@gmail.com> wrote:
>> Hi Mike,
>>
>> Perhaps I missed something from the previous discussion.  I've been locked
>> away working on proposals all week.  Out of curiosity, have you tried using
>> ParallelHop, which is Stephen's specially designed parallel Hop.  It is
>> entirely distinct from simply running hop in parallel.  Stephen, please
>> correct me if I'm wrong.
>>
>> You should be able to use that simply by replacing HaloFinder with
>> parallelHF in your script.
>>
>> Britton
>>
>> On Fri, Feb 24, 2012 at 5:23 PM, Stephen Skory <s@skory.us> wrote:
>>>
>>> Hi Mike,
>>>
>>> > Is running Hop in parallel currently broken? I get way fewer halos in
>>> > my catalog when I run Hop with 8 processors than with 1. Am I doing
>>> > something wrong?
>>>
>>> Why do you always got to start trouble? I'm getting this too. I'll dig
>>> into this.
>>>
>>> --
>>> Stephen Skory
>>> s@skory.us
>>> http://stephenskory.com/
>>> 510.621.3687 (google voice)
>>> _______________________________________________
>>> yt-users mailing list
>>> yt-users@lists.spacepope.org
>>> http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
>>
>>
>>
>> _______________________________________________
>> yt-users mailing list
>> yt-users@lists.spacepope.org
>> http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
>>
>
>
>
> --
> *********************************************************************
> *                                                                   *
> *  Dr. Michael Kuhlen              Theoretical Astrophysics Center  *
> *  email: mqk@astro.berkeley.edu   UC Berkeley                      *
> *  cell phone: (831) 588-1468      B-116 Hearst Field Annex # 3411  *
> *  skype username: mikekuhlen      Berkeley, CA 94720               *
> *                                                                   *
> *********************************************************************



--
*********************************************************************
*                                                                   *
*  Dr. Michael Kuhlen              Theoretical Astrophysics Center  *
*  email: mqk@astro.berkeley.edu   UC Berkeley                      *
*  cell phone: (831) 588-1468      B-116 Hearst Field Annex # 3411  *
*  skype username: mikekuhlen      Berkeley, CA 94720               *
*                                                                   *
*********************************************************************
_______________________________________________
yt-users mailing list
yt-users@lists.spacepope.org
http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org