I've recently been trying to get parallelHOP integrated into the current
version of yt-3. (I need halo finding with star particles which I think
rules out Rockstar, and I eventually need to work with grids up to 1536^3
which would be problematic with regular HOP.)
I currently have it working properly for single core runs, and it produces
output that matches the yt-2.x version. It also runs to completion with
mpi, and produces mostly sensible output.
However, when run on multiple cores it has problems with the halos near the
region boundaries. In particular it produces a large number (~10% of total)
of typically small "bad halos" which have 0 values for the halo maximum
density/location (i.e. [0, 0, 0, 0]), and at least one negative value for
the center of mass location (due to incorrect handling of periodicity). In
addition, some of the "real" halos near the region boundaries have fewer
particles in parallel runs. Comparing halo particle lists between serial
and mpi runs shows that some of these missing particles are located in the
"bad halos". Running with premerge=False improves the situation slightly
(slightly fewer "bad halos", better agreement in some of the real ones).
I think this means that the halo finder isn't correctly joining up some
chains across processors, but I'm not familiar enough with the parallelHOP
code to know where to look first. The only changes I've made to the code
are in halo_objects.py, updating the old self.hierarchy.region_strict and
similar calls as well as fixing units in a few places. I haven't made any
changes in parallel_hop_interface.py.
Has anyone run into a similar problem before? If integrating parallelHOP is
known to be unworkable (or extremely difficult) I'll start looking at other
options. Otherwise, if anyone has some insight into where the problem might
lie or what to look at first, I'd appreciate any advice I can get. Thanks.
New issue 1205: default disparity in StereoPerspective lens doesn't show much difference between left and right
Depending on what you are rendering, when you do a volume rendering with the lens="stereo-perspective", the left and right images do not look at all different.
To fix this, you need to change the disparity parameter, like:
sc.camera.lens.disparity = cam.width/10.0
this setting is 50x greater than the default, and gives a noticable left-right difference.
We need to figure out what a reasonable default is for the `disparity`, and also we should document this better in the volume rendering docs.
I'm almost finished updating the install script so that by default it
creates a conda-based installation, but can also optionally do a
from-source bootstrapped installation.
Before the pull request can be merged, it needs testing on a variety of
platforms as well as some sort of test script we can run as part of the
test suite. If any of you have some time and interest in helping me out,
I'd appreciate it if you could try. You should read the description at the
top of the pull request and modify any variables you would like to change
to customize your installation.
You should be able to get the latest version of the script by doing:
$ curl -O
If you run into any problems, either reply here or on the pull request with
a description of the issue or a solution and I will try to fix them.
I am proud to announce the release of xs 1.0: the next generation of yt.
xs is a new package, designed from the ground up to take advantage of
emerging architectures and to address challenges presented by the next
generation of computing hardware. In keeping with this vision, xs
facilitates the deployment of complex analysis pipelines as defined at
compile time, rather than run time as is common in most yt usage.
For more information, including compilation instructions, contribution
guidelines, and download information, please see the homepage at: