Thanks for all your help over the last couple of days. One more question:
- Can I plot particles on a volume rendered image?
I have stars and I want to show where they are!
Elizabeth Harper-Clark MA MSci
PhD Candidate, Canadian Institute for Theoretical Astrophysics, UofT
Sciences and Engineering Coordinator, Teaching Assistants' Training Program,
Astronomy office phone: +1-416-978-5759
Does anyone out there have a technique for getting the variance out of
a profile object? A profile object is good at getting <X> vs. B, I'd
then like to get < (X - <X>)^2 > vs B. Matt and I had spittballed the
possibility some time ago, but I was wondering if anyone out there had
successfully done it.
Sent from my computer.
I did some simple volume rendering with the following script:
volume2 = AMRKDTree(pf, fields=["Dark_Matter_Density"],
cam = pf.h.camera(c, L, W, N, tf, volume=volume2, no_ghost=False,
cam.snapshot(fn="%s_iso-DMdensity-%3.3d.png" % (filenameTHIS, j))
I got rather strange results in that the pictures look symmetric, which I am pretty
sure can not be true.
I attach the obtained plot.
Note that I am using KD tree and using 32 cores.
Your help at your earliest convenience is appreciated.
I'm attempting to visualize multiple variables in the same volume
rendering. For instance, I'd like to show some density surfaces of a star
and then get temperature overlayed to show hotspots. I found the
MultiVariateTransferFunction, but it is unfortunately not particularly well
documented yet, and I'm being repeatedly thwarted. I'm not sure if I've set
it up correctly, or how to go from the transfer function to an image.
Here's my current script, using just one variable for now:
from yt.mods import *
pf = load('plt00100')
mv = MultiVariateTransferFunction()
tf = TransferFunction((mi-1, ma+1), nbins=1.0e6)
tf.add_gaussian(np.log10(9.0e5), 0.01, 1.0)
c = [5.0e9, 5.0e9, 5.0e9]
L = [0.15, 1.0, 0.40]
W = wi*0.7
Nvec = 1024
cam = pf.h.camera(c, L, W, (Nvec,Nvec), transfer_function = mv,
fields=['density'], pf=pf, no_ghost=True)
im = cam.snapshot(num_threads=4)
I checked to make sure density is indeed my 0-index field. I get the
following error after ray casting:
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "multivol.py", line 153, in <module>
im = cam.snapshot(num_threads=4)
line 742, in snapshot
line 632, in _render
image = self.finalize_image(image)
line 611, in finalize_image
if self.transfer_function.grey_opacity is False:
AttributeError: 'MultiVariateTransferFunction' object has no attribute
Can I not use snapshot in this case, or is something else the matter
I've just issued a PR that will hopefully fix a whole class of buggy
behavior that both new and experienced yt users commonly run into.
Specifically, I'd like it if we could turn off data serialization by
default. This changes a long-lived default value in yt's configuration, so
I wanted to bring this change to the attention of both the yt user and
What is data serialization? Currently, yt will save the result of certain
expensive calculations, including projections, the structure of the grid
hierarchy, and the list of fields present in the data. While this does
have the beneficial effect of saving time when a user needs to repetitively
calculate these quantities on the same dataset, it has a number of features
which lead to buggy, annoying behavior.
Specifically, If I am developing my simulation code or repeatedly
restarting my code, searching for a way to grind past a code crash, I will
quite often regenerate the same simulation output file over and over,
changing a line of code or switching out the value of a parameter each
If yt's data serialization is turned on, it's likely that yt's
visualizations will correspond to old versions of the data file. Since
only certain operations are serialized, it's also possible for yt to get
into an inconsistent state - one operation will show the current data file,
while another operation will show an old version.
It's possible to fix a bug in your code, but because yt is still loading
the old data, you won't be able to tell that your bug is fixed until you
realize that you have .yt and .harrays files littering your filesystem.
I've personally wasted a lot of time due to yt's serialization 'feature'
and denizens of our IRC channel and mailing list can attest to how often
new users run into this behavior as well.
My pull request only turns off serlialization by default, it doesn't
disable the capability completely. Once the pull request is merged in, you
can turn on serialization either by adding an entry to your config file:
$ cat ~/.yt/config
serialize = True
Or on a per-script basis:
from yt.config import ytcfg
ytcfg['yt', 'serialize'] = 'True'
from yt.mods import *
The pull request is here:
I know several of you are big fans of this feature, so if you object to
this change please leave a comment on the pull request so we can figure out
a way forward.
I'm currently attempting to build yt on Scinet GPC cluster, where the
Python module can only be loaded alongside Intel and gcc combined.
Compiling with Intel modules leads to mercurial compile error. I've
attempted to build yt with only the gcc module active, and have achieved a
successful build (though ), and yt sources without any errors. Attempting
to run ipython or yt serve, however, gives me this error:
ImportError: libimf.so: cannot open shared object file: No such file or
Is there a workaround to this issue? Assistance would be greatly
After defining a disk
my_disk=pf.h.disk(sc, [0,0,1], 450.0*pc, 30.0*pc)
I asked for the minimum and maximum values of these fields
0.00390625 0.99609375 0.00390625 0.99609375 0.00390625 0.99609375
So I really don't know what actually are these fields
I'm trying to access the density for a cell and its six neighbors (in the
x,y,z directions), but I can't figure out how I can get the information for
the neighboring cells. The problem is that dd[ 'NumberDensity' ] is a
one-dimensional array, and I don't know the original 3d shape of the data,
so I can't reshape the array into three dimensions. Is there a way to get
the indices of a cell's neighbors?
Thanks for any help!
I'm getting this error during the installation and I'm not sure exactly why.
Could you take a look at this? I just downloaded the 3.0a2 package
publicly available on Bitbucket to try visualizing my SPH snaps. Any help
would be greatly appreciated.
Error compiling Cython file:
cdef void *tree_root
cdef int num_root
cdef int max_root
cdef extern from "search.h" nogil:
void *tsearch(const void *key, const void **rootp,
yt/geometry/oct_container.pxd:81:29: Expected ')', found '*'
error: 2 errors while compiling 'yt/geometry/oct_container.pyx' with Cython