If I read in a dataset like:
ds = yt.load("ActiveParticleCosmology/DD0046/DD0046")
(that file is from the data page)
and then do
it gives 99. This happens for the boxlib frontend too.
Is there any meaning to the ds.max_level parameter? should we be setting
it somewhere? or else remove it? As it is, it is confusing (and tempting
to use it) seeing it there.
Dept. of Physics & Astronomy • Stony Brook University • Stony Brook, NY
I am trying to do something that feels like it ought to be simple, but I
simply can't figure it out. I am using yt-2.x, and I want to create a
derived field based on a given center and a radial profile, which I have
pre-calculated and have sent to the derived field using
set_field_parameter. This does not seem to work particularly well - I
can't seem to get these field parameters, and even my simplest example of a
derived field breaks down. Here's a copy of my script:
which dies with this traceback:
It seems that I can't even access the center that I pass in, much less work
with the radial profile. I suspect I'm doing something dumb, but reading
the documentation on creating derived fields does not seem to be giving me
the clues I need to figure out what I'm doing wrong.
I would like to use yt's halo finder to select out spatially-coherent
hotspots in my data. In other words, I would like the halo-finder
algorithm to select out based on temperature instead of density. Is this
possible? I couldn't find any obvious way to do this in the documentation.
Thanks for your assistance!
Department of Physics and Astronomy, PhD Candidate
Stony Brook University
As the title states, I am confused - specifically about units. I want to
create a derived field where parts of the field are different values,
dependent upon some cutoff criteria like so:
tl = data['temperature']
dl = data['density']
aux = np.where(tl.in_cgs() > 5000.)
dene = dl
ds.add_field("dene", units="g/cm**3", function = _dene2)
Not only does this produce a very worrisome error: "RuntimeError:
Something has gone terribly wrong, _function is NullFunc for ('flash',
The print statement in there shows a bunch of values close to 1, which is
why the "aux" mask never returns anything. But why is data a bunch of 1's
in here? How do I access/convert to the "real" units?
Any help is appreciated,
I have a slice plot that I make with the simple:
slc = SlicePlot(pf,'z',('gas','temperature'))
and produces this (it's super ugly but not the point :) ):
I can't figure out how to get the colorbar to reveal the numbers that the
colors correspond to. Is there an obvious solution?
I'm trying to mask out several spheres in my domain before making a phase
plot. Is there a way to do this? I've tried making a derived field, using
the boolean mask things, and checked out the "masking domain" section of
the cookbook with little success.
The regions I want to mask out come from a text file of x/y/z coordinates
Thanks for any insights!
Hi YT Users,
I have a question about speed and caching that I’m hoping someone can help with.
I have a fairly large simulation from which I’d like to make projection plots in all three cardinal directions and save the results in a flat numpy array for later incorporation into an animation. One of the quantities I’d like to project is density-weighted temperature, and this is fairly expensive to compute because temperature is a derived field whose construction requires reading 8 other fields from disk. The computation is that temperature = constant times internal energy, and internal energy = total energy (1 field) - magnetic energy (3 fields: Bx, By, and Bz) - kinetic energy (4 fields: density, px, py, pz). I want to do the projection over the entire simulation volume, so it seems like the sensible way to do the computation would be to read all the data once, construct the temperature, store it in memory, and then project it into the three cardinal directions. The size of the data set is not so large that it won’t fit in memory on the largest memory nodes to which I have access. However, what yt seems to be doing instead is to read the data and compute the temperature for the first projection, and then repeat the entire process, including the reading and computation, for each of the other two directions. This makes the computation a factor of 3 slower than it should be (since IO dominates the cost), and, given the size of the data sets involved, this is a significant annoyance.
So here’s the question: is there a way to force yt to cache a derived field rather than reconstructing it on the fly every time it is needed? Or is there some other strategy that can be used to avoid this inefficiency? There’s and sample script pasted at https://bpaste.net/show/a58f7ebea9bd <https://bpaste.net/show/a58f7ebea9bd> that demonstrates what I’m trying to do.
Hi--I've got a question I'm hoping someone can help me with. I'm trying to
do a volume rendering including 2 different fields whose scales differ by
~8 orders of magnitude. I'd like to be able to specify isocontours for
specific values for each of the fields. I've been able to achieve this with
just one field, but how can I do it for multiple fields?
I was wondering about the behaviour of create_profile in 3.0.
If I do something like
yt.create_profile(sphere, 'particle_radius', 'all_density')
it will fail but
yt.create_profile(sphere, 'radius', 'all_density')
yt.create_profile(sphere, 'particle_radius', 'particle_mass',
will also be OK as long as I set the weight_field = None otherwise it
I'm able to plot enclosed dark matter profiles but when it comes to density
profiles the above difficulties are proving troublesome :)
Any tips would be appreciated! Simple script attached.
I'm trying to do the simple volume rendering from the cookbook. The data is
FLASH HDF5 data, and I'm using version 3.0.2 of yt. The script I'm running
However, I'm getting the following error:
I guess it has something to do with the AMR tree, but unfortunately I don't
know a lot about the specifics of that.
Could anyone offer any insight into this problem?