Hi, Everybody!
Does anyone out there have a technique for getting the variance out of
a profile object? A profile object is good at getting <X> vs. B, I'd
then like to get < (X - <X>)^2 > vs B. Matt and I had spittballed the
possibility some time ago, but I was wondering if anyone out there had
successfully done it.
Thanks,
d.
--
Sent from my computer.
Hi guys,
I am very new to Ytini's mailing list, mailing lists as well as Houdini.
So please bear with me while I pick up the etiquette around this medium.
With that out of the way here's my first question:
After having succeeded installing Ytini for Houdini on a Mac OS X 10.14 I
would now like to achieve the same with my Houdini install on a Windows 10
OS.
I have tried to somewhat reproduce all steps described here
http://www.ytini.com/getstarted.html#collapseTwoOut (Houdini 17.5 Tested on
Mac OS X) on my Windows machine but wasn't surprised to learn that of
course Houdini could not find the ytini module when I tried to import it in
the python shell.
Is there somebody, that could point me . in the right direction of how to
install Ytini for Houdini on a Windows 10 OS environment?
Not afraid of reading so even hints a re more than welcome.
Many Thanks in advance
Ur4n
Dear yt
Can current yt calculate 3-D Mass power spectra? I checked the website but
I didn't find any information. I think calculating 3-D Mass power
spectra is a very useful for cosmological simulations. So I guess maybe yt
supports this function now....?
Thanks in advance
Hi,
I am trying to overlap two different projection plots (particles and
density) with two different color bars. However, I couldn't figure out a
way to do that but found over plotting particles over fluids.
So, I have RAMSES data which can't distinguish between dark matter and
stars through particle types but only with mass cut_off. So I create a
filter for the particles accordingly and supply it as the 'data_source'
argument in 'annotate_particles'. So, I do the following:
> def stars(pfilter, data):
>
> # age = (data.ds.current_time - data[pfilter.filtered_type,
> "particle_birth_time"]).in_units('Myr')
>
> stars_only = data[pfilter.filtered_type, "particle_mass"
> ].in_units('Msun')
>
> filter = (stars_only<4e6)
>
> return filter
>
> yt.add_particle_filter("stars", function=stars, filtered_type='io',
> requires=["particle_mass"])
>
>
> ini=1
>
> fin=2
>
> for x in range(ini, fin):
>
> filename="/lunarc/nobackup/users/samvad/SQ-8-5/output/output_%0*d/info_%0*
> d.txt" % (5,x,5,x)
>
> ds = yt.load(filename)
>
> ds.define_unit("H", (1.674*10**(-24), "g"))
>
> def density_threshold(field, data):
>
> dens=data["gas", "density"].in_units('H/cm**3')
>
> dens[dens < 0] = 0
>
> return dens
>
> ds.add_field(name=("gas", "density_threshold"),
> function=density_threshold, units="H/cm**3")
>
> ad=ds.all_data()
>
> denst=ad["gas", "density_threshold"].in_units('H/cm**3')
>
> #plot = yt.SlicePlot(ds, 'z', 'density_threshold', width=(400, "kpc"))
>
> plot = yt.ProjectionPlot(ds, 'z', 'density', weight_field=
> 'density', width=(800, "kpc"), center="c")
>
> ds.add_particle_filter('stars')
>
> dd=ds.all_data()
>
> plot.annotate_particles(1.0, dd)
>
> plot.set_cmap(field='density', cmap='Blues')
>
> plot.set_unit(("gas", "density"), "H/cm**3")
>
> plot.set_zlim(("gas", "density"), 1e-4, 1e-1)
>
plot.annotate_timestamp()
fil="density_%0*d.png" % (5,x)
>
> plot.save(fil)
>
But ending up with the error:
Traceback (most recent call last):
File
"/home/samvad/yt-conda/lib/python3.6/site-packages/yt/visualization/plot_window.py",
line 999, in run_callbacks
callback(cbw)
File
"/home/samvad/yt-conda/lib/python3.6/site-packages/yt/visualization/plot_modifications.py",
line 53, in _check_geometry
return func(self, plot)
File
"/home/samvad/yt-conda/lib/python3.6/site-packages/yt/visualization/plot_modifications.py",
line 1563, in __call__
s=self.p_size, c=self.color,alpha=self.alpha)
File
"/home/samvad/yt-conda/lib/python3.6/site-packages/matplotlib/__init__.py",
line 1717, in inner
return func(ax, *args, **kwargs)
File
"/home/samvad/yt-conda/lib/python3.6/site-packages/matplotlib/axes/_axes.py",
line 4032, in scatter
alpha=alpha
File
"/home/samvad/yt-conda/lib/python3.6/site-packages/matplotlib/collections.py",
line 880, in __init__
self.set_sizes(sizes)
File
"/home/samvad/yt-conda/lib/python3.6/site-packages/matplotlib/collections.py",
line 853, in set_sizes
scale = np.sqrt(self._sizes) * dpi / 72.0 * self._factor
AttributeError: 'YTRegion' object has no attribute 'sqrt'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "density.py", line 57, in <module>
plot.save(fil)
File
"/home/samvad/yt-conda/lib/python3.6/site-packages/yt/visualization/plot_container.py",
line 94, in newfunc
args[0]._setup_plots()
File
"/home/samvad/yt-conda/lib/python3.6/site-packages/yt/visualization/plot_window.py",
line 948, in _setup_plots
self.run_callbacks()
File
"/home/samvad/yt-conda/lib/python3.6/site-packages/yt/visualization/plot_window.py",
line 1005, in run_callbacks
sys.exc_info()[2])
File "/home/samvad/yt-conda/lib/python3.6/site-packages/six.py", line
692, in reraise
raise value.with_traceback(tb)
File
"/home/samvad/yt-conda/lib/python3.6/site-packages/yt/visualization/plot_window.py",
line 999, in run_callbacks
callback(cbw)
File
"/home/samvad/yt-conda/lib/python3.6/site-packages/yt/visualization/plot_modifications.py",
line 53, in _check_geometry
return func(self, plot)
File
"/home/samvad/yt-conda/lib/python3.6/site-packages/yt/visualization/plot_modifications.py",
line 1563, in __call__
s=self.p_size, c=self.color,alpha=self.alpha)
File
"/home/samvad/yt-conda/lib/python3.6/site-packages/matplotlib/__init__.py",
line 1717, in inner
return func(ax, *args, **kwargs)
File
"/home/samvad/yt-conda/lib/python3.6/site-packages/matplotlib/axes/_axes.py",
line 4032, in scatter
alpha=alpha
File
"/home/samvad/yt-conda/lib/python3.6/site-packages/matplotlib/collections.py",
line 880, in __init__
self.set_sizes(sizes)
File
"/home/samvad/yt-conda/lib/python3.6/site-packages/matplotlib/collections.py",
line 853, in set_sizes
scale = np.sqrt(self._sizes) * dpi / 72.0 * self._factor
> yt.utilities.exceptions.YTPlotCallbackError: annotate_particles callback
> failed with the following error: 'YTRegion' object has no attribute 'sqrt'
>
I don't seem to understand the issue here. Is there a workaround to get
particle plot over density plot in a simpler way? Thank you.
Dear All,
We are trying to reproduce the results of the AGORA comparison isolated galaxy (Kim et al 2016). We have re-run the simulation using RAMSES with help from Oscar Agertz and we believe it ran correctly based on our analysis/plots (using yt and other tools). We are doing analysis using the official yt scripts (see below for details). We are able to get most plots correct except the temperature-density phase plot which looks strange (stopping at densities around 1e-24) and is not consistent with the density histogram (which continues up to 1e-21). The density histogram and other plots look like the paper so we think the issue is the plot, not the simulation.
The paper says what to use in footnotes: We are using the monolithic analysis script from footnote 68
"The website is http://bitbucket.org/mornkr/agora-analysis-script/."
We started using yt 3.5.1. However, the paper indicates a version of yt to use in footnote 69:
"yt version 3.3 or later is required for the script to reproduce our analyses.
For the figures and plots in Section 6, the yt-dev changeset d7f213e1752e
is used."
We have not tried older yt -- would this help?
Is there a specific python version as well for this to work correctly?
We cannot find yt-dev or anything on the github for yt that seems to correspond to the changeset d7f213e1752e
The paper does not state anything regarding a specific python version to use.
Any help would be appreciated.
Thanks,
James
Hi,
I am trying to add value of a particle field to the gas field (same units)
by accessing a data region of fixed resolution. I expect the fields to have
same sizes but it doesn't seem to be the case. Could someone please shed
some light on this? Thank you.
DIM=256
>
> dd=ds.arbitrary_grid(left_edge= ds.arr([(com_x_1.value-10),
> (com_y_1.value-10), (com_z_1.value-10)], 'kpc'), right_edge=
> ds.arr([(com_x_1.value+10), (com_y_1.value+10), (com_z_1.value+10)], 'kpc'),
> dims=[DIM,DIM,DIM])
>
m=dd["gas", "cell_mass"].in_units('Msun')
starmass=dd[("new_stars", "particle_mass")].in_units('Msun')
b=m.value+starmass.value
>
Error:
Traceback (most recent call last):
File "gasinflow.py", line 95, in <module>
b=m.value+starmass.value
ValueError: operands could not be broadcast together with shapes
(16777216,) (8978,)
Hi everybody, I'd like to remind you all about the 2019 John Hunter
Excellence in Plotting Contest. My apologies to those of you getting this
on multiple lists.
In memory of John Hunter, we are pleased to be reviving the SciPy John
Hunter Excellence in Plotting Competition for 2019. This open competition
aims to highlight the importance of data visualization to scientific
progress and showcase the capabilities of open source software.
Participants are invited to submit scientific plots to be judged by a
panel. The winning entries will be announced and displayed at the
conference.
John Hunter’s family and NumFocus are graciously sponsoring cash prizes for
the winners in the following amounts:
-
1st prize: $1000
-
2nd prize: $750
-
3rd prize: $500
-
Entries must be submitted by June, 8th to the form at
https://goo.gl/forms/cFTB3FUBrMPfQ7Vz1
-
Winners will be announced at Scipy 2019 in Austin, TX.
-
Participants do not need to attend the Scipy conference.
-
Entries may take the definition of “visualization” rather broadly.
Entries may be, for example, a traditional printed plot, an interactive
visualization for the web, or an animation.
-
Source code for the plot must be provided, in the form of Python code
and/or a Jupyter notebook, along with a rendering of the plot in a widely
used format. This may be, for example, PDF for print, standalone HTML and
Javascript for an interactive plot, or MPEG-4 for a video. If the original
data can not be shared for reasons of size or licensing, "fake" data may be
substituted, along with an image of the plot using real data.
-
Each entry must include a 300-500 word abstract describing the plot and
its importance for a general scientific audience.
-
Entries will be judged on their clarity, innovation and aesthetics, but
most importantly for their effectiveness in communicating a real-world
problem. Entrants are encouraged to submit plots that were used during the
course of research or work, rather than merely being hypothetical.
-
SciPy reserves the right to display any and all entries, whether
prize-winning or not, at the conference, use in any materials or on its
website, with attribution to the original author(s).
SciPy John Hunter Excellence in Plotting Competition Co-Chairs
Hannah Aizenman
Thomas Caswell
Madicken Munk
Nelle Varoquaux
Hello,
I’m trying to get yt to work with Enzo datasets with ghost zones. I’m having an Enzo crash in a large simulation I’m trying to debug and it would be helpful to inspect the ghost zones in yt. I see there has been significant work to build up the framework in the Enzo frontend to allow for this but much of it is inconsistent and not working yet. Sorry for this being long and overly verbose. I want to make sure I explain things in enough detail since there are a lot of moving parts. It also helps me learn how the frontend works if I explain everything. There's a TLDR at the end. I've opened a yt issue but thought I should post a message here too.
There are 3 main problems I’ve identified. One is trivial to fix, the other 2 require more work and a well thought out plan so that current working parts of the code are not broken in the process.
In `yt/frontends/enzo/data_structures.py` there’s a class `EnzoGridGZ` derived from `EnzoGrid`. It adds a member function `retrieve_ghost_zones` which is supposed to make a covering_grid including the ghost zones. This is exactly what I need.
The first problem is the call
`self.index.covering_grid(level, new_left_edge, **kwargs)`
should be
`self.ds.covering_grid(level, new_left_edge, **kwargs)`
Also, the kwargs dictionary is not created properly. This is an easy fix.
It was
```kwargs = {'dims': self.ActiveDimensions + 2*n_zones,
'num_ghost_zones':n_zones,
'use_pbar':False}
kwargs.update(self.field_parameters)```
it should be
```kwargs = {'dims': self.ActiveDimensions + 2*n_zones,
'fields': fields,
'ds': self.ds,
'num_ghost_zones':n_zones,
'use_pbar':False,
'field_parameters': self.field_parameters}
```
Next are the real problems. Here’s the call stack when yt fails. I’ll walk through it explaining flow of control and where the problems arise. I'm using a the `<enzo_dir>/run/Cosmology/AdiabaticExpansion` simulation with a 16^3 root grid and 3 ghost zones.
```
/Users/coreybrummel-smith/simulations/testing/WriteGhostZones/ghost.py(21)<module>()
-> gz = grid_gz.retrieve_ghost_zones(3, ['density'])
yt/frontends/enzo/data_structures.py(146)retrieve_ghost_zones()
-> level, new_left_edge, **kwargs
yt/data_objects/construction_data_containers.py(555)__init__()
-> self.get_data(fields)
yt/data_objects/construction_data_containers.py(642)get_data()
-> if len(fill) > 0: self._fill_fields(fill)
yt/data_objects/construction_data_containers.py(698)_fill_fields()
-> for chunk in parallel_objects(self._data_source.chunks(fields, "io")):
yt/utilities/parallel_tools/parallel_analysis_interface.py(507)parallel_objects()
-> for result_id, obj in oiter:
yt/data_objects/data_containers.py(1276)chunks()
-> self.get_data(fields)
yt/data_objects/data_containers.py(1370)get_data()
-> (fluids, self, self._current_chunk)
yt/geometry/geometry_handler.py(245)_read_fluid_fields()
-> chunk_size
/utilities/io_handler.py(148)_read_fluid_selection()
-> ind[field] += obj.select(selector, data, rv[field], ind[field])
yt/data_objects/grid_patch.py(423)select()
-> dest[offset:offset+count, i] = source[sl][np.squeeze(mask)]
```
Things start out okay, the covering grid has the correct dimensions including ghost zones and correct edges
```
ipdb> self.ActiveDimensions
array([22, 22, 22], dtype=int32)
ipdb> self.left_edge
YTArray([-0.1875, -0.1875, -0.1875]) code_length
ipdb> self.right_edge
YTArray([1.1875, 1.1875, 1.1875]) code_length
```
Now `_fill_fields()` is called from `get_data()`. Here we do
```
output_fields = [np.zeros(self.ActiveDimensions, dtype="float64")
for field in fields]
domain_dims = self.ds.domain_dimensions.astype("int64") \
* self.ds.relative_refinement(0, self.level)
```
`output_fields[0].shape` is correct (22,22,22), but `domain_dims` is (16,16,16) because it comes from TopGridDimensions in the Enzo parameter file.
Next, `_fill_fields()` loops over chunks while calling `fill_region()`
As soon as `self._data_source.chunks()` is called, the YTRegion (self._data_source) associated with the covering grid calls `get_data()`. Since there is no _current_chunk yet, `self.index._identify_base_chunk()` gets called. By doing this, `dobj.selector.select_grids()` gets called which modifies the YTRegion’s left edge and right edge.
```
ipdb> dobj.left_edge
YTArray([0.8125, 0.8125, 0.8125]) code_length
ipdb> dobj.right_edge
YTArray([2.1875, 2.1875, 2.1875]) code_length
```
Then the size is computed
```dobj.size = self._count_selection(dobj, fast_index = fast_index)```
_count_selection()
The size ultimately comes from ActiveDimensions of the EnzoGrid in YTRegion's base chunk which is (16,16,16) not (22,22,22).
Now back in `self._chunk()` where self is the EnzoHierarchy of the YTCoveringGrid. We call `self._chunk_io()` and yield the chunk back to the loop in `YTRegion.chunks()`
Now that we've prepared the chunk we try to load in the density field with
`self.get_data(fields)` which calls `self.index._read_fluid_fields(fluids, self, self._current_chunk)` which calls `self.io._read_fluid_selection(self._chunk_io(dobj), selector, fields_to_read, chunk_size)`
here a dictionary named `rv` is created which will store the field data.
`rv[field] = np.empty(size, dtype="=f8")`
But `size` is determined by the chunk size which is 16^3 (b/c the grid's ActiveDimensions are 16^3). If it is to include the ghost zones, it should be 22^3 instead.
Now we do
```for field, obj, data in self.io_iter(chunks, fields):
...
ind[field] += obj.select(selector, data, rv[field], ind[field])
```
`self.io_iter(chunks, fields)` loops over the chunks, grids, and fields, opens the appropriate Enzo hdf5 dataset, then does
```
dims = obj.ActiveDimensions[::-1] + nodal_flag[::-1]
data = np.empty(dims, dtype=h5_dtype)
yield field, obj, self._read_obj_field(
obj, field, (fid, data))
```
`data` a 16^3 ndarray again because the grid's ActiveDimensions are 16^3.
`self._read_obj_field` reads the data from the hdf5 dataset and stores it in the `data` array and returns the transpose of this array to the io_itter function. Here's tho code.
```
dg = h5py.h5d.open(fid, b(node))
dg.read(h5py.h5s.ALL, h5py.h5s.ALL, data)
return data.T
```
Interestingly this doesn't fail even though the dataset is 22^3 not 16^3. The `h5py.h5d.read()` doc says "It is your responsibility to ensure that the memory dataspace provided is compatible with the shape of the Numpy array. ... You can easily crash Python by reading in data from too large a dataspace." Not sure why it doesn't crash since these two don't match in this case.
Anyway, here's the second key problem. Because `WriteGhostZones = 1` in the Enzo parameter file, `self` is an `IOHandlerPackedHDF5GhostZones` object and its version of `_read_obj_field()` returns the data with the ghost zones trimmed off. Here's the class definition so you can see what I mean.
```
class IOHandlerPackedHDF5GhostZones(IOHandlerPackedHDF5):
_dataset_type = "enzo_packed_3d_gz"
def __init__(self, *args, **kwargs):
super(IOHandlerPackedHDF5GhostZones, self).__init__(*args, **kwargs)
NGZ = self.ds.parameters.get("NumberOfGhostZones", 3)
self._base = (slice(NGZ, -NGZ),
slice(NGZ, -NGZ),
slice(NGZ, -NGZ))
def _read_obj_field(self, *args, **kwargs):
return super(IOHandlerPackedHDF5GhostZones, self)._read_obj_field(
*args, **kwargs)[self._base]
```
Obviously this is supposed to be for when you have a dataset with ghost zones, most of the time you don't want to include them when inspecting data with yt. For instance, since my root grid is 16^3 with 3 ghost zones, instead of returning a 22^3 data cube it would return the 16^3 datacube. But here we want to use the ghost zones and furthermore, the dimensions weren't set properly so it returns a 10^3 array.
Now we yield the 10^3 data array back to the loop in `_read_fluid_selection`. Then we do
```
ind[field] += obj.select(selector, data, rv[field], ind[field])
```
which is supposed to copy `data` into `rv`. Here we encounter another problem. Here's the code for `obj.select()`
```
def select(self, selector, source, dest, offset):
mask = self._get_selector_mask(selector)
count = self.count(selector)
if count == 0: return 0
dim = np.squeeze(self.ds.dimensionality)
nodal_flag = source.shape[:dim] - self.ActiveDimensions[:dim]
# DEBUG NOTE:
# source.shape = [10,10,10]
# self.ActiveDimensions = [16,16,16]
if sum(nodal_flag) == 0:
dest[offset:offset+count] = source[mask]
else:
slices = get_nodal_slices(source.shape, nodal_flag, dim)
for i , sl in enumerate(slices):
dest[offset:offset+count, i] = source[sl][np.squeeze(mask)]
# DEBUG NOTE:
# source size originally comes from chunk size and was then modified to not include ghost zones
return count
```
`nodal_flag` should be `[0,0,0]` for the density field but since `source.shape` is incorrect, `nodal_flag = [-6,-6,-6]`.
Thus, we enter the else: block which should not happen for the density field. Finally we execute the
`dest[offset:offset+count, i] = source[sl][np.squeeze(mask)]` command and fail because `source[sl].shape` is `(9,9,9)` and `mask.shape` is `(16,16,16)`
TLDR:
In summary, there are two main problems with trying to create a YTCoveringGrid of an Enzo grid from a dataset with ghost zones. When trying to load in the field data, yt prepares numpy arrays whose shape is determined from the EnzoGrid's ActiveDimensions which does not include the ghost zones. This problem is compounded with the second problem that yt assumes we have loaded in the the full grid including ghost zones and then trims the edges of the array to eliminate the would-be ghost cells.
I'm not sure the best way to go about solving this. I don't think it is wise or elegant to just throw if statements everywhere to check if we are are trying to create a covering grid including ghost zones.
I think there should be two ways to handle Enzo datasets with ghost zones.
1) Most common: Load in an Enzo dataset that has ghost zones but treat it like a regular dataset and exclude the ghost zones when making a CoveringGrid. This is the idea behind the `IOHandlerPackedHDF5GhostZones` class.
2) My interest: Make a covering grid of an individual Enzo grid including its ghost zones. I think this was the idea for the `EnzoGridGZ.retrieve_ghost_zones()` function.
If anyone has any ideas, I'd appreciate your input.
Thanks,
Corey
Hi everyone,
is there a yt native way to map between grids with different geometry?
To be a bit more precise. I have a cartesian dataset (MAESTRO) and would
like to map it onto a spherical grid to make some analysis easier.
So far I tried to use the _find_field_values_at_points function to
extract the necessary datapoints, which works fine for most fields, but
fails as soon as the radius get involved. Somewhere in the process the
point objects (or grid objects for the faster version in
grid_geometry_handler) loose the information about the centre of my
simulation, which messes up the radius calculation and all other fields
that rely on that field parameter.
Also, the set_field_parameter function does not seem to have any effect
on the grid objects.
To summarize:
Is there an easier method to do what I'm trying to do and if not is
there a way to get consistent radius calculations throughout the
different data objects?
Best
Johann
Dear yt users,
I want to convert the default IllustrisTNG outputs into a mesh grid. I basically want to convert individual Voronoi cells into a 3 D coordinate grids and assign temperature density and velocities to each grid points.
I am looking forward to using the yt 4.0 demeshening technique. Let me know if anybody has done something similar in the past or know how to do that.
Thank you.
-Vikram