New issue 1263: Plugins file is still supposed to be in ~/.yt folder
https://bitbucket.org/yt_analysis/yt/issues/1263/plugins-file-is-still-supp…
Nathan Goldbaum:
We moved the config file's location to reflect the value or XDG_CONFIG (defaulting to ~/.config/yt), we should make the `my_plugins.py` file be located in the same place.
New issue 1262: LightRay masking out all zero-valued density fields in constructor
https://bitbucket.org/yt_analysis/yt/issues/1262/lightray-masking-out-all-z…
Cameron Hummels:
Recently, there was a problem with LightRay objects attempting to be created from zero density trajectories, which led to problems when attempting to use the `AbsorptionSpectrum` functionality because it expects non-zero density fields in order to work correctly (Issue #1232).
This issue was addressed by PR #2300, which searched for any fields present in the `fields` kwarg of the `make_light_ray()` function, identified which of these fields had units of `'K', 'g/cm**3', or 'cm**-3'`, and applied a mask so only non-zero values of those fields were included in the resulting LightRay.
The problem I've encountered is when you have several ion number density fields in a dataset and you want to include them in the LightRay object, you end up masking out all array elements that have zero values for any of the ion fields. So for example, if my simulation keeps track of Oxygen VI number density and H I number density (i.e. `O_p5_number_density` and `H_p0_number_density`), and my LightRay passes through a region with zero metallicity (and thus, zero Oxygen VI, but non-zero H I), all of the cells with zero oxygen VI will get masked out of the LightRay, even though they contain valid densities and H I densities. I've created an example script below:
```
#!python
import yt
from yt.analysis_modules.cosmological_observation.api import LightRay
ds = yt.load('IsolatedGalaxy/galaxy0030/galaxy0030')
# Add a field that is zero everywhere
def _zeros(field, data):
return data["density"] * 0.0
ds.add_field("zeros", function=_zeros, units="g/cm**3")
# Create a LightRay object and include the zeros field
lr = LightRay(ds)
ray_start = [0,0,0]
ray_end = [1,1,1]
lr.make_light_ray(start_position=ray_start, end_position=ray_end,
fields=['temperature', 'density', 'H_number_density', 'zeros'],
data_filename='lightray.h5')
```
make_light_ray fails because it is trying to mask all cells that the ray hits that have zero values in *any* probed field. Here is the STDERR/STDOUT:
```
#!python
[cambot:~/scratch] chummels% python test_zero_dens.py
yt : [INFO ] 2016-08-16 13:30:58,968 Parameters: current_time = 0.00600002000283
yt : [INFO ] 2016-08-16 13:30:58,969 Parameters: domain_dimensions = [32 32 32]
yt : [INFO ] 2016-08-16 13:30:58,969 Parameters: domain_left_edge = [ 0. 0. 0.]
yt : [INFO ] 2016-08-16 13:30:58,969 Parameters: domain_right_edge = [ 1. 1. 1.]
yt : [INFO ] 2016-08-16 13:30:58,970 Parameters: cosmological_simulation = 0.0
Parsing Hierarchy : 100%|█████████████████████████████████████████████████████████| 173/173 [00:00<00:00, 35560.63it/s]
yt : [INFO ] 2016-08-16 13:30:58,983 Gathering a field list (this may take a moment.)
yt : [INFO ] 2016-08-16 13:31:00,080 Getting segment at z = 0.0: [ 0. 0. 0.] unitary to [ 0. 0. 1.] unitary.
/Users/chummels/src/yt/yt/units/yt_array.py:973: RuntimeWarning: invalid value encountered in divide
return super(YTArray, self).__div__(ro)
yt : [INFO ] 2016-08-16 13:31:00,082 Getting subsegment: [0.0, 0.0, 0.0] to [0.0, 0.0, 1.0].
Traceback (most recent call last):
File "test_zero_dens.py", line 17, in <module>
data_filename='lightray.h5')
File "/Users/chummels/src/yt/yt/analysis_modules/cosmological_observation/light_ray/light_ray.py", line 575, in make_light_ray
self._write_light_ray(data_filename, all_data)
File "/Users/chummels/src/yt/yt/utilities/parallel_tools/parallel_analysis_interface.py", line 320, in root_only
return func(*args, **kwargs)
File "/Users/chummels/src/yt/yt/analysis_modules/cosmological_observation/light_ray/light_ray.py", line 617, in _write_light_ray
"Please modify your light ray trajectory." % (f,))
RuntimeError: No zones along light ray with nonzero zeros. Please modify your light ray trajectory.
```
I'm not sure the ideal solution. One might be to require users to include the `density` field when they make a LightRay, and then just use that field for masking out the zero-value density cells. Alternatively, we could accept zero-valued cells and then make AbsorptionSpectrum smarter about what it tries to do with these data. Ideas?
Hi folks,
As a heads up, we're moving the domain registration for yt-project.org to
NumFOCUS. This may result in some intermittent resolution failures over
the next couple days. Mailing lists should be unaffected.
-Matt
Hey all,
Looking at the current docs, generating a single figure from multiple plots is currently a bit of a hassle for the user. This is primarily because the method for doing so generates the figure from the bottom up --all of the axes and the figure are instantiated and then the user further manipulates them. I would like to add a multi plot container class that would reach this same end but takes a top down approach. Essentially users would input a list of yt plots or a nested list of yt plots, which simultaneously gives the container all of the plots and tells it how to layout the plots on the figure. From here the class would pull out the axes for each plot and then deposit them onto the single figure. Before I get into writing out the class I am looking for support as well as input. The main downside I see to this object is that the input needed to instantiate it will either need to be extremely specific or we will need to accept blank spaces in the resulting figure. Also, we would need to define a default behavior for laying out plots that display information from multiple fields. Thoughts and suggestions?
Thanks,
Austin Gilbert
Hi all,
Just a head's up, Kacper recently updated our Jenkins installation and is
working through issues that have come up from the upgrade.
Right now there are two major issues left:
1. Docs builds is broken
2. Tests need to be manually triggered in slack by saying "Test PR XXXX" in
the general or testing channel
Kacper is working on fixing this situation but it might take a few days.
Thanks for your patience.
-Nathan
New issue 1261: ProfilePlot should fail more gracefully when used with particle fields
https://bitbucket.org/yt_analysis/yt/issues/1261/profileplot-should-fail-mo…
Nathan Goldbaum:
Right now ParticlePlot will fail when trying to plot particle fields:
```
import yt
ds = yt.load('output_00080/info_00080.txt')
yt.ProfilePlot(ds.all_data(), 'radius', 'particle_mass')
```
This produces a confusing error:
```
IndexError Traceback (most recent call last)
/Users/goldbaum/Documents/yt-hg/yt/mods.pyc in <module>()
----> 1 yt.ProfilePlot(ds.all_data(), 'radius', 'particle_mass')
/Users/goldbaum/Documents/yt-hg/yt/visualization/profile_plotter.pyc in __init__(self, data_source, x_field, y_fields, weight_field, n_bins, accumulation, fractional, label, plot_spec, x_log, y_log)
228 accumulation=accumulation,
229 fractional=fractional,
--> 230 logs=logs)]
231
232 if plot_spec is None:
/Users/goldbaum/Documents/yt-hg/yt/data_objects/profiles.pyc in create_profile(data_source, bin_fields, fields, n_bins, extrema, logs, units, weight_field, accumulation, fractional, deposition)
1017 setattr(obj, "fractional", fractional)
1018 if fields is not None:
-> 1019 obj.add_fields([field for field in fields])
1020 for field in fields:
1021 if fractional:
/Users/goldbaum/Documents/yt-hg/yt/data_objects/profiles.pyc in add_fields(self, fields)
111 citer = self.data_source.chunks([], "io")
112 for chunk in parallel_objects(citer):
--> 113 self._bin_chunk(chunk, fields, temp_storage)
114 self._finalize_storage(fields, temp_storage)
115
/Users/goldbaum/Documents/yt-hg/yt/data_objects/profiles.pyc in _bin_chunk(self, chunk, fields, storage)
427
428 def _bin_chunk(self, chunk, fields, storage):
--> 429 rv = self._get_data(chunk, fields)
430 if rv is None: return
431 fdata, wdata, (bf_x,) = rv
/Users/goldbaum/Documents/yt-hg/yt/data_objects/profiles.pyc in _get_data(self, chunk, fields)
239 for i, field in enumerate(fields):
240 units = chunk.ds.field_info[field].units
--> 241 arr[:,i] = chunk[field][filter].in_units(units)
242 if self.weight_field is not None:
243 units = chunk.ds.field_info[self.weight_field].units
/Users/goldbaum/Documents/yt-hg/yt/units/yt_array.pyc in __getitem__(self, item)
1159
1160 def __getitem__(self, item):
-> 1161 ret = super(YTArray, self).__getitem__(item)
1162 if ret.shape == ():
1163 return YTQuantity(ret, self.units, bypass_validation=True)
IndexError: index 94462 is out of bounds for axis 1 with size 94462
```
In addition, even if I use particle fields for the bin and binned fields, I still get a similar error:
```
import yt
ds = yt.load('output_00080/info_00080.txt')
yt.ProfilePlot(ds.all_data(), 'particle_radius', 'particle_mass')
```
```
IndexError Traceback (most recent call last)
/Users/goldbaum/Documents/yt-hg/yt/mods.pyc in <module>()
----> 1 yt.ProfilePlot(ds.all_data(), 'particle_radius', 'particle_mass')
/Users/goldbaum/Documents/yt-hg/yt/visualization/profile_plotter.pyc in __init__(self, data_source, x_field, y_fields, weight_field, n_bins, accumulation, fractional, label, plot_spec, x_log, y_log)
228 accumulation=accumulation,
229 fractional=fractional,
--> 230 logs=logs)]
231
232 if plot_spec is None:
/Users/goldbaum/Documents/yt-hg/yt/data_objects/profiles.pyc in create_profile(data_source, bin_fields, fields, n_bins, extrema, logs, units, weight_field, accumulation, fractional, deposition)
1017 setattr(obj, "fractional", fractional)
1018 if fields is not None:
-> 1019 obj.add_fields([field for field in fields])
1020 for field in fields:
1021 if fractional:
/Users/goldbaum/Documents/yt-hg/yt/data_objects/profiles.pyc in add_fields(self, fields)
111 citer = self.data_source.chunks([], "io")
112 for chunk in parallel_objects(citer):
--> 113 self._bin_chunk(chunk, fields, temp_storage)
114 self._finalize_storage(fields, temp_storage)
115
/Users/goldbaum/Documents/yt-hg/yt/data_objects/profiles.pyc in _bin_chunk(self, chunk, fields, storage)
427
428 def _bin_chunk(self, chunk, fields, storage):
--> 429 rv = self._get_data(chunk, fields)
430 if rv is None: return
431 fdata, wdata, (bf_x,) = rv
/Users/goldbaum/Documents/yt-hg/yt/data_objects/profiles.pyc in _get_data(self, chunk, fields)
245 else:
246 weight_data = np.ones(filter.size, dtype="float64")
--> 247 weight_data = weight_data[filter]
248 # So that we can pass these into
249 return arr, weight_data, bin_fields
/Users/goldbaum/Documents/yt-hg/yt/units/yt_array.pyc in __getitem__(self, item)
1159
1160 def __getitem__(self, item):
-> 1161 ret = super(YTArray, self).__getitem__(item)
1162 if ret.shape == ():
1163 return YTQuantity(ret, self.units, bypass_validation=True)
IndexError: index 191131 is out of bounds for axis 1 with size 191131
```
I think the best route here is to fail with a nicer error message, suggesting to use ParticlePlot.
New issue 1260: make spectrum with "use_peculiar_velocity = False"
https://bitbucket.org/yt_analysis/yt/issues/1260/make-spectrum-with-use_pec…
Pengfei Chen:
Hi all,
I was trying to make a light ray with "use_peculiar_velocity=False", then trying to make a spectrum also with "use_peculiar_velocity=False", but I got an error.
The code I use is here:
http://paste.yt-project.org/show/6763/
I should set "field = 'H_number_density'" but since there's no such field in that dataset, I choose "field = 'density'" just to trigger the error.
The error message is here:
http://paste.yt-project.org/show/6764/
Thanks for your attention,
Pengfei
New issue 1259: magnetic field units for athena frontend
https://bitbucket.org/yt_analysis/yt/issues/1259/magnetic-field-units-for-a…
Chang-Goo Kim:
I set unit conversion factors with 'unit_override' keyword. These are the units I'm using
>>> print ya.unit_base
{'length_unit': (1.0, 'pc'),
'mass_unit': (2.38858753789e-24, 'g/cm**3*pc**3'),
'time_unit': (1.0, 's*pc/km')}
This should set the magnetic field unit as follows:
>>> mu=yt.YTQuantity(ya.unit_base['mass_unit'][0],ya.unit_base['mass_unit'][1])
>>> lu=yt.YTQuantity(ya.unit_base['length_unit'][0],ya.unit_base['length_unit'][1])
>>> tu=yt.YTQuantity(ya.unit_base['time_unit'][0],ya.unit_base['time_unit'][1])
>>> mag_unit=(np.sqrt(4*np.pi*mu/lu)/tu)
>>> print mag_unit.convert_to_units('gauss')
5.4786746797e-07 gauss
But, what I found in the unit_registry is a bit strange number.
>>> print ds.unit_registry['code_magnetic']
(3.5449077018110318,
sqrt((mass))/(sqrt((length))*(time)),
0.0,
'\\rm{code\\ magnetic}')
It seems that "magnetic_unit" was set correctly in yt/frontends/athena/data_structures.py:
self.magnetic_unit = np.sqrt(4*np.pi * self.mass_unit /
(self.time_unit**2 * self.length_unit))
self.magnetic_unit.convert_to_units("gauss")
New issue 1258: potential unit error in cosmology.py
https://bitbucket.org/yt_analysis/yt/issues/1258/potential-unit-error-in-co…
Pengfei Chen:
I was trying to generate light rays using yt 3.4-dev, but found that the light rays have wrong redshift intervals. The error happens when I make light rays from one single dataset. I reproduced the error with the dataset under src/yt-hg/tests. The code I used is here:
http://paste.yt-project.org/show/6751/
After running the code, I did
h5ls -d lightray.h5/grid/redshift
and found the last redshift was 9.98999107782409. However, the correct value should be 9.97888260334447.
When I call LightRay this way, it will call the _deltaz_forward function in cosmology_splice.py, which will call the comoving_radial_distance in cosmology.py:
distance2 = self.cosmology.comoving_radial_distance(z2, z)
, where distance2 is not in comoving units. Then it will calculate z2:
z2 = ((target_distance - distance2) / m.in_units("Mpccm / h")) + z2
. I think the subtraction here is the problem, since it tries to convert distance2 to comoving units, even though the numerical value of distance2 is already comoving. So a wrong factor of (1+z) is multiplied.