New issue 1308: profile variance vs standard deviation
https://bitbucket.org/yt_analysis/yt/issues/1308/profile-variance-vs-stan...
Andrew Cunningham:
yt/data_objects/profiles.py :: _finalize_storage()
sets:
all_var = np.sqrt(all_var)
which later is used to construct "self.variance". So, after this line, both "all_var" and the public-facing member "variance" is really a standard deviation.
The cookbook example "profile_with_variance.py" correctly labels prof.variance as "standard deviation" in the generated plot.
The consensus arising from the mailing list discussion (http://lists.spacepope.org/pipermail/yt-users-spacepope.org/2017-January/...) was that public facing members of profile instances should be correctly named and the cookbook example should be adjusted to follow whatever convention is chosen.
Perhaps add a profile.standard_deviation member and throw a deprivation warning on profile.variance ... ?
New issue 1307: Add capability to visualize a single variable on multiple meshes
https://bitbucket.org/yt_analysis/yt/issues/1307/add-capability-to-visual...
Alexander Lindsay:
- Starting with SlicePlot initially, it would be great to be able to visualize a variable that exists on multiple (most often geometrically and physically connected) mesh blocks.
- This is probably unique to UnstructuredMesh (at least for now)
Currently have a hacky solution...will refine over the next couple days.
Hi yt-devs!
I have a quick user-question (sorry) and a devel question:
First, how to convert to a unit like microns (10^-6 meters) or
pico-seconds? I can't find an example here [1]. I tried
.in_units("mum")
.in_units("mu m)
.in_units("1.e-6 m")
.in_units("microns")
.in_units("micro meter")
.in_units("micro meters")
.in_units("micro m")
.in_units("μm")
and only
.in_units("1.e-6 * m")
does not fail but looks ugly.
An orthogonal question: in our domain (laser-plasma physics) unit
systems are scaled by something like the applied laser wavelength or the
plasma density (and more to make a full set).
Following again [1] I would implement something like .in_base('plasma')
or .in_base('laserplasma') but those are not scaled to "fixed" reference
quantities as the other base systems but are depending on either user input
[ length = ("$\lambda_0$", "800 nm"),
time = ("$\omega_\text{pe}^{-1}$", "1.e-15 s"),
...]
or alternatively scalings the data file already provided. (Ok, the
symbol is always fixed per "in_base" but the value to scale with needs
to be provided.)
Is that already possible or any ideas how we could implement that?
Thanks,
Axel
[1]
http://yt-project.org/docs/dev/analyzing/units/fields_and_unit_conversion...
--
Axel Huebl
Phone +49 351 260 3582
https://www.hzdr.de/crp
Computational Radiation Physics
Laser Particle Acceleration Division
Helmholtz-Zentrum Dresden - Rossendorf e.V.
Bautzner Landstrasse 400, 01328 Dresden
POB 510119, D-01314 Dresden
Vorstand: Prof. Dr.Dr.h.c. R. Sauerbrey
Prof. Dr.Dr.h.c. P. Joehnk
VR 1693 beim Amtsgericht Dresden
Hi all,
our cloud provider "will have a brief network outage starting at 10AM
(CST/UTC-6) on Friday, January 13th in order to upgrade its network
hardware. The upgrade will install new hardware and software and will
double the bandwidth.
Before the outage the new hardware will already be configured and
connected to the network so the interruption should be brief, hopefully
about 5 minutes."
This affects: hub (only running notebooks), jenkins, blog, pastebin,
slackbots (yt-fido, irc bridge).
In theory it should go unnoticed. On the other hand, what could possibly
go wrong on Friday 13th...?
Cheers,
Kacper
New issue 1305: Performance Issues with large BoxLib datasets
https://bitbucket.org/yt_analysis/yt/issues/1305/performance-issues-with-...
Chris Byrohl:
When loading a larger BoxLib dataset (~270Gb), yt is stuck for hours without making any progress.
Interrupting points to _reconstruct_parent_child(self):
```
#!python
---------------------------------------------------------------------------
KeyboardInterrupt Traceback (most recent call last)
<ipython-input-14-fa86699093be> in <module>()
----> 1 box['density']
/home/uni09/cosmo/cbyrohl/anaconda3/envs/py35/lib/python3.5/site-packages/yt/data_objects/data_containers.py in __getitem__(self, key)
263 Returns a single field. Will add if necessary.
264 """
--> 265 f = self._determine_fields([key])[0]
266 if f not in self.field_data and key not in self.field_data:
267 if f in self._container_fields:
/home/uni09/cosmo/cbyrohl/anaconda3/envs/py35/lib/python3.5/site-packages/yt/data_objects/data_containers.py in _determine_fields(self, fields)
993 else:
994 fname = field
--> 995 finfo = self.ds._get_field_info("unknown", fname)
996 if finfo.particle_type:
997 ftype = self._current_particle_type
/home/uni09/cosmo/cbyrohl/anaconda3/envs/py35/lib/python3.5/site-packages/yt/data_objects/static_output.py in _get_field_info(self, ftype, fname)
622 _last_finfo = None
623 def _get_field_info(self, ftype, fname = None):
--> 624 self.index
625 if fname is None:
626 if isinstance(ftype, DerivedField):
/home/uni09/cosmo/cbyrohl/anaconda3/envs/py35/lib/python3.5/site-packages/yt/data_objects/static_output.py in index(self)
417 raise RuntimeError("You should not instantiate Dataset.")
418 self._instantiated_index = self._index_class(
--> 419 self, dataset_type=self.dataset_type)
420 # Now we do things that we need an instantiated index for
421 # ...first off, we create our field_info now.
/home/uni09/cosmo/cbyrohl/anaconda3/envs/py35/lib/python3.5/site-packages/yt/frontends/boxlib/data_structures.py in __init__(self, ds, dataset_type)
144 self.directory = ds.output_dir
145
--> 146 GridIndex.__init__(self, ds, dataset_type)
147 self._cache_endianness(self.grids[-1])
148
/home/uni09/cosmo/cbyrohl/anaconda3/envs/py35/lib/python3.5/site-packages/yt/geometry/geometry_handler.py in __init__(self, ds, dataset_type)
48
49 mylog.debug("Setting up domain geometry.")
---> 50 self._setup_geometry()
51
52 mylog.debug("Initializing data grid data IO")
/home/uni09/cosmo/cbyrohl/anaconda3/envs/py35/lib/python3.5/site-packages/yt/geometry/grid_geometry_handler.py in _setup_geometry(self)
52
53 mylog.debug("Constructing grid objects.")
---> 54 self._populate_grid_objects()
55
56 mylog.debug("Re-examining index")
/home/uni09/cosmo/cbyrohl/anaconda3/envs/py35/lib/python3.5/site-packages/yt/frontends/boxlib/data_structures.py in _populate_grid_objects(self)
295 mylog.debug("Creating grid objects")
296 self.grids = np.array(self.grids, dtype='object')
--> 297 self._reconstruct_parent_child()
298 for i, grid in enumerate(self.grids):
299 if (i % 1e4) == 0: mylog.debug("Prepared % 7i / % 7i grids", i,
/home/uni09/cosmo/cbyrohl/anaconda3/envs/py35/lib/python3.5/site-packages/yt/frontends/boxlib/data_structures.py in _reconstruct_parent_child(self)
311 self.grid_levels[i] + 1,
312 self.grid_left_edge, self.grid_right_edge,
--> 313 self.grid_levels, mask)
314 ids = np.where(mask.astype("bool")) # where is a tuple
315 grid._children_ids = ids[0] + grid._id_offset
```
The result of
```
#!python
np.savez('data.npz', left_edge=self.grid_left_edge, right_edge=self.grid_right_edge, levels=self.grid_levels)
```
for the beginning of that routine can be found here: http://use.yt/upload/87b007b1
Hi all,
we had a hdd failure on a machine hosting yt's jenkins and blog
instances. PRs are not being tested at the moment. I don't have an ETA
for a replacement yet. I will keep you posted.
Cheers,
Kacper