After looking at this some more I've discovered a couple of other wrinkles.

My first thought was to set the dataset units (length, time, mass, etc.) for backup GDF files to be equal to that of the parent dataset, so there's consistency there. However, I then realized that most fields written to backup files would be in cgs, not code units. If any of these fields have aliases (like "density", "velocity_x", etc.), then they will get converted "to cgs" using the conversion factors even if they don't need to be (e.g., they already are in cgs). 

So, the backup file could have code units equal to cgs, which would relieve this problem but would leave its code units inconsistent with that of the parent file, or we could find some way to keep the underlying code units of the files the same but prevent this conversion from happening (either by removing the aliases altogether from the GDF frontend or some other procedure). 

Would appreciate feedback on this. 

On Jul 14, 2014, at 1:49 PM, Matthew Turk <> wrote:

Hi John,

On Sat, Jul 12, 2014 at 1:48 PM, John Zuhone <> wrote:
Hi all,

I have a work-in-progress PR in the hopper regarding GDF reading and writing:

What it does is implement direct writing of covering grids to GDF
files (something I've been using in my research, to work with the
resulting files directly in yt) and clobbering existing files with the
same path if the user allows for it.

However, when fiddling around with this I determined that it was
necessary to make GDF more unit-aware. Ultimately, I determined that
changes were needed that would break the standard that we had
determined for the files (

The two main changes are:

1) Remove "field_to_cgs", and allow the fields to be in whatever units
we wish them to be in the file (which are specified as HDF5

2) Add a new top-level group containing the information for
"length","mass", and "time" units. These will be used when the file is
opened up in yt to determine the units.

I'm in favor of both of these things, along with MikeZ's suggestion
about refine by.  I think we should call this a 1.1 standard, and then
we can evaluate a few other big holes with a 2.0 in the future.

We should also probably allow units like B-field, velocity, etc, as
these have in the past had odd, non-conforming properties.

Since (for now) we do automatic conversion to cgs, that is something
that is still left unimplemented, but otherwise I think this is all to
do for now.

I'm writing the email in case anyone has any suggestions or objections
to the format changes--particularly Jeff or Kacper. We'll obviously
need to document them if they are accepted.



John ZuHone

Postdoctoral Researcher
NASA/Goddard Space Flight Center
yt-dev mailing list
yt-dev mailing list