On Wed, Jan 16, 2013 at 5:01 PM, Marcel Haas firstname.lastname@example.org wrote:
Hi Matt et al.,
I removed most of the convo, as it got a bit long. I hope it's still clear what the different parts are about :)
I agree, it definitely would. Do you think you'd be at all interested in trying this out from the binary reader I wrote into yt? It might be possible to get a flow of data to OWLS-style format.
Yes, I think that would be worth my time for a bit. One issue could be is that not even close to all metadata for those simulations is stored in the binary output (that is at least true for the Oppenheimer&Dave simulations, I am not sure about others). Also, i bet different groups have stored different (numbers) of arrays and may have dealt with that in a non-standardized way.
I had heard that incomplete sets of metadata were stored, but I think I had not realized that not enough information to continue the simulation was stored.
Maybe the main pro of having one routine that converts data into a useful format is for others to copy and adapt to their own output standards… Writing an HDF5 file with all metadata included also makes it super-easy to check whether all has gone right in the data conversion.
Yes, totally. (A few people here have been working on a data format, too ...)
Do you, or anyone in this community have access to AREPO? It seems that code is still fairly non-public. I am sure some of us could get (possibly me included) for some specific science case, if you ask Volker or Lars. I in fact don't quite know what they store. They could store particle-like data, or grid-like data or both. If they store particle-like data, I'm sure they output all necessary hydro quantities to estimate the density everywhere in space, also where there is no particle (which in a Voronoi tesellation of space would just be the density of the nearest voronoi cell center, i.e. particle, right?). So that always needs the knowledge of zero neighbors for particles, or 1 for points in space other than particle positions.
Let me back up and explain why I mentioned AREPO. I don't have access to it; I know that at least one member of the list does, but I don't. My understanding was similar to Nathan's -- that it's in the same format as Gadget, and mostly Gadget analysis tools were used. Since we already have tessellation support in yt via voro++, supporting tessellation-based density estimates should be possible. Anyway, the takeaway is that as long as we have flexibility in supporting different density estimators or methods of calculating primitive quantities. So once we can support Gadget format, this would enable someone interested in using yt for AREPO data to do so. But yes, I don't think we can do anything other than keep our thoughts on flexibility for supporting things like that.
Yay! Unfortunately, the week of the yt developers thing in SC, I am already occupied elsewhere… I do have funds myself nowadays, so any future meeting like that (or smaller/shorter…) should be no problem!
Ah, congrats on the funding, bummer on not being able to come. We should consider a mini-sprint at some other time during the year.
I think, when a grid cell somewhere wants to know the total mass contained in it, or something like that, the grids should be the same, right? So creating the octree for the hydro and collisionless components seems to make some sense to me. I am not sure if that wouldn't give issues in regions where the density of one type of particles is much larger than that of another type (again, oct tree n00b here)…
This is a good point. I think we should probably do it in a simialr way, or else we end up essentially having to merge contributions between mesh structures.
I've gotten much of the way through Tipsy file format support, although the row vs column storage of the data caused some tricky bits. Hopefully I will start next week on the kernel evaluations.