Re: [yt-users] can yt load file non-recursively?
Hi Yuxiang, Sorry for the late reply. I dropped the ball on this. Thank you very much for making the sample dataset available. I have download it and will use it when I start working on adding support for loading halos. I am pretty busy with other things at the moment, but will try to get to this as soon as I can. As to your second question, I believe yt creates the ("gas", "density") field by depositing the PartType0 particles onto a mesh using cloud-in-cell deposit. The mesh itself is an octree. Britton On Thu, Jan 29, 2015 at 1:34 AM, Yuxiang Qin < yuxiangq@student.unimelb.edu.au> wrote:
Hi Britton,
I asked the data owner, he said it’s okay to provide a data for you. Here is the data: https://www.dropbox.com/s/jc2gou0frajk9dw/ NOSN_NOZCOOL_L010N0128_subhalos_103.tar?dl=0
It’s a OWLS format data (just like the sample data, owls_of_halos from yt website) with 10Mpc boxsize and 128^3 particles.
Besides, It would be great if I can involve in the development although I don’t think I am able to do it at this moment. But I will try :)
One more question, how does yt construct its mesh field e.g. the gas field data from the original PartType0 data? I think the former one is a mesh filed while the later one is particle filed, right? But in the snapshot data, e.g. the sample data, snapshot_033 from yt website, there are only PartType0, 1, 4 (gas, dm, star) data. So how does yt generate the gas field, e.g. (‘gas’,’density’)?
Cheers, Yuxiang
On 28 Jan 2015, at 01:42, Britton Smith
wrote: Hi Yuxiang,
The issue is that the PartTypeN fields are not simply fields of one value per halo, but are values for the individual member particles of each halo. Loading these in is not supported yet, but it could be done.
Currently, there are two obstacles to implementing this. Firstly, I'm not entirely sure how to do it. There isn't a framework within yt for fields corresponding to the particles of a halo. I can see a couple ways this might be implemented, and I would greatly appreciate feedback from other devs on these:
1. Allow PartTypeN particles to be their own particle type and attempt to load them in similarly to the rest of the particles. This might be tricky because they live within the FOF and SUBFIND groups. Perhaps the particle type could be FOF/PartType0 and SUBFIND/PartType0, etc. Then, use the information given in the file to create a halo membership field which returns the ID of the halo each particle belongs to.
2. Create some sort of load_particles function that accepts a halo id and just grabs all particles for that halo and returns a data container. I'm not sure where this would hang off of, but maybe the OWLS_Subfind dataset object itself.
The second obstacle is that at this time we don't have any sample data that contains these types of particles. Yuxiang, if are able to provide us with a sample dataset that contains all of the complexity of the one you're working with but in a much smaller size (no more than a few GB), that would be a great help. Additionally, if you're interested in spear-heading this development yourself, we would gladly provide you with assistance and feedback.
Other devs, what sounds like the best way to implement this feature?
Finally, in the future, it would be better to start a new thread for a different question. I completely missed this question because I thought it was part of a discussion that others had well in hand.
Britton
On Mon, Jan 26, 2015 at 11:33 PM, Yuxiang Qin < yuxiangq@student.unimelb.edu.au> wrote:
Hi guys,
Can I continue my question? still about loading in gadget/owls data...
When I try to load subhalo files, it cannot load in FOF/PartTypeN data. Neither ds.field_list and ds.derived_field_list shows those PartTypeN fields. They only contains ‘SF’, ’NSF’ and ‘Star’ fields. So Is there any way that I can load in those PartTypeN fields?
[yqin@gstar001 subhalos_103]$ h5ls subhalo_103.0.hdf5 Constants Group FOF Group Header Group Parameters Group SUBFIND Group Units Group [yqin@gstar001 subhalos_103]$ h5ls subhalo_103.0.hdf5/FOF CenterOfMass Dataset {1920} CenterOfMassVelocity Dataset {1920} Mass Dataset {640} MassType Dataset {3840} NSF Group PartType0 Group PartType1 Group PartType4 Group SF Group StarFormationRate Dataset {640} Stars Group [yqin@gstar001 subhalos_103]$ h5ls subhalo_103.0.hdf5/FOF/PartType0 AExpMaximumTemperature Dataset {15607} Coordinates Dataset {46821} Density Dataset {15607} ElementAbundance Group IronFromSNIa Dataset {15607} Length Dataset {530} Mass Dataset {15607} MaximumTemperature Dataset {15607} Metallicity Dataset {15607} MetallicityWeightedRedshift Dataset {15607} Offset Dataset {530} OnEquationOfState Dataset {15607} ParticleIDs Dataset {15607} SUB_Length Dataset {410} SUB_Offset Dataset {410} SmoothedMetallicity Dataset {15607} SmoothingLength Dataset {15607} StarFormationRate Dataset {15607} Temperature Dataset {15607} Velocity Dataset {46821} WindFlag Dataset {15607}
Cheers, Yuxiang
On 24 Jan 2015, at 22:38, Yuxiang Qin
wrote: I figure it. Thanks for fix this bug :)
Yuxiang
On 24 Jan 2015, at 17:39, Nathan Goldbaum
wrote: I opened a PR for this:
https://bitbucket.org/yt_analysis/yt/pull-request/1424/use-the-correct-filen... On Fri, Jan 23, 2015 at 10:35 PM Gabriel Altay
wrote: We could definitely do that. When I first worked on the OWLS front end I considered only the actual OWLS snapshots which all have at least 8 files in their output. It think it would be a minor change but glancing through the code base, I see that someone has gone to the trouble of organizing the frontend folder. That is awesome!, but I can't immediately see where the file name construction is happening.
On Fri, Jan 23, 2015 at 9:47 PM, Yuxiang Qin < yuxiangq@student.unimelb.edu.au> wrote:
Hi Nathan,
That’s exactly what I would suggest :)
Cheers, Yuxiang
On 24 Jan 2015, at 14:44, Nathan Goldbaum
wrote: Perhaps we should adjust the OWLS frontend to allow for datasets that fit in a single file? I believe when Gabe originally wrote the frontend he assumed that the dataset would be split into multiple files. On Fri, Jan 23, 2015 at 7:43 PM Yuxiang Qin < yuxiangq@student.unimelb.edu.au> wrote:
Hi Matthew,
Thanks! Your method works. Actually, I think it’s because this simulation doesn’t have too many particles (Box size is 10Mpc, particle number is 128^3), so it doesn’t require too many processes to run the simulation, then the number of output files is only one.
Cheers, Yuxiang
On 24 Jan 2015, at 14:28, Matthew Turk
wrote: Hi Yuxiang,
yt chooses the template for the filenames based on the value in NumFiles for vanilla Gadget; it doesn't add a zero if NumFiles == 1. Looking over the code, it looks like this behavior is not hte same if it thinks it's an EAGLE simulation or an OWLS simulation.
I believe that if you do this:
ds = yt.load( ... ds.filename_template = ds.parameter_filename
it will force the behavior you're looking for. But, I don't know why for OWLS and EAGLE we don't make the same assumption about file naming convention.
-Matt
On Fri, Jan 23, 2015 at 8:38 PM, Yuxiang Qin < yuxiangq@student.unimelb.edu.au> wrote:
Hi Guys,
I understand those warnings are not big problems and I think I decide to just ignore them at this moment.
I still have the problem loading only one file.
Here is a example. In directory ’snapshot_103’, I just have one file, snap_103.hdf5
[yqin@gstar001 snapshot_103]$ ls snap_103.hdf5
Then we I try to use this file, yt gives me this:
In [2]: fname='../../Smaug/REF_L010N0128/data/snapshot_103/snap_103.hdf5'
In [3]: ds=yt.load(fname)
yt : [INFO ] 2015-01-24 13:19:41,119 Parameters: current_time = 3.7953673 7148e+16 s
yt : [INFO ] 2015-01-24 13:19:41,119 Parameters: domain_dimensions = [2 2 2] yt : [INFO ] 2015-01-24 13:19:41,119 Parameters: domain_left_edge = [ 0. 0. 0.]
yt : [INFO ] 2015-01-24 13:19:41,120 Parameters: domain_right_edge = [ 10. 10 . 10.]
yt : [INFO ] 2015-01-24 13:19:41,120 Parameters: cosmological_simulation = 1 yt : [INFO ] 2015-01-24 13:19:41,120 Parameters: current_redshift = 5.0000024 yt : [INFO ] 2015-01-24 13:19:41,121 Parameters: omega_lambda = 0.725 yt : [INFO ] 2015-01-24 13:19:41,121 Parameters: omega_matter = 0.275 yt : [INFO ] 2015-01-24 13:19:41,121 Parameters: hubble_constant = 0.702
In [4]: ds.field_list
ERROR OPENING /lustre/projects/p071_swin/Smaug/REF_L010N0128/data/snapshot_103/snap_103.0. hdf5
FILENAME DOES NOT EXIST
---------------------------------------------------------------------------
IOError Traceback (most recent call last) <ipython-input-4-c4d73f2b23ef> in <module>()
----> 1 ds.field_list
Loading file is fine but using the information has a bug here. I think the problem is yt would automatically add a '.0’ in the filename and try to load from snapshot_103.0.hdf5 to snapshot_103.MAX_NUMBER.hdf5.
And Gabriel, when I said making a symbol link. I mean:
Firstly, I create a link named snap_103.0.hdf5 to snap_103.hdf5, so that when yt automatically add a ‘.0’ in the filename, it would visit the real file since I create the link. It’s just a trick. I can also rename my file snap_103.hdf5 to snap_103.0.hdf5.
[yqin@gstar001 snapshot_103]$ ln -s snap_103.hdf5 snap_103.0.hdf5 [yqin@gstar001 snapshot_103]$ ls snap_103.0.hdf5 snap_103.hdf5
Then when I do this in yt:
In [5]: ds.field_list
yt : [INFO ] 2015-01-24 13:30:47,423 Allocating for 4.194e+06 particles yt : [INFO ] 2015-01-24 13:30:48,233 Identified 3.000e+05 octs
yt : [WARNING ] 2015-01-24 13:30:48,366 Field ('deposit', 'PartType4_smoothed_O_fraction') alre ady exists. To override use force_override=True.
yt : [WARNING ] 2015-01-24 13:30:48,367 Field ('deposit', 'PartType4_smoothed_Mg_fraction') alr eady exists. To override use force_override=True.
yt : [WARNING ] 2015-01-24 13:30:48,367 Field ('deposit', 'PartType4_smoothed_C_fraction') alre ady exists. To override use force_override=True.
yt : [WARNING ] 2015-01-24 13:30:48,962 Field ('gas', 'Fe_mass') already exists. To override us e force_override=True.
BLA…BLA…BLA… A LOT OF WARNINGS
yt : [INFO ] 2015-01-24 13:30:55,017 Loading field plugins.
yt : [INFO ] 2015-01-24 13:30:55,019 Loaded angular_momentum (8 new fields) yt : [INFO ] 2015-01-24 13:30:55,019 Loaded astro (15 new fields) yt : [INFO ] 2015-01-24 13:30:55,020 Loaded cosmology (22 new fields) yt : [WARNING ] 2015-01-24 13:30:55,021 Field ('gas', 'metallicity') already exists. To overrid e use force_override=True.
yt : [INFO ] 2015-01-24 13:30:55,021 Loaded fluid (63 new fields) yt : [INFO ] 2015-01-24 13:30:55,023 Loaded fluid_vector (95 new fields) yt : [INFO ] 2015-01-24 13:30:55,024 Loaded geometric (111 new fields) yt : [INFO ] 2015-01-24 13:30:55,024 Loaded local (111 new fields) yt : [INFO ] 2015-01-24 13:30:55,025 Loaded magnetic_field (119 new fields) yt : [INFO ] 2015-01-24 13:30:55,026 Loaded my_plugins (119 new fields) yt : [INFO ] 2015-01-24 13:30:55,027 Loaded species (121 new fields) Out[5]:
[('PartType0', 'Coordinates'),
('PartType0', 'Density'),
('PartType0', 'Carbon'),
('PartType0', 'Helium'),
('PartType0', 'Hydrogen'),
('PartType0', 'Iron'),
('PartType0', 'Magnesium'),
('PartType0', 'Neon'),
('PartType0', 'Nitrogen'),
('PartType0', 'Oxygen'),
Also yt still reports those warnings, but it actually succeeds reading the file.
So my question is can yt load file non-recursively. For example, can I say, ds=yt.load(fname, recursively==False), so that I don’t need to create the symbol links every time.
Again, thanks for helping me, guys :)
Cheers, Yuxiang
On 24 Jan 2015, at 04:40, Gabriel Altay
wrote: Yuxiang, are you trying to read a single file still? Are you making symlinks between files? Can you explain in more detail?
On Thu, Jan 22, 2015 at 9:16 PM, Yuxiang Qin
wrote: > Hi Desika, > > I tried to reinstall yt by the script but I failed. So I reinstall > it from source (hg clone). It still gives me those Warning… > > > Yuxiang > > On 23 Jan 2015, at 02:56, Desika Narayanan
> wrote: > > Hi Yuxiang, > > Are you running anaconda python? > > One thing I've noticed this morning for the first time is that when > I run yt installed via anaconda on gadget data sets (albeit not owls) I get > similar warnings - i.e.: > > yt : [WARNING ] 2015-01-22 10:44:54,090 Field ('deposit', > 'PartType0_smoothed_Ne_fraction') already exists. To override use > force_override=True. > > whereas I didn't' receive them when running yt installed via the > install script (and using the python that ships with yt). Not a solution, > just another symptom. > > -d > > On Wed, Jan 21, 2015 at 7:06 PM, Yuxiang Qin < > yuxiangq@student.unimelb.edu.au> wrote: > >> Hi Andrew, >> >> I had a look of those files and I think I get what you mean. By the >> way, about my second question, ‘cannot load in PartTypeN fields’, I think >> probably it’s because there is no definition of those fields in yt’s source >> code. >> >> Cheers, >> Yuxiang >> >> >> On 22 Jan 2015, at 10:42, Andrew Myers wrote: >> >> Hi Yuxiang, >> >> There is a standard list of fields that yt automatically defines >> whenever you load an OWLS dataset (you can see the list in >> yt/frontends/owls/fields.py). You don't need to explicitly define any new >> fields yourself for this to happen - this all takes place under the hood. I >> think that the code that does this is adding all the fields twice for some >> reason. This is something that needs to get fixed in the OWLS frontend in >> yt's source code, not anything you are doing in your scripts. >> >> -Andrew >> >> On Wed, Jan 21, 2015 at 3:30 PM, Yuxiang Qin < >> yuxiangq@student.unimelb.edu.au> wrote: >> >>> Hi, >>> >>> Thanks for your help. >>> >>> Nathan, I am using OWLS, a Gadget format data. >>> >>> Andrew, what do you mean by ‘add a field that has already been >>> defined’? I am not defining a new field. Does it mean there is already a >>> new field in my data? If so, why yt says it’s a new field since it’s >>> already there? I am kind of confused about it. >>> >>> Cheers, >>> Yuxiang >>> >>> >>> On 22 Jan 2015, at 09:43, Nathan Goldbaum >>> wrote: >>> >>> Hi Yuxiang, >>> >>> What simulation code produced these files? Judging by the naming, >>> I'd have to guess some flavor of gadget, but it's not clear based on your >>> e-mail. >>> >>> -Nathan >>> >>> On Tue, Jan 20, 2015 at 4:29 PM, Yuxiang Qin < >>> yuxiangq@student.unimelb.edu.au> wrote: >>> >>>> Dear yt-users, >>>> >>>> I have a tiny question about loading files. >>>> >>>> I understand when yt load files, it will load all the files in >>>> the directory. For example: if I do ds=yt.load('snap_103.0.hdf5’), >>>> then ds.field_list , yt would open all the files: >>>> >>>> snap_103.0.hdf5* snap_103.16.hdf5* snap_103.22.hdf5* >>>> snap_103.29.hdf5* snap_103.6.hdf5* >>>> snap_103.10.hdf5* snap_103.17.hdf5* snap_103.23.hdf5* >>>> snap_103.2.hdf5* snap_103.7.hdf5* >>>> snap_103.11.hdf5* snap_103.18.hdf5* snap_103.24.hdf5* >>>> snap_103.30.hdf5* snap_103.8.hdf5* >>>> snap_103.12.hdf5* snap_103.19.hdf5* snap_103.25.hdf5* >>>> snap_103.31.hdf5* snap_103.9.hdf5* >>>> snap_103.13.hdf5* snap_103.1.hdf5* snap_103.26.hdf5* >>>> snap_103.3.hdf5* >>>> snap_103.14.hdf5* snap_103.20.hdf5* snap_103.27.hdf5* >>>> snap_103.4.hdf5* >>>> snap_103.15.hdf5* snap_103.21.hdf5* snap_103.28.hdf5* >>>> snap_103.5.hdf5* >>>> >>>> However, if our data only has one file :snap_103.hdf5*, then if >>>> I do ds=yt.load('snap_103.hdf5’), then ds.field_list, yt would >>>> still try to open all the files from snap_103.0.hdf5, however >>>> there is no snap_103.0.hdf5. It would report error: >>>> >>>> IOError: Unable to open file (Unable to open file: name = >>>> '/lustre/projects/p071_swin/yqin/smaug/ref_eff_l010n0128/data/snapshot_102/snap_102.0.hdf5', >>>> errno = 2, error message = 'no such file or directory', flags = 0, o_flags >>>> = 0) >>>> >>>> >>>> Is there an argument saying I don’t want to do loading >>>> recursively? I can make a link named snap_103.0.hdf5 to >>>> snap_103.hdf5. But it’s kind of inconvenient. >>>> >>>> Cheers, >>>> -- >>>> Yuxiang Qin >>>> PhD Student >>>> School of Physics >>>> The University of Melbourne >>>> VIC, Australia, 3010 >>>> >>>> >>>> _______________________________________________ >>>> yt-users mailing list >>>> yt-users@lists.spacepope.org >>>> http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org >>>> >>>> >>> _______________________________________________ >>> yt-users mailing list >>> yt-users@lists.spacepope.org >>> http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org >>> >>> >>> >>> _______________________________________________ >>> yt-users mailing list >>> yt-users@lists.spacepope.org >>> http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org >>> >>> >> _______________________________________________ >> yt-users mailing list >> yt-users@lists.spacepope.org >> http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org >> >> >> >> _______________________________________________ >> yt-users mailing list >> yt-users@lists.spacepope.org >> http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org >> >> > _______________________________________________ > yt-users mailing list > yt-users@lists.spacepope.org > http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org > > > > _______________________________________________ > yt-users mailing list > yt-users@lists.spacepope.org > http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org > > _______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org _______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
participants (2)
-
Britton Smith
-
Nathan Goldbaum