Hi all, I am finding a bug when I try to use the 'fig_size' argument with add_projection: http://paste.yt-project.org/show/1998/ could this be related to the recent fields refactoring? I'm using the current tip rev c21592d52977. Thanks! -- Stephen Skory s@skory.us http://stephenskory.com/ 510.621.3687 (google voice)
Hi again, as always, I should have tried removing the .yt file. And now it works. Sorry for the noise. -- Stephen Skory s@skory.us http://stephenskory.com/ 510.621.3687 (google voice)
Hi, I'm using the following script to analyze my data: from yt.config import ytcfg; ytcfg["yt","serialize"] = "False" from yt.mods import * from yt.analysis_modules.star_analysis.api import * pf = load("DD0088/g1e10s1e11dm1e12_0088") my_disk=pf.h.disk([0.5, 0.5, 0.5], [0, 0, 1], 50/pf["kpc"], 10/pf["kpc"]) sfr = StarFormationRate(pf, data_source=my_disk) and I'm getting this error: http://paste.yt-project.org/show/2004/ but yt is actually reading star particles, if I try to get the number of particles or their masses there's no problem (http://paste.yt-project.org/show/2005/), so I suppose the problem is only when reading "creation_time"... or maybe the file is corrupted? I'm using yt-2.2, but with yt-2.3 a similar error occurs. If it's necessary I can upload the data, but its ~250MB. Regards, Fernando.
Hello Fernando,
I'm using the following script to analyze my data:
from yt.config import ytcfg; ytcfg["yt","serialize"] = "False" from yt.mods import * from yt.analysis_modules.star_analysis.api import * pf = load("DD0088/g1e10s1e11dm1e12_0088") my_disk=pf.h.disk([0.5, 0.5, 0.5], [0, 0, 1], 50/pf["kpc"], 10/pf["kpc"]) sfr = StarFormationRate(pf, data_source=my_disk)
If you want to select stars by particle_type, the easiest way to do it is to hand-feed the stars you want considered to the analysis machinery: from yt.config import ytcfg; ytcfg["yt","serialize"] = "False" from yt.mods import * from yt.analysis_modules.star_analysis.api import * pf = load("DD0088/g1e10s1e11dm1e12_0088") my_disk=pf.h.disk([0.5, 0.5, 0.5], [0, 0, 1], 50/pf["kpc"], 10/pf["kpc"]) sel = (my_disk["particle_type"] == 2) st_ct = my_disk["creation_time"][sel] st_mass = my_disk["ParticleMassMsun"][sel] sfr = StarFormationRate(pf, star_mass = st_mass, star_creation_time = st_ct) However! The error message you sent shows that the "creation_time" field is problematic in your dataset, so even the example above won't work. Clearly, the creation time of stars is needed if you want to graph the star formation rate as a function of time. This looks like an enzo dataset, can you try this command and tell me what the output is? h5ls DD0088/g1e10s1e11dm1e12_0088.cpu0000/Grid00000001 -- Stephen Skory s@skory.us http://stephenskory.com/ 510.621.3687 (google voice)
Hi Stephen,
from yt.config import ytcfg; ytcfg["yt","serialize"] = "False" from yt.mods import * from yt.analysis_modules.star_analysis.api import * pf = load("DD0088/g1e10s1e11dm1e12_0088") my_disk=pf.h.disk([0.5, 0.5, 0.5], [0, 0, 1], 50/pf["kpc"], 10/pf["kpc"]) sel = (my_disk["particle_type"] == 2) st_ct = my_disk["creation_time"][sel] st_mass = my_disk["ParticleMassMsun"][sel] sfr = StarFormationRate(pf, star_mass = st_mass, star_creation_time = st_ct)
However! The error message you sent shows that the "creation_time" field is problematic in your dataset, so even the example above won't work.
As you say, I also tried selecting particles by particle_type and it didn't work
This looks like an enzo dataset, can you try this command and tell me what the output is?
h5ls DD0088/g1e10s1e11dm1e12_0088.cpu0000/Grid00000001
Sorry, I forgot to mention it but you're right, this is an enzo dataset. Here is the output: $ h5ls DD0088/g1e10s1e11dm1e12_0088.cpu0000/Grid00000001 Density Dataset {128, 128, 128} Metal_Density Dataset {128, 128, 128} Temperature Dataset {128, 128, 128} TotalEnergy Dataset {128, 128, 128} x-velocity Dataset {128, 128, 128} y-velocity Dataset {128, 128, 128} z-velocity Dataset {128, 128, 128} Thanks! Fernando
Hi Fernando,
$ h5ls DD0088/g1e10s1e11dm1e12_0088.cpu0000/Grid00000001 Density Dataset {128, 128, 128} Metal_Density Dataset {128, 128, 128} Temperature Dataset {128, 128, 128} TotalEnergy Dataset {128, 128, 128} x-velocity Dataset {128, 128, 128} y-velocity Dataset {128, 128, 128} z-velocity Dataset {128, 128, 128}
It looks like your dataset doesn't have particles. If you had particles, your output would look more like this: Dark_Matter_Density Dataset {64, 64, 128} Density Dataset {64, 64, 128} GasEnergy Dataset {64, 64, 128} Temperature Dataset {64, 64, 128} TotalEnergy Dataset {64, 64, 128} creation_time Dataset {324823} dynamical_time Dataset {324823} metallicity_fraction Dataset {324823} particle_index Dataset {324823} particle_mass Dataset {324823} particle_position_x Dataset {324823} particle_position_y Dataset {324823} particle_position_z Dataset {324823} particle_type Dataset {324823} particle_velocity_x Dataset {324823} particle_velocity_y Dataset {324823} particle_velocity_z Dataset {324823} x-velocity Dataset {64, 64, 128} y-velocity Dataset {64, 64, 128} z-velocity Dataset {64, 64, 128} If you intended on having particles in this simulation, and you cannot figure out what went wrong, perhaps you should email the enzo-users list and ask there. Good luck! -- Stephen Skory s@skory.us http://stephenskory.com/ 510.621.3687 (google voice)
Hi Fernando, On Tue, Jan 3, 2012 at 12:35 PM, Stephen Skory <s@skory.us> wrote:
It looks like your dataset doesn't have particles. If you had particles, your output would look more like this:
Oops, I forgot your earlier paste #2005. I'll think more about your problem, but it may not be an enzo problem. -- Stephen Skory s@skory.us http://stephenskory.com/ 510.621.3687 (google voice)
Hi Fernando, Matt Turk and I have been exchanging messages in the background and we are a bit puzzled about your problem, but have some ideas of where to ask you to look. Could you run this script, and give us the output (perhaps in a paste if it's long)? We're wondering if there are grids with particles in your disk, but without the star-specific fields. from yt.config import ytcfg; ytcfg["yt","serialize"] = "False" from yt.mods import * pf = load("DD0088/g1e10s1e11dm1e12_0088") my_disk=pf.h.disk([0.5, 0.5, 0.5], [0, 0, 1], 50/pf["kpc"], 10/pf["kpc"]) for grid in my_disk._grids: print grid.id, grid.NumberOfParticles try: ct = grid['creation_time'] print ct.size except: print 'no creation_time' Also, this command will uniquely pull out all the fields in your hdf5 files. It should be run in the same directory as the *cpu* files in DD0088. We're pretty sure that I was wrong about you missing particles, but this is a good double-check. It should include all the particle fields, including creation_time. h5ls -r *.cpu* | cut -f4 -d\/|cut -f1 -d\ |sort -u -- Stephen Skory s@skory.us http://stephenskory.com/ 510.621.3687 (google voice)
Hi Stephen,
Could you run this script, and give us the output (perhaps in a paste if it's long)? We're wondering if there are grids with particles in your disk, but without the star-specific fields.
http://paste.yt-project.org/show/2007/ Every grid has no creation_time, even though some of them have NumberOfParticles > 0 (e.g. grid.id=143)
Also, this command will uniquely pull out all the fields in your hdf5 files.
$ h5ls -r *.cpu* | cut -f4 -d\/|cut -f1 -d\ |sort -u Dark_Matter_Density Density Metal_Density Temperature TotalEnergy creation_time dynamical_time metallicity_fraction particle_index particle_mass particle_position_x particle_position_y particle_position_z particle_type particle_velocity_x particle_velocity_y particle_velocity_z x-velocity y-velocity z-velocity I also realized that if I use yt-2.3 instead of yt-2.2 I can get "particle_index" and "particle_mass", but if I try "particle_type" I get the same error than with "creation_time". Thanks! Fernando.
Fernando,
http://paste.yt-project.org/show/2007/ Every grid has no creation_time, even though some of them have NumberOfParticles > 0 (e.g. grid.id=143)
This is super weird stuff going on. I'd like to take a look at it. Is it possible for you to upload the dataset somewhere? You can send me the location off-list. -- Stephen Skory s@skory.us http://stephenskory.com/ 510.621.3687 (google voice)
Fernando and Matt, Fernando sent me one of his datasets and I think I know what's going on. The dataset I'm looking at (g1e10s1e11dm1e12_0088) has 1449 grids, but only 21 of them have (star) particles in them. Normally, when yt first opens an enzo dataset and there is no .yt file in the directory, it opens 20 semi-randomly chosen grids to see what fields exist in the dataset. The assumption is that if these 20 grids are well-spaced enough over all the levels, it should see all the fields in the collective set of grids. From this, it builds a list of fields (pf.h.field_list) and derived fields (pf.h.derived_field_list). If I force yt to check all the grids (by modifiying yt/frontends/enzo/data_structures.py) for fields on Fernando's dataset, the particle fields work, and I got Fernando's SFR script at the very top to work (modulo a bug I discovered in the cylinder object volume calculation. I just pushed a fix and it should be in the development copy of yt soon. Perhaps the same time as a fix for this field list issue is also made available). Matt - what do you think is the best solution for this? Add a ytcfg parameter so users with datasets like this can simply tell yt to check all grids for fields? -- Stephen Skory s@skory.us http://stephenskory.com/ 510.621.3687 (google voice)
Thank you very much Stephen. I'll be waiting for Matt's reply. Also, thanks for solving the bug using SFR. Actually it was the next question I'll ask once I solve the "creation-time" problem. Regards, Fernando. On Jan 3, 2012, at 8:50 PM, Stephen Skory wrote:
Fernando and Matt,
Fernando sent me one of his datasets and I think I know what's going on. The dataset I'm looking at (g1e10s1e11dm1e12_0088) has 1449 grids, but only 21 of them have (star) particles in them. Normally, when yt first opens an enzo dataset and there is no .yt file in the directory, it opens 20 semi-randomly chosen grids to see what fields exist in the dataset. The assumption is that if these 20 grids are well-spaced enough over all the levels, it should see all the fields in the collective set of grids. From this, it builds a list of fields (pf.h.field_list) and derived fields (pf.h.derived_field_list).
If I force yt to check all the grids (by modifiying yt/frontends/enzo/data_structures.py) for fields on Fernando's dataset, the particle fields work, and I got Fernando's SFR script at the very top to work (modulo a bug I discovered in the cylinder object volume calculation. I just pushed a fix and it should be in the development copy of yt soon. Perhaps the same time as a fix for this field list issue is also made available).
Matt - what do you think is the best solution for this? Add a ytcfg parameter so users with datasets like this can simply tell yt to check all grids for fields?
-- Stephen Skory s@skory.us http://stephenskory.com/ 510.621.3687 (google voice) _______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
Hi Fernando and Stephen, I've pushed a change that I believe should fix this: https://bitbucket.org/yt_analysis/yt/changeset/74e0a0d08c10 The issue is as Stephen noted -- to identify which fields are in a file, yt will randomly sample the grids. If during this random sample it misses the grids that have particles, it won't know about all the fields. I believe that for the current generation of Enzo formats (however, *not* for the provisional Enzo 3.0 format, which is still under development) if you have star particles on, *all* grids with star particles will have fields like creation_time and the other attributes belonging to the particular star particle type that is enabled. So all we need to do is find a single grid that has particles and make sure it participates in the random sampling. So that's what the patch does. Fernando, if you are running on an installation you manage, running "yt update" should bring you up to date with this change. You *may* have to wipe out existing .yt or .harrays files. Let me know if it does/doesn't work! -Matt On Tue, Jan 3, 2012 at 7:57 PM, Fernando Becerra <becerrafernando@gmail.com> wrote:
Thank you very much Stephen. I'll be waiting for Matt's reply. Also, thanks for solving the bug using SFR. Actually it was the next question I'll ask once I solve the "creation-time" problem.
Regards, Fernando.
On Jan 3, 2012, at 8:50 PM, Stephen Skory wrote:
Fernando and Matt,
Fernando sent me one of his datasets and I think I know what's going on. The dataset I'm looking at (g1e10s1e11dm1e12_0088) has 1449 grids, but only 21 of them have (star) particles in them. Normally, when yt first opens an enzo dataset and there is no .yt file in the directory, it opens 20 semi-randomly chosen grids to see what fields exist in the dataset. The assumption is that if these 20 grids are well-spaced enough over all the levels, it should see all the fields in the collective set of grids. From this, it builds a list of fields (pf.h.field_list) and derived fields (pf.h.derived_field_list).
If I force yt to check all the grids (by modifiying yt/frontends/enzo/data_structures.py) for fields on Fernando's dataset, the particle fields work, and I got Fernando's SFR script at the very top to work (modulo a bug I discovered in the cylinder object volume calculation. I just pushed a fix and it should be in the development copy of yt soon. Perhaps the same time as a fix for this field list issue is also made available).
Matt - what do you think is the best solution for this? Add a ytcfg parameter so users with datasets like this can simply tell yt to check all grids for fields?
-- Stephen Skory s@skory.us http://stephenskory.com/ 510.621.3687 (google voice) _______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
Fernando, a brief follow-up. Everything should now be working for you for the SFR calculation if you update yt again. The fix for cylinder volume calculation (my_disk.volume('mpc')) which the SFR module uses has been made. Let us know if you run into any problems. -- Stephen Skory s@skory.us http://stephenskory.com/ 510.621.3687 (google voice)
Everything's working fine now! Thanks Stephen and Matt! On Jan 4, 2012, at 1:07 PM, Stephen Skory wrote:
Fernando,
a brief follow-up. Everything should now be working for you for the SFR calculation if you update yt again. The fix for cylinder volume calculation (my_disk.volume('mpc')) which the SFR module uses has been made.
Let us know if you run into any problems.
-- Stephen Skory s@skory.us http://stephenskory.com/ 510.621.3687 (google voice) _______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
participants (3)
-
Fernando Becerra
-
Matthew Turk
-
Stephen Skory