Hi, folks,
I have a small problem in halo analyses. I'm making a halo catalog with
virial quantities, but some of the halos don't return a virial profiles and
when I load the catalog and virial profile yt crash because the catalog has
halos that don't have a profile file.
How is the easiest way to add a filter to save only halos that can be
calculated virial profiles?
Thanks for the attention!
Hi everyone.
I'm fairly new to yt and I'm having some problems.
I'm trying to plot radial density profiles for some halos in a RAMSES
simulation with periodic boundary conditions.
The problem arises when dealing with a halo close to the edge.
The relevant part of the code looks like this:
ds = yt.load(datapath)
cen = halos[0] # numpy array [x,y,z] in code units
rad = haloradius[0] # float in code units
sph = ds.sphere(cen, (10.*rad, 'Mpc/h'))
plot = yt.ProfilePlot(sph, "radius", 'particle_mass')
This returns
IndexError: index 3665 is out of bounds for axis 1 with size 3665
If I understand this correctly the problem is in the sphere object.
It works fine if I keep the sphere inside the edge of the simulation.
Is there a way to deal with this problem?
PS: I know particle_mass is not the density of the dark matter, but I
haven't found a field for that, so this will have to do until I find one or
learn how to make one myself.
Best,
Andreas Ellewsen
Thanks, there's no hurry for this specific case.
I have another problem with magnetic field units. I set unit conversion
factors with 'unit_override' keyword. These are the units I'm using
>>> print ya.unit_base
{'length_unit': (1.0, 'pc'),
'mass_unit': (2.38858753789e-24, 'g/cm**3*pc**3'),
'time_unit': (1.0, 's*pc/km')}
This should set the magnetic field unit as follows:
>>>
mu=yt.YTQuantity(ya.unit_base['mass_unit'][0],ya.unit_base['mass_unit'][1])
>>>
lu=yt.YTQuantity(ya.unit_base['length_unit'][0],ya.unit_base['length_unit'][1])
>>>
tu=yt.YTQuantity(ya.unit_base['time_unit'][0],ya.unit_base['time_unit'][1])
>>> mag_unit=(np.sqrt(4*np.pi*mu/lu)/tu)
>>> print mag_unit.convert_to_units('gauss')
5.4786746797e-07 gauss
But, what I found in the unit_registry is a bit strange number.
>>> print ds.unit_registry['code_magnetic']
(3.5449077018110318,
sqrt((mass))/(sqrt((length))*(time)),
0.0,
'\\rm{code\\ magnetic}')
It seems that "magnetic_unit" was set correctly in
yt/frontends/athena/data_structures.py:
self.magnetic_unit = np.sqrt(4*np.pi * self.mass_unit /
(self.time_unit**2 * self.length_unit))
self.magnetic_unit.convert_to_units("gauss")
I don't know what's wrong here. Do you have any idea?
Thanks,
Chang-Goo
On Thu, Aug 11, 2016 at 12:12 PM, John Zuhone <jzuhone(a)gmail.com> wrote:
> We can add in a particle_file keyword argument like we do for FLASH data.
>
> I can help out with this, but I might not be able to get to it until early
> next week.
>
> On Aug 11, 2016, at 12:11 PM, Nathan Goldbaum <nathan12343(a)gmail.com>
> wrote:
>
>
>
> On Thu, Aug 11, 2016 at 11:03 AM, Chang-Goo Kim <cgkim(a)astro.princeton.edu
> > wrote:
>
>> Hi Nathan,
>>
>> Thanks for the reply. Adding particles to the athena frontend will not be
>> easy since there is no unified (and well settled) data dump for particles.
>> Star particles are not in the public release. Let me find other ways at
>> this point.
>>
>>
> Ah, in that case we should probably add a way for you to supply your own
> particle data. You won't get chunked particle IO but if that's sufficient
> it should probably be straightforward to load the data in as a keyword
> argument for the load() command.
>
> John might have some insight on what's best to do here, he's the
> maintainer of the athena frontend.
>
> -Nathan
>
>
>> Thanks,
>> Chang-Goo
>>
>>
>> On Thu, Aug 11, 2016 at 11:35 AM, Nathan Goldbaum <nathan12343(a)gmail.com>
>> wrote:
>>
>>> I think we'd need to add support for reading particle data to the athena
>>> frontend. I think the main reason this hasn't been done yet is lack of
>>> available test datasets. If you'd like to make one or more test datasets
>>> available to us (they will live on yt-project.org/data so we can use
>>> them for testing and debugging going forward), someone could try to add
>>> support for reading particle data to the Athena frontend.
>>>
>>> To answer your question, no, there isn't a way to add particle fields to
>>> an athena dataset without modifying the athena frontend.
>>>
>>> That said, you might be able to cheat and load in the particle fields
>>> using the stream frontend to reload your data as an in-memory dataset - in
>>> particular load_uniform_grid or load_amr_grids. Of course this is sort of a
>>> hack, ideally we'd just add particle support to the athena frontend.
>>>
>>> -Nathan
>>>
>>> -Nathan
>>>
>>> On Thu, Aug 11, 2016 at 10:32 AM, Chang-Goo Kim <
>>> cgkim(a)astro.princeton.edu> wrote:
>>>
>>>> Sorry for confusion.
>>>>
>>>> I have a separate vtk file for star particles. Just wonder whether I
>>>> can add this to original dataset as fields to handle them together.
>>>>
>>>> Thanks,
>>>> Chang-Goo
>>>>
>>>> On Wed, Aug 10, 2016 at 3:19 PM, John Zuhone <jzuhone(a)gmail.com> wrote:
>>>>
>>>>> Hi Chang-Goo,
>>>>>
>>>>> > When I load the dataset from Athena and I have star particle mass,
>>>>> position, velocity, age information separately,
>>>>>
>>>>> Sorry, I’m a bit confused as to what you mean by this, because as far
>>>>> as I know we don’t have particle support for Athena data yet. Do you
>>>>> actually see the particle data when you load the dataset? Also, is this the
>>>>> old Athena or Athena++?
>>>>>
>>>>> > how can I add particle information to yt dataset.
>>>>>
>>>>> I think by this you mean getting the particle fields into yt, correct?
>>>>> Just making sure.
>>>>>
>>>>> Is the data part of the Athena file itself or is it in a separate file?
>>>>>
>>>>> Best,
>>>>>
>>>>> John
>>>>>
>>>>> > On Aug 10, 2016, at 2:24 PM, Chang-Goo Kim <changgoo(a)princeton.edu>
>>>>> wrote:
>>>>> >
>>>>> > Hi all,
>>>>> >
>>>>> > I'm analysing data from Athena simulations with star particles. When
>>>>> I load the dataset from Athena and I have star particle mass, position,
>>>>> velocity, age information separately, how can I add particle information to
>>>>> yt dataset. I tried to look up the hep, but I cannot find any good
>>>>> reference for this issue.
>>>>> >
>>>>> > Thanks,
>>>>> > Chang-Goo
>>>>> > _______________________________________________
>>>>> > yt-users mailing list
>>>>> > yt-users(a)lists.spacepope.org
>>>>> > http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
>>>>>
>>>>> _______________________________________________
>>>>> yt-users mailing list
>>>>> yt-users(a)lists.spacepope.org
>>>>> http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
>>>>>
>>>>
>>>>
>>>> _______________________________________________
>>>> yt-users mailing list
>>>> yt-users(a)lists.spacepope.org
>>>> http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
>>>>
>>>>
>>>
>>> _______________________________________________
>>> yt-users mailing list
>>> yt-users(a)lists.spacepope.org
>>> http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
>>>
>>>
>>
>> _______________________________________________
>> yt-users mailing list
>> yt-users(a)lists.spacepope.org
>> http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
>>
>>
> _______________________________________________
> yt-users mailing list
> yt-users(a)lists.spacepope.org
> http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
>
>
>
Hi all,
I'm analysing data from Athena simulations with star particles. When I load
the dataset from Athena and I have star particle mass, position, velocity,
age information separately, how can I add particle information to yt
dataset. I tried to look up the hep, but I cannot find any good reference
for this issue.
Thanks,
Chang-Goo
Hi all,
I was trying to generate light rays using yt 3.4-dev, but found that the
light rays have wrong redshift intervals. For example, my simulation box is
80Mpccm/h per side, and I expect a light ray generated in a data dump at
z=2 with length =1 simulation unit (i.e. 80Mpccm/h) would have a redshift
interval from z=2 to z=1.92. However, the light ray generated by yt gives
spans from z=2 to z=1.97.
As I look into this, I found that the function comoving_radial_distance
in cosmology.py might return the wrong unit. I think the return value
should be in comoving units, instead of physical units. To see this
directly, I made the following change in cosmology_splice.py:
(root) ~/yt-conda/src/yt-hg/yt/analysis_modules/cosmological_observation
$diff cosmology_splice.py cosmology_splice.py0
373d372
< print target_distance, distance2, z2
And it shows:
80.0 Mpccm/h 4.65320708035e+26 cm 1.97387969592 dimensionless
80.0 Mpccm/h 1.19455177073e+26 cm 1.97343368065 dimensionless
80.0 Mpccm/h 1.21507519065e+26 cm 1.97342592996 dimensionless
Then I made the following change in cosmology.py:
(root) ~/yt-conda/src/yt-hg/yt/utilities $diff cosmology.py cosmology.py0
111,112c111,112
< return self.quan((self.hubble_distance() *
< trapzint(self.inverse_expansion_factor, z_i, z_f)).value,
'cmcm')
---
> return (self.hubble_distance() *
> trapzint(self.inverse_expansion_factor, z_i,
z_f)).in_base(self.unit_system)
Then I get:
80.0 Mpccm/h 4.65320708035e+26 cmcm 1.92163908776 dimensionless
80.0 Mpccm/h 3.62771515661e+26 cmcm 1.92124702027 dimensionless
With the change of unit from cm to cmcm, the light rays have the right span.
Even though this solves my problem, I am not sure if similar problem still
exists. For example, instead of making change in comoving_radial_distance,
we might need to change hubble_distance into comoving units. Hopefully
someone familiar with yt unit system could check this.
Thanks,
Pengfei
Hi yt folks,
I’m happy to announce the first release of pyXSIM, the reincarnation of yt’s photon_simulator analysis module as a standalone package.
pyXSIM is a Python package for simulating X-ray observations from astrophysical sources.
X-rays probe the high-energy universe, from hot galaxy clusters to compact objects such as neutron stars and black holes and many interesting sources in between. pyXSIM makes it possible to generate synthetic X-ray observations of these sources from a wide variety of models, whether from grid-based simulation codes such as FLASH, Enzo, and Athena, to particle-based codes such as Gadget and AREPO, and even from datasets that have been created "by hand", such as from NumPy arrays. pyXSIM also provides facilities for manipulating the synthetic observations it produces in various ways, as well as ways to export the simulated X-ray events to other software packages to simulate the end products of specific X-ray observatories.
Source repo lives here: http://github.com/jzuhone/pyxsim <http://github.com/jzuhone/pyxsim>
Docs live here: http://hea-www.cfa.harvard.edu/~jzuhone/pyxsim <http://hea-www.cfa.harvard.edu/~jzuhone/pyxsim>
Google group mailing list is here: https://groups.google.com/forum/#!forum/pyxsim <https://groups.google.com/forum/#!forum/pyxsim>
The photon_simulator analysis module within yt will no longer be updated, though it will remain within yt for the time being. I encourage all current users of photon_simulator to switch over to pyXSIM, as it will not only be the future track for this code but the version released today already contains many improvements over the version currently living in yt.
Best,
John ZuHone
Hi John,
Yes, you should be able to do this. I think the clumps have an attribute
that describes their value, and then you can supply that to the surface
operator. I've gotten your bug report and will be looking into supplying
clumps to surfaces.
On Mon, Jul 25, 2016 at 3:08 PM, John Regan <johnanthonyregan(a)gmail.com>
wrote:
> Hi Matt,
>
> Thanks for looking at this. Much appreciated! I'll file a bug report no
> problem.
>
> For calling the surface operator with a clump as a data source is it
> possible to extract the isodensity value of the clump from the Clump object
> and pass that to the surface operator?
>
> Cheers,
> John
>
> On 25 Jul 2016 19:58, "Matthew Turk" <matthewturk(a)gmail.com> wrote:
>
> Hi John,
>
> Sadly, we don't store information about which cells are on the boundary.
> This is potentially possible to compute (although the naive solution, which
> is to check the # of faces of a zone that have neighbors in the clump, will
> not work for clumps that have holes in them) but it's not stored just now.
> I don't have an alternate solution to present right now to that problem.
>
> For the second part, I'm not sure what's going on. Could you file a bug
> with a repro script?
>
> -Matt
>
> On Mon, Jul 25, 2016 at 8:33 AM, John Regan <johnanthonyregan(a)gmail.com>
> wrote:
>
>> Hi All,
>>
>> I'm interested in calculating some surface properties of clumps found
>> with the clump finder. Finding the clumps is straight forward via the clump
>> finder API. Once I find a clump then I apply the surface operator
>>
>> e.g.
>> surf = clump.data.ds.surface(clump, "density", 1e-24)
>>
>> This works but isn't ideal. For one I don't want to specify a value for
>> the density contour as by definition this is part of the clump - ideally
>> I'd just be able to extract the surface cells from the clump with something
>> like
>>
>> surf2 = clump.surface()
>>
>> Does YT store any information on which cells reside on the surface of a
>> clump?
>>
>> Even pushing ahead with the YTSurface object doesn't go very far anyway
>> as even trying to access the triangles variable gives the error
>>
>> triangles = surf.triangles
>>
>> File
>> "/cosma/home/dp004/dc-rega4/YT/yt-x86_64/src/yt-hg/yt/analysis_modules/level_sets/virial_validator.py",
>> line 74, in _virial_validation
>> triangles = surf.triangles
>> File
>> "/cosma/home/dp004/dc-rega4/YT/yt-x86_64/src/yt-hg/yt/data_objects/construction_data_containers.py",
>> line 1223, in triangles
>> self.get_data()
>> File
>> "/cosma/home/dp004/dc-rega4/YT/yt-x86_64/src/yt-hg/yt/data_objects/construction_data_containers.py",
>> line 1108, in get_data
>> for io_chunk in parallel_objects(self.data_source.chunks([], "io")):
>> AttributeError: 'Clump' object has no attribute 'chunks'
>>
>>
>> Any tips/ideas greatly appreciated!
>>
>> Cheers,
>> John
>>
>>
>>
>> _______________________________________________
>> yt-users mailing list
>> yt-users(a)lists.spacepope.org
>> http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
>>
>>
>
> _______________________________________________
> yt-users mailing list
> yt-users(a)lists.spacepope.org
> http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
>
>
>
> _______________________________________________
> yt-users mailing list
> yt-users(a)lists.spacepope.org
> http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
>
>
Hi yt people,
I have this issue, where I want to get the sliceplots through z and x axis,
my x axis slice plot is OK but the sliceplot through z axis does not show
much info. I assume the yt sliceplot through z is defined through z=0, may
I make a sliceplot that goes through z = 1 , 2 or z = -1 , -2 ?
what changes should I make to this command ?
c =[0,0,0]
p1 = yt.SlicePlot(pf,field, c)
I tried changing c = [0,0,2] , it didn't make any change to my plot.
Thanks-
Tazkera
Hi folks,
I have a dataset I’d like to add omega_baryon to (I want to use the baryon_overdensity derived field, but it seems to only have omega_matter and omega_lambda, but I know what omega_baryon is). It’s not a derived field; it’s just a number (not sure what the yt-lingo for it is!)…
I tried doing ds.omega_baryon = 0.048, but I still get
/Users/molly/anaconda2/envs/astroconda/lib/python2.7/site-packages/yt/fields/derived_field.pyc in __call__(self, data)
277 doesnt_have.append(p)
278 if len(doesnt_have) > 0:
--> 279 raise NeedsParameter(doesnt_have)
280 return True
281
NeedsParameter: (['omega_baryon'])
when I try to plot using baryon_overdensity. Suggestions?
Thanks,
—Molly
Dear Yt users
I want to save the slice plot in FITS or HDF5 format. How can I do this in
yt once I plot the slice plot of derived field.
Many thanks in advance
Regards
Prateek