Hi, Everybody!
Does anyone out there have a technique for getting the variance out of
a profile object? A profile object is good at getting <X> vs. B, I'd
then like to get < (X - <X>)^2 > vs B. Matt and I had spittballed the
possibility some time ago, but I was wondering if anyone out there had
successfully done it.
Thanks,
d.
--
Sent from my computer.
Hello everyone.
I was wondering if anyone has had experience with producing a profile plot
from a 2d projection object (FRB object).
Essentially, what I am trying to do is plot the stellar surface density of
a galaxy as a function of radius.
This is achieved by doing the following from a disk YT object called
cylinder (in which the origional simulation object is called shot)
center = cylinder.get_field_parameter("center")
normal = cylinder.get_field_parameter("normal")
image_width = (100,"kpc")
three_image_width = YTArray((image_width[0], image_width[0],
image_width[0]),image_width[1])
left = center - image_width
right = center + image_width
region = shot.region(center, left, right)
proj =
yt.ProjectionPlot(cylinder.ds,"z",[("deposit","stars_density")],center=center,width=image_width,data_source=region,axes_unit="kpc")
the error arrises here
prof =
yt.create_profile(proj,bin_fields="cylindrical_r",fields=[("deposit","stars_density")],n_bins=128,weight_field=None
)
where I get the error
/gpfs/home/........./profiles.pyc in create_profile(data_source,
bin_fields, fields, n_bins, extrema, logs, units, weight_field,
accumulation, fractional)
1304 else:
1305 raise NotImplementedError
-> 1306 bin_fields = data_source._determine_fields(bin_fields)
1307 fields = data_source._determine_fields(fields)
1308 if units is not None:
AttributeError: 'FixedResolutionBuffer' object has no attribute
'_determine_fields'
Any ideas how to get around this error?
Also some other things to add as a postscript. Since how the projection
works, if I provide weights=None as a keyword argument within the
ProjectionPlot object, I get a surface density (g/cm^2). But also a
"cylindrical_r" in cm^2 as well. I *think* the way to get around this is to
do another projection where weights="ones", get the radius values out of
that profile.. and then in matplotlib, useing the surface density array
from the former profile, and the radius bin array from the latter...
Produce a plot of the surface density as a function of radius from those
two arrays (I might check by hand afterwards to see if this does the
trick). This seems kinda convoluted so I am wondering if there is an easier
way than this.
_______________________________________________
yt-users mailing list
yt-users(a)lists.spacepope.org
http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
Dear yt-users,
I have a tiny question about loading files.
I understand when yt load files, it will load all the files in the directory. For example: if I do ds=yt.load('snap_103.0.hdf5’), then ds.field_list , yt would open all the files:
snap_103.0.hdf5* snap_103.16.hdf5* snap_103.22.hdf5* snap_103.29.hdf5* snap_103.6.hdf5*
snap_103.10.hdf5* snap_103.17.hdf5* snap_103.23.hdf5* snap_103.2.hdf5* snap_103.7.hdf5*
snap_103.11.hdf5* snap_103.18.hdf5* snap_103.24.hdf5* snap_103.30.hdf5* snap_103.8.hdf5*
snap_103.12.hdf5* snap_103.19.hdf5* snap_103.25.hdf5* snap_103.31.hdf5* snap_103.9.hdf5*
snap_103.13.hdf5* snap_103.1.hdf5* snap_103.26.hdf5* snap_103.3.hdf5*
snap_103.14.hdf5* snap_103.20.hdf5* snap_103.27.hdf5* snap_103.4.hdf5*
snap_103.15.hdf5* snap_103.21.hdf5* snap_103.28.hdf5* snap_103.5.hdf5*
However, if our data only has one file :snap_103.hdf5*, then if I do ds=yt.load('snap_103.hdf5’), then ds.field_list, yt would still try to open all the files from snap_103.0.hdf5, however there is no snap_103.0.hdf5. It would report error:
IOError: Unable to open file (Unable to open file: name = '/lustre/projects/p071_swin/yqin/smaug/ref_eff_l010n0128/data/snapshot_102/snap_102.0.hdf5', errno = 2, error message = 'no such file or directory', flags = 0, o_flags = 0)
Is there an argument saying I don’t want to do loading recursively? I can make a link named snap_103.0.hdf5 to snap_103.hdf5. But it’s kind of inconvenient.
Cheers,
--
Yuxiang Qin
PhD Student
School of Physics
The University of Melbourne
VIC, Australia, 3010
_______________________________________________
yt-users mailing list
yt-users(a)lists.spacepope.org
http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
Dear yt users,
I'm having some trouble installing SciPy. I can install yt just fine
using the install script, but if I turn the option for installing SciPy
on, I get the following error:
creating build/temp.linux-x86_64-2.7/scipy/fftpack/src/dfftpack
compile options:
'-I/data/users/borm/YT/yt-x86_64/lib/python2.7/site-packages/numpy/core/include
-c'
f77.sh:f77: scipy/fftpack/src/dfftpack/dsint.f
Cannot open file dsint.f
gcc: dsint.c: No such file or directory
gcc: no input files
Cannot open file dsint.f
gcc: dsint.c: No such file or directory
gcc: no input files
error: Command "/iraf/iraf/unix/hlib//f77.sh -Wall -ffixed-form
-fno-second-underscore -fPIC -O3 -funroll-loops
-I/data/users/borm/YT/yt-x86_64/lib/python2.7/site-packages/numpy/core/include
-c -c scipy/fftpack/src/dfftpack/dsint.f -o
build/temp.linux-x86_64-2.7/scipy/fftpack/src/dfftpack/dsint.o"
failed with exit status 1
I've tried uncommenting the NUMPY_ARGS lines one by one as suggested in
the script, but none of that seems to work.
I'm using yt version 2.6.1, changeset e0906fc5b6d5, if that helps.
Any ideas on how to fix this? Thanks!
Cheers,
Caroline
_______________________________________________
yt-users mailing list
yt-users(a)lists.spacepope.org
http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
Hi everyone,
Is there a straightforward way of creating a spherical data object by
specifying the total enclosed mass, rather than the radius?
If not, I supposed I could write an iterative algorithm that would do it,
but I'd rather avoid that if there's an easier way provided by yt.
Thanks for your help.
Dan
_______________________________________________
yt-users mailing list
yt-users(a)lists.spacepope.org
http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
Hey Rick,
Can we keep this discussion on the mailing list? Just in case others in
the future have similar problems.
How did you install yt? If you used the install script, did you remember
to activate the yt environment before launching ipcluster? Is IPython
installed into yt's environment?
-Nathan
On Tue, Jan 20, 2015 at 3:31 PM, Rick Sarmento <rsarment(a)asu.edu> wrote:
> Hum… I used the ipcluster command on the vis node I’m using
>
> And I get through all your init steps:
>
> In [1]:
>
> from IPython.parallel import Client
> rc = Client()
> In [2]:
>
> rc.ids
> Out[2]:
> [0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15]
> In [3]:
>
> %autopx
> %autopx enabled
>
> In [4]:
>
> from mpi4py import MPI
>
> MPI.COMM_WORLD.Get_rank()
>
> Out[0:1]: 8
> Out[1:1]: 1
> Out[2:1]: 14
> Out[3:1]: 7
> Out[4:1]: 9
> Out[5:1]: 15
> Out[6:1]: 6
> Out[7:1]: 12
> Out[8:1]: 4
> Out[9:1]: 0
> Out[10:1]: 5
> Out[11:1]: 10
> Out[12:1]: 2
> Out[13:1]: 11
> Out[14:1]: 3
> Out[15:1]: 13
> In [5]:
>
>
> But then it can’t find yt!!
>
> import yt
> yt.enable_parallelism()
> import numpy as np
> import os
> import shutil
> import h5py
> import gc
> import glob
>
>
> AttributeError Traceback (most recent call last)<ipython-input-3-915313e529f2> in <module>()----> 1 import yt
>
> ...
>
>
> Weird tho … cuz if I kill the cluster and try again, import YT loads… *Can
> you think of anything obvious?* It’s like my path to yt goes away once I
> start the cluster … and I’m getting some default version (old) on stampede.
>
> Thanks Nathan .. I’m watching the tutorial now …
>
> *Cheers!*
>
> *Rick Sarmento*
> SESE Astronomy/Astrophysics Grad Student
> rsarment(a)asu.edu
>
>
>
>
>
>
>
> On Jan 17, 2015, at 12:54 PM, Rick Sarmento <rsarment(a)asu.edu> wrote:
>
> Thanks for the info Nathan…
>
> I’m trying to use STAMPEDE out at Texas (http://vis.tacc.xsede.org/), and
> they have a way to kick-off access to their 16 core visualization nodes via
> the web… But that probably doesn’t help given the info below. I’ll try to
> work with support to see how I can get it to work…
>
> If I start the cluster using your instructions (via a ssh session), below,
> I’d have to login — but then I can’t get remote access to the notebook
> interface because of the firewalls, etc. Although there’s like a way to
> setup an ssh tunnel, I haven’t figured it out yet.
>
> Thx!
>
> *Cheers!*
>
> *Rick Sarmento*
> SESE Astronomy/Astrophysics Grad Student
> rsarment(a)asu.edu
>
> Hi Rick,
>
> You need to launch the notebook in parallel using an MPI parallel IPython
> cluster. In addition, you will need to configure the notebook to hook into
> IPython's built-in parallelism.
>
> I've uploaded a notebook that uses parallel IPython here:
>
> http://nbviewer.ipython.org/gist/ngoldbaum/32ca07cf4f5b3dd06add
>
> Note that all the yt operations aren't really important, but the first few
> cells where I check to make sure IPython's parallelism works and that MPI
> parllelism is working are quite important.
>
> Note also that you will need to launch the IPython cluster separately from
> the notebook server using the "ipcluster" command.
>
> If you want to learn more about parallel IPython, I'd encourage you to
> take a look at Min Ranger-Kelley's tutorial from scipy 2014:
> http://pyvideo.org/video/2738/interactive-parallel-computing-with-ipython-p…
>
> All that said, it does add a significant amount of semantic overhead to
> use the IPython notebook in parallel. It's generally much more
> straightforward to work with yt in parallel using regular python scripts.
>
> Hope that helps,
>
> Nathan
>
>
>
_______________________________________________
yt-users mailing list
yt-users(a)lists.spacepope.org
http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
Dear yt users,
I am trying to make covering_grid from an enzo simulation. The code is very
simple: (somewhat sloppy since it's running with ipython. )
ds=yt.load("data0201")
ad_c=ds.covering_grid(level=3, left_edge=0.4375, dims=[64,64,64])
ad_c["x"]
where the ad_c["x"] will give an Index Error on my desktop. However, the
same piece of code runs great in a different machine. The version of yt are
the same in both machines, which is 3.0.2. Both of them installed yt
through anaconda. The error message I got was following. Thanks for your
help!
---------------------------------------------------------------------------
IndexError Traceback (most recent call last)
<ipython-input-6-ff78a8bccc54> in <module>()
----> 1 ad_c["x"]
/home/fxyang/software/anaconda/lib/python2.7/site-packages/yt/data_objects/data_containers.pyc
in __getitem__(self, key)
243 if f in self._container_fields:
244 self.field_data[f] = \
--> 245 self.ds.arr(self._generate_container_field(f))
246 return self.field_data[f]
247 else:
/home/fxyang/software/anaconda/lib/python2.7/site-packages/yt/data_objects/construction_data_containers.pyc
in _generate_container_field(self, field)
587 np.multiply(rv, self.dds[2], rv)
588 elif field == ("index", "x"):
--> 589 x = np.mgrid[self.left_edge[0] + 0.5*self.dds[0]:
590 self.right_edge[0] - 0.5*self.dds[0]:
591 self.ActiveDimensions[0] * 1j]
/home/fxyang/software/anaconda/lib/python2.7/site-packages/yt/units/yt_array.pyc
in __getitem__(self, item)
981
982 def __getitem__(self, item):
--> 983 ret = super(YTArray, self).__getitem__(item)
984 if ret.shape == ():
985 return YTQuantity(ret, self.units)
IndexError: too many indices for array
--
Best,
Haifeng
Dear YT-Users,
I am a fresh man to YT-project. I got a few questions for parallel computation on yt. May I ask for help?
I tried to do mpirun -np 4 myscript.py , but it always failed even when I used very simple script like
#!/usr/bin/env python
import yt
yt.enable_parallelism()
fname=('../../Smaug/NOZCOOL_LateRe_L010N0512/data/snapshot_103/snap_103.0.hdf5')
ds = yt.load(fname)
pz = yt.ProjectionPlot(ds, 'z', ('gas', 'temperature'),weight_field='density')
pz.save()
I am using the data from Dr. Alan Duffy. His data is similar to OWLS. So when I tried to run a similar script with sample data downloaded from yt website, it works. Does anyone know what’s going on here? Is anything wrong with my data format or script? How to check it?
Warmest sincerely,
--
Yuxiang Qin
PhD Student
School of Physics
The University of Melbourne
VIC, Australia, 3010
_______________________________________________
yt-users mailing list
yt-users(a)lists.spacepope.org
http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
Oh, that sounds valuable. I ended up adding it at a low level- there is now a FilteredIOBrick, which takes a (list of) filter callbacks and takes under a second. But a yt-ish way to do it would be nicer.
-------- Original message --------
From: Matthew Turk <matthewturk(a)gmail.com>
Date: 01/16/2015 18:24 (GMT-06:00)
To: Discussion of the yt analysis package <yt-users(a)lists.spacepope.org>
Subject: Re: [yt-users] cut_region: can I make an expression dimensionless?
Hi Stuart,
Actually the more I look at this, the more I think it's probably quite easy to turn this into a data selector itself, avoiding the need for a cut region. Should be straightforward to add it to selection_routines and selection_data_containers and have it work out of the box.
On Fri, Jan 16, 2015 at 6:20 PM, Matthew Turk <matthewturk(a)gmail.com> wrote:
On Wed, Jan 14, 2015 at 11:03 AM, Stuart Levy <salevy(a)illinois.edu> wrote:
Using one of Sam's past e-mails to answer my own question, it at least syntactically works to use the '.uq' unit-quantity attribute of a YT object to get something that has the units of the object, so e.g. 0.78 * obj['x'].uq is 0.78 in x's length units. So it actually seems to work to say:
cr = ds.cut_region( dd, ["obj['z']-(obj['x']**2 + obj['y']**2)/(2.0*obj['z'].uq) < 0.78*obj['z'].uq"])
Neato!
As it turns out, I'll look for another lower-level way to do this - with the above cut_region applied, 32GB of RAM isn't enough to process a single data field of about 512^3, and it takes several minutes of CPU time even to start. But it's good to know that the above is possible.
Glad you kind of got it working ... this is kind of ridiculous that it takes so much, though. What I think could cut it dowen considerably is to reduce the numebr of sequential numpy operations; right now it's computing them and storing temporary arrays like crazy. Unfortunately I don't know how one might do this with the current setup of cut_region.
Perhaps one way would be to make a derived field that is each component, and inside it do each operation inline, then apply the conditional there?
On 1/14/15 10:26 AM, Stuart Levy wrote:
Hello yt people,
I'm hoping to write a cut_region expression that (a) depends on position and (b) does it in a nonlinear way - I'd like to do a sort of paraboloidal cut, like
ds.cut_region(dd, "obj['z'] - (obj['x']**2 + obj['y']**2)/2.0 < 0.78")
er, correction - cut_region()'s 2nd arg is a [] list of strings rather than a string, so I'd tried
ds.cut_region(dd, ["obj['z'] - (obj['x']**2 + obj['y']**2)/2.0 < 0.78"])
But this runs afoul of the unit-checking - obj['z'] doesn't have the same units as obj['x']**2.
Somehow I either need to cast all the obj[] terms to be dimensionless ("trust me, I promise it's right"), or else give dimensions of length to the constants. Should there be a way to do either one?
_______________________________________________
yt-users mailing list
yt-users(a)lists.spacepope.org
http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org