Hi all,
Registration is now open for the Learn yt Workshop at the University of
Illinois Urbana-Champaign, October 10-12, 2016. This workshop will
introduce attendees to the yt project (yt-project.org), a python toolkit
for the analysis and visualization of volumetric data.
We have create a website for the workshop with more information:
http://yt-project.org/workshop2016/
In addition, if you wold like to register for the workshop, please fill out
this online form:
https://goo.gl/forms/c6oIzNQywU1YWgOe2
The workshop will cover basic usage of yt, including the yt data model,
yt's field system, low-level data inspection, basic visualization
workflows, and data analysis and reduction tasks. We will also cover more
advanced usages like generating simulated observations, halo finding and
halo analysis, volume rendering, advanced 3D visualizations, and advanced
data analysis and reduction tasks. Finally, we will cover how to modify and
extend yt, as well as the development and contribution process.
In addition, there will be time set aside for exploring data you bring to
the workshop, along with opportunities to work directly with yt developers
to load and explore your data.
The workshop will take place at the National Center for Supercomputing
Applications building on the north end of the UIUC campus. The NCSA
building is about a block away from the conference hotel and is next door
to a parking structure that offers metered all-day parking. There are a
number of food trucks nearby, a university-run cafeteria about 2 blocks
away, and a university business district with many good lunch restaurants
about a half a mile away.
We are planning to offer funding for hotel and travel for those requesting
support. If you request funding, you will be notified of available funds
by September 15. Travel awards will be made in the form of arranged
lodging and airfare, with reservations being made by the conference
organizers.
We hope to see you there.
On behalf of the organizing committee,
Matt Turk
Nathan Goldbaum
Jill Naiman
John ZuHone
Kandace Turner
Hi all,
Another phase plot question: is there a way to get the background to be white instead of this weird purpley color? FWIW, the colormap goes down to a very pale grey, not white.
[cid:3A7656EE-7769-4312-843B-2CD6051619ED]
Thanks,
—Molly
This year has been an amazing one! This year was the first one in which
we've participated as an independent organisation covering many of the
astronomy open source efforts under the Open Astronomy umbrella
organisation.
http://openastronomy.org/members/
Under the Open Astronomy organisation we have been accepted by Google
directly, while in previous years the different organisations had
participated under the Python Software Foundation or NumFocus. This is
great because it gives us a higher visibility for students and in the open
source communities, and in addition other important advantages: we get some
compensation from Google and two of us will represent the organisation in
the Google Summer of Code mentor summit happening at the end of October.
We got 8 slots from Google, of which 5 went to Astropy and 3 to SunPy.
Sadly the other sub-organisations (yt, chiantipy, juliaAstro, ...) didn't
get any qualified student applicants this time but surely it will change in
the future. From these 8 slots, 7 of them passed the mid-term and all the 7
passed the final evaluation. The students, their projects and the mentors
are the following:
For the Astropy project:
Micky Costa: Astropy: Bridge Sherpa and astropy fitting
<http://myopensauceadventure.blogspot.co.uk/2016/08/dessert.html> (Tom
Aldcroft, Omar Laurino, and Moritz Guenther)
Olga Vorokh: Image processing and source detection in Gammapy
<https://github.com/search?p=1&q=author%3AOlgaVorokh+created%3A%222016-02-...>
(Johannes King and Christoph Deil)
Zé Vinícius: Implement PSF photometry for fitting several overlapping
objects at once <http://mirca.github.io/gsoc-astropy-final/> (Moritz
Guenter, Brigitta Sipocz, and Erik Tollerud)
Karl Vyhmeister: Scheduling capabilities for Astroplan
<http://kvyhastroplan.blogspot.com/2016/08/cleaning-up-and-documenting.html?>
(Brett Morris and Erik Tollerud)
For SunPy:
Tessa Wilkinson: Implementing AIA response function in Sunpy
<http://tdwilkinson.blogspot.co.uk/2016/08/aia-response-summary-this-is-my...>
(Drew Leonard and Will Barnes)
Punyaslok Pattnaik: Improvements to the SunPy Database
<https://punyaslokpattnaik.wordpress.com/2016/08/22/its-already-time/>
(Stuart Mumford and Simon Liedtke)
Sudarshan Konge: Real Time Data Access and Visualisation tools
<https://sudonymousblog.wordpress.com/2016/08/19/gsoc-summary-post/> (David
Perez-Suarez and Jack Ireland)
You can still see all their trip that brought them till the end reading
their blogs, available at:
http://openastronomy.org/Universe_OA/
If you would like to give us suggestions on how to improve the program (to
us as an umbrella, or to Google itself) or you want to get more involved
either as a mentor or as a sub-organisation, do not hesitate to contact
us. There's a lot that we can do better in the coming years!!
Thanks for all the summer!
David Perez-Suarez, Tom Aldcroft, Erik Tollerud, Stuart Mumford
*Hi all,**
**
**I currently try to run enzo with embedded parallelized python/yt with
the following user_script.py :*
/import yt//
//
//from yt.frontends.enzo.api import EnzoDatasetInMemory//
//
//from yt.fields.field_plugin_registry import \//
// register_field_plugin//
//from yt.fields.fluid_fields import \//
// setup_gradient_fields//
//
//@register_field_plugin//
//def setup_my_fields(registry, ftype="enzo", slice_info=None)://
// setup_gradient_fields(registry, ('enzo','AxImaginary'), '',
slice_info)//
//
//yt.enable_parallelism()//
//
//def main()://
//
// ds = EnzoDatasetInMemory()//
// ds.index//
// dd = ds.all_data()//
// print dd.quantities.total_quantity('AxImaginary_gradient_z')/
*It works fine on a single processor (mpirun -n 1 ...) but gives my the
following error message for multiple processors:*
/Global Dir set to .//
//ENZO_layout 1 x 1 x 2//
//Successfully read in parameter file Bosonstar.enzo.//
//INITIALIZATION TIME = 3.75330448e-02//
//yt : [INFO ] 2016-08-30 15:02:40,868 Global parallel computation
enabled: 0 / 2//
//yt : [INFO ] 2016-08-30 15:02:40,868 Global parallel computation
enabled: 1 / 2//
//Continuation Flag = 1//
//TopGrid dt = 2.000000e-05 time = 0 cycle = 0//
//P000 yt : [INFO ] 2016-08-30 15:02:41,001 Parameters:
current_time = 2e-05//
//P000 yt : [INFO ] 2016-08-30 15:02:41,001 Parameters:
domain_dimensions = [32 32 32]//
//P000 yt : [INFO ] 2016-08-30 15:02:41,002 Parameters:
domain_left_edge = [-0.256 -0.256 -0.256]//
//P000 yt : [INFO ] 2016-08-30 15:02:41,002 Parameters:
domain_right_edge = [ 0.256 0.256 0.256]//
//P000 yt : [INFO ] 2016-08-30 15:02:41,003 Parameters:
cosmological_simulation = 0.0//
//P000 yt : [INFO ] 2016-08-30 15:02:41,005 Gathering a field list
(this may take a moment.)//
// File "<string>", line 1, in <module>//
// File "./user_script.py", line 21, in main//
// print dd.quantities.total_quantity('AxImaginary_gradient_z')//
// File
"/gfs2/work/nipbschw/yt-shared-smp/src/yt-hg/yt/data_objects/derived_quantities.py",
line 176, in __call__//
// rv = super(TotalQuantity, self).__call__(fields)//
// File
"/gfs2/work/nipbschw/yt-shared-smp/src/yt-hg/yt/data_objects/derived_quantities.py",
line 67, in __call__//
// sto.result = self.process_chunk(ds, *args, **kwargs)//
// File
"/gfs2/work/nipbschw/yt-shared-smp/src/yt-hg/yt/data_objects/derived_quantities.py",
line 182, in process_chunk//
// for field in fields]//
// File
"/gfs2/work/nipbschw/yt-shared-smp/src/yt-hg/yt/data_objects/data_containers.py",
line 279, in __getitem__//
// File "<string>", line 1, in <module>//
// File "./user_script.py", line 21, in main//
// print dd.quantities.total_quantity('AxImaginary_gradient_z')//
// File
"/gfs2/work/nipbschw/yt-shared-smp/src/yt-hg/yt/data_objects/derived_quantities.py",
line 176, in __call__//
// self.get_data(f)//
// File
"/gfs2/work/nipbschw/yt-shared-smp/src/yt-hg/yt/data_objects/data_containers.py",
line 1291, in get_data//
// self._generate_fields(fields_to_generate)//
// File
"/gfs2/work/nipbschw/yt-shared-smp/src/yt-hg/yt/data_objects/data_containers.py",
line 1311, in _generate_fields//
// fd = self._generate_field(field)//
// File
"/gfs2/work/nipbschw/yt-shared-smp/src/yt-hg/yt/data_objects/data_containers.py",
line 316, in _generate_field//
// tr = self._generate_fluid_field(field)//
// File
"/gfs2/work/nipbschw/yt-shared-smp/src/yt-hg/yt/data_objects/data_containers.py",
line 334, in _generate_fluid_field//
// rv = self._generate_spatial_fluid(field, ngt_exception.ghost_zones)//
// File
"/gfs2/work/nipbschw/yt-shared-smp/src/yt-hg/yt/data_objects/data_containers.py",
line 364, in _generate_spatial_fluid//
// gz[field][ngz:-ngz, ngz:-ngz, ngz:-ngz],//
// rv = super(TotalQuantity, self).__call__(fields)//
// File
"/gfs2/work/nipbschw/yt-shared-smp/src/yt-hg/yt/data_objects/derived_quantities.py",
line 67, in __call__//
// File
"/gfs2/work/nipbschw/yt-shared-smp/src/yt-hg/yt/data_objects/data_containers.py",
line 279, in __getitem__//
// sto.result = self.process_chunk(ds, *args, **kwargs)//
// File
"/gfs2/work/nipbschw/yt-shared-smp/src/yt-hg/yt/data_objects/derived_quantities.py",
line 182, in process_chunk//
// for field in fields]//
// File
"/gfs2/work/nipbschw/yt-shared-smp/src/yt-hg/yt/data_objects/data_containers.py",
line 279, in __getitem__//
// self.get_data(f)//
// File
"/gfs2/work/nipbschw/yt-shared-smp/src/yt-hg/yt/data_objects/construction_data_containers.py",
line 628, in get_data//
// self.get_data(f)//
// File
"/gfs2/work/nipbschw/yt-shared-smp/src/yt-hg/yt/data_objects/data_containers.py",
line 1291, in get_data//
// self._generate_fields(fields_to_generate)//
// File
"/gfs2/work/nipbschw/yt-shared-smp/src/yt-hg/yt/data_objects/data_containers.py",
line 1311, in _generate_fields//
// fd = self._generate_field(field)//
// File
"/gfs2/work/nipbschw/yt-shared-smp/src/yt-hg/yt/data_objects/data_containers.py",
line 316, in _generate_field//
// tr = self._generate_fluid_field(field)//
// File
"/gfs2/work/nipbschw/yt-shared-smp/src/yt-hg/yt/data_objects/data_containers.py",
line 334, in _generate_fluid_field//
// rv = self._generate_spatial_fluid(field, ngt_exception.ghost_zones)//
// File
"/gfs2/work/nipbschw/yt-shared-smp/src/yt-hg/yt/data_objects/data_containers.py",
line 364, in _generate_spatial_fluid//
// if len(fill) > 0: self._fill_fields(fill)//
// File
"/gfs2/work/nipbschw/yt-shared-smp/src/yt-hg/yt/data_objects/construction_data_containers.py",
line 947, in _fill_fields//
// gz[field][ngz:-ngz, ngz:-ngz, ngz:-ngz],//
// File
"/gfs2/work/nipbschw/yt-shared-smp/src/yt-hg/yt/data_objects/data_containers.py",
line 279, in __getitem__//
// for chunk in ls.data_source.chunks(fields, "io")://
// File
"/gfs2/work/nipbschw/yt-shared-smp/src/yt-hg/yt/data_objects/data_containers.py",
line 1190, in chunks//
// self.get_data(f)//
// File
"/gfs2/work/nipbschw/yt-shared-smp/src/yt-hg/yt/data_objects/construction_data_containers.py",
line 628, in get_data//
// self.get_data(fields)//
// File
"/gfs2/work/nipbschw/yt-shared-smp/src/yt-hg/yt/data_objects/data_containers.py",
line 1279, in get_data//
// fluids, self, self._current_chunk)//
// File
"/gfs2/work/nipbschw/yt-shared-smp/src/yt-hg/yt/geometry/geometry_handler.py",
line 245, in _read_fluid_fields//
// chunk_size)//
// File
"/gfs2/work/nipbschw/yt-shared-smp/src/yt-hg/yt/frontends/enzo/io.py",
line 380, in _read_fluid_selection//
// if len(fill) > 0: self._fill_fields(fill)//
// File
"/gfs2/work/nipbschw/yt-shared-smp/src/yt-hg/yt/data_objects/construction_data_containers.py",
line 947, in _fill_fields//
// for chunk in ls.data_source.chunks(fields, "io")://
// File
"/gfs2/work/nipbschw/yt-shared-smp/src/yt-hg/yt/data_objects/data_containers.py",
line 1190, in chunks//
// self.get_data(fields)//
// File
"/gfs2/work/nipbschw/yt-shared-smp/src/yt-hg/yt/data_objects/data_containers.py",
line 1279, in get_data//
// fluids, self, self._current_chunk)//
// File
"/gfs2/work/nipbschw/yt-shared-smp/src/yt-hg/yt/geometry/geometry_handler.py",
line 245, in _read_fluid_fields//
// data_view =
self.grids_in_memory[g.id][fname][self.my_slice].swapaxes(0,2)//
//P001 yt : [ERROR ] 2016-08-30 15:02:41,783 KeyError: 1//
// chunk_size)//
// File
"/gfs2/work/nipbschw/yt-shared-smp/src/yt-hg/yt/frontends/enzo/io.py",
line 380, in _read_fluid_selection//
// data_view =
self.grids_in_memory[g.id][fname][self.my_slice].swapaxes(0,2)//
//P000 yt : [ERROR ] 2016-08-30 15:02:41,784 KeyError: 2//
//[hsmp16:59147] 1 more process has sent help message
help-mpi-btl-openib.txt / default subnet prefix//
//[hsmp16:59147] Set MCA parameter "orte_base_help_aggregate" to 0 to
see all help / error messages//
//[hsmp16:59147] 1 more process has sent help message
help-mpi-runtime.txt / mpi_init:warn-fork//
//[hsmp16:59147] 1 more process has sent help message help-mpi-api.txt /
mpi-abort
/*Thank you very much for any helpful suggestions,**
**Bodo*/
/
Hi yt people,
I just updated to yt 3.3.1 to use all the new features of volume rendering.
I am trying to render ejected density which is a derived field and draw the
coordinate triads, domain boundary along with labels showing the log range
of ejected density. my code is attached. While my code worked fine with
default field 'density', the following figure was produced for 'density',
but I am getting error when the source field is set to 'ejected_density'.
Also the 'density' figure still does not show either the coordinate triads
or the domain boundary. I would appreciate your help to fix the error and
thanks in advance.
*Traceback (most recent call last):*
* File "volume_render_test.py", line 117, in <module>*
* text_annotate=[[(.1, 1.05), text_string]])*
* File
"/work/03858/thaque56/sw/yt-new-3.3/yt-conda/lib/python2.7/site-packages/yt/visualization/volume_rendering/scene.py",
line 395, in save_annotated*
* label = rs.data_source.ds._get_field_info(rs.field).get_label()*
* File
"/work/03858/thaque56/sw/yt-new-3.3/yt-conda/lib/python2.7/site-packages/yt/fields/derived_field.py",
line 225, in get_label*
* units = Unit(self.units, registry=self.ds.unit_registry)*
*AttributeError: 'NoneType' object has no attribute 'unit_registry'*
*Thanks*
*Tazkera*
Hi Nathan,
I have just updated yt 3.3 and now I have this weird issue with the scripts
, I am uploading my script for sliceplots, which should just analyse the
super3d_hdf5_plt_cnt files but I have got issues which also pulls up the
super3d_hdf5_part files which are also located in the same path and
produces the error.
*yt : [INFO ] 2016-08-26 13:23:36,817 Particle file found:
super3d_hdf5_part_0000*
*yt : [INFO ] 2016-08-26 13:23:36,840 integer runtime parameter
checkpointfilenumber overwrites a simulation scalar of the same name*
*yt : [INFO ] 2016-08-26 13:23:36,862 Parameters: current_time
= 0.0*
*yt : [INFO ] 2016-08-26 13:23:36,862 Parameters: domain_dimensions
= [128 128 128]*
*yt : [INFO ] 2016-08-26 13:23:36,863 Parameters: domain_left_edge
= [ -2.80000000e+10 -2.80000000e+10 -2.80000000e+10]*
*yt : [INFO ] 2016-08-26 13:23:36,863 Parameters: domain_right_edge
= [ 2.80000000e+10 2.80000000e+10 2.80000000e+10]*
*yt : [INFO ] 2016-08-26 13:23:36,864 Parameters:
cosmological_simulation = 0.0*
*yt : [INFO ] 2016-08-26 13:23:38,500 xlim = -28000000000.000000
28000000000.000000*
*yt : [INFO ] 2016-08-26 13:23:38,500 ylim = -28000000000.000000
28000000000.000000*
*yt : [INFO ] 2016-08-26 13:23:38,501 xlim = -28000000000.000000
28000000000.000000*
*yt : [INFO ] 2016-08-26 13:23:38,501 ylim = -28000000000.000000
28000000000.000000*
*yt : [INFO ] 2016-08-26 13:23:38,502 Making a fixed resolution buffer
of (('flash', 'temp')) 800 by 800*
*yt : [INFO ] 2016-08-26 13:23:39,016 Making a fixed resolution buffer
of (('flash', 'temp')) 800 by 800*
*yt : [INFO ] 2016-08-26 13:23:39,252 Saving plot
Slice_z_temp/super3d_hdf5_plt_cnt_0000_Slice_z_temp.png*
*yt : [INFO ] 2016-08-26 13:23:39,885 Particle file found:
super3d_hdf5_part_0001*
*Traceback (most recent call last):*
* File "allSlices.py", line 115, in <module>*
* ds=yt.load(filename)*
* File
"/work/03858/thaque56/sw/yt-new-3.3/yt-conda/lib/python2.7/site-packages/yt/convenience.py",
line 86, in load*
* return candidates[0](*args, **kwargs)*
* File
"/work/03858/thaque56/sw/yt-new-3.3/yt-conda/lib/python2.7/site-packages/yt/frontends/flash/data_structures.py",
line 215, in __init__*
* raise IOError('%s and %s are not at the same time.' %
(self.particle_filename, filename))*
*IOError: /work/03858/thaque56/runs/run_176/super3d_hdf5_part_0001 and
/work/03858/thaque56/runs/run_176/super3d_hdf5_plt_cnt_0001 are not at the
same time.*
*[c558-502.stampede.tacc.utexas.edu:mpispawn_0][child_handler] MPI process
(rank: 0, pid: 13788) exited with status 1*
*TACC: MPI job exited with code: 1*
Please take a look at the error and let me know how to avoid the particle
files in this circumstance.
Best
Tazkera
Hi people,
I would like to upgrade yt to the latest version, I tried typing *yt update
*in a folder on cluster /work/03858/thaque56/sw/yt-3.1 and it gives me
following error:
Traceback (most recent call last):
File "/work/03858/thaque56/sw/yt-3.1/yt-x86_64/bin/yt", line 9, in
<module>
load_entry_point('yt==3.2.3', 'console_scripts', 'yt')()
File "build/bdist.linux-x86_64/egg/pkg_resources/__init__.py", line 558,
in load_entry_point
File "build/bdist.linux-x86_64/egg/pkg_resources/__init__.py", line 2682,
in load_entry_point
File "build/bdist.linux-x86_64/egg/pkg_resources/__init__.py", line 2355,
in load
File "build/bdist.linux-x86_64/egg/pkg_resources/__init__.py", line 2361,
in resolve
File "/work/03858/thaque56/sw/yt-3.1/yt-x86_64/src/yt-hg/yt/__init__.py",
line 123, in <module>
from yt.data_objects.api import \
File
"/work/03858/thaque56/sw/yt-3.1/yt-x86_64/src/yt-hg/yt/data_objects/api.py",
line 16, in <module>
from .grid_patch import \
File
"/work/03858/thaque56/sw/yt-3.1/yt-x86_64/src/yt-hg/yt/data_objects/grid_patch.py",
line 19, in <module>
from yt.data_objects.data_containers import \
File
"/work/03858/thaque56/sw/yt-3.1/yt-x86_64/src/yt-hg/yt/data_objects/data_containers.py",
line 55, in <module>
from yt.utilities.parallel_tools.parallel_analysis_interface import \
File
"/work/03858/thaque56/sw/yt-3.1/yt-x86_64/src/yt-hg/yt/utilities/parallel_tools/parallel_analysis_interface.py",
line 33, in <module>
from yt.utilities.lib.quad_tree import \
ImportError: No module named quad_tree
and now I cannot excecute any of my python scripts there, please help me
upgrade and fix the problem. thanks
Tazkera
Dear yt users,
I'm trying to load a set of particles (a bunch of particles, so I'm running it in the hyades cluster). I have their positions and velocities. What I'm doing is just defining a box as the domain of my simulation, the data as a dictionary of velocities and positions, and just saying yt.load_particles(data=data, bbox=bbox). I tried it for some particle files that I'm using, and it works, but for others, it crashes. I'm getting the following error:
/pfs/sw/python/yt/yt-x86_64/src/yt-hg/yt/units/yt_array.py:793: RuntimeWarning: invalid value encountered in divide return YTArray(super(YTArray, self).__div__(ro))
/pfs/sw/python/yt/yt-x86_64/src/yt-hg/yt/utilities/math_utils.py:861: RuntimeWarning: invalid value encountered in arccos return np.arccos( JdotCoords / np.sqrt(np.sum(coords**2,axis=0)) )
I tried it in my computer with less particles,and it works perfectly. Am I loading the particles wrong or is there a way to fix this error?
Thank you very much in advance
All the best,
Ariadna
----------------
Ariadna Murguia Berthier
University of California, Santa Cruz
Department of Astronomy and Astrophysics
Hi all,
I have a feelling that the answer to this is--nope, not easy or built-in,
but thought I should check before I go nuts staring at the source code. I
know that once I make a spectrum I can get the column density of all the
absorbers along a LightRay, and the b value. Is there a way to get the
size, or length, of each absorber? I see a figure doing this in the Egan
2014 paper, but don't see a clear way to ask for it in any of the calls,
nor do I see anything in Trident.
Thanks,
Stephanie
--
Dr. Stephanie Tonnesen
Alvin E. Nashman Postdoctoral Fellow
Carnegie Observatories, Pasadena, CA
stonnes(a)gmail.com