Howdy y'all,
I'm wondering if there is a system to deciding when something belongs in yt.lagos (like the HaloFinders), or in yt.extensions (like the HaloProfiler)? For example, I'm thinking of adding a simple bit of code that will calculate the star formation history (Msol/year, for example) for a given set of stars. Would that go in extensions or in lagos? The best I can tell is that extensions are more secondary, as in they are a post-processor of already refined data, while lagos handles the raw data and refines it down.
Thanks!
_______________________________________________________
sskory(a)physics.ucsd.edu o__ Stephen Skory
http://physics.ucsd.edu/~sskory/ _.>/ _Graduate Student
________________________________(_)_\(_)_______________
Hi guys,
The slides from the SC09 Argonne Labs presentation on HPC Python are now up:
http://www.mcs.anl.gov/~wscullin/python/tut/sc09/Site/Introduction.html
It starts out basic, moves into some cool stuff they've done with GPAW
on BG/P machines, some profiling, mpi4py, and then into web
interfaces. It seems like a good overall introduction, though, and
there's a lot of overlap with what we do.
-Matt
Hi guys,
Just a heads up that we're going to push out a yt-1.6 point release
just around the New Years. It'll be taken from the trunk+a bit of hg,
which means basically:
* Bug fixes
* Stephen's new Parallel HOP
* Optimized Hierarchy
* (very inefficient) Direct ray caster
* Star particle analysis routines
* SEDs
* Fixes for plots, including normalized phase plots
* Collective communication in parallel routines
* amr_utils instead of the various Cython routines
(Have I missed anything else that's in trunk / hg but not 1.5?)
As you can see, most of the major, user-visible features for this
release are Stephen's. He's already documented most, as well.
Congratulations and thanks, Stephen! I'll work to document the
updates I'm responsible for, as well.
-Matt
Anybody at AAS might want to check this out.
---------- Forwarded message ----------
From: Robert Hurt <hurt(a)ipac.caltech.edu>
Date: Wed, Dec 16, 2009 at 5:20 PM
Subject: [astro-viz] AAS Splinter Meeting on Astro Visualization
Metadata/VAMP
To: astro-viz(a)yahoogroups.com
While it has been a bit quiet on the astronomy metadata front and the
Virtual Astronomy Mulitmedia Project (VAMP), the state of the art has
actually been progressing rapidly and today there are many new tools
to assist in adding astronomy-specific metadata to image galleries.
The January AAS meeting provides a perfect opportunity to bring
together anyone involved in astronomical visualization, imagery, and
outreach to discuss tools, techniques, and advancements in the use of
Astronomy Visualization Metadata (AVM).
So if you are going to be at the AAS, please do set aside time to join
us for the splinter session:
Tuesday, January 5th, 5:30-7:00 PM
Harding Room
The meeting will be an informal discussion between interested parties
with a goal of bringing everyone up to speed of what is being done,
which projects are starting to implement AVM, what kind of funding has
been granted, and what tools and workflows exist. If you have anything
to contribute, please come prepared to chat about any of these topics,
with or without slides. We will have a projector and screen for our
own use (do not have to go through the AAS Speaker's Room).
As a quick teaser of some of the topics, here is an incomplete list of
some of the latest developments:
- CS4 compatible tagging tools
- Support in Microsoft's WorldWide Telescope (stand-alone and web
clients)
- New Spitzer website built on AVM specifications
- PinpointWCS tool from Chandra for recovering WCS data from reference
FITS images
For more about AVM/VAMP: www.virtualastronomy.org
Cheers,
Robert Hurt
| Dr. Robert L. Hurt | Spitzer Science Center/Public Affairs
| Caltech MS 220-6 | 626-395-1825 (office) 626-568-0673 (fax)
| Pasadena, CA 91125 | http://www.spitzer.caltech.edu
__._,_.___
Reply to sender<hurt@ipac.caltech.edu?subject=AAS%20Splinter%20Meeting%20on%20Astro%20Visualization%20Metadata/VAMP>|
Reply
to group<astro-viz@yahoogroups.com?subject=AAS%20Splinter%20Meeting%20on%20Astro%20Visualization%20Metadata/VAMP>
Messages in this
topic<http://groups.yahoo.com/group/astro-viz/message/344;_ylc=X3oDMTMzOTkxbmxnBF…>(
1)
Recent Activity:
Visit Your Group<http://groups.yahoo.com/group/astro-viz;_ylc=X3oDMTJmaDB0Z3E3BF9TAzk3MzU5Nz…>
Start
a New Topic<http://groups.yahoo.com/group/astro-viz/post;_ylc=X3oDMTJmcDQ4OThwBF9TAzk3M…>
MARKETPLACE
Going Green: Your Yahoo! Groups resource for green
living<http://us.ard.yahoo.com/SIG=14kdpqvdi/M=493064.13814333.13821539.13298430/D…>
------------------------------
Mom Power: Discover the community of moms doing more for their families, for
the world and for each
other<http://us.ard.yahoo.com/SIG=14kncdmj1/M=493064.13814537.13821737.10835568/D…>
[image: Yahoo!
Groups]<http://groups.yahoo.com/;_ylc=X3oDMTJlMTNrYmRpBF9TAzk3MzU5NzE0BGdycElkAzE0N…>
Switch to: Text-Only<astro-viz-traditional@yahoogroups.com?subject=Change%20Delivery%20Format:%20Traditional>,
Daily Digest<astro-viz-digest@yahoogroups.com?subject=Email%20Delivery:%20Digest>•
Unsubscribe <astro-viz-unsubscribe(a)yahoogroups.com?subject=Unsubscribe> • Terms
of Use <http://docs.yahoo.com/info/terms/>
.
__,_._,___
Hi all,
I've added code to yt.extensions.StarAnalysis (in hg, of course) to calculate the spectral flux and spectral energy distribution for a collection of stars. This is based on code from Ken Nagamine, and it uses the data tables provided by Bruzual & Charlot. You can pass it a data_source, or arrays containing the pertinent information (star mass, creation time and metallicity) and it will build you a spectrum.
The rub is in the B&C data tables. They are distributed as ASCII files from an unreliable foreign server. The data required to build a spectrum is at least 150MB in ASCII format and takes a looong time to download. And then once you have it, reading ASCII data in is super sloooow. I have converted the data to HDF5 files which are nice and fast and only 72MB in size in total. However, I'm not sure exactly if we can distribute them as HDF5 files. I am going to email B&C and ask them if we can, with proper acknowledgement, of course. Otherwise, I'll have to include my (currently ridiculously slow) conversion script with any instructions on how to use the spectrum code.
I'll let you guys know!
_______________________________________________________
sskory(a)physics.ucsd.edu o__ Stephen Skory
http://physics.ucsd.edu/~sskory/ _.>/ _Graduate Student
________________________________(_)_\(_)_______________
Hi everyone,
the proceedings for SciPy 2009 are out:
http://conference.scipy.org/proceedings/SciPy2009/
including a paper on FEMhub, a paper on Sherpa (used with Chandra),
and CorePy (an assembly language programming environment in Python).
There's also a Cython tutorial and a paper on fast computation with
Cython.
-Matt
Hi all,
I've changed the way haloes are sorted in HaloFinding.py in hg. Now they are sorted by mass, rather than by number of particles. I figure this makes more sense, especially when dealing with nested cosmology and different DM particle masses.
I hope no one finds this objectionable. Feel free to put me in my place if you feel it is!
_______________________________________________________
sskory(a)physics.ucsd.edu o__ Stephen Skory
http://physics.ucsd.edu/~sskory/ _.>/ _Graduate Student
________________________________(_)_\(_)_______________
Hi all,
I wanted to make you all aware of a few new features I just added to the devel repo.
For a given data_source, like 'sp = pf.h.sphere([0.5]*3, .2)', you can get the volume of the region in your favorite units: 'sp.volume('mpc')'. No units gives it in code units 'sp.volume()'.
I added yt.extensions.StarAnalysis, which has a class StarFormationRate, which when given a data_source with stars in it, will calculate various star formation rates binned over time.
Let me know if you have any questions! And yes, I should eventually document the StarAnalysis stuff!
_______________________________________________________
sskory(a)physics.ucsd.edu o__ Stephen Skory
http://physics.ucsd.edu/~sskory/ _.>/ _Graduate Student
________________________________(_)_\(_)_______________
Hi all,
I wrote a simple script that finds the virial masses of DM only haloes in yt. My thought is that this could be easily added to Britton's HaloProfiler and selected with a dm_only flag. What does everyone think?
Thanks!
_______________________________________________________
sskory(a)physics.ucsd.edu o__ Stephen Skory
http://physics.ucsd.edu/~sskory/ _.>/ _Graduate Student
________________________________(_)_\(_)_______________
Hi everyone,
Stephen's latest update of his new parallel hop to check for unique-ness of
particle indices has shed light on another problem. In a recent simulation
run with the devel/trunk version of enzo, I have found the particle numbers
are no longer unique when star particles are generated. Originally, the
star particle indices by adding to the number of dark matter particles, such
that if you had N dm particles and M star particles, the dm particle indices
when from 0 to N-1 and the star particle indices went from N to N+M-1.
However, now it seems that the star particles simply go from 0 to M. Before
I go any further, I will mention that the star particles can be
differentiated from the dark matter particles by the particle_type data, so
we're not totally screwed. The point is that for the time being, hop no
longer works. The question is this: what is the best solution for dealing
with this? I am more than willing to write something to remap the particle
indices, but that doesn't fix the underlying problem. I have already done
something similar to this for a different application, so I can do it
quickly. OR, something can be done to hop to read in particle type.
Clearly, this question is mainly for Stephen, but anyone should feel free to
weigh in. What is the best solution?
Britton