Hi all,
The yt module on Kraken and the one on Nautilus have both been updated
to yt 2.1. You can use these in lieu of an install script.
On Nautilus: just run "module load yt" and it should default to
yt/2.1. The 1.7 install is still there and can be loaded using module
load yt/1.7
On Kraken run:
module swap PrgEnv-pgi PrgEnv-gnu
module load yt
Thanks very much to Harinarayan Krishnan, who maintains these modules.
If you run into any problems, let me know!
Best,
Matt
Hi together,
I would like to better understand how the volume rendering works.
It is explained here
http://yt.spacepope.org/doc/visualizing/volume_rendering.html
that the user defines transfer functions in terms of RGB values.
>From the description of the add_gaussian function, I understand
that these RGB values describe the color value in the interval
[0,1]. Now, in the radiative transfer equation on the above
website, the emissivity gets multiplied by the path length
delta s. I am now wondering how this works: Depending on how
big the step size is, one could get extremely large or extremely
small intensities that are essentially unrelated to the RGB
values that were previously specified. How is it possible that,
for example, the color of a density isosurface depends on the
density only and not on the cell size? I guess I am missing
something.
Cheers,
Maike
--
GMX DSL Doppel-Flat ab 19,99 Euro/mtl.! Jetzt mit
gratis Handy-Flat! http://portal.gmx.net/de/go/dsl
Geoffrey,
I think we've stumbled upon a h5py bug! Without getting into gritty
details, the LoadHaloes() code actually opens, closes, and then
re-opens the h5 file when data is read in. For some reason, the
re-open wasn't working, and the file was still marked as closed, and
therefore trying read data from it wasn't working. I've added a
band-aid of sorts which seems to fix the issue. Go ahead and update
and let me know if you continue to have problems.
(For those of you who get this twice, sorry! I sent the reply to the
wrong yt list the first time.)
--
Stephen Skory
s(a)skory.us
http://stephenskory.com/
510.621.3687 (google voice)
Hi Stephen, I was trying your two examples shown on
"Loading Haloes Off Disk" in the documentation,
I can do to both:
haloes[0].center_of_mass()
to find the center of mass coordinate xyz of the 0th halo, but when I
tried to get the particle positions by doing
haloes[0]["partilcle_position_x"] I succeeded with the first method while
the data is still in memory, but fail in the second method when it is
loaded off disk, can you verify this? It'll save a lot of time if I have
the particle positions without having to run the profiler every time.
error pasted at:
http://paste.enzotools.org/show/NyLEwvGzl2RGBOcAcfKH
From
G.S.
Hi folks,
I'm looking at the spin of some objects and have a few questions about the
quantities BaryonSpinParameter, and ParticleSpinParameter. I've only measured
the spin of DM particles in the past, which is straightfoward compared to the
baryons. In looking at the code (posted below from derived_quantities.py)
however, I don't think either of the two functions actually calculate the spin
parameter as usually defined: spin = | J | sqrt( | Etot | ) / (G M^5/2).
In the both the Baryon and Particle Spin Parameter functions, the energy
calculated is only the kinetic energy (and it's missing a factor of 0.5),
whereas the definition of spin includes the total energy -- for DM this is just
potential and kinetic energies. I'm assuming for baryons, it is potential +
kinetic + internal energy. Neither functions include the potential energy, or
the internal energy.
Also, it looks like the second function (ParticleSpinParameter) really is just
calculating the spin paramter of the DM particles (or star particles I guess),
so it may make more sense just to not include the gas mass in the first line,
and then re-word the short description so that it says that it really is just
calculating the particle spin, ignoring any gas.
If I'm wrong, please let me know -- otherwise, I'd be happy to submit a modified
version of the two functions.
Cheers,
Andrew
def _BaryonSpinParameter(data):
"""
This function returns the spin parameter for the baryons, but it uses
the particles in calculating enclosed mass.
"""
m_enc = data["CellMassMsun"].sum() + data["ParticleMassMsun"].sum()
am = data["SpecificAngularMomentum"]*data["CellMassMsun"]
j_mag = am.sum(axis=1)
e_term_pre = na.sum(data["CellMassMsun"]*data["VelocityMagnitude"]**2.0)
weight=data["CellMassMsun"].sum()
return j_mag, m_enc, e_term_pre, weight
def _combBaryonSpinParameter(data, j_mag, m_enc, e_term_pre, weight):
# Because it's a vector field, we have to ensure we have enough dimensions
if len(j_mag.shape) < 2: j_mag = na.expand_dims(j_mag, 0)
W = weight.sum()
M = m_enc.sum()
J = na.sqrt(((j_mag.sum(axis=0))**2.0).sum())/W
E = na.sqrt(e_term_pre.sum()/W)
G = 6.67e-8 # cm^3 g^-1 s^-2
spin = J * E / (M*1.989e33*G)
return spin
add_quantity("BaryonSpinParameter", function=_BaryonSpinParameter,
combine_function=_combBaryonSpinParameter, n_ret=4)
def _ParticleSpinParameter(data):
"""
This function returns the spin parameter for the baryons, but it uses
the particles in calculating enclosed mass.
"""
m_enc = data["CellMassMsun"].sum() + data["ParticleMassMsun"].sum()
am = data["ParticleSpecificAngularMomentum"]*data["ParticleMassMsun"]
if am.size == 0: return (na.zeros((3,), dtype='float64'), m_enc, 0, 0)
j_mag = am.sum(axis=1)
e_term_pre = na.sum(data["ParticleMassMsun"]
*data["ParticleVelocityMagnitude"]**2.0)
weight=data["ParticleMassMsun"].sum()
return j_mag, m_enc, e_term_pre, weight
add_quantity("ParticleSpinParameter", function=_ParticleSpinParameter,
combine_function=_combBaryonSpinParameter, n_ret=4)
*************************************************************
** Andrew J. Davis andrew.davis(a)yale.edu **
** Dept. of Astronomy 203-432-5119 **
** Yale University www.astro.yale.edu/adavis **
*************************************************************
Hi all,
Does anyone know what the covering grid does if the cell size on the covering grid is larger than the cells on the amr level? Is an average taken?
Best,
John Z
Hi All,
I've just updated the common install of yt on Kraken to 2.1, and added
an install of the unstable branch. Both are full copies of yt that is
kept on lustre in a non-volatile place. This means that it is
accessible from both the login nodes and the compute nodes, but unlike
most files on Lustre, will not be deleted automatically.
If you wish to use yt 2.1, which will only see changes for bug fixes,
please set your environment using these settings:
YT_DEST => /lustre/scratch/proj/yt_common/2.1
PATH => /lustre/scratch/proj/yt_common/2.1/bin/
PYTHONPATH =>
/lustre/scratch/proj/yt_common/2.1/lib/python2.7/site-packages/
LD_LIBRARY_PATH => /lustre/scratch/proj/yt_common/2.1/lib/
If you wish to use the unstable branch, which I'll update on Kraken
when I remember to do it (or you can bug me if I haven't):
YT_DEST => /lustre/scratch/proj/yt_common/unstable
PATH => /lustre/scratch/proj/yt_common/unstable/bin/
PYTHONPATH =>
/lustre/scratch/proj/yt_common/unstable/lib/python2.7/site-packages/
LD_LIBRARY_PATH => /lustre/scratch/proj/yt_common/unstable/lib/
The older versions of yt at /lustre/scratch/proj/yt_common/2.0 and
/lustre/scratch/proj/yt_common/trunk will be deleted within a week or
so.
--
Stephen Skory
s(a)skory.us
http://stephenskory.com/
510.621.3687 (google voice)
Hi all,
I'm trying to run yt in parallel on ranger. I gather from previous
messages to this list that others have had issues in the past, but I
haven't been able to find something that works from those posts.
I'm trying to run a test script that does a simple projection.
First I tried this (from the yt docs):
ibrun mpirun -np 8 python2.7 test_parallel_yt.py --parallel
I got this output in the output file:
TACC: Setting memory limits for job 1896644 to unlimited KB
TACC: Dumping job script:
--------------------------------------------------------------------------------
# Submit this script using the "qsub" command.
# Use the "qstat" command to check the status of a job.
#
#$-l h_rt=00:05:00
#$-pe 8way 16
#$-N test_parallel_yt
#$-o $JOB_NAME.o$JOB_ID
#$-q development
#$-M csimpson(a)astro.columbia.edu
#$-m be
#$-V
#$-cwd
ibrun mpirun -np 8 python2.7 test_parallel_yt.py --parallel
--------------------------------------------------------------------------------
TACC: Done.
TACC: Starting up job 1896644
TACC: Setting up parallel environment for MVAPICH ssh-based mpirun.
TACC: Setup complete. Running job script.
TACC: starting parallel tasks...
Warning: Command line arguments for program should be given
after the program name. Assuming that test_parallel_yt.py is a
command line argument for the program.
Warning: Command line arguments for program should be given
after the program name. Assuming that test_parallel_yt.py is a
command line argument for the program.
Warning: Command line arguments for program should be given
after the program name. Assuming that test_parallel_yt.py is a
command line argument for the program.
Warning: Command line arguments for program should be given
after the program name. Assuming that test_parallel_yt.py is a
command line argument for the program.
Warning: Command line arguments for program should be given
after the program name. Assuming that test_parallel_yt.py is a
command line argument for the program.
Warning: Command line arguments for program should be given
after the program name. Assuming that test_parallel_yt.py is a
command line argument for the program.
Warning: Command line arguments for program should be given
after the program name. Assuming that test_parallel_yt.py is a
command line argument for the program.
Warning: Command line arguments for program should be given
after the program name. Assuming that test_parallel_yt.py is a
command line argument for the program.
Warning: Command line arguments for program should be given
after the program name. Assuming that --parallel is a
command line argument for the program.
Missing: program name
Program python2.7 either does not exist, is not
executable, or is an erroneous argument to mpirun.
Warning: Command line arguments for program should be given
after the program name. Assuming that --parallel is a
command line argument for the program.
TACC: MPI job exited with code: 1
TACC: Shutting down parallel environment.
TACC: Shutdown complete. Exiting.
TACC: Cleaning up after job: 1896644
TACC: Done.
I also tried this:
ibrun mpi4py -np 8 python2.7 test_parallel_yt.py --parallel
and got this output:
TACC: Setting memory limits for job 1896674 to unlimited KB
TACC: Dumping job script:
--------------------------------------------------------------------------------
#!/bin/sh
#
# Submit this script using the "qsub" command.
# Use the "qstat" command to check the status of a job.
#
#$-l h_rt=00:05:00
#$-pe 8way 16
#$-N test_parallel_yt
#$-o $JOB_NAME.o$JOB_ID
#$-q development
#$-M csimpson(a)astro.columbia.edu
#$-m be
#$-V
#$-cwd
ibrun mpi4py -np 8 python2.7 test_parallel_yt.py --parallel
--------------------------------------------------------------------------------
TACC: Done.
TACC: Starting up job 1896674
TACC: Setting up parallel environment for MVAPICH ssh-based mpirun.
TACC: Setup complete. Running job script.
TACC: starting parallel tasks...
TACC: MPI job exited with code: 1
TACC: Shutting down parallel environment.
TACC: Shutdown complete. Exiting.
TACC: Cleaning up after job: 1896674
TACC: Done.
Any ideas? I guess in the first instance it is not finding python, but
the test script I run works fine on the command line and doing -V should
import the same environment settings, right? I guess in the second
instance I'm using the wrong mpi call. I found info on that call in
some old posts to the email list.
Christine
We are proud to announce the release of yt version 2.1. This release
includes several new features, bug fixes, and numerous improvements to the
code base and documentation. At the yt homepage, http://yt.enzotools.org/ ,
an installation script, a cookbook, documentation and a guide to getting
involved can be found.
yt is an analysis and visualization toolkit for Adaptive Mesh Refinement
data. yt provides full support for Enzo, Orion, and FLASH codes, with
preliminary support for RAMSES, ART, Chombo, CASTRO and MAESTRO codes. It
can be used to create many common types of data products such as:
* Slices
* Projections
* Profiles
* Arbitrary Data Selection
* Cosmological Analysis
* Halo finding
* Parallel AMR Volume Rendering
* Gravitationally Bound Objects Analysis
There are a few major additions since yt-2.0 (Released January 17, 2011),
including:
* Streamlines for visualization and querying
* A treecode implementation to calculate binding energy
* Healpix / all-sky parallel volume rendering
* A development bootstrap script, for getting going with modifying and
contributing
* CASTRO particles
* Time series analysis
Documentation: http://yt.enzotools.org/doc/
Installation:
http://yt.enzotools.org/doc/advanced/installing.html#installing-yt
Cookbook: http://yt.enzotools.org/doc/cookbook/recipes.html
Get Involved:
http://yt.enzotools.org/doc/advanced/developing.html#contributing-code
If you can’t wait to get started, install with:
$ wget http://hg.enzotools.org/yt/raw/stable/doc/install_script.sh
$ bash install_script.sh
Development has been sponsored by the NSF, DOE, and University funding. We
invite you to get involved with developing and using yt!
Please forward this announcement to interested parties.
Sincerely,
The yt development team:
David Collins
Oliver Hahn
Cameron Hummels
Stefan Klemer
Chris Malone
Christopher Moody
Andrew Myers
Jeff Oishi
Britton Smith
Devin Silvia
Sam Skillman
Stephen Skory
Matthew Turk
John Wise
John ZuHone
Hi YT uses,
I would like to know how make slice, projection, or volume rendering
plots for derived Fields.
For example, my enzo simulation output only has gas_energy (no
temperature field in grid data) and I would like to make temperature
map with add_slice, simply write add_slice("Temperature", 0) causes an
error.
Is there any way this visualization can be done with derived Field?
Thank you,
Junhwan
--
--------------------------------------------------------------
Jun-Hwan Choi, Ph.D.
Department of Physics and Astronomy, University of Kentucky
Tel: (859) 897-6737 Fax: (859) 323-2846
Email: jhchoi(a)pa.uky.edu URL: http://www.pa.uky.edu/~jhchoi
--------------------------------------------------------------
----------------------------------------------------------------
This message was sent using IMP, the Internet Messaging Program.