Does anyone out there have a technique for getting the variance out of
a profile object? A profile object is good at getting <X> vs. B, I'd
then like to get < (X - <X>)^2 > vs B. Matt and I had spittballed the
possibility some time ago, but I was wondering if anyone out there had
successfully done it.
Sent from my computer.
After a pretty long (few month) hiatus, I just tried to update my yt and
seem to have broken it.
Now my desktop is frozen so I can't cut and paste the long outputs, but it
boils down to:
The current version of the code is:
0be45301e0eb (yt) tip
This version CAN be automatically updated.
added 1740 changesets with 3520 changes to 355 files (+1 heads)
Updating the repository
126 files updates, 0 files merged, 0 files removed, 0 files unresolved
BROKEN: See /u/stonnes/Installs/yt-x86_64/src/yt-hg/yt_updater.log
So I looked at that
and the first thing it had was:
pulling from https://bitbucket.org/yt_analysis/yt/
(run 'hg update' to get a working copy)
So I ran hg update and then it failed and said I had no module named
When I start python2.7
>> from yt.mods import *
I get the same error (no module write_array)
Any advice would be appreciated!
I am not even plotting using MATLAB or alike.
I just want to read in the data in Fortran to look at it in some detail,
so I think that my (j,i) switched way should work. Just to make sure.
On Aug 26, 2013, at 1:50 PM, yt-users-request(a)lists.spacepope.org wrote:
> Send yt-users mailing list submissions to
> To subscribe or unsubscribe via the World Wide Web, visit
> or, via email, send a message with subject or body 'help' to
> You can reach the person managing the list at
> When replying, please edit your Subject line so it is more specific
> than "Re: Contents of yt-users digest..."
> Today's Topics:
> 1. Re: pc.add_phase_object(source,[x,y,z],?) (Geoffrey So)
> Message: 1
> Date: Sun, 25 Aug 2013 11:46:23 -0700
> From: Geoffrey So <gsiisg(a)gmail.com>
> To: Discussion of the yt analysis package
> Subject: Re: [yt-users] pc.add_phase_object(source,[x,y,z],?)
> Content-Type: text/plain; charset="iso-8859-1"
> I usually use something similar to
> to use yt to bin the data then use matplotlib to make the plot of the 2D
> profile for more customization options.
> If you compare the test1.....png and test2.png, you should see they are
> using the same data range, the 2D profile is transposed to switch x,y and
> origin set at (0,0).
> hope it's something you can use.
> On Sun, Aug 25, 2013 at 9:34 AM, Renyue Cen <cen(a)astro.princeton.edu> wrote:
>> Hi Matt,
>> so if I just read in data filename like:
>> do j=1,50
>> do i=1,50
>> will that correct Z exactly?
>> On Aug 25, 2013, at 12:28 PM, Matthew Turk <matthewturk(a)gmail.com> wrote:
>>> Hi Renyue,
>>> Yup, it's column/row switched from what one might expect. But, the
>>> bins written out should correspond.
>>> On Sun, Aug 25, 2013 at 10:38 AM, Renyue Cen <cen(a)astro.princeton.edu>
>>>> When I do
>> = None,..)
>>>> It seems that the QuantityZ written out in file filename is transposed
>> with respect to
>>>> x-y plane of the plot plotted by pc.add_phase_object().
>>>> Has anyone encountered this?
>>>> If anyone wants script and actual file to help me, I can send to you.
>>>> yt-users mailing list
>> yt-users mailing list
I did some simple volume rendering with the following script:
volume2 = AMRKDTree(pf, fields=["Dark_Matter_Density"],
cam = pf.h.camera(c, L, W, N, tf, volume=volume2, no_ghost=False,
cam.snapshot(fn="%s_iso-DMdensity-%3.3d.png" % (filenameTHIS, j))
I got rather strange results in that the pictures look symmetric, which I am pretty
sure can not be true.
I attach the obtained plot.
Note that I am using KD tree and using 32 cores.
Your help at your earliest convenience is appreciated.
We're proud to release yt version 2.5.5. This is an unscheduled point
release that includes all bug fixes identified and fixed since the
release of 2.5.4 on July 2nd. We missed our scheduled release date a
bit, and have continued to push off a 2.6 as development on the 2.x
branch winds down.
Additions, changes and bug fixes include:
* New absorption spectrum analysis module with documentation
* Fix for volume rendering on the command line
* map_to_colormap will no longer return out-of-bounds errors
* Fixes for dds in covering grid calculations
* Library searching for build process is now more reliable
* Unit fix for "VorticityGrowthTimescale" field
* Pyflakes stylistic fixes
* Adding ability to draw lines with Grey Opacity in volume rendering
* Updated physical constants to reflect 2010 CODATA data
* Number density added to FLASH
* Dependency updates (including IPython 1.0)
* Many fixes for Athena frontend
* Radius and ParticleRadius now work for reduced-dimensionality datasets
* Source distributions now work again!
* Better notebook support for yt plots
* Athena data now 64 bits everywhere
* Grids displays on plots are now shaded to reflect the level of refinement
* show_colormaps() is a new function for displaying all known colormaps
* PhasePlotter by default now adds a colormap.
If you are using the stable branch of yt from an installation script,
you can upgrade using "yt update" or "yt update --all" to upgrade your
full dependency stack. If you are using the development branch, you
may already have these fixes. A tarball of this release has been
uploaded to the Python Package Index (PyPI).
yt releases often feature contributions from many individuals; this
release includes first time contributions from John Forbes, Noel
Scudder and Hilary Egan.
Documentation for this release can be found at:
Thanks very much,
Matt, on behalf of the yt development team
We're proud to announce the third ALPHA release of yt 3.0. yt has
recently transitioned to a time-based release plan (
https://ytep.readthedocs.org/en/latest/YTEPs/YTEP-0008.html ) and this
is the third scheduled alpha of 3.0, although we've grossly missed the
estimated date of July 15. No date for a "final" release has yet been
set, but there will likely be several more alphas before that time.
The yt 2.5 codebase, and further updates in the 2.x series, will be
supported for a considerable amount of time and you do not need to
= yt 3.0?! =
yt 3.0 represents a new direction forward for yt: getting rid of all
the underlying assumptions that data needs to be sectioned off into
nice little grid patches. This includes supporting Octree codes
natively (NMSU-ART and RAMSES), eventual support for SPH codes, and
even opaque data structures where the data is extremely large (ARTIO).
We're even planning support for natively handling cylindrical and
However, this *is* an alpha release. Not all of the existing codes
have been ported to 3.0.
Additionally, this release benefits from the technical and
non-technical contributions from many new people. yt is developed in
the context of a community of contributors, and with the push toward a
new architecture, we aim to expand that community considerably. In
particular, this release has considerably benefited from contributions
from many new individuals.
= Getting It! =
To try out yt 3.0, you can now pull from the main yt repository,
update to the yt-3.0 branch, and rebuild your extensions. Or, if you
would like to create a new, safely sectioned off environment, simply
re-run the normal "development" install script after changing the
variable BRANCH to "yt-3.0".
If you would like to try out yt 3.0 and are having trouble, please
write to the yt-users mailing list for assistance.
The yt 3.0 install script may also work, which can be obtained by
executing these commands:
= What's New? =
A demo notebook demonstrating much new functionality, and including
the full release notes, can be found here:
The bullet-pointed release notes can be found at the end of this
email, as well. The main improvements include *considerable* memory
usage reduction, ARTIO spatial data indexing support, better particle
support, an overhaul of the selection methods for data objects, and a
mechanism for on-the-fly definitions of particle types based on
Additionally, this release was used in the production of the AGORA
project flagship paper. The analysis script used there can be seen at
this short URL:
and the paper can be found here: http://arxiv.org/abs/1308.2669
= Reporting Problems =
If you test out yt 3.0 we want to hear if it DID or DID NOT work!
Feedback is crucial at this time. yt-users and yt-dev are both good
forums for discussion, asking questions, and reporting problems. Lots
of things have changed on the backend, but we have attempted to
minimize the user-facing changes.
To report a bug please go here:
Note that you will not receive updates if you are not logged in when
you create the bug.
= What's Next? =
The next alpha release (3.0a4) will be released sometime this fall,
but development can be monitored either at
http://bitbucket.org/yt_analysis/yt-3.0 or in the main yt repository
under the named branch "yt-3.0". We hope to focus on generalizing
Octree support further, adding better non-Cartesian support, and
supporting generic smoothing kernel definitions.
If you'd like to participate in yt development, please stop by #yt on
irc.freenode.net ( http://yt-project.org/irc.html ) or yt-dev (
http://lists.spacepope.org/listinfo.cgi/yt-dev-spacepope.org ), or
submit a pull request on BitBucket.
Thanks very much,
Matt, on behalf of the yt development team
(Including 434 changesets from 7 contributors since 3.0a2)
* Fixes for volume rendering and porting of the Cython kD-tree. VR
now works for octree codes, albeit slowly.
* Major reorganization and simplification of selector code, including
* ARTIO now has spatial data support
* Major bug fix for FLASH IO. FLASH works again, following a
regression in 3.0a2.
* Merged with mainline yt development, bringing developments from yt
2.5.3 and 2.5.4 into 3.0.
* OWLS (Gadget-HDF5) data now updated to match functionality of other
* Creationg of "arbitrary_grid" data selector for flexible particle
deposition operations. This enables creation of a grid of arbitrary
dimensions that can have particles smoothed or deposited within them.
* Conversion of all CIC operations to particle deposition operations.
This results in much more flexible particle type selection and speed
* Refactoring of particle fields to enable simpler creation and
collection of particle fields. Particle fields defined for Eulerian
codes and Lagrangian codes are now interchangeable.
* New, currently unused parallel ring iteration. To be explored in
the future for smoothing kernels.
* Many improvements for NMSU-ART
* Fixed a crazy QuadTree deposition bug for projections that showed up
when projecting octree particle deposition results through a
restricted data object.
* Fixed a crazy off-by-one bug for binary Gadget IO.
* Enormous Octree diet.
* Oct leaf nodes now cost 256 bits each, which may eventually be
reduced to 192 bits. Previous versions were considerably larger.
* Octree traversal is strictly recursive, enabling
distributed-memory octrees to be implemented.
* RAMSES octrees are created by-domain instead of globally. This
will enable better parallelism in the future and faster ghost zone
generation when that is implemented.
* Particle octree are now constructed via Z-curve generation.
This is considerably faster (10-20x) as well as much more memory
conservative. This was designed to utilize parallel octree
construction in the future.
* Particle octrees are now indexed by coarse base-level indexing
of domain regions. This enables unsorted particle files to be indexed
for IO as well as spatial selection without mandating that leaf nodes
are fully-contained within a single file. Reduction in total oct leaf
count of ~8x for multi-file particle datasets.
* YTDataContainer no longer requires .shape and .size
* Only requested fields are plotted, not translated fields
* Reduce memory usage of spatially-chunked fields for patch datasets
* Added arbitrary particle loader, load_particles.
* Units fix for RAMSES boxlen!=1.0 datasets
* Particle filtering for dynamically-created particle types, similar
to derived fields
* Improve type consistency in TotalQuantity and other derived quantities.
* Fixed major error with ghost zone generation and uninitialized values
* Some resiliency for particle datasets that do not specify a domain
* Bugfix for 1- and 2-D Enzo datasets
* Improvements to the Tipsy frontend, including auto-detecting parameter files
* Enable specification of n_ref when creating particle dataset
* Enable particle deposition to back-act on particle fields
* Major fix for octree particle counting, resulting in smaller meshes
* Enable "source" specification for volume rendering
* Enable initial volume rendering of octree datasets
* Fixed error with RAMSES C/F ordering of data
* Many fixes related to field dependency calculation
* FLASH cylindrical & polar pixelization fix
I'm new to yt. I have questions about making projections of certain region,
and inspecting the data returned.
I made a cube as follows:
import pylab as pl
center = [0.5, 0.5, 0.5]
halfwidth = 0.005
cube = pf.h.region(center, center-pl.ones(3)*halfwidth,
print min(cube['x']), max(cube['x'])
When I print cube['x'].shape, I get (10728,).
I tried to make a projection of this cube along z-axis as follows:
proj2 = pf.h.proj(2, 'Density', center=center, source=cube)
But I could not understand what is returned.
x']), max(proj2['py']), proj2['px'].shape
0.00390625 0.99609375 (19648,)
[ 0. 0. 0. ..., 0. 0. 0.]
My questions are:
1. What exactly is returned in proj2? Why does it span from 0. to 1. when I
2. I also tried 'data_source=cube' instead of 'source=cube'. It did not
raise any error, and returned
something different: array of same shape, but with different data values.
Is this keyword also used
for projection object? Is there a consistent difference between the two
keywords in yt?
3. I could not find documentation on some fields like 'px', 'py' that seem
to be generated for certain data containers
(e.g., 't' for ray objects). Is there an exhaustive list of such fields in
Thanks in advance.
My name is Luki.
I'm new in yt.
I have a problem in extracting object from yt. I want to have the surface
object in .obj or .ply file so then i can transform it to .u3d for 3d pdf
I have a data cube that i build this way.
from yt.mods import *
from numpy import *
Then, I would like to have a surface of certain value in my data and
extract it to .obj files
sp = pf.h.sphere("center", (250, "pc"))
trans = 1.0
distf = 3.08e18
surf = pf.h.surface(sp, "Density", 0.0006)
surf.export_obj("myism.obj", transparency=trans, dist_fac = distf)
But i have a problem with segmentation fault.
Segmentation fault: 11
I'll also try export_ply but i have the same problem :(
sp = pf.h.sphere("center", (250, "pc"))
surf = pf.h.surface(sp, "Density", 0.0006)
bounds = [(sp.center[i] - 125.0/pf['pc'],sp.center[i] + 125.0/pf['pc']) for
i in range(3)]
Can you help me?
Thankyou in advance
This past week I prepared some slides on how to deal with and report issues
that may come up in daily use of yt. The slides are available here:
The slides were based on a talk Cameron gave at the first yt workshop at
the FLASH center in January 2012 but I went ahead and updated them to
reflect current best practices. If you have a minute, please take a look.
PS: The slides for the yt-3.0 demo I gave are also available: