Hi, Everybody!
Does anyone out there have a technique for getting the variance out of
a profile object? A profile object is good at getting <X> vs. B, I'd
then like to get < (X - <X>)^2 > vs B. Matt and I had spittballed the
possibility some time ago, but I was wondering if anyone out there had
successfully done it.
Thanks,
d.
--
Sent from my computer.
Hi,
I did some simple volume rendering with the following script:
volume2 = AMRKDTree(pf, fields=["Dark_Matter_Density"],
no_ghost=False, tree_type="domain",
le=c-0.5*WW, re=c+0.5*WW)
cam = pf.h.camera(c, L, W, N, tf, volume=volume2, no_ghost=False,
north_vector=L, steady_north=True)
cam.snapshot(fn="%s_iso-DMdensity-%3.3d.png" % (filenameTHIS, j))
I got rather strange results in that the pictures look symmetric, which I am pretty
sure can not be true.
I attach the obtained plot.
Note that I am using KD tree and using 32 cores.
Your help at your earliest convenience is appreciated.
Best,
Renyue
Hi all,
I'm attempting to visualize multiple variables in the same volume
rendering. For instance, I'd like to show some density surfaces of a star
and then get temperature overlayed to show hotspots. I found the
MultiVariateTransferFunction, but it is unfortunately not particularly well
documented yet, and I'm being repeatedly thwarted. I'm not sure if I've set
it up correctly, or how to go from the transfer function to an image.
Here's my current script, using just one variable for now:
#----------------------------------
from yt.mods import *
pf = load('plt00100')
#<some constants>
pf.h
pf.field_info['density'].take_log=True
mv = MultiVariateTransferFunction()
tf = TransferFunction((mi-1, ma+1), nbins=1.0e6)
tf.add_gaussian(np.log10(9.0e5), 0.01, 1.0)
mv.add_field_table(tf, 0)
mv.link_channels(0, [0,1,2,3])
c = [5.0e9, 5.0e9, 5.0e9]
L = [0.15, 1.0, 0.40]
W = wi*0.7
Nvec = 1024
cam = pf.h.camera(c, L, W, (Nvec,Nvec), transfer_function = mv,
fields=['density'], pf=pf, no_ghost=True)
im = cam.snapshot(num_threads=4)
im.write_png('plt00100.png' )
#----------------------------------
I checked to make sure density is indeed my 0-index field. I get the
following error after ray casting:
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "multivol.py", line 153, in <module>
im = cam.snapshot(num_threads=4)
File
"/home/noel/shocks/yt-x86_64/src/yt-hg/yt/visualization/volume_rendering/camera.py",
line 742, in snapshot
image, sampler),
File
"/home/noel/shocks/yt-x86_64/src/yt-hg/yt/visualization/volume_rendering/camera.py",
line 632, in _render
image = self.finalize_image(image)
File
"/home/noel/shocks/yt-x86_64/src/yt-hg/yt/visualization/volume_rendering/camera.py",
line 611, in finalize_image
if self.transfer_function.grey_opacity is False:
AttributeError: 'MultiVariateTransferFunction' object has no attribute
'grey_opacity'
Can I not use snapshot in this case, or is something else the matter
entirely?
Thanks,
Noel Scudder
Hi all,
I've just issued a PR that will hopefully fix a whole class of buggy
behavior that both new and experienced yt users commonly run into.
Specifically, I'd like it if we could turn off data serialization by
default. This changes a long-lived default value in yt's configuration, so
I wanted to bring this change to the attention of both the yt user and
developer community.
What is data serialization? Currently, yt will save the result of certain
expensive calculations, including projections, the structure of the grid
hierarchy, and the list of fields present in the data. While this does
have the beneficial effect of saving time when a user needs to repetitively
calculate these quantities on the same dataset, it has a number of features
which lead to buggy, annoying behavior.
Specifically, If I am developing my simulation code or repeatedly
restarting my code, searching for a way to grind past a code crash, I will
quite often regenerate the same simulation output file over and over,
changing a line of code or switching out the value of a parameter each
time.
If yt's data serialization is turned on, it's likely that yt's
visualizations will correspond to old versions of the data file. Since
only certain operations are serialized, it's also possible for yt to get
into an inconsistent state - one operation will show the current data file,
while another operation will show an old version.
It's possible to fix a bug in your code, but because yt is still loading
the old data, you won't be able to tell that your bug is fixed until you
realize that you have .yt and .harrays files littering your filesystem.
I've personally wasted a lot of time due to yt's serialization 'feature'
and denizens of our IRC channel and mailing list can attest to how often
new users run into this behavior as well.
My pull request only turns off serlialization by default, it doesn't
disable the capability completely. Once the pull request is merged in, you
can turn on serialization either by adding an entry to your config file:
$ cat ~/.yt/config
[yt]
serialize = True
Or on a per-script basis:
from yt.config import ytcfg
ytcfg['yt', 'serialize'] = 'True'
from yt.mods import *
The pull request is here:
https://bitbucket.org/yt_analysis/yt/pull-request/558
I know several of you are big fans of this feature, so if you object to
this change please leave a comment on the pull request so we can figure out
a way forward.
-Nathan
Hello yt-users,
I'm currently attempting to build yt on Scinet GPC cluster, where the
Python module can only be loaded alongside Intel and gcc combined.
Compiling with Intel modules leads to mercurial compile error. I've
attempted to build yt with only the gcc module active, and have achieved a
successful build (though ), and yt sources without any errors. Attempting
to run ipython or yt serve, however, gives me this error:
ImportError: libimf.so: cannot open shared object file: No such file or
directory
Is there a workaround to this issue? Assistance would be greatly
appreciated!
Best,
Charles
After defining a disk
my_disk=pf.h.disk(sc, [0,0,1], 450.0*pc, 30.0*pc)
I asked for the minimum and maximum values of these fields
xmin= my_disk["x"].min()
xmax= my_disk["x"].max()
ymin= my_disk["y"].min()
ymax= my_disk["y"].max()
zmin=my_disk["z"].min()
zmax=my_disk["z"].max()
print xmin,xmax,ymin,ymax,zmin,zmax
and got
0.00390625 0.99609375 0.00390625 0.99609375 0.00390625 0.99609375
xmin+xmax=ymin+ymax=zmin+zmax=1
So I really don't know what actually are these fields
Hello all,
I'm trying to access the density for a cell and its six neighbors (in the
x,y,z directions), but I can't figure out how I can get the information for
the neighboring cells. The problem is that dd[ 'NumberDensity' ] is a
one-dimensional array, and I don't know the original 3d shape of the data,
so I can't reshape the array into three dimensions. Is there a way to get
the indices of a cell's neighbors?
Thanks for any help!
Morgan
Hey Guys,
Quick question on off-axis slices: is the normal vector pointed into
the page as you look at the slice, or out of the page?
thanks!
Munier
--
Munier A. Salem // 845.489.6450
Hi,
I'm getting this error during the installation and I'm not sure exactly why.
Could you take a look at this? I just downloaded the 3.0a2 package
publicly available on Bitbucket to try visualizing my SPH snaps. Any help
would be greatly appreciated.
Best,
Keita
Error compiling Cython file:
------------------------------------------------------------
...
cdef void *tree_root
cdef int num_root
cdef int max_root
cdef extern from "search.h" nogil:
void *tsearch(const void *key, const void **rootp,
^
------------------------------------------------------------
yt/geometry/oct_container.pxd:81:29: Expected ')', found '*'
error: 2 errors while compiling 'yt/geometry/oct_container.pyx' with Cython
Hi all,
I'm trying to run some simple analysis on some FLASH runs I've done on Kraken, and it looks like the yt/dev module is stuck somewhere between ver. 2.3 & 2.4. Specifically, it looks like it's before SlicePlot was added since I see it in the 2.4 docs, but not the 2.3 docs. I unsurprisingly get errors with my plotting script:
NameError: name 'SlicePlot' is not defined
"yt instinfo" gives me:
yt module located at:
/lustre/scratch/proj/sw/yt/dev/lib/python2.7/site-packages/yt-2.4dev-py2.7-linux-x86_64.egg
I haven't tried installing yt to my home directory since I'm worried about space issues (we only have 2G total). Is it possible to update the yt/dev module or add a yt/2.5 module?
Thanks,
Kevin