
Hi Dave,
If you wanted to add this, you would modify:
yt/data_objects/profiles.py yt/utilities/data_point_utilities.c
You'd probably be able to copy/paste most of the necessary code to pass in additional arguments, but you'd want to pass in additional arrays for the necessary storage to the functions Py_Bin1DProfile, Py_Bin2DProfile and Py_Bin3DProfile. (These could some day be turned into Cython routines.) Note that in the Python code these lose the prefix "Py_". In that article there are comments suggesting that a different algorithm avoids error better, which isn't on that page but is in a linked PDF. Because we may be doing this over many thousands of individual sets of arrays, that's likely the right one to use.
To contribute, since this will modify a piece of core functionality, it will require detailed code review. So I would suggest that you fork the yt repository on bitbucket:
https://bitbucket.org/yt_analysis/yt/fork
and then push to your new repository your changes. Issue a pull request to signify it's ready to be reviewed.
Thanks, and let me know if I can help with more information or assistance,
Matt
On Wed, Oct 5, 2011 at 1:36 PM, David Collins dcollins@physics.ucsd.edu wrote:
Matt and Britton--
Thanks for the input. I'll poke around with both of those ideas and let you know what I come up with.
Thanks! d.
On Wed, Oct 5, 2011 at 10:13 AM, Matthew Turk matthewturk@gmail.com wrote:
Hi Dave,
I've thought about this on and off and there is an oustanding bug:
https://bitbucket.org/yt_analysis/yt/issue/277/standard-deviation-for-1d-pro...
One option is also to simply do it yourself by not using lazy_reader, and loading the data manually. (Cameron has done this, sort of.) Another is to use a running-stddev algorithm, like one of those listed here:
http://www.strchr.com/standard_deviation_in_one_pass
If you want to look at this more in depth, please feel free to either ping yt-dev or leave comments in the ticket. It'd be a great addition to have.
-Matt
On Wed, Oct 5, 2011 at 12:01 PM, Britton Smith brittonsmith@gmail.com wrote:
Hi Dave,
The best I have been able to do for this is to manually bin up a 3d data object with the cut_region function and then do the calculation myself bin by bin. For example, if you have a sphere, you can do new_sphere = sphere.cut_region(['grid["Temperature"] > 1e5', 'grid["Temperature"] < 1e6']) That'll give you access to all the cells within that bin for a variance calculation. Note the quotes around those expressions. The cuts are applied in yt using eval functions.
This will work, but it's slow and requires you to do a lot by hand.
Britton
On Wed, Oct 5, 2011 at 11:51 AM, david collins antpuncher@gmail.com wrote:
Hi, Everybody!
Does anyone out there have a technique for getting the variance out of a profile object? A profile object is good at getting <X> vs. B, I'd then like to get < (X - <X>)^2 > vs B. Matt and I had spittballed the possibility some time ago, but I was wondering if anyone out there had successfully done it.
Thanks, d.
-- Sent from my computer. _______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
-- Sent from my computer. _______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org