Column Density derived field
Hi all, I think I have the column density thingy working. I've got slices, projections and volume renders working (http://vimeo.com/31035296). I have a few questions. - Where in yt should I put this? Keeping the file in my work directory I use it like this, below. yt/utilities? Better ideas? from ColumnDensity import * cdnumdens = ColumnDensity(pf, 'NumberDensity', [0.5, 0.5, 0.5]) def _CDNumberDensity(field, data, cd = cdnumdens): return cd._build_derived_field(data) add_field('CDNumberDensity', _CDNumberDensity) - The function _build_derived_field, used directly above, is slightly odd. What happens in it is I get the cell positions from data (data['x'], etc..) and put those into the stuff that interpolates the column density. I get a field based on these positions, and that's what gets returned. However, and this shows what I don't understand, for projections (as in not slices) one of the times this function gets called as part of the add_field machinations data is a FieldDetector, and calling data['x'] on them is not what I want. So I've added an "if" that basically returns na.ones when data is a FieldDetector. What would be a better option than that? The source as it stands is pasted below. http://paste.enzotools.org/show/1892/ Thanks for your thoughts! -- Stephen Skory s@skory.us http://stephenskory.com/ 510.621.3687 (google voice)
Hi Stephen, I'm a bit confused. You're making a derived field called ColumnDensity, which is actually the Column Density of a ray of material between each field element in the volume and some sort of "center" position? If this is the case, I think to name this simply ColumnDensity may confuse some users (it would confuse me), as a ColumnDensity is already calculated for a plane by the use of projections and off-axis projections. I would rather this be called ColumnDensityToAPoint or RadialColumnDensity or something like that to avoid the confusion, and maybe including a line telling explicitly what you're doing (i.e. calculating the CD with respect to a point) in the comments section in the class docstrings. Is this right, or am I totally misunderstanding what you're doing here? The video looks nice! Cameron On 10/24/2011 04:37 PM, Stephen Skory wrote:
Hi all,
I think I have the column density thingy working. I've got slices, projections and volume renders working (http://vimeo.com/31035296). I have a few questions.
- Where in yt should I put this? Keeping the file in my work directory I use it like this, below. yt/utilities? Better ideas?
from ColumnDensity import *
cdnumdens = ColumnDensity(pf, 'NumberDensity', [0.5, 0.5, 0.5])
def _CDNumberDensity(field, data, cd = cdnumdens): return cd._build_derived_field(data) add_field('CDNumberDensity', _CDNumberDensity)
- The function _build_derived_field, used directly above, is slightly odd. What happens in it is I get the cell positions from data (data['x'], etc..) and put those into the stuff that interpolates the column density. I get a field based on these positions, and that's what gets returned. However, and this shows what I don't understand, for projections (as in not slices) one of the times this function gets called as part of the add_field machinations data is a FieldDetector, and calling data['x'] on them is not what I want. So I've added an "if" that basically returns na.ones when data is a FieldDetector. What would be a better option than that? The source as it stands is pasted below.
http://paste.enzotools.org/show/1892/
Thanks for your thoughts!
-- Cameron Hummels PhD Candidate, Astronomy Department of Columbia University Public Outreach Director, Astronomy Department of Columbia University NASA IYA New York State Student Ambassador http://outreach.astro.columbia.edu PGP: 0x06F886E3
Hi Stephen, Cameron's points about naming aside, thank you very much for contributing this. I think it would go well in a new analysis_modules subdir, but utilities would work fine as well. The issue about field detector is an interesting one. I guess I don't understand why asking for 'x' is a problem. Your solution sounds good to me. How fast is the code? It looks to me like it probably does quite a few expensive operations ... would you be willing at some point in the future to explore replacing it with an actual adaptive healpix operation? -Matt On Mon, Oct 24, 2011 at 4:37 PM, Stephen Skory <s@skory.us> wrote:
Hi all,
I think I have the column density thingy working. I've got slices, projections and volume renders working (http://vimeo.com/31035296). I have a few questions.
- Where in yt should I put this? Keeping the file in my work directory I use it like this, below. yt/utilities? Better ideas?
from ColumnDensity import *
cdnumdens = ColumnDensity(pf, 'NumberDensity', [0.5, 0.5, 0.5])
def _CDNumberDensity(field, data, cd = cdnumdens): return cd._build_derived_field(data) add_field('CDNumberDensity', _CDNumberDensity)
- The function _build_derived_field, used directly above, is slightly odd. What happens in it is I get the cell positions from data (data['x'], etc..) and put those into the stuff that interpolates the column density. I get a field based on these positions, and that's what gets returned. However, and this shows what I don't understand, for projections (as in not slices) one of the times this function gets called as part of the add_field machinations data is a FieldDetector, and calling data['x'] on them is not what I want. So I've added an "if" that basically returns na.ones when data is a FieldDetector. What would be a better option than that? The source as it stands is pasted below.
http://paste.enzotools.org/show/1892/
Thanks for your thoughts!
-- Stephen Skory s@skory.us http://stephenskory.com/ 510.621.3687 (google voice) _______________________________________________ Yt-dev mailing list Yt-dev@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-dev-spacepope.org
Cameron & Matt, What it's called isn't a big concern to me, but I see what you're saying, Cameron.
The issue about field detector is an interesting one. I guess I don't understand why asking for 'x' is a problem. Your solution sounds good to me.
When I wasn't looking for the field detector as I am now, what was happening is data['x' or 'y' or 'z'] would return some values that weren't cell centers, which when passed to the interpolation stuff would not work. So asking for 'x' wasn't a problem, but the values it returned were not what I wanted.
How fast is the code? It looks to me like it probably does quite a few expensive operations
Running on a 40^3 dataset on my 2.0Ghz i7 lappy on battery power gives about 300,000 cells/second for the whole process (HEALPix with 5 surfaces + interpolation). I think I'm close to making it parallel, but some weird stuff is popping up that I don't quite understand just yet.
... would you be willing at some point in the future to explore replacing it with an actual adaptive healpix operation?
Perhaps. It seems to me before you said that that would be quite a bit more difficult, which looking at the source is true. Everything in this current attempt is using numpy vectorization, so I don't know how much more speed can be squeezed out of this method. -- Stephen Skory s@skory.us http://stephenskory.com/ 510.621.3687 (google voice)
On Tue, Oct 25, 2011 at 9:23 AM, Stephen Skory <s@skory.us> wrote:
Cameron & Matt,
What it's called isn't a big concern to me, but I see what you're saying, Cameron.
The issue about field detector is an interesting one. I guess I don't understand why asking for 'x' is a problem. Your solution sounds good to me.
When I wasn't looking for the field detector as I am now, what was happening is data['x' or 'y' or 'z'] would return some values that weren't cell centers, which when passed to the interpolation stuff would not work. So asking for 'x' wasn't a problem, but the values it returned were not what I wanted.
How fast is the code? It looks to me like it probably does quite a few expensive operations
Running on a 40^3 dataset on my 2.0Ghz i7 lappy on battery power gives about 300,000 cells/second for the whole process (HEALPix with 5 surfaces + interpolation). I think I'm close to making it parallel, but some weird stuff is popping up that I don't quite understand just yet.
... would you be willing at some point in the future to explore replacing it with an actual adaptive healpix operation?
Perhaps. It seems to me before you said that that would be quite a bit more difficult, which looking at the source is true. Everything in this current attempt is using numpy vectorization, so I don't know how much more speed can be squeezed out of this method.
I have reconsidered my position. I wrote a set of adaptive healpix routines last Spring that I wasn't able to get to work. Here's where I was at, last Spring. 1) I created an "AdaptiveRaySource" which emanated "AdaptiveRayPackets" 2) Each ray packet walked a given grid. At the entry to a grid, it was checked to see if it needed to be refined. If so, it was. 3) At the end, they were collected into an array of value, pixel id, and NSide. There were a couple problems with this. The first is that rays got lost once in a while; this may be either because of a bug in my implementation or because they get refined to a grid that's previous in the kD-tree traversal. It should be fixable. There are two additional problems: the first is that step #2 is not correct. The rays should be checked at every single cell traversal to see if they need to be refined. The second problem is that right now it would be difficult to deposit the column density, although this is readily fixed. I think the entire thing is tractable, and I think it should coincide with a refactoring of the current volume rendering. As it stands, volume rendering in yt right now uses the PartitionedGrid class, which walks rays and calls sample_values. I think it would be more efficient for what we're talking about doing to re-use this implementation of Amanatides & Woo but supply a function pointer and a void* struct of data to the function pointer, which would allow us to use the same "cast_ray" function for both adaptive ray packets and normal packets. This would also allow us to refine rays mid-grid. Anyway, if you're interested in this, let me know. -Matt
-- Stephen Skory s@skory.us http://stephenskory.com/ 510.621.3687 (google voice) _______________________________________________ Yt-dev mailing list Yt-dev@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-dev-spacepope.org
Hi Stephen, For what it's worth, I also agree with Cameron that calling it "ColumnDensity" is a bit too broad, and maybe it should be called something like "RadialColumnDensity" or something similarly indicative of its nature to indicate it's not the same as a projection. Can you also maybe take a minute to outline a couple use cases? -Matt On Tue, Oct 25, 2011 at 9:23 AM, Stephen Skory <s@skory.us> wrote:
Cameron & Matt,
What it's called isn't a big concern to me, but I see what you're saying, Cameron.
The issue about field detector is an interesting one. I guess I don't understand why asking for 'x' is a problem. Your solution sounds good to me.
When I wasn't looking for the field detector as I am now, what was happening is data['x' or 'y' or 'z'] would return some values that weren't cell centers, which when passed to the interpolation stuff would not work. So asking for 'x' wasn't a problem, but the values it returned were not what I wanted.
How fast is the code? It looks to me like it probably does quite a few expensive operations
Running on a 40^3 dataset on my 2.0Ghz i7 lappy on battery power gives about 300,000 cells/second for the whole process (HEALPix with 5 surfaces + interpolation). I think I'm close to making it parallel, but some weird stuff is popping up that I don't quite understand just yet.
... would you be willing at some point in the future to explore replacing it with an actual adaptive healpix operation?
Perhaps. It seems to me before you said that that would be quite a bit more difficult, which looking at the source is true. Everything in this current attempt is using numpy vectorization, so I don't know how much more speed can be squeezed out of this method.
-- Stephen Skory s@skory.us http://stephenskory.com/ 510.621.3687 (google voice) _______________________________________________ Yt-dev mailing list Yt-dev@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-dev-spacepope.org
Matt and everyone, I'm also interested in a tool of this type. Stephen has a particular use in mind, but also, one could imagine calculating an "all-sky" light-cone type of thing using the similar code tool. Does the HEALPIX stuff do that? We had discussed in the past doing a projection outward from a point, then stacking the higher z outputs in shells around the z=0 set, to make a spherical lightcone projection. Not sure if the current tools are capable of the individual pieces of such an operation, but what Stephen's describing would certainly do part of it. Eric On Tue, Oct 25, 2011 at 11:22 AM, Matthew Turk <matthewturk@gmail.com>wrote:
Hi Stephen,
For what it's worth, I also agree with Cameron that calling it "ColumnDensity" is a bit too broad, and maybe it should be called something like "RadialColumnDensity" or something similarly indicative of its nature to indicate it's not the same as a projection.
Can you also maybe take a minute to outline a couple use cases?
-Matt
On Tue, Oct 25, 2011 at 9:23 AM, Stephen Skory <s@skory.us> wrote:
Cameron & Matt,
What it's called isn't a big concern to me, but I see what you're saying, Cameron.
The issue about field detector is an interesting one. I guess I don't understand why asking for 'x' is a problem. Your solution sounds good to me.
When I wasn't looking for the field detector as I am now, what was happening is data['x' or 'y' or 'z'] would return some values that weren't cell centers, which when passed to the interpolation stuff would not work. So asking for 'x' wasn't a problem, but the values it returned were not what I wanted.
How fast is the code? It looks to me like it probably does quite a few expensive operations
Running on a 40^3 dataset on my 2.0Ghz i7 lappy on battery power gives about 300,000 cells/second for the whole process (HEALPix with 5 surfaces + interpolation). I think I'm close to making it parallel, but some weird stuff is popping up that I don't quite understand just yet.
... would you be willing at some point in the future to explore replacing it with an actual adaptive healpix operation?
Perhaps. It seems to me before you said that that would be quite a bit more difficult, which looking at the source is true. Everything in this current attempt is using numpy vectorization, so I don't know how much more speed can be squeezed out of this method.
-- Stephen Skory s@skory.us http://stephenskory.com/ 510.621.3687 (google voice) _______________________________________________ Yt-dev mailing list Yt-dev@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-dev-spacepope.org
_______________________________________________ Yt-dev mailing list Yt-dev@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-dev-spacepope.org
Hi Eric, On Tue, Oct 25, 2011 at 11:26 AM, Eric Hallman <hallman13@gmail.com> wrote:
Matt and everyone, I'm also interested in a tool of this type. Stephen has a particular use in mind, but also, one could imagine calculating an "all-sky" light-cone type of thing using the similar code tool. Does the HEALPIX stuff do that?
Yes, the existing HEALpix will do this; what I believe Stephen is addressing is the deposition of column density back in the grids. The existing HEALpix will calculate column density (and can also calculate star particle contributions) between two radii and return that to the user.
We had discussed in the past doing a projection outward from a point, then stacking the higher z outputs in shells around the z=0 set, to make a spherical lightcone projection.
I believe this is possible with existing infrastructure, but keep in mind that it will only currently work with a static-resolution HEALpix pixelization. Stephen's will deposit column density back on the grid so that it can then be treated as any other field in yt. -Matt
Not sure if the current tools are capable of the individual pieces of such an operation, but what Stephen's describing would certainly do part of it. Eric
On Tue, Oct 25, 2011 at 11:22 AM, Matthew Turk <matthewturk@gmail.com> wrote:
Hi Stephen,
For what it's worth, I also agree with Cameron that calling it "ColumnDensity" is a bit too broad, and maybe it should be called something like "RadialColumnDensity" or something similarly indicative of its nature to indicate it's not the same as a projection.
Can you also maybe take a minute to outline a couple use cases?
-Matt
On Tue, Oct 25, 2011 at 9:23 AM, Stephen Skory <s@skory.us> wrote:
Cameron & Matt,
What it's called isn't a big concern to me, but I see what you're saying, Cameron.
The issue about field detector is an interesting one. I guess I don't understand why asking for 'x' is a problem. Your solution sounds good to me.
When I wasn't looking for the field detector as I am now, what was happening is data['x' or 'y' or 'z'] would return some values that weren't cell centers, which when passed to the interpolation stuff would not work. So asking for 'x' wasn't a problem, but the values it returned were not what I wanted.
How fast is the code? It looks to me like it probably does quite a few expensive operations
Running on a 40^3 dataset on my 2.0Ghz i7 lappy on battery power gives about 300,000 cells/second for the whole process (HEALPix with 5 surfaces + interpolation). I think I'm close to making it parallel, but some weird stuff is popping up that I don't quite understand just yet.
... would you be willing at some point in the future to explore replacing it with an actual adaptive healpix operation?
Perhaps. It seems to me before you said that that would be quite a bit more difficult, which looking at the source is true. Everything in this current attempt is using numpy vectorization, so I don't know how much more speed can be squeezed out of this method.
-- Stephen Skory s@skory.us http://stephenskory.com/ 510.621.3687 (google voice) _______________________________________________ Yt-dev mailing list Yt-dev@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-dev-spacepope.org
_______________________________________________ Yt-dev mailing list Yt-dev@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-dev-spacepope.org
_______________________________________________ Yt-dev mailing list Yt-dev@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-dev-spacepope.org
Matt, OK, thanks for clearing that up. The deposition of that quantity (or others like it) could be useful in a number of ways. It is a very specific type of need, but I think there could be others. For instance, one could use this to post-process simulations with rad transport. One could have a collection of points and construct a radiation field quantity on the grid for instance in yt based on distance from the points and assumed luminosity. That is, if one did not do active rad transport in a run. Anyway, interesting. Thanks. Eric On Tue, Oct 25, 2011 at 11:30 AM, Matthew Turk <matthewturk@gmail.com>wrote:
Hi Eric,
Matt and everyone, I'm also interested in a tool of this type. Stephen has a particular use in mind, but also, one could imagine calculating an "all-sky" light-cone type of thing using the similar code tool. Does the HEALPIX stuff do
On Tue, Oct 25, 2011 at 11:26 AM, Eric Hallman <hallman13@gmail.com> wrote: that?
Yes, the existing HEALpix will do this; what I believe Stephen is addressing is the deposition of column density back in the grids. The existing HEALpix will calculate column density (and can also calculate star particle contributions) between two radii and return that to the user.
We had discussed in the past doing a projection outward from a point, then stacking the higher z outputs in shells around the z=0 set, to make a spherical lightcone projection.
I believe this is possible with existing infrastructure, but keep in mind that it will only currently work with a static-resolution HEALpix pixelization. Stephen's will deposit column density back on the grid so that it can then be treated as any other field in yt.
-Matt
Not sure if the current tools are capable of the individual pieces of such an operation, but what Stephen's describing would certainly do part of it. Eric
On Tue, Oct 25, 2011 at 11:22 AM, Matthew Turk <matthewturk@gmail.com> wrote:
Hi Stephen,
For what it's worth, I also agree with Cameron that calling it "ColumnDensity" is a bit too broad, and maybe it should be called something like "RadialColumnDensity" or something similarly indicative of its nature to indicate it's not the same as a projection.
Can you also maybe take a minute to outline a couple use cases?
-Matt
On Tue, Oct 25, 2011 at 9:23 AM, Stephen Skory <s@skory.us> wrote:
Cameron & Matt,
What it's called isn't a big concern to me, but I see what you're saying, Cameron.
The issue about field detector is an interesting one. I guess I
don't
understand why asking for 'x' is a problem. Your solution sounds good to me.
When I wasn't looking for the field detector as I am now, what was happening is data['x' or 'y' or 'z'] would return some values that weren't cell centers, which when passed to the interpolation stuff would not work. So asking for 'x' wasn't a problem, but the values it returned were not what I wanted.
How fast is the code? It looks to me like it probably does quite a few expensive operations
Running on a 40^3 dataset on my 2.0Ghz i7 lappy on battery power gives about 300,000 cells/second for the whole process (HEALPix with 5 surfaces + interpolation). I think I'm close to making it parallel, but some weird stuff is popping up that I don't quite understand just yet.
... would you be willing at some point in the future to explore replacing it with an actual adaptive healpix operation?
Perhaps. It seems to me before you said that that would be quite a bit more difficult, which looking at the source is true. Everything in this current attempt is using numpy vectorization, so I don't know how much more speed can be squeezed out of this method.
-- Stephen Skory s@skory.us http://stephenskory.com/ 510.621.3687 (google voice) _______________________________________________ Yt-dev mailing list Yt-dev@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-dev-spacepope.org
_______________________________________________ Yt-dev mailing list Yt-dev@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-dev-spacepope.org
_______________________________________________ Yt-dev mailing list Yt-dev@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-dev-spacepope.org
_______________________________________________ Yt-dev mailing list Yt-dev@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-dev-spacepope.org
Matt,
something like "RadialColumnDensity" or something similarly indicative of its nature to indicate it's not the same as a projection.
Sure.
Can you also maybe take a minute to outline a couple use cases?
The mean thing I wanted to do was to calculate some column density contours using the contour finder. I haven't really thought beyond that. I see that Eric has some ideas. Does anyone else have any other ideas? -- Stephen Skory s@skory.us http://stephenskory.com/ 510.621.3687 (google voice)
Matt, with regards to your comments on how to complete/fix the AdaptiveRaySource stuff, I think that my interest in trying that would be driven both by my possible need for something faster and with, as you said, refactoring of the volume rendering stuff. If I need something faster remains to be seen. -- Stephen Skory s@skory.us http://stephenskory.com/ 510.621.3687 (google voice)
participants (4)
-
Cameron Hummels
-
Eric Hallman
-
Matthew Turk
-
Stephen Skory