Please forgive the subject line -- it makes sense to me; hopefully after reading the note it will make sense to me+1 or more....
Just last week I became aware of yt, and so this is a question about capabilities:
I'm working on a project to aid a researcher with visualizing derived data products (a set of ~5000 histograms) along with volume rendering of simulations performed with Flash (flash.uchicago.edu
). This would involve manipulation of the rendered volume data (e.g. temp, density data fields) and coordinated presentation in an inset or separate window from the set of histograms. The active camera view angles from the rendered volume display would be used to select from the histogram set (in this case indexed by 51 polar angle bins and 101 azimuthal angle bins). The key is for the histogram selection to take place as the rendered display is being updated through user interaction -- so as the view is rotated, the histogram updates without having to stop the user actions.
Right now, these histograms are created at the end of the simulation. One could imagine a more general case in which the histogram were derived from the volumetric data in real time.
Is this a task suited to the current state of yt development?