Thanks for writing. Right now, this isn't possible; I had delayed on writing back because I was experimenting with ways of augmenting the existing OpenGL VR to do this, but I ended up not coming up with anything in time. I absolutely think this is where we would like to go with our interactive VR, but I don't have a timetable for it right now.
On Tue, Apr 11, 2017 at 9:54 AM, E.M. Dragowsky firstname.lastname@example.org wrote:
Please forgive the subject line -- it makes sense to me; hopefully after reading the note it will make sense to me+1 or more....
Just last week I became aware of yt, and so this is a question about capabilities:
I'm working on a project to aid a researcher with visualizing derived data products (a set of ~5000 histograms) along with volume rendering of simulations performed with Flash (flash.uchicago.edu). This would involve manipulation of the rendered volume data (e.g. temp, density data fields) and coordinated presentation in an inset or separate window from the set of histograms. The active camera view angles from the rendered volume display would be used to select from the histogram set (in this case indexed by 51 polar angle bins and 101 azimuthal angle bins). The key is for the histogram selection to take place as the rendered display is being updated through user interaction -- so as the view is rotated, the histogram updates without having to stop the user actions.
Right now, these histograms are created at the end of the simulation. One could imagine a more general case in which the histogram were derived from the volumetric data in real time.
Is this a task suited to the current state of yt development?
Best regards, ~ Em
E.M. Dragowsky, Ph.D. Research Computing -- UTech Case Western Reserve University (216) 368-0082
yt-users mailing list email@example.com http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org