Quick note -- I'm attaching a screenshot of the VTK interface. I
tried to open up a bunch of the editor windows, to show editing of the
contours, the plane location, the camera path, etc. The color map for
the plane and the color map for the contours are identical, so that is
why it's not super-distinct. I didn't demonstrate the drag-and-drop,
but you can drag items from the tree view on the left into the
interpreter to get access to them.
On Wed, Sep 9, 2009 at 9:56 AM, Matthew Turk
Hi everyone,
As per Sam's request (offline) to get some info about how to use the VTK interface, I'm writing an email to start a dialogue about this. Right now, the VTK interface is exposed through the next version of reason, which is stored inside this mercurial repository:
Here are the requirements:
* VTK * ETS
If you install VTK, be sure to install the Python bindings. During the cmake phase, you will have to ensure that the Python interpreter pointed to is the same one you used to install yt. You *may* have to manually edit the CMakeCache file to ensure that it does *NOT* install with a "--prefix" command, which will confuse the situation. I will speak below, briefly, about my feelings on the Raft and how it plays into all of this. Once you have installed VTK, you should be able to install ETS (code.enthought.com) with
$ easy_install "ETS[nonets]"
but, if not, do a manual source install. This will involve a couple more steps -- getting ETSProjectTools, running "ets co ets" and then doing "ets develop" in the ets directory. Alternatively, the newest EPD (5.0) is supposedly far more stable and easy to install and get working on OSX.
Check out the hg and update to branch 'yt'. Now execute the command:
$ python2.5 yt/reason/reason_v2.py
and you will have access to the next-gen GUI. Open up a parameter file and you then have access to the VTK interface by right clicking on the parameter file in question. This is a developing project that I haven't really played with in a few months. However, it can do isosurfaces, marching cubes, cutting planes, box outlines, camera paths (save to a file, even, and export to Amira format) and some other stuff. It's also stereo-enabled. With John Wise's work on a software-volume renderer (and any future developments on that) hopefully this will also become a gateway for setting up renderings of datasets after prototyping. I know that Sam Skillman has also expressed interest in using this as a platform for rendering.
Okay, so, that was kind of a pain, right? Well, that's where the Raft came in. I began the Raft project when I saw how the FEMhub guys had undertaken the task of getting Mayavi2 and its dependencies (Traits, VTK, etc etc) installed in a simple, cross-platform manner. Right now, it already installs VTK and Traits (not on OSX, but that should change by endofthemonth according to Ondrej Certik and Prabhu Ramachandran) and I am working on getting wxPython to install as well, as available. If I can get wxPython to go, then we should be 100% set up for running this next-gen GUI with embedded VTK. However, an additional and awesome feature of the Raft (that I cannot take any credit for) is that it includes off-screen rendering, done via Mesa. If the VTK code I've written can be refactored to have the rendering independent of the GUI, then this will also become a viable mechanism for rendering. I've done this, via monkeypatching and hackery, already with a RAFT notebook -- I made some images of a cosmology dataset, rendered with Mesa, using VTK widgets, displayed through a web browser. If anyone is interested I can post that worksheet.
Sam, if you run into any problems, please feel free to reply to this message so we can figure them out together and move forward. I'm very excited about all this, and I'd really like to bring more people on board with developing the VTK interface and the GUI. I've been in touch with Prabhu from the MayaVi project, and he has sent me a patch to read in MultiBlock data in MayaVi. Currently we use the HierarchicalBoxDataSet, which I've modified the patch to support as well. I'd like to continue working on that project, but I might need some help in that department from other people -- but if we can get our data into MayaVi, that would be a huge step forward in usability and exploration.
-Matt