Wow, awesome work, both of you!  I enjoyed your other renderings on your website, too.  And thanks for sharing your parallel script!

I looked through the ray tracing code, and saw that it could only supports one field.  Would it be easy to support separate fields for opacity and color?  i.e. density for opacity and temperature for color.  Then we can re-create those "photo-realistic" renderings ...

http://www.astro.princeton.edu/~jwise/movies/FirstStarLighting_CLASSIC_HD_MONO720.mov
http://www.astro.princeton.edu/~jwise/movies/FirstStarLighting_RedBlue_MONO_HD720.mov

with yt!

Cheers,
John

On 5 Feb 2010, at 19:27, Sam Skillman wrote:

Hi all,

Matt asked me to share some recent volume renderings that I've made, along with the scripts to go with them.  I've posted one such example here:

http://casa.colorado.edu/~skillman/research_and_codes/files/0c4c670cea1660aecbf0d0abdf3f3120-3.html

Beware the movie is about 260 MB.

This is a fairly small simulation but is a good example of what the volume renderer can do.  I've implemented an "embarrassingly" parallel script that allocates one datadump/viewpoint to each processor.  Here each of the 1717 frames can be partitioned and rendered in about a minute for a 1024^2 image, meaning that on 16 processors this took about 2 hours.  If you have questions about how the script works, please let me know.  

Cheers,
Sam

--
Samuel W. Skillman
DOE Computational Science Graduate Fellow
Center for Astrophysics and Space Astronomy
University of Colorado at Boulder
samuel.skillman[at]colorado.edu
_______________________________________________
yt-users mailing list
yt-users@lists.spacepope.org
http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org