Hi guys, I’m Alex from Hull University. We’re hosting the UK’s National Astronomy Meeting in July and I’ve been asked to run the hack day. I’ve managed to procure some VR headsets (HTC Vive, Oculus Rift and MS Hololens) and want to try and see if we could use them for visualising some of our simulation data. I hear you guys may have been playing with VR, is that true? Is there anything that we could use/play-with at this point? Thanks Alex ************************************************** To view the terms under which this email is distributed, please go to http://www2.hull.ac.uk/legal/disclaimer.aspx **************************************************
Hi Alex,
Yup, we have been, and with the vive and oculus specifically. Right
now though, it never got past much of a tech demo phase -- there's
code on my group's bitbucket account (bitbucket.org/data-exp-lab/ )
that runs in Unity, and Nathan Goldbaum got some of the stuff that's
in yt (the volume rendering) to work with the OpenVR python bindings
directly (not srue where that code is) but it's all still pretty early
on. Not sure it's quite shovel ready. There're some students coming
on soon that will take another look, but July might be a bit
optimistic... That being said, it'd be an awesome hack or
collaboration and if you are interested in trying to push it at all,
that'd be pretty rad.
-Matt
On Mon, May 22, 2017 at 10:10 AM, Alexander Hamilton
Hi guys, I’m Alex from Hull University. We’re hosting the UK’s National Astronomy Meeting in July and I’ve been asked to run the hack day.
I’ve managed to procure some VR headsets (HTC Vive, Oculus Rift and MS Hololens) and want to try and see if we could use them for visualising some of our simulation data. I hear you guys may have been playing with VR, is that true? Is there anything that we could use/play-with at this point?
Thanks
Alex
************************************************** To view the terms under which this email is distributed, please go to http://www2.hull.ac.uk/legal/disclaimer.aspx ************************************************** _______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
On Mon, May 22, 2017 at 11:07 AM, Matthew Turk
Hi Alex,
Yup, we have been, and with the vive and oculus specifically. Right now though, it never got past much of a tech demo phase -- there's code on my group's bitbucket account (bitbucket.org/data-exp-lab/ ) that runs in Unity, and Nathan Goldbaum got some of the stuff that's in yt (the volume rendering) to work with the OpenVR python bindings directly (not srue where that code is) but it's all still pretty early on.
This code isn't really usable yet. In order for this to actually work, we'll need a big overhaul of the OpenGL volume rendering in yt to use multiple passes. In addition there were significant performance issues that I never tracked down. Unfortunately I'm no expert on graphics programming and that's probably what's needed here.
Not sure it's quite shovel ready. There're some students coming on soon that will take another look, but July might be a bit optimistic... That being said, it'd be an awesome hack or collaboration and if you are interested in trying to push it at all, that'd be pretty rad.
-Matt
On Mon, May 22, 2017 at 10:10 AM, Alexander Hamilton
wrote: Hi guys, I’m Alex from Hull University. We’re hosting the UK’s National Astronomy Meeting in July and I’ve been asked to run the hack day.
I’ve managed to procure some VR headsets (HTC Vive, Oculus Rift and MS Hololens) and want to try and see if we could use them for visualising some of our simulation data. I hear you guys may have been playing with VR, is that true? Is there anything that we could use/play-with at this point?
Thanks
Alex
************************************************** To view the terms under which this email is distributed, please go to http://www2.hull.ac.uk/legal/disclaimer.aspx ************************************************** _______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
Hi All,
Just FYI - one can also generate points/surfaces with yt and upload them
somewhere like Sketchfab and use their VR mechanisms.
Looking forward to what comes of it!
Cheers,
-Jill
On Tue, May 23, 2017 at 2:14 AM, Nathan Goldbaum
On Mon, May 22, 2017 at 11:07 AM, Matthew Turk
wrote: Hi Alex,
Yup, we have been, and with the vive and oculus specifically. Right now though, it never got past much of a tech demo phase -- there's code on my group's bitbucket account (bitbucket.org/data-exp-lab/ ) that runs in Unity, and Nathan Goldbaum got some of the stuff that's in yt (the volume rendering) to work with the OpenVR python bindings directly (not srue where that code is) but it's all still pretty early on.
This code isn't really usable yet. In order for this to actually work, we'll need a big overhaul of the OpenGL volume rendering in yt to use multiple passes.
In addition there were significant performance issues that I never tracked down. Unfortunately I'm no expert on graphics programming and that's probably what's needed here.
Not sure it's quite shovel ready. There're some students coming on soon that will take another look, but July might be a bit optimistic... That being said, it'd be an awesome hack or collaboration and if you are interested in trying to push it at all, that'd be pretty rad.
-Matt
On Mon, May 22, 2017 at 10:10 AM, Alexander Hamilton
wrote: Hi guys, I’m Alex from Hull University. We’re hosting the UK’s National Astronomy Meeting in July and I’ve been asked to run the hack day.
I’ve managed to procure some VR headsets (HTC Vive, Oculus Rift and MS Hololens) and want to try and see if we could use them for visualising some of our simulation data. I hear you guys may have been playing with VR, is that true? Is there anything that we could use/play-with at this point?
Thanks
Alex
************************************************** To view the terms under which this email is distributed, please go to http://www2.hull.ac.uk/legal/disclaimer.aspx ************************************************** _______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
Hi Alex,
Just to add up one more point: yt is always able to generate pre-rendered virtual reality movies with stereo-spherical lens, which works perfectly with Google Cardboard.
To generate the movie, you could refer to this page: http://yt-project.org/docs/dev/visualizing/volume_rendering.html#spherical-a...
An example move is at: https://www.youtube.com/watch?v=ZYWY53X7UQE . You could also put your smartphone into Google Cardboard to watch the video with Youtube App.
Best wishes,
—
Suoqing Ji
Ph.D Candidate
Department of Physics
University of California, Santa Barbara
http://physics.ucsb.edu/~suoqing
On May 22, 2017, 5:00 PM -0700, Naiman, Jill
Hi All,
Just FYI - one can also generate points/surfaces with yt and upload them somewhere like Sketchfab and use their VR mechanisms.
Looking forward to what comes of it!
Cheers, -Jill
On Tue, May 23, 2017 at 2:14 AM, Nathan Goldbaum
wrote: On Mon, May 22, 2017 at 11:07 AM, Matthew Turk
wrote: Hi Alex,
Yup, we have been, and with the vive and oculus specifically. Right now though, it never got past much of a tech demo phase -- there's code on my group's bitbucket account (bitbucket.org/data-exp-lab/ ) that runs in Unity, and Nathan Goldbaum got some of the stuff that's in yt (the volume rendering) to work with the OpenVR python bindings directly (not srue where that code is) but it's all still pretty early on.
This code isn't really usable yet. In order for this to actually work, we'll need a big overhaul of the OpenGL volume rendering in yt to use multiple passes.
In addition there were significant performance issues that I never tracked down. Unfortunately I'm no expert on graphics programming and that's probably what's needed here.
Not sure it's quite shovel ready. There're some students coming on soon that will take another look, but July might be a bit optimistic... That being said, it'd be an awesome hack or collaboration and if you are interested in trying to push it at all, that'd be pretty rad.
-Matt
On Mon, May 22, 2017 at 10:10 AM, Alexander Hamilton
wrote: Hi guys, I’m Alex from Hull University. We’re hosting the UK’s National Astronomy Meeting in July and I’ve been asked to run the hack day.
I’ve managed to procure some VR headsets (HTC Vive, Oculus Rift and MS Hololens) and want to try and see if we could use them for visualising some of our simulation data. I hear you guys may have been playing with VR, is that true? Is there anything that we could use/play-with at this point?
Thanks
Alex
************************************************** To view the terms under which this email is distributed, please go to http://www2.hull.ac.uk/legal/disclaimer.aspx ************************************************** _______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
participants (5)
-
Alexander Hamilton
-
Matthew Turk
-
Naiman, Jill
-
Nathan Goldbaum
-
Suoqing Ji