Real-time histogram presentation based on interactive volume rendered camera properties

Greetings -- Please forgive the subject line -- it makes sense to me; hopefully after reading the note it will make sense to me+1 or more.... Just last week I became aware of yt, and so this is a question about capabilities: I'm working on a project to aid a researcher with visualizing derived data products (a set of ~5000 histograms) along with volume rendering of simulations performed with Flash (flash.uchicago.edu). This would involve manipulation of the rendered volume data (e.g. temp, density data fields) and coordinated presentation in an inset or separate window from the set of histograms. The active camera view angles from the rendered volume display would be used to select from the histogram set (in this case indexed by 51 polar angle bins and 101 azimuthal angle bins). The key is for the histogram selection to take place as the rendered display is being updated through user interaction -- so as the view is rotated, the histogram updates without having to stop the user actions. Right now, these histograms are created at the end of the simulation. One could imagine a more general case in which the histogram were derived from the volumetric data in real time. Is this a task suited to the current state of yt development? Best regards, ~ Em -- ---------------------------------- E.M. Dragowsky, Ph.D. Research Computing -- UTech Case Western Reserve University (216) 368-0082

Hi Em, Thanks for writing. Right now, this isn't possible; I had delayed on writing back because I was experimenting with ways of augmenting the existing OpenGL VR to do this, but I ended up not coming up with anything in time. I absolutely think this is where we would like to go with our interactive VR, but I don't have a timetable for it right now. -Matt On Tue, Apr 11, 2017 at 9:54 AM, E.M. Dragowsky <dragowsky@case.edu> wrote:
Greetings --
Please forgive the subject line -- it makes sense to me; hopefully after reading the note it will make sense to me+1 or more....
Just last week I became aware of yt, and so this is a question about capabilities:
I'm working on a project to aid a researcher with visualizing derived data products (a set of ~5000 histograms) along with volume rendering of simulations performed with Flash (flash.uchicago.edu). This would involve manipulation of the rendered volume data (e.g. temp, density data fields) and coordinated presentation in an inset or separate window from the set of histograms. The active camera view angles from the rendered volume display would be used to select from the histogram set (in this case indexed by 51 polar angle bins and 101 azimuthal angle bins). The key is for the histogram selection to take place as the rendered display is being updated through user interaction -- so as the view is rotated, the histogram updates without having to stop the user actions.
Right now, these histograms are created at the end of the simulation. One could imagine a more general case in which the histogram were derived from the volumetric data in real time.
Is this a task suited to the current state of yt development?
Best regards, ~ Em
-- ---------------------------------- E.M. Dragowsky, Ph.D. Research Computing -- UTech Case Western Reserve University (216) 368-0082
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org

Hi, Matt -- Thanks for offering this assessment of the current state, including the notion that my interest aligns with future plans. To aid in my report back to my sponsors, perhaps I can extend this conversation to determine if there's anything I can contribute to this development effort? I've described myself as a "researcher who can code", definitely not to be confused with a practiced developer, and yet...these topics of volume rendering and coordinated views have become really interesting to me. Please let me know, ~ Em ---------- Forwarded message ---------- From: Matthew Turk <matthewturk@gmail.com> Date: Tue, Apr 18, 2017 at 2:47 PM Subject: Re: [yt-users] Real-time histogram presentation based on interactive volume rendered camera properties To: Discussion of the yt analysis package <yt-users@lists.spacepope.org> Hi Em, Thanks for writing. Right now, this isn't possible; I had delayed on writing back because I was experimenting with ways of augmenting the existing OpenGL VR to do this, but I ended up not coming up with anything in time. I absolutely think this is where we would like to go with our interactive VR, but I don't have a timetable for it right now. -Matt On Tue, Apr 11, 2017 at 9:54 AM, E.M. Dragowsky <dragowsky@case.edu> wrote:
Greetings --
Please forgive the subject line -- it makes sense to me; hopefully after reading the note it will make sense to me+1 or more....
Just last week I became aware of yt, and so this is a question about capabilities:
I'm working on a project to aid a researcher with visualizing derived data products (a set of ~5000 histograms) along with volume rendering of simulations performed with Flash (flash.uchicago.edu). This would involve manipulation of the rendered volume data (e.g. temp, density data fields) and coordinated presentation in an inset or separate window from the set of histograms. The active camera view angles from the rendered volume display would be used to select from the histogram set (in this case indexed by 51 polar angle bins and 101 azimuthal angle bins). The key is for the histogram selection to take place as the rendered display is being updated through user interaction -- so as the view is rotated, the histogram updates without having to stop the user actions.
Right now, these histograms are created at the end of the simulation. One could imagine a more general case in which the histogram were derived from the volumetric data in real time.
Is this a task suited to the current state of yt development?
Best regards, ~ Em
-- ---------------------------------- E.M. Dragowsky, Ph.D. Research Computing -- UTech Case Western Reserve University (216) 368-0082
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org -- ---------------------------------- E.M. Dragowsky, Ph.D. Research Computing -- UTech Case Western Reserve University (216) 368-0082

If you'd be interested in experimenting with the OpenGL volume rendering, take a look at these doc pages: http://yt-project.org/docs/dev/visualizing/interactive_data_visualization.ht... http://yt-project.org/docs/dev/cookbook/complex_plots.html#cookbook-opengl-v... In addition, the place in the codebase to look to see how this is implemented is e.g. here: https://bitbucket.org/yt_analysis/yt/src/416bc87fd064d8cd5d64a98922c00c1cc71... It makes use of cyglfw, a cython wrapper for the GLFW OpenGL library. If you're familiar with OpenGL that will help. -Nathan On Tue, Apr 18, 2017 at 3:20 PM, E.M. Dragowsky <dragowsky@case.edu> wrote:
Hi, Matt -- Thanks for offering this assessment of the current state, including the notion that my interest aligns with future plans.
To aid in my report back to my sponsors, perhaps I can extend this conversation to determine if there's anything I can contribute to this development effort? I've described myself as a "researcher who can code", definitely not to be confused with a practiced developer, and yet...these topics of volume rendering and coordinated views have become really interesting to me.
Please let me know, ~ Em
---------- Forwarded message ---------- From: Matthew Turk <matthewturk@gmail.com> Date: Tue, Apr 18, 2017 at 2:47 PM Subject: Re: [yt-users] Real-time histogram presentation based on interactive volume rendered camera properties To: Discussion of the yt analysis package <yt-users@lists.spacepope.org>
Hi Em,
Thanks for writing. Right now, this isn't possible; I had delayed on writing back because I was experimenting with ways of augmenting the existing OpenGL VR to do this, but I ended up not coming up with anything in time. I absolutely think this is where we would like to go with our interactive VR, but I don't have a timetable for it right now.
-Matt
On Tue, Apr 11, 2017 at 9:54 AM, E.M. Dragowsky <dragowsky@case.edu> wrote:
Greetings --
Please forgive the subject line -- it makes sense to me; hopefully after reading the note it will make sense to me+1 or more....
Just last week I became aware of yt, and so this is a question about capabilities:
I'm working on a project to aid a researcher with visualizing derived data products (a set of ~5000 histograms) along with volume rendering of simulations performed with Flash (flash.uchicago.edu). This would involve manipulation of the rendered volume data (e.g. temp, density data fields) and coordinated presentation in an inset or separate window from the set of histograms. The active camera view angles from the rendered volume display would be used to select from the histogram set (in this case indexed by 51 polar angle bins and 101 azimuthal angle bins). The key is for the histogram selection to take place as the rendered display is being updated through user interaction -- so as the view is rotated, the histogram updates without having to stop the user actions.
Right now, these histograms are created at the end of the simulation. One could imagine a more general case in which the histogram were derived from the volumetric data in real time.
Is this a task suited to the current state of yt development?
Best regards, ~ Em
-- ---------------------------------- E.M. Dragowsky, Ph.D. Research Computing -- UTech Case Western Reserve University (216) 368-0082
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
-- ---------------------------------- E.M. Dragowsky, Ph.D. Research Computing -- UTech Case Western Reserve University (216) 368-0082
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org

Thanks, Nathan -- will review in the next couple of days. On Tue, Apr 18, 2017 at 4:24 PM, Nathan Goldbaum <nathan12343@gmail.com> wrote:
If you'd be interested in experimenting with the OpenGL volume rendering, take a look at these doc pages:
http://yt-project.org/docs/dev/visualizing/interactive_ data_visualization.html http://yt-project.org/docs/dev/cookbook/complex_plots. html#cookbook-opengl-vr
In addition, the place in the codebase to look to see how this is implemented is e.g. here:
https://bitbucket.org/yt_analysis/yt/src/416bc87fd064d8cd5d64a98922c00c 1cc71a0f7d/yt/visualization/volume_rendering/interactive_vr.py
It makes use of cyglfw, a cython wrapper for the GLFW OpenGL library. If you're familiar with OpenGL that will help.
-Nathan
On Tue, Apr 18, 2017 at 3:20 PM, E.M. Dragowsky <dragowsky@case.edu> wrote:
Hi, Matt -- Thanks for offering this assessment of the current state, including the notion that my interest aligns with future plans.
To aid in my report back to my sponsors, perhaps I can extend this conversation to determine if there's anything I can contribute to this development effort? I've described myself as a "researcher who can code", definitely not to be confused with a practiced developer, and yet...these topics of volume rendering and coordinated views have become really interesting to me.
Please let me know, ~ Em
---------- Forwarded message ---------- From: Matthew Turk <matthewturk@gmail.com> Date: Tue, Apr 18, 2017 at 2:47 PM Subject: Re: [yt-users] Real-time histogram presentation based on interactive volume rendered camera properties To: Discussion of the yt analysis package <yt-users@lists.spacepope.org>
Hi Em,
Thanks for writing. Right now, this isn't possible; I had delayed on writing back because I was experimenting with ways of augmenting the existing OpenGL VR to do this, but I ended up not coming up with anything in time. I absolutely think this is where we would like to go with our interactive VR, but I don't have a timetable for it right now.
-Matt
On Tue, Apr 11, 2017 at 9:54 AM, E.M. Dragowsky <dragowsky@case.edu> wrote:
Greetings --
Please forgive the subject line -- it makes sense to me; hopefully after reading the note it will make sense to me+1 or more....
Just last week I became aware of yt, and so this is a question about capabilities:
I'm working on a project to aid a researcher with visualizing derived data products (a set of ~5000 histograms) along with volume rendering of simulations performed with Flash (flash.uchicago.edu). This would involve manipulation of the rendered volume data (e.g. temp, density data fields) and coordinated presentation in an inset or separate window from the set of histograms. The active camera view angles from the rendered volume display would be used to select from the histogram set (in this case indexed by 51 polar angle bins and 101 azimuthal angle bins). The key is for the histogram selection to take place as the rendered display is being updated through user interaction -- so as the view is rotated, the histogram updates without having to stop the user actions.
Right now, these histograms are created at the end of the simulation. One could imagine a more general case in which the histogram were derived from the volumetric data in real time.
Is this a task suited to the current state of yt development?
Best regards, ~ Em
-- ---------------------------------- E.M. Dragowsky, Ph.D. Research Computing -- UTech Case Western Reserve University (216) 368-0082
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
-- ---------------------------------- E.M. Dragowsky, Ph.D. Research Computing -- UTech Case Western Reserve University (216) 368-0082
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
-- ---------------------------------- E.M. Dragowsky, Ph.D. Research Computing -- UTech Case Western Reserve University (216) 368-0082

Hi Em, I think I have something that can get you going. If you could clone the following repository: hg clone https://bitbucket.org/xarthisius/reason cd reason hg update -C MPL_IDV and then run: python reason.py. It should create a simple Qt Widget with two frames: matplotlib canvas on the left, and yt's IDV on the right. Whenever you double click on the right panel it should update the left one with a current position of the camera. It should be possible to compute the angles you need from the camera's position and focus. All that's left is adding your data. This code, in addition to all dependencies of IDV, requires PyQt4 with OpenGL support. Let me know if there are any issues. Cheers, Kacper On 04/18/2017 03:27 PM, E.M. Dragowsky wrote:
Thanks, Nathan -- will review in the next couple of days.
On Tue, Apr 18, 2017 at 4:24 PM, Nathan Goldbaum <nathan12343@gmail.com> wrote:
If you'd be interested in experimenting with the OpenGL volume rendering, take a look at these doc pages:
http://yt-project.org/docs/dev/visualizing/interactive_ data_visualization.html http://yt-project.org/docs/dev/cookbook/complex_plots. html#cookbook-opengl-vr
In addition, the place in the codebase to look to see how this is implemented is e.g. here:
https://bitbucket.org/yt_analysis/yt/src/416bc87fd064d8cd5d64a98922c00c 1cc71a0f7d/yt/visualization/volume_rendering/interactive_vr.py
It makes use of cyglfw, a cython wrapper for the GLFW OpenGL library. If you're familiar with OpenGL that will help.
-Nathan
On Tue, Apr 18, 2017 at 3:20 PM, E.M. Dragowsky <dragowsky@case.edu> wrote:
Hi, Matt -- Thanks for offering this assessment of the current state, including the notion that my interest aligns with future plans.
To aid in my report back to my sponsors, perhaps I can extend this conversation to determine if there's anything I can contribute to this development effort? I've described myself as a "researcher who can code", definitely not to be confused with a practiced developer, and yet...these topics of volume rendering and coordinated views have become really interesting to me.
Please let me know, ~ Em
---------- Forwarded message ---------- From: Matthew Turk <matthewturk@gmail.com> Date: Tue, Apr 18, 2017 at 2:47 PM Subject: Re: [yt-users] Real-time histogram presentation based on interactive volume rendered camera properties To: Discussion of the yt analysis package <yt-users@lists.spacepope.org>
Hi Em,
Thanks for writing. Right now, this isn't possible; I had delayed on writing back because I was experimenting with ways of augmenting the existing OpenGL VR to do this, but I ended up not coming up with anything in time. I absolutely think this is where we would like to go with our interactive VR, but I don't have a timetable for it right now.
-Matt
On Tue, Apr 11, 2017 at 9:54 AM, E.M. Dragowsky <dragowsky@case.edu> wrote:
Greetings --
Please forgive the subject line -- it makes sense to me; hopefully after reading the note it will make sense to me+1 or more....
Just last week I became aware of yt, and so this is a question about capabilities:
I'm working on a project to aid a researcher with visualizing derived data products (a set of ~5000 histograms) along with volume rendering of simulations performed with Flash (flash.uchicago.edu). This would involve manipulation of the rendered volume data (e.g. temp, density data fields) and coordinated presentation in an inset or separate window from the set of histograms. The active camera view angles from the rendered volume display would be used to select from the histogram set (in this case indexed by 51 polar angle bins and 101 azimuthal angle bins). The key is for the histogram selection to take place as the rendered display is being updated through user interaction -- so as the view is rotated, the histogram updates without having to stop the user actions.
Right now, these histograms are created at the end of the simulation. One could imagine a more general case in which the histogram were derived from the volumetric data in real time.
Is this a task suited to the current state of yt development?
Best regards, ~ Em
-- ---------------------------------- E.M. Dragowsky, Ph.D. Research Computing -- UTech Case Western Reserve University (216) 368-0082
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
-- ---------------------------------- E.M. Dragowsky, Ph.D. Research Computing -- UTech Case Western Reserve University (216) 368-0082
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org

Is glfw still available/recommended as cyglfw3? I've tried to apply the following, but 'cyglfw3' package is not found. I also searched through anaconda client, and do find the pypi package. conda install -c http://use.yt/with_conda/ cyglfw3 pyopengl I did find the following from 'anaconda search glfw': auto/pyglfw | 0.1.0 | conda | linux-64 : https://bitbucket.org/pyglfw/pyglfw iandh/glfw | 3.1.2 | conda | linux-64 insertinterestingnamehere/glfw | 3.1.2 | conda | linux-64 menpo/glfw3 | 3.2.1 | conda | linux-64, win-32, win-64, linux-32, osx-64 pypi/cyglfw3 | 0.0.5 | pypi | : Python bindings for GLFW 3+ using Cython pypi/glfw | 1.1.0 | pypi | : A ctypes-based wrapper for GLFW3. pypi/pyglfw | 0.2.2 | pypi | : Python bindings for the GLFW library Should I just obtain cyglfw3 through pypi? I liked the idea of maintaining an 'isolated' install through conda, as I'm trying to set up under my user account in our university cluster. Thanks! On Tue, Apr 18, 2017 at 4:24 PM, Nathan Goldbaum <nathan12343@gmail.com> wrote:
If you'd be interested in experimenting with the OpenGL volume rendering, take a look at these doc pages:
http://yt-project.org/docs/dev/visualizing/interactive_data_ visualization.html http://yt-project.org/docs/dev/cookbook/complex_plots.html# cookbook-opengl-vr
In addition, the place in the codebase to look to see how this is implemented is e.g. here:
https://bitbucket.org/yt_analysis/yt/src/416bc87fd064d8cd5d6 4a98922c00c1cc71a0f7d/yt/visualization/volume_rendering/interactive_vr.py
It makes use of cyglfw, a cython wrapper for the GLFW OpenGL library. If you're familiar with OpenGL that will help.
-Nathan
On Tue, Apr 18, 2017 at 3:20 PM, E.M. Dragowsky <dragowsky@case.edu> wrote:
Hi, Matt -- Thanks for offering this assessment of the current state, including the notion that my interest aligns with future plans.
To aid in my report back to my sponsors, perhaps I can extend this conversation to determine if there's anything I can contribute to this development effort? I've described myself as a "researcher who can code", definitely not to be confused with a practiced developer, and yet...these topics of volume rendering and coordinated views have become really interesting to me.
Please let me know, ~ Em
---------- Forwarded message ---------- From: Matthew Turk <matthewturk@gmail.com> Date: Tue, Apr 18, 2017 at 2:47 PM Subject: Re: [yt-users] Real-time histogram presentation based on interactive volume rendered camera properties To: Discussion of the yt analysis package <yt-users@lists.spacepope.org>
Hi Em,
Thanks for writing. Right now, this isn't possible; I had delayed on writing back because I was experimenting with ways of augmenting the existing OpenGL VR to do this, but I ended up not coming up with anything in time. I absolutely think this is where we would like to go with our interactive VR, but I don't have a timetable for it right now.
-Matt
On Tue, Apr 11, 2017 at 9:54 AM, E.M. Dragowsky <dragowsky@case.edu> wrote:
Greetings --
Please forgive the subject line -- it makes sense to me; hopefully after reading the note it will make sense to me+1 or more....
Just last week I became aware of yt, and so this is a question about capabilities:
I'm working on a project to aid a researcher with visualizing derived data products (a set of ~5000 histograms) along with volume rendering of simulations performed with Flash (flash.uchicago.edu). This would involve manipulation of the rendered volume data (e.g. temp, density data fields) and coordinated presentation in an inset or separate window from the set of histograms. The active camera view angles from the rendered volume display would be used to select from the histogram set (in this case indexed by 51 polar angle bins and 101 azimuthal angle bins). The key is for the histogram selection to take place as the rendered display is being updated through user interaction -- so as the view is rotated, the histogram updates without having to stop the user actions.
Right now, these histograms are created at the end of the simulation. One could imagine a more general case in which the histogram were derived from the volumetric data in real time.
Is this a task suited to the current state of yt development?
Best regards, ~ Em
-- ---------------------------------- E.M. Dragowsky, Ph.D. Research Computing -- UTech Case Western Reserve University (216) 368-0082
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
-- ---------------------------------- E.M. Dragowsky, Ph.D. Research Computing -- UTech Case Western Reserve University (216) 368-0082
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
-- ---------------------------------- E.M. Dragowsky, Ph.D. Research Computing -- UTech Case Western Reserve University (216) 368-0082

I think that's a bug - there should be a cyglfw package in that channel. There were some changes to that channel recently so I guess that was dropped unintentionally. For now you will need to install cyglfw3 and glfw3 from source, I think. You should be able to do that using your conda setup using e.g. pip, although you'll need access to a compilation environment. On Thu, Apr 20, 2017 at 1:18 PM, E.M. Dragowsky <dragowsky@case.edu> wrote:
Is glfw still available/recommended as cyglfw3? I've tried to apply the following, but 'cyglfw3' package is not found. I also searched through anaconda client, and do find the pypi package.
conda install -c http://use.yt/with_conda/ cyglfw3 pyopengl
I did find the following from 'anaconda search glfw': auto/pyglfw | 0.1.0 | conda | linux-64 : https://bitbucket.org/pyglfw/ pyglfw iandh/glfw | 3.1.2 | conda | linux-64 insertinterestingnamehere/glfw | 3.1.2 | conda | linux-64 menpo/glfw3 | 3.2.1 | conda | linux-64, win-32, win-64, linux-32, osx-64 pypi/cyglfw3 | 0.0.5 | pypi | : Python bindings for GLFW 3+ using Cython pypi/glfw | 1.1.0 | pypi | : A ctypes-based wrapper for GLFW3. pypi/pyglfw | 0.2.2 | pypi | : Python bindings for the GLFW library
Should I just obtain cyglfw3 through pypi? I liked the idea of maintaining an 'isolated' install through conda, as I'm trying to set up under my user account in our university cluster.
Thanks!
On Tue, Apr 18, 2017 at 4:24 PM, Nathan Goldbaum <nathan12343@gmail.com> wrote:
If you'd be interested in experimenting with the OpenGL volume rendering, take a look at these doc pages:
http://yt-project.org/docs/dev/visualizing/interactive_data_ visualization.html http://yt-project.org/docs/dev/cookbook/complex_plots.html#c ookbook-opengl-vr
In addition, the place in the codebase to look to see how this is implemented is e.g. here:
https://bitbucket.org/yt_analysis/yt/src/416bc87fd064d8cd5d6 4a98922c00c1cc71a0f7d/yt/visualization/volume_rendering/interactive_vr.py
It makes use of cyglfw, a cython wrapper for the GLFW OpenGL library. If you're familiar with OpenGL that will help.
-Nathan
On Tue, Apr 18, 2017 at 3:20 PM, E.M. Dragowsky <dragowsky@case.edu> wrote:
Hi, Matt -- Thanks for offering this assessment of the current state, including the notion that my interest aligns with future plans.
To aid in my report back to my sponsors, perhaps I can extend this conversation to determine if there's anything I can contribute to this development effort? I've described myself as a "researcher who can code", definitely not to be confused with a practiced developer, and yet...these topics of volume rendering and coordinated views have become really interesting to me.
Please let me know, ~ Em
---------- Forwarded message ---------- From: Matthew Turk <matthewturk@gmail.com> Date: Tue, Apr 18, 2017 at 2:47 PM Subject: Re: [yt-users] Real-time histogram presentation based on interactive volume rendered camera properties To: Discussion of the yt analysis package <yt-users@lists.spacepope.org>
Hi Em,
Thanks for writing. Right now, this isn't possible; I had delayed on writing back because I was experimenting with ways of augmenting the existing OpenGL VR to do this, but I ended up not coming up with anything in time. I absolutely think this is where we would like to go with our interactive VR, but I don't have a timetable for it right now.
-Matt
On Tue, Apr 11, 2017 at 9:54 AM, E.M. Dragowsky <dragowsky@case.edu> wrote:
Greetings --
Please forgive the subject line -- it makes sense to me; hopefully after reading the note it will make sense to me+1 or more....
Just last week I became aware of yt, and so this is a question about capabilities:
I'm working on a project to aid a researcher with visualizing derived data products (a set of ~5000 histograms) along with volume rendering of simulations performed with Flash (flash.uchicago.edu). This would involve manipulation of the rendered volume data (e.g. temp, density data fields) and coordinated presentation in an inset or separate window from the set of histograms. The active camera view angles from the rendered volume display would be used to select from the histogram set (in this case indexed by 51 polar angle bins and 101 azimuthal angle bins). The key is for the histogram selection to take place as the rendered display is being updated through user interaction -- so as the view is rotated, the histogram updates without having to stop the user actions.
Right now, these histograms are created at the end of the simulation. One could imagine a more general case in which the histogram were derived from the volumetric data in real time.
Is this a task suited to the current state of yt development?
Best regards, ~ Em
-- ---------------------------------- E.M. Dragowsky, Ph.D. Research Computing -- UTech Case Western Reserve University (216) 368-0082
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
-- ---------------------------------- E.M. Dragowsky, Ph.D. Research Computing -- UTech Case Western Reserve University (216) 368-0082
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
-- ---------------------------------- E.M. Dragowsky, Ph.D. Research Computing -- UTech Case Western Reserve University (216) 368-0082
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org

On 04/20/2017 01:18 PM, E.M. Dragowsky wrote:
Is glfw still available/recommended as cyglfw3? I've tried to apply the following, but 'cyglfw3' package is not found. I also searched through anaconda client, and do find the pypi package.
conda install -c http://use.yt/with_conda/ cyglfw3 pyopengl
Hi, I uploaded the missing packages. Please try again. Cheers, Kacper
I did find the following from 'anaconda search glfw': auto/pyglfw | 0.1.0 | conda | linux-64 : https://bitbucket.org/pyglfw/pyglfw iandh/glfw | 3.1.2 | conda | linux-64 insertinterestingnamehere/glfw | 3.1.2 | conda | linux-64 menpo/glfw3 | 3.2.1 | conda | linux-64, win-32, win-64, linux-32, osx-64 pypi/cyglfw3 | 0.0.5 | pypi | : Python bindings for GLFW 3+ using Cython pypi/glfw | 1.1.0 | pypi | : A ctypes-based wrapper for GLFW3. pypi/pyglfw | 0.2.2 | pypi | : Python bindings for the GLFW library
Should I just obtain cyglfw3 through pypi? I liked the idea of maintaining an 'isolated' install through conda, as I'm trying to set up under my user account in our university cluster.
Thanks!
On Tue, Apr 18, 2017 at 4:24 PM, Nathan Goldbaum <nathan12343@gmail.com <mailto:nathan12343@gmail.com>> wrote:
If you'd be interested in experimenting with the OpenGL volume rendering, take a look at these doc pages:
http://yt-project.org/docs/dev/visualizing/interactive_data_visualization.ht... <http://yt-project.org/docs/dev/visualizing/interactive_data_visualization.ht...> http://yt-project.org/docs/dev/cookbook/complex_plots.html#cookbook-opengl-v... <http://yt-project.org/docs/dev/cookbook/complex_plots.html#cookbook-opengl-v...>
In addition, the place in the codebase to look to see how this is implemented is e.g. here:
https://bitbucket.org/yt_analysis/yt/src/416bc87fd064d8cd5d64a98922c00c1cc71... <https://bitbucket.org/yt_analysis/yt/src/416bc87fd064d8cd5d64a98922c00c1cc71...>
It makes use of cyglfw, a cython wrapper for the GLFW OpenGL library. If you're familiar with OpenGL that will help.
-Nathan
On Tue, Apr 18, 2017 at 3:20 PM, E.M. Dragowsky <dragowsky@case.edu <mailto:dragowsky@case.edu>> wrote:
Hi, Matt -- Thanks for offering this assessment of the current state, including the notion that my interest aligns with future plans.
To aid in my report back to my sponsors, perhaps I can extend this conversation to determine if there's anything I can contribute to this development effort? I've described myself as a "researcher who can code", definitely not to be confused with a practiced developer, and yet...these topics of volume rendering and coordinated views have become really interesting to me.
Please let me know, ~ Em
---------- Forwarded message ---------- From: *Matthew Turk* <matthewturk@gmail.com <mailto:matthewturk@gmail.com>> Date: Tue, Apr 18, 2017 at 2:47 PM Subject: Re: [yt-users] Real-time histogram presentation based on interactive volume rendered camera properties To: Discussion of the yt analysis package <yt-users@lists.spacepope.org <mailto:yt-users@lists.spacepope.org>>
Hi Em,
Thanks for writing. Right now, this isn't possible; I had delayed on writing back because I was experimenting with ways of augmenting the existing OpenGL VR to do this, but I ended up not coming up with anything in time. I absolutely think this is where we would like to go with our interactive VR, but I don't have a timetable for it right now.
-Matt
On Tue, Apr 11, 2017 at 9:54 AM, E.M. Dragowsky <dragowsky@case.edu <mailto:dragowsky@case.edu>> wrote:
Greetings --
Please forgive the subject line -- it makes sense to me; hopefully after reading the note it will make sense to me+1 or more....
Just last week I became aware of yt, and so this is a question about capabilities:
I'm working on a project to aid a researcher with visualizing derived data products (a set of ~5000 histograms) along with volume rendering of simulations performed with Flash (flash.uchicago.edu <http://flash.uchicago.edu/>). This would involve manipulation of the rendered volume data (e.g. temp, density data fields) and coordinated presentation in an inset or separate window from the set of histograms. The active camera view angles from the rendered volume display would be used to select from the histogram set (in this case indexed by 51 polar angle bins and 101 azimuthal angle bins). The key is for the histogram selection to take place as the rendered display is being updated through user interaction -- so as the view is rotated, the histogram updates without having to stop the user actions.
Right now, these histograms are created at the end of the simulation. One could imagine a more general case in which the histogram were derived from the volumetric data in real time.
Is this a task suited to the current state of yt development?
Best regards, ~ Em
-- ---------------------------------- E.M. Dragowsky, Ph.D. Research Computing -- UTech Case Western Reserve University (216) 368-0082 <tel:(216)%20368-0082>
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org <mailto:yt-users@lists.spacepope.org> http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org <http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org>
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org <mailto:yt-users@lists.spacepope.org> http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org <http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org>
-- ---------------------------------- E.M. Dragowsky, Ph.D. Research Computing -- UTech Case Western Reserve University (216) 368-0082 <tel:(216)%20368-0082>
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org <mailto:yt-users@lists.spacepope.org> http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org <http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org>
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org <mailto:yt-users@lists.spacepope.org> http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org <http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org>
-- ---------------------------------- E.M. Dragowsky, Ph.D. Research Computing -- UTech Case Western Reserve University (216) 368-0082 <tel:(216)%20368-0082>
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org

From the unsatisfiable error, i don't really understand the best course of action. It would seem at this point that cyglfw3 has bindings for
Thanks, Kacper & Nathan -- UnsatisfiableError: The following specifications were found to be in conflict: - cyglfw3 -> python 3.5* -> openssl 1.0.1* - cyglfw3 -> python 3.5* -> xz 5.0.5 - python 3.6* Use "conda info <package>" to see the dependencies for each package. 'conda info cyglfw3' finds the package missing from linux-64 channels. python3.5, but not yet for python3.6 -- is that the upshot? I don't know what's required to generate the updated bindings. The 'yt' install was done using this same procedure: conda install -c http://use.yt/with-conda yt The versions are openssl is 1.0.2, and xz 5.2.2-1 Thanks, ~ E.m On Thu, Apr 20, 2017 at 2:45 PM, Kacper Kowalik <xarthisius.kk@gmail.com> wrote:
On 04/20/2017 01:18 PM, E.M. Dragowsky wrote:
Is glfw still available/recommended as cyglfw3? I've tried to apply the following, but 'cyglfw3' package is not found. I also searched through anaconda client, and do find the pypi package.
conda install -c http://use.yt/with_conda/ cyglfw3 pyopengl
Hi, I uploaded the missing packages. Please try again. Cheers, Kacper
I did find the following from 'anaconda search glfw': auto/pyglfw | 0.1.0 | conda | linux-64 : https://bitbucket.org/pyglfw/pyglfw iandh/glfw | 3.1.2 | conda | linux-64 insertinterestingnamehere/glfw | 3.1.2 | conda |
linux-64
menpo/glfw3 | 3.2.1 | conda | linux-64, win-32, win-64, linux-32, osx-64 pypi/cyglfw3 | 0.0.5 | pypi | : Python bindings for GLFW 3+ using Cython pypi/glfw | 1.1.0 | pypi | : A ctypes-based wrapper for GLFW3. pypi/pyglfw | 0.2.2 | pypi | : Python bindings for the GLFW library
Should I just obtain cyglfw3 through pypi? I liked the idea of maintaining an 'isolated' install through conda, as I'm trying to set up under my user account in our university cluster.
Thanks!
On Tue, Apr 18, 2017 at 4:24 PM, Nathan Goldbaum <nathan12343@gmail.com <mailto:nathan12343@gmail.com>> wrote:
If you'd be interested in experimenting with the OpenGL volume rendering, take a look at these doc pages:
data_visualization.html
data_visualization.html>
html#cookbook-opengl-vr
html#cookbook-opengl-vr>
In addition, the place in the codebase to look to see how this is implemented is e.g. here:
416bc87fd064d8cd5d64a98922c00c1cc71a0f7d/yt/visualization/ volume_rendering/interactive_vr.py
416bc87fd064d8cd5d64a98922c00c1cc71a0f7d/yt/visualization/ volume_rendering/interactive_vr.py>
It makes use of cyglfw, a cython wrapper for the GLFW OpenGL library. If you're familiar with OpenGL that will help.
-Nathan
On Tue, Apr 18, 2017 at 3:20 PM, E.M. Dragowsky <dragowsky@case.edu <mailto:dragowsky@case.edu>> wrote:
Hi, Matt -- Thanks for offering this assessment of the current state, including the notion that my interest aligns with future plans.
To aid in my report back to my sponsors, perhaps I can extend this conversation to determine if there's anything I can contribute to this development effort? I've described myself as a "researcher who can code", definitely not to be confused with a practiced developer, and yet...these topics of volume rendering and coordinated views have become really interesting to me.
Please let me know, ~ Em
---------- Forwarded message ---------- From: *Matthew Turk* <matthewturk@gmail.com <mailto:matthewturk@gmail.com>> Date: Tue, Apr 18, 2017 at 2:47 PM Subject: Re: [yt-users] Real-time histogram presentation based on interactive volume rendered camera properties To: Discussion of the yt analysis package <yt-users@lists.spacepope.org <mailto:yt-users@lists.
spacepope.org>>
Hi Em,
Thanks for writing. Right now, this isn't possible; I had delayed on writing back because I was experimenting with ways of augmenting the existing OpenGL VR to do this, but I ended up not coming up with anything in time. I absolutely think this is where we would like to go with our interactive VR, but I don't have a timetable for it right now.
-Matt
On Tue, Apr 11, 2017 at 9:54 AM, E.M. Dragowsky <dragowsky@case.edu <mailto:dragowsky@case.edu>> wrote:
Greetings --
Please forgive the subject line -- it makes sense to me; hopefully after reading the note it will make sense to me+1 or more....
Just last week I became aware of yt, and so this is a question about capabilities:
I'm working on a project to aid a researcher with visualizing derived data products (a set of ~5000 histograms) along with volume rendering of simulations performed with Flash (flash.uchicago.edu <http://flash.uchicago.edu/>). This would involve manipulation of the rendered volume data (e.g. temp, density data fields) and coordinated presentation in an inset or separate window from the set of histograms. The active camera view angles from the rendered volume display would be used to select from the histogram set (in this case indexed by 51 polar angle bins and 101 azimuthal angle bins). The key is for the histogram selection to take place as the rendered display is being updated through user interaction -- so as the view is rotated, the histogram updates without having to stop the user actions.
Right now, these histograms are created at the end of the simulation. One could imagine a more general case in which the histogram were derived from the volumetric data in real time.
Is this a task suited to the current state of yt development?
Best regards, ~ Em
-- ---------------------------------- E.M. Dragowsky, Ph.D. Research Computing -- UTech Case Western Reserve University (216) 368-0082 <tel:(216)%20368-0082>
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org <mailto:yt-users@lists.spacepope.org> http://lists.spacepope.org/listinfo.cgi/yt-users-
spacepope.org
spacepope.org>
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org <mailto:yt-users@lists.
spacepope.org>
http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org <http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org>
-- ---------------------------------- E.M. Dragowsky, Ph.D. Research Computing -- UTech Case Western Reserve University (216) 368-0082 <tel:(216)%20368-0082>
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org <mailto:yt-users@lists.
spacepope.org>
http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org <http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org>
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org <mailto:yt-users@lists.spacepope.org> http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org <http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org>
-- ---------------------------------- E.M. Dragowsky, Ph.D. Research Computing -- UTech Case Western Reserve University (216) 368-0082 <tel:(216)%20368-0082>
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
-- ---------------------------------- E.M. Dragowsky, Ph.D. Research Computing -- UTech Case Western Reserve University (216) 368-0082

I think this is happening because Kacper only made conda builds for python3.5, not python3.6. I *think* it will work if you downgrade python to python3.5, and set up your python environment once again. We should probably set up a conda-forge package for glfw and cyglfw3 so we can automatically take care of these package updates. -Nathan On Thu, Apr 20, 2017 at 4:05 PM, E.M. Dragowsky <dragowsky@case.edu> wrote:
Thanks, Kacper & Nathan --
UnsatisfiableError: The following specifications were found to be in conflict: - cyglfw3 -> python 3.5* -> openssl 1.0.1* - cyglfw3 -> python 3.5* -> xz 5.0.5 - python 3.6* Use "conda info <package>" to see the dependencies for each package.
'conda info cyglfw3' finds the package missing from linux-64 channels.
From the unsatisfiable error, i don't really understand the best course of action. It would seem at this point that cyglfw3 has bindings for python3.5, but not yet for python3.6 -- is that the upshot?
I don't know what's required to generate the updated bindings. The 'yt' install was done using this same procedure: conda install -c http://use.yt/with-conda yt
The versions are openssl is 1.0.2, and xz 5.2.2-1 Thanks, ~ E.m
On Thu, Apr 20, 2017 at 2:45 PM, Kacper Kowalik <xarthisius.kk@gmail.com> wrote:
On 04/20/2017 01:18 PM, E.M. Dragowsky wrote:
Is glfw still available/recommended as cyglfw3? I've tried to apply the following, but 'cyglfw3' package is not found. I also searched through anaconda client, and do find the pypi package.
conda install -c http://use.yt/with_conda/ cyglfw3 pyopengl
Hi, I uploaded the missing packages. Please try again. Cheers, Kacper
I did find the following from 'anaconda search glfw': auto/pyglfw | 0.1.0 | conda | linux-64 : https://bitbucket.org/pyglfw/pyglfw iandh/glfw | 3.1.2 | conda | linux-64 insertinterestingnamehere/glfw | 3.1.2 | conda |
linux-64
menpo/glfw3 | 3.2.1 | conda | linux-64, win-32, win-64, linux-32, osx-64 pypi/cyglfw3 | 0.0.5 | pypi | : Python bindings for GLFW 3+ using Cython pypi/glfw | 1.1.0 | pypi | : A ctypes-based wrapper for GLFW3. pypi/pyglfw | 0.2.2 | pypi | : Python bindings for the GLFW library
Should I just obtain cyglfw3 through pypi? I liked the idea of maintaining an 'isolated' install through conda, as I'm trying to set up under my user account in our university cluster.
Thanks!
On Tue, Apr 18, 2017 at 4:24 PM, Nathan Goldbaum <nathan12343@gmail.com <mailto:nathan12343@gmail.com>> wrote:
If you'd be interested in experimenting with the OpenGL volume rendering, take a look at these doc pages:
_visualization.html
a_visualization.html>
cookbook-opengl-vr
#cookbook-opengl-vr>
In addition, the place in the codebase to look to see how this is implemented is e.g. here:
64a98922c00c1cc71a0f7d/yt/visualization/volume_ rendering/interactive_vr.py
d64a98922c00c1cc71a0f7d/yt/visualization/volume_ rendering/interactive_vr.py>
It makes use of cyglfw, a cython wrapper for the GLFW OpenGL library. If you're familiar with OpenGL that will help.
-Nathan
On Tue, Apr 18, 2017 at 3:20 PM, E.M. Dragowsky <dragowsky@case.edu <mailto:dragowsky@case.edu>> wrote:
Hi, Matt -- Thanks for offering this assessment of the current state, including the notion that my interest aligns with future plans.
To aid in my report back to my sponsors, perhaps I can extend this conversation to determine if there's anything I can contribute to this development effort? I've described myself as a "researcher who can code", definitely not to be confused with a practiced developer, and yet...these topics of volume rendering and coordinated views have become really interesting to me.
Please let me know, ~ Em
---------- Forwarded message ---------- From: *Matthew Turk* <matthewturk@gmail.com <mailto:matthewturk@gmail.com>> Date: Tue, Apr 18, 2017 at 2:47 PM Subject: Re: [yt-users] Real-time histogram presentation based on interactive volume rendered camera properties To: Discussion of the yt analysis package <yt-users@lists.spacepope.org <mailto:yt-users@lists.spacepo
pe.org>>
Hi Em,
Thanks for writing. Right now, this isn't possible; I had delayed on writing back because I was experimenting with ways of augmenting the existing OpenGL VR to do this, but I ended up not coming up with anything in time. I absolutely think this is where we would like to go with our interactive VR, but I don't have a timetable for it right now.
-Matt
On Tue, Apr 11, 2017 at 9:54 AM, E.M. Dragowsky <dragowsky@case.edu <mailto:dragowsky@case.edu>> wrote:
Greetings --
Please forgive the subject line -- it makes sense to me; hopefully after reading the note it will make sense to me+1 or more....
Just last week I became aware of yt, and so this is a question about capabilities:
I'm working on a project to aid a researcher with visualizing derived data products (a set of ~5000 histograms) along with volume rendering of simulations performed with Flash (flash.uchicago.edu <http://flash.uchicago.edu/>). This would involve manipulation of the rendered volume data (e.g. temp, density data fields) and coordinated presentation in an inset or separate window from the set of histograms. The active camera view angles from the rendered volume display would be used to select from the histogram set (in this case indexed by 51 polar angle bins and 101 azimuthal angle bins). The key is for the histogram selection to take place as the rendered display is being updated through user interaction -- so as the view is rotated, the histogram updates without having to stop the user actions.
Right now, these histograms are created at the end of the simulation. One could imagine a more general case in which the histogram were derived from the volumetric data in real time.
Is this a task suited to the current state of yt
development?
Best regards, ~ Em
-- ---------------------------------- E.M. Dragowsky, Ph.D. Research Computing -- UTech Case Western Reserve University (216) 368-0082 <tel:(216)%20368-0082>
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org <mailto:yt-users@lists.spacepope.org> http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.
org
.org>
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org <mailto:yt-users@lists.spacepo
pe.org>
http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org <http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
-- ---------------------------------- E.M. Dragowsky, Ph.D. Research Computing -- UTech Case Western Reserve University (216) 368-0082 <tel:(216)%20368-0082>
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org <mailto:yt-users@lists.spacepo
pe.org>
http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org <http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org <mailto:yt-users@lists.spacepope.org> http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org <http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org>
-- ---------------------------------- E.M. Dragowsky, Ph.D. Research Computing -- UTech Case Western Reserve University (216) 368-0082 <tel:(216)%20368-0082>
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
-- ---------------------------------- E.M. Dragowsky, Ph.D. Research Computing -- UTech Case Western Reserve University (216) 368-0082
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org

Thanks, Nathan -- I'll follow-up tomorrow. ~ E.m On Thu, Apr 20, 2017 at 5:10 PM, Nathan Goldbaum <nathan12343@gmail.com> wrote:
I think this is happening because Kacper only made conda builds for python3.5, not python3.6. I *think* it will work if you downgrade python to python3.5, and set up your python environment once again.
We should probably set up a conda-forge package for glfw and cyglfw3 so we can automatically take care of these package updates.
-Nathan
On Thu, Apr 20, 2017 at 4:05 PM, E.M. Dragowsky <dragowsky@case.edu> wrote:
Thanks, Kacper & Nathan --
UnsatisfiableError: The following specifications were found to be in conflict: - cyglfw3 -> python 3.5* -> openssl 1.0.1* - cyglfw3 -> python 3.5* -> xz 5.0.5 - python 3.6* Use "conda info <package>" to see the dependencies for each package.
'conda info cyglfw3' finds the package missing from linux-64 channels.
From the unsatisfiable error, i don't really understand the best course of action. It would seem at this point that cyglfw3 has bindings for python3.5, but not yet for python3.6 -- is that the upshot?
I don't know what's required to generate the updated bindings. The 'yt' install was done using this same procedure: conda install -c http://use.yt/with-conda yt
The versions are openssl is 1.0.2, and xz 5.2.2-1 Thanks, ~ E.m
On Thu, Apr 20, 2017 at 2:45 PM, Kacper Kowalik <xarthisius.kk@gmail.com> wrote:
On 04/20/2017 01:18 PM, E.M. Dragowsky wrote:
Is glfw still available/recommended as cyglfw3? I've tried to apply the following, but 'cyglfw3' package is not found. I also searched through anaconda client, and do find the pypi package.
conda install -c http://use.yt/with_conda/ cyglfw3 pyopengl
Hi, I uploaded the missing packages. Please try again. Cheers, Kacper
I did find the following from 'anaconda search glfw': auto/pyglfw | 0.1.0 | conda | linux-64 : https://bitbucket.org/pyglfw/pyglfw iandh/glfw | 3.1.2 | conda | linux-64 insertinterestingnamehere/glfw | 3.1.2 | conda |
linux-64
menpo/glfw3 | 3.2.1 | conda | linux-64, win-32, win-64, linux-32, osx-64 pypi/cyglfw3 | 0.0.5 | pypi | : Python bindings for GLFW 3+ using Cython pypi/glfw | 1.1.0 | pypi | : A ctypes-based wrapper for GLFW3. pypi/pyglfw | 0.2.2 | pypi | : Python bindings for the
library
Should I just obtain cyglfw3 through pypi? I liked the idea of maintaining an 'isolated' install through conda, as I'm trying to set up under my user account in our university cluster.
Thanks!
On Tue, Apr 18, 2017 at 4:24 PM, Nathan Goldbaum < nathan12343@gmail.com <mailto:nathan12343@gmail.com>> wrote:
If you'd be interested in experimenting with the OpenGL volume rendering, take a look at these doc pages:
http://yt-project.org/docs/dev/visualizing/interactive_data _visualization.html <http://yt-project.org/docs/dev/visualizing/interactive_dat a_visualization.html> http://yt-project.org/docs/dev/cookbook/complex_plots.html# cookbook-opengl-vr <http://yt-project.org/docs/dev/cookbook/complex_plots.html #cookbook-opengl-vr>
In addition, the place in the codebase to look to see how this is implemented is e.g. here:
https://bitbucket.org/yt_analysis/yt/src/416bc87fd064d8cd5d 64a98922c00c1cc71a0f7d/yt/visualization/volume_rendering/ interactive_vr.py <https://bitbucket.org/yt_analysis/yt/src/416bc87fd064d8cd5 d64a98922c00c1cc71a0f7d/yt/visualization/volume_rendering/ interactive_vr.py>
It makes use of cyglfw, a cython wrapper for the GLFW OpenGL library. If you're familiar with OpenGL that will help.
-Nathan
On Tue, Apr 18, 2017 at 3:20 PM, E.M. Dragowsky < dragowsky@case.edu <mailto:dragowsky@case.edu>> wrote:
Hi, Matt -- Thanks for offering this assessment of the current state, including the notion that my interest aligns with future plans.
To aid in my report back to my sponsors, perhaps I can extend this conversation to determine if there's anything I can contribute to this development effort? I've described myself as a "researcher who can code", definitely not to be confused with a practiced developer, and yet...these topics of volume rendering and coordinated views have become really interesting to me.
Please let me know, ~ Em
---------- Forwarded message ---------- From: *Matthew Turk* <matthewturk@gmail.com <mailto:matthewturk@gmail.com>> Date: Tue, Apr 18, 2017 at 2:47 PM Subject: Re: [yt-users] Real-time histogram presentation based on interactive volume rendered camera properties To: Discussion of the yt analysis package <yt-users@lists.spacepope.org <mailto:yt-users@lists.spacepo
GLFW pe.org>>
Hi Em,
Thanks for writing. Right now, this isn't possible; I had delayed on writing back because I was experimenting with ways
of
augmenting the existing OpenGL VR to do this, but I ended up
not
coming up with anything in time. I absolutely think this is where we would like to go with our interactive VR, but I don't have a timetable for it right now.
-Matt
On Tue, Apr 11, 2017 at 9:54 AM, E.M. Dragowsky <dragowsky@case.edu <mailto:dragowsky@case.edu>> wrote:
Greetings --
Please forgive the subject line -- it makes sense to me; hopefully after reading the note it will make sense to me+1 or more....
Just last week I became aware of yt, and so this is a question about capabilities:
I'm working on a project to aid a researcher with visualizing derived data products (a set of ~5000 histograms) along with volume rendering of simulations performed with Flash (flash.uchicago.edu <http://flash.uchicago.edu/>). This would involve manipulation of the rendered volume data (e.g. temp,
density
data fields) and coordinated presentation in an inset or separate window from the set of histograms. The active camera view angles from the rendered volume display would
be
used to select from the histogram set (in this case indexed by 51 polar angle bins and 101 azimuthal angle bins). The key is for the histogram selection to take place as the rendered display is being updated through user interaction -- so as the view is rotated, the histogram updates without having to stop the user actions.
Right now, these histograms are created at the end of the simulation. One could imagine a more general case in which the histogram were derived from the volumetric data in real time.
Is this a task suited to the current state of yt
development?
Best regards, ~ Em
-- ---------------------------------- E.M. Dragowsky, Ph.D. Research Computing -- UTech Case Western Reserve University (216) 368-0082 <tel:(216)%20368-0082>
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org <mailto:yt-users@lists.spacepope.org> http://lists.spacepope.org/li
stinfo.cgi/yt-users-spacepope.org
istinfo.cgi/yt-users-spacepope.org>
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org <mailto:yt-users@lists.spacepo
pe.org>
http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org <http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope
.org>
-- ---------------------------------- E.M. Dragowsky, Ph.D. Research Computing -- UTech Case Western Reserve University (216) 368-0082 <tel:(216)%20368-0082>
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org <mailto:yt-users@lists.spacepo
pe.org>
http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org <http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope
.org>
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org <mailto:yt-users@lists.spacepope.org> http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org <http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org>
-- ---------------------------------- E.M. Dragowsky, Ph.D. Research Computing -- UTech Case Western Reserve University (216) 368-0082 <tel:(216)%20368-0082>
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
-- ---------------------------------- E.M. Dragowsky, Ph.D. Research Computing -- UTech Case Western Reserve University (216) 368-0082
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
-- ---------------------------------- E.M. Dragowsky, Ph.D. Research Computing -- UTech Case Western Reserve University (216) 368-0082
participants (4)
-
E.M. Dragowsky
-
Kacper Kowalik
-
Matthew Turk
-
Nathan Goldbaum