On Thu, Nov 5, 2009 at 3:44 PM, SirVer <sir...@gmx.de> wrote:
On 5 Nov., 15:27, Chris Colbert <sccolb...@gmail.com> wrote:
On Thu, Nov 5, 2009 at 3:16 PM, SirVer <sir...@gmx.de> wrote:
Hi,
On 4 Nov., 15:04, Stéfan van der Walt <ste...@sun.ac.za> wrote:
2009/11/4 SirVer <sir...@gmx.de>:
Stefan, concerning my GUI branch: I played around with pyqt and QImages and they just couldn't deliver what I needed: Speed.
I'd like to see some benchmarks that support this, because it should be the cost of 2 python calls + whatever time the GUI uses. QImage is fast when loading directly from a numpy array. I'm not sure, with the copying that you have to do into a texture, that OpenGL can do any better. I did some benchmarks, but unfortunately I do not have the code around. I created QImages and painted them directly in PyQt. It was reasonable fast, but I couldn't reach the performance I have with a QGLWidget which delivers easily 300fps or 60fps in 12 different windows. I'd rather not hack this again since this issue is somewhat settled for me and I'd prefer spending my coding time on other itches I have.
I still think that PyQt is a much bigger dependency then PyOpenGL - and even so: both are optional and only needed when GUI stuff in real time should be performed.
I think you're not really grasping the idea of "plugin" Well, I understood the principle of plugin quite well; BUT the plugin architecture does not allow real time display of images right now. It won't be possible to implement it that way. AND there are users who will need this and might need it in image processing tasks. The question here is just if and if yes how to implement this.
I meant that you make the claim that QT is a heavy dependency, when in fact its not a dependency at all unless the individual wants to use the qt plugins. The scikit, and all it's image process, still function without having QT installed. I understood that; and frankly that is out of question. I only mean IF
On 5 Nov., 15:54, Chris Colbert <sccolb...@gmail.com> wrote: they want to display anything in real time, pyopengl is not a heavy dependency comparing to Pyqt.
Further, these imshow() type widgets are primarily meant to be used from the interactive interpreter, an environment not best suited for real time image acquisition and display. I use live camera display + annotated images in pylab/ipython every day; I couldn't do my job without it.
that said, the plugin archiceture can most certainly be used in the method you speak of. You just simply have your imshow() function return the window object, and implement an update() or similar method that the consumer can call to update the image. I thought that this was not the way it should go. What I would need was a kind of update() functionality for each image. I would also need a kind of annotate functionality, so that the user has the chance to draw things onto the image (the drawing would not be backend independent, obviously). That said comes a big but: But as a user, I would not see the reason to use the plugin architecture then anymore (the same holds for me and matplotlib: they offer a backend independent signaling library (for mouse events etc), but every time I tried to use it, I decided to choose the backend to be qt (and no longer anything else) and use qt directly for enhanced flexibility). My point is: the current attempt for the plugin architecture is to make many backends for image displaying and maybe modification simple. My aim is in another direction: to offer the user a set of base classes, that he really wants to extend in his own Programs and which are NOT backend independent. Maybe scikit.images is not the right place for this, but that's what i'm here to discuss.
If my code and my attempts are not to be included in scikit.image, so be it. I will continue to use it anyway; i just think they are useful and WANT to contribute them to the public.
Cheers, Holger
Regards Stéfan