Number of bins in Entropy and Enhance_contrast for uint16
Hi I'm new to using the scikit-image library, so if this question is silly please excuse me. I'm trying to run contrast enhancement on thousands of 1000x1000 images of "numpy.uint16" type. I'm expecting this to be very computationally expensive, something confirmed when running either entropy or enhance_contrast (from "skimage.filter.rank") with the warning: "Bitdepth of 15 may result in bad rank filter performance due to large number of bins". I'm wondering if it is possible to reduce the number of bins via an option? I've tried to find such a keyword in the documentation and source code, but I haven't been able to find it. Kind regards Pål
Hi Pål On 2014-12-10 12:05:19, Pål Gunnar Ellingsen <paalge@gmail.com> wrote:
"Bitdepth of 15 may result in bad rank filter performance due to large number of bins". I'm wondering if it is possible to reduce the number of bins via an option? I've tried to find such a keyword in the documentation and source code, but I haven't been able to find it.
The easiest is to change the image dtype by using, e.g., from skimage import img_as_ubyte image8 = img_as_ubyte(image16) The algorithm should run much faster on image8 than on image16. Regards Stéfan
Hi Thank you for the quick answer. I agree that converting it to uint8 will speed it up by a lot, and I have also tried this. Though it also removes so much data from my 16 bit grayscale image, that the contrast I'm interesting in isn't there anymore. This is the reason why I think that changing the binning from 1000 to 100 or even 50, without changing the data type would be a better choice. Kind regards Pål On Wednesday, 10 December 2014 11:10:48 UTC+1, Stefan van der Walt wrote:
Hi Pål
"Bitdepth of 15 may result in bad rank filter performance due to large number of bins". I'm wondering if it is possible to reduce the number of bins via an
On 2014-12-10 12:05:19, Pål Gunnar Ellingsen <paa...@gmail.com <javascript:>> wrote: option?
I've tried to find such a keyword in the documentation and source code, but I haven't been able to find it.
The easiest is to change the image dtype by using, e.g.,
from skimage import img_as_ubyte image8 = img_as_ubyte(image16)
The algorithm should run much faster on image8 than on image16.
Regards Stéfan
Hi Pål, Actually, the rank filters are fast up to 12 bits, so if you can (manually) compress your data to be in 0-2047 in a uint16 array, you might still get good performance. Hope that helps! Juan. On Wed, Dec 10, 2014 at 11:17 PM, Pål Gunnar Ellingsen <paalge@gmail.com> wrote:
Hi
Thank you for the quick answer. I agree that converting it to uint8 will speed it up by a lot, and I have also tried this. Though it also removes so much data from my 16 bit grayscale image, that the contrast I'm interesting in isn't there anymore. This is the reason why I think that changing the binning from 1000 to 100 or even 50, without changing the data type would be a better choice.
Kind regards
Pål
On Wednesday, 10 December 2014 11:10:48 UTC+1, Stefan van der Walt wrote:
Hi Pål
"Bitdepth of 15 may result in bad rank filter performance due to large number of bins". I'm wondering if it is possible to reduce the number of bins via an
On 2014-12-10 12:05:19, Pål Gunnar Ellingsen <paa...@gmail.com> wrote: option?
I've tried to find such a keyword in the documentation and source code, but I haven't been able to find it.
The easiest is to change the image dtype by using, e.g.,
from skimage import img_as_ubyte image8 = img_as_ubyte(image16)
The algorithm should run much faster on image8 than on image16.
Regards Stéfan
-- You received this message because you are subscribed to the Google Groups "scikit-image" group. To unsubscribe from this group and stop receiving emails from it, send an email to scikit-image+unsubscribe@googlegroups.com. For more options, visit https://groups.google.com/d/optout.
participants (3)
-
Juan Nunez-Iglesias
-
Pål Gunnar Ellingsen
-
Stefan van der Walt