Thanks. I suspect then that ndimage.imread(, flatten=True) is using the older standard, hence why the image values appeared the same. For testing purposes, I will use Matlab's conversion factor to see if I get the same output as the paper. Once I am happy it is the same, then I will go back to using color.rgb2gray.
Thanks for your help.
On 18/05/13 1:25 AM, Tony Yu wrote:
On Fri, May 17, 2013 at 12:15 PM, Brickle Macho <email@example.com mailto:firstname.lastname@example.org> wrote:
I porting a 8 line Matlab script.
So what is the difference between converting an array to gray scale verse reading it in as grayscale? Have I done something wrong? Is there another way to convert a numpy array to grayscale? Any help appreciated. Michael. --
They're just different color conversion factors. Based on http://www.mathworks.com/help/images/ref/rgb2gray.html, Matlab uses: 0.2989 R + 0.5870 G + 0.1140 B
Based on the docstring for `color.rgb2gray`: 0.2125 R + 0.7154 G + 0.0721 B
Wikipedia (http://en.wikipedia.org/wiki/Grayscale) seems to suggest that Matlab's is an older standard while the one in scikit-image is a more recent spec.
-- You received this message because you are subscribed to the Google Groups "scikit-image" group. To unsubscribe from this group and stop receiving emails from it, send an email to email@example.com. For more options, visit https://groups.google.com/groups/opt_out.