[Image-SIG] bit decoder - Cineon
Joe Connellan
joec@mill.co.uk
Mon, 22 Jul 2002 12:50:12 +0100
Hi,
I'm writting (or trying to) a decoder for the 10bit Cineon file
format. I've got it recognising the format and getting width/height, etc
from it. Now for the image data.
The Cineon format stores the 3 channels (rgb) in 32bits with two bits of
waste ie:
(R10bits)(G10bits)(B10bits)(2)
is it possible to decode this using either the raw or bit decoders? the
bit decoder allows for line bit padding but I don't think pixel bit
padding (for the 2bits at the end).
also if I set up the tile (in cineonImagePlugin.py - the decoder) like
so:
self.tile = [
("bit", (0, 0) + self.size, offset, (32, 0, 1, 0, 1))
]
when I try to do a tostring:
self.img[0].tostring("bit", 32, 0, 1, 0, 1)
I get the following:
File "imageViewer.py", line 84, in loadImage
self.img[0].tostring("bit", 32, 0, 1, 0, 1)
File "C:\Python22\PIL\Image.py", line 351, in tostring
self.load()
File "C:\Python22\PIL\ImageFile.py", line 140, in load
d = Image._getdecoder(self.mode, d, a, self.decoderconfig)
File "C:\Python22\PIL\Image.py", line 241, in _getdecoder
return apply(decoder, (mode,) + args + extra)
ValueError: bad image mode
Any ideas?
Joe