Python 3.0 automatic decoding of UTF16

J Kenneth King james at
Fri Dec 5 11:51:17 EST 2008

Johannes Bauer <dfnsonfsduifb at> writes:

> Traceback (most recent call last):
>   File "./", line 12, in <module>
>     a = AddressBook("2008_11_05_Handy_Backup.txt")
>   File "./", line 7, in __init__
>     line = f.readline()
>   File "/usr/local/lib/python3.0/", line 1807, in readline
>     while self._read_chunk():
>   File "/usr/local/lib/python3.0/", line 1556, in _read_chunk
>     self._set_decoded_chars(self._decoder.decode(input_chunk, eof))
>   File "/usr/local/lib/python3.0/", line 1293, in decode
>     output = self.decoder.decode(input, final=final)
>   File "/usr/local/lib/python3.0/", line 300, in decode
>     (result, consumed) = self._buffer_decode(data, self.errors, final)
>   File "/usr/local/lib/python3.0/encodings/", line 69, in
> _buffer_decode
>     return self.decoder(input, self.errors, final)
> UnicodeDecodeError: 'utf16' codec can't decode bytes in position 74-75:
> illegal encoding

It probably means what it says: that the input file contains characters
it cannot read using the specified encoding.

Are you generating the file from python using a file object with the
same encoding? If not, then you might want to look at your input data
and find a way to deal with the exception.

More information about the Python-list mailing list