[Python-Dev] "data".decode(encoding) ?!
Fredrik Lundh
fredrik@effbot.org
Fri, 11 May 2001 11:43:14 +0200
mal wrote:
> > I may be being dense, but can you explain what's going on here:
> >
> > ->> u'\u00e3'.encode('latin-1')
> > '\xe3'
> > ->> u'\u00e3'.encode("latin-1").decode("latin-1")
> > Traceback (most recent call last):
> > File "<input>", line 1, in ?
> > UnicodeError: ASCII encoding error: ordinal not in range(128)
>
> The string.decode() method will try to reuse the Unicode
> codecs here. To do this, it will have to convert the string
> to Unicode first and this fails due to the character not being
> in the ASCII range.
can you take that again? shouldn't michael's example be
equivalent to:
unicode(u"\u00e3".encode("latin-1"), "latin-1")
if not, I'd argue that your "decode" design is broken, instead
of just buggy...
Cheers /F