[Python-Dev] unicode() and its error argument
Sun, 16 Jun 2002 12:48:49 +0200
Skip Montanaro wrote:
> The unicode() builtin accepts an optional third argument, errors, which
> defaults to "strict". According to the docs if errors is set to "ignore",
> decoding errors are silently ignored. I seem to still get the occasional
> UnicodeError exception, however. I'm still trying to track down an actual
> example (it doesn't happen often, and I hadn't wrapped unicode() in a
> try/except statement, so all I saw was the error raised, not the input
> string value).
The error argument is passed on to the codec you request.
It's the codec that decides how to implement the error handling,
not the unicode() builtin, so if you're seeing errors with 'ignore'
then this is probably the result of some problem in the codec.
CEO eGenix.com Software GmbH
Company & Consulting: http://www.egenix.com/
Python Software: http://www.egenix.com/files/python/
Meet us at EuroPython 2002: http://www.europython.org/