[Python-Dev] unicode() and its error argument

Skip Montanaro skip@pobox.com
Sat, 15 Jun 2002 09:51:25 -0500

The unicode() builtin accepts an optional third argument, errors, which
defaults to "strict".  According to the docs if errors is set to "ignore",
decoding errors are silently ignored.  I seem to still get the occasional
UnicodeError exception, however.  I'm still trying to track down an actual
example (it doesn't happen often, and I hadn't wrapped unicode() in a
try/except statement, so all I saw was the error raised, not the input
string value).

This reminds me, it occurred to me the other day that a plain text version
of cgitb would be useful to use for non-web scripts.  You'd get a lot more
context about the environment in which the exception was raised.