Python dict as unicode

Terry Reedy tjreedy at udel.edu
Thu Nov 25 00:42:55 EST 2010


On 11/24/2010 5:58 PM, Brendon wrote:
> Hi all,
>
> I am trying to convert a dictionary to a unicode string and it fails
> with an exception. I am awfully surprised but searching the web has
> not turned up anything useful. I understand why the exception ocurrs,
> but am not sure why this is the default behaviour of python and if
> there is anything I can do to fix the problem.
>
> I have a python dictionary:
> d = { ......}
>
> It contains both primitive and complex objects. I want a unicode
> representation of that dict:
> s = unicode(d)
>
> Doing this I get an exception:
> UnicodeDecodeError: 'ascii' codec can't decode byte 0xe4 in position
> 71: ordinal not in range(128)
>
> Now, it seems that unicode(d) is the same as unicode(str(d)). I was
> expecting there to be a __unicode__ method in the dictionary that in
> turn calls unicode() on each of the keys/values in the dict, but
> apparently not. Instead it seems to call the equivalent of str() on
> each key/value and then after adding them together, calls unicode() on
> the resulting string.
>
> Is this really the default behaviour? If so is there any way around
> it?

Use 3.x

> I am using python 2.6.6 on a Linux system.


-- 
Terry Jan Reedy




More information about the Python-list mailing list