[Python-Dev] Py_UNICODE madness

Nicholas Bastin nbastin at opnet.com
Wed May 4 05:15:38 CEST 2005


On May 3, 2005, at 6:44 PM, Guido van Rossum wrote:

> I think that documentation is wrong; AFAIK Py_UNICODE has always been
> allowed to be either 16 or 32 bits, and the source code goes through
> great lengths to make sure that you get a link error if you try to
> combine extensions built with different assumptions about its size.

That makes PyUnicode_FromUnicode() a lot less useful.  Well, really, 
not useful at all.

You might suggest that PyUnicode_FromWideChar is more useful, but 
that's only true on platforms that support wchar_t.

Is there no universally supported way of moving buffers of unicode data 
(as common data types, like unsigned short, etc.) into Python from C?

--
Nick



More information about the Python-Dev mailing list