[Python-Dev] readd u'' literal support in 3.3?

Chris McDonough chrism at plope.com
Fri Dec 9 05:33:24 CET 2011


On Fri, 2011-12-09 at 03:50 +0100, Lennart Regebro wrote:
> "from future import unicode_literals" is my fault. I'm sorry. It's
> pretty useless. It was suggested by somebody and I then supported it's
> adding, instead of allowing u'' which I suggested. But it doesn't
> work.
> 
> One reason is that you need to be able to say "This should be str in
> Python 2, and binary in Python 3, that should be Unicode in Python 2
> and str in Python 3, and that over there should be str in both
> versions", and the future import doesn't support that.

This is also true.

But even so, b'' exists as a porting nicety.  The argument for
supporting u'' is the same one the one which exists for b'', except in
the opposite direction.  Since popular library code is going to need to
run on both Python 2 and Python 3 for the foreseeable future, anything
to make this easier helps.

Supporting u'' in 3.3 will prevent me from needing to think about
bytes/text distinction again while porting/straddling.  Every time I say
this to somebody who isn't listening closely they say "AHA!  You're
*supposed* to think about bytes vs. text, that's the whole point
stupid!"

They fail to hear the "again" in that sentence.  I've clearly already
thought about the distinction between bytes and text at least once:
that's *why* I'm using a u'' literal there.  I shouldn't have to think
about it again to service syntax constraints.  Code that is more
explicit than strictly necessary should not be needlessly punished.

Continuing to not support u'' in Python 3 will be like having an
immigration station where folks who have a  b'ritish' passport can get
through right away, but folks with a u'kranian' passport need to get
back on a plane that appears to come from the Ukraine before they
receive another tag that says they are indeed from the Ukraine.  It's
just pointless makework.

- C




More information about the Python-Dev mailing list