[Python-Dev] Deprecation warning on integer shifts and such

Guido van Rossum guido@python.org
Mon, 12 Aug 2002 17:31:26 -0400


> I think what it boils down to is what Python's model of the 
> world is: C or mathematics. It used to be C, which is probably 
> the one reason Python caught on initially (whereas ABC with it's 
> mathematical model didn't, really). I can see the reason behind 
> moving towards a more consistent world view, where integers are 
> integers, be they 32 bits or more, where strings are strings, be 
> they unicode or ascii, and I even agree with it, up to a point.
> 
> The drawback is that it will make it more difficult to interface 
> Python to the real world, where integers have a size, characters 
> are 8 bits, binary data is "char *" too, unicode has funny APIs, 
> etc. And I happen to feel responsible for a lot of this real 
> world interfacing code:-)

The issue is not that the new approach makes it more difficult to
interface to the real world.  The issue is that you have to change how
you interface to the real world.  Writing something from scratch that
uses the new approach won't take any more work.  It's the backwards
compatibility that bites you.

--Guido van Rossum (home page: http://www.python.org/~guido/)