On Sun, Nov 28, 2010 at 7:04 PM, Antoine Pitrou firstname.lastname@example.org wrote:
On Sun, 28 Nov 2010 15:58:33 -0500 Alexander Belopolsky email@example.com wrote:
On Sun, Nov 28, 2010 at 3:43 PM, Antoine Pitrou firstname.lastname@example.org wrote: ..
For example, I don't think that supporting
is more important than to assure users that once their program accepted some text as a number, they can assume that the text is ASCII.
Why would they assume the text is ASCII?
def deposit(self, amountstr): self.balance += float(amountstr) audit_log("Deposited: " + amountstr)
$ cat numbered-account.log Deposited: ?????.??
I'm not sure that's how banking applications are written :)
+1 for this being bogus - I see no correlation whatsoever in numbers inside unicode having to be "ASCII" if we have surpassed all technical barriers for needing to behave like that. ASCII is an oversimplification of human communication needed for computing devices not complex enough to represent it fully.
Let novice C programmers in English speaking countries deal with the fact that 1 character is not 1 byte anymore. We are past this point.
Antoine. _______________________________________________ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/jsbueno%40python.org.br