On Sun, Mar 9, 2014 at 6:13 AM, Mark H. Harris firstname.lastname@example.org wrote:
But, you are correct that what is "really" wanted --someday-- is to
have the literal be decimal (rather than float) to begin with.
Neither here nor there, step at a time over time is better than simple status quo.
Please let me be clear, I think Guido's proposal is a very good first step. It makes sense for the most users (esp naive ones) and does not interfere with advanced users.
It's probably time someone  wrote up a PEP about all this. The most important points, as I see it, are:
1) Create a new Decimal literal notation - 1.234d seems to be the most popular syntax. This is reasonably uncontroversial, but it has consequences. 2) Create a new float literal notation - 1.234f or 1.234r or any of the other proposals. 3) Possibly change repr(float) to include the tag. 4) Introduce a "from __future__ import decimal_literals" (named to parallel unicode_literals - you can get a u"literal" without that directive, but the default literal type becomes unicode) 5) What about int/int? Should that now be Decimal? Should it be per-module??? 6) Further cans of worms like #5
Introducing 1/2/4 would let you stick the future directive into PYTHONSTARTUP and then run Python as an interactive decimal calculator.
 Not it!