[Python-Dev] Cannot declare the largest integer literal.

Tim Peters tim_one@email.msn.com
Sat, 6 May 2000 15:13:46 -0400


[Tim]
> Python's grammar is such that negative integer literals don't
> exist; what you actually have there is the unary minus operator
> applied to positive integer literals; ...

[Christian Tismer]
> Well, knowing that there are more negatives than positives
> and then coding it this way appears in fact as a design flaw to me.

Don't know what you're saying here.  Python's grammar has nothing to do with
the relative number of positive vs negative entities; indeed, in a
2's-complement machine it's not even true that there are more negatives than
positives.  Python generates the unary minus for "negative literals"
because, again, negative literals *don't exist* in the grammar.

> A simple solution could be to do the opposite:
> Always store a negative number and negate it
> for positive numbers.  ...

So long as negative literals don't exist in the grammar, "-2147483648" makes
no sense on a 2's-complement machine with 32-bit C longs.  There isn't "a
problem" here worth fixing, although if there is <wink>, it will get fixed
by magic as soon as Python ints and longs are unified.