# Rationale for core Python numeric types

Grant Edwards grante at visi.com
Fri Jul 16 20:27:55 CEST 2004

```On 2004-07-16, Peter Hickman <peter at semantico.com> wrote:
> Grant Edwards wrote:
>> I disagree.
>>
>> Things like "eight-bit signed 2's compliment binary number"
>> have completely rigorous definitions that are not dependent on
>> some particular hardware platform.
>
> What you say is true but the OP does not talk about "eight-bit signed 2's
> compliment binary number", he talks about "e.g., single-precision
> float, short integers". It is these types that the OP was talking about and

You're right.  I read more into his post that I should have.
When he wrote single-precision float, I assumed he meant 32-bit
IEEE-754, and "short int" he meant "16-bit signed 2's
compliment number".

>> No, I think you're missing the point. We're talking about
>> writing a Python program who's purpose is to manipulate
>> externally defined data in externally defined ways.  The data
>> are binary numbers of specific lengths (typically 8, 16, and
>> 32 bits) and the operations are things like 1's compliment,
>> 2's compliment, 2's compliment addition and subtraction, left
>> and right shift, binary and and or, etc.
>
> Again returning to the OP, I find no mention of "things like
> 1's compliment, 2's compliment, 2's compliment addition and
> subtraction, left and right shift, binary and and or, etc". Am
> I missing a post or are you in private conversation with the
> OP?

No -- something about the language he used made me think he was
working in the context I described: where one needs to perform
operations on types of specific lengths.  You're right, though,
he didn't actually say that.

>> You seem obsessed with hardware. :)
>
> The type system, or at least the whole zoo of integers, came
> about from the hardware. First there was char and int, because
> there was only 8 bits anyway and they were words. Then came 16
> bits and the int went up to 16 bits but was now no longer a
> synonym for char so we got short int and the cpus started to
> implement signing of integers so now we require signed and
> unsigned integers. The whole type system came into existence