[Tutor] difference between signed integers and unsigned integers
Alan Gauld
alan.gauld at blueyonder.co.uk
Wed Jan 21 14:19:51 EST 2004
> this list. i want to know what are signed and unsigned
> integers and what is the difference between them.
A signed integer is one with either a plus or minus sign
in front. That is it can be either positive or negative.
An unsigned integer is assumed to be positive.
This is important in computing because the numbers are
stored (usually) as a fixed number of binary digits. For
a signed integer one bit is used to indicate the sign
- 1 for negative, zero for positive. Thus a 16 bit signed
integer only has 15 bits for data whereas a 16 bit unsigned
integer has all 16 bits available. This means unsigned
integers can have a value twice as high as signed integers
(but only positive values). On 16 bit computers this was
significant, since it translates to the difference between
a maximum value of ~32,000 or ~65,000. On 32 bit computers
its far less signifocant since we get 2 billion or 4 billion.
And on 64 bit computers it becomes of academic interest.
IN Python we have large integers which effectively are
unlimited in size, so we don't really care much! :-)
Alan G.
More information about the Tutor
mailing list