[Tutor] Bits operations [another version of binary() using hex()]
Jeff Shannon
jeff@ccvcorp.com
Mon Jul 14 14:38:02 2003
Thomas Clive Richards wrote:
>The small trouble that i have with using octal as an intermediate step
>between decimal and binary is that one octal digit represents 3 binary
>digits, and this is not easily divisible into an 8 or 16 bit binary
>number.
>
Once upon a time, there were a variety of mainframe/minicomputer systems
that were based on 18- or 27-bit words. For these machines, each 9-bit
byte would be exactly representable by a 3-digit octal number.
Nowadays, virtually every machine on the market uses a word length
that's some power of two (8, 16, 32, 64), so it makes more sense to use
hex, where each 8-bit byte is exactly representable by 2 hex digits. In
my opinion, support for octal is now a historical relic. (Keep in mind,
though, that some historical relics have a long and productive lifetime
-- COBOL comes to mind. Despite its "obsolescense", COBOL is still one
of the most-used computer languages.)
Jeff Shannon
Technician/Programmer
Credit International