aleax at aleax.it
Tue Nov 11 16:13:41 CET 2003
John J. Lee wrote:
> Peter Hansen <peter at engcorp.com> writes:
>> Strange, but based on a relatively mundane thing: the frequency
>> (14.31818MHz) of the NTSC color sub-carrier which was used when
>> displaying computer output
>> on a TV. This clock was divided by 3 to produce the 4.77MHz clock for
>> in time-keeping, which then counted on every edge using a 16-bit counter
>> which wrapped around every 65536 counts, producing one interrupt every
>> 65536/(14.31818*1000000/12) or about 0.5492 ms, which is about 18.2 ticks
> That doesn't explain it AFAICS -- why not use a different (smaller)
> divisor? An eight bit counter would give about 0.2 ms resolution.
The original IBM PC (8088, 64KB of memory if you were lucky, and
two 160 KB floppies), which is where all of these numbers come from,
didn't exactly have all that much power to spare. Dealing with 18.2
clock interrupts a second was plenty -- dealing with way more was
probably considered out of the question by the original designers.
We _are_ talking about more than 20 years ago, after all (and I'm
sure none of those designers could possibly dream that their numbers
had to be chosen, not for ONE computer model, but for models that
would span 15 or more turns of Moore Law's wheel...!_).
More information about the Python-list