Numeric Programming for Everybody (was Re: I had a thought ...)
Don Dwiggins
dwig at advancedmp.net
Mon Jun 4 16:18:02 EDT 2001
Tim Peters writes:
> [Don Dwiggins]
>> See http://support.microsoft.com/support/kb/articles/Q172/3/38.ASP for
>> one use of Currency having nothing to do with money.
> Cute! In an earlier, unrelated thread, we learned that the Currency type is
> really a 64-bit int, conceptually scaled by a factor of 10000. The
> performance counter API uses 64-bit ints directly. So they're the same
> thing, except that the VB code at the link has to fiddle around multiplying
> the Frequency value by a magic 10000 to get VB to *treat* the value like a
> real 64-bit int. OTOH, in (Ctr2 - Ctr1) / Freq, the 10000's cancel out so
> they dare not play with magic multipliers in *that* context. Oh ya -- I bet
> all the VB programmers understand this in sufficient detail to bet the
> business on <wink>.
This, and the various threads on numeric representation and computation,
triggered the following thoughts:
- Common number systems can't be completely and accurately represented in
computers -- some limitations and/or approximations must be accepted.
- This is widely acknowledged among programmers, but not obvious to
newcomers.
- Many of those who acknowledge it don't have a complete understanding of
the "limitations and/or approximations" and the sometimes subtle
consequences for building software that handles numbers "reasonably". (As
evidence, I offer the discussions in this group. I also freely admit
that, if I were to undertake serious floating-point programming again, I'd
have to brush up on the details.)
- For many purposes, a relatively naive level of understanding is sufficient
-- provided it's accompanied by an appreciation that there's more to it,
and that one may be required to learn the details of floating point,
rational representations, etc.
So, what are the implications for "Computer Programming for Everybody"? How
much and what knowledge should the newbie be expected to master, to avoid
falling into various traps? What conceptual model(s) of numbers should a
language (say, for example, Python) present to the learner to be generally
useful without making it too easy to go wrong? How can the language
accommodate the differing needs of the learner and dilettante on the one
hand, and the specialists in various kinds of numeric programming on the
other?
--
Don Dwiggins "Solvitur Ambulando"
Advanced MP Technology
dwig at advancedmp.net
More information about the Python-list
mailing list