PEP0238 lament

Steve Horne sh at ttsoftware.co.uk
Mon Jul 23 14:15:35 EDT 2001


On Mon, 23 Jul 2001 09:34:36 -0500, <mcherm at destiny.com> wrote:

>Guido writes:
>> [...]
>> Because all Arthur does is tick me off, I'll leave it at this -- 
>> maybe someone else can explain it all to him.
>
>Well, I'll see what I can do. In the language there are (at least)
>two distinct things we might want to do when "dividing" A and B:
>
>  (1) split A up into B parts
>  (2) find how many B's fit in an A
>
>A good example of (1) is float division, and the cannonical example 
>of (2) is "int division", aka "div". These are DIFFERENT THINGS.

Nope - float division is simply a logical extension of int division,
the same as float addition compared with int addition. The fact that
some people *see* them as different things is irrelevant given that
others don't.

>So I'm hoping that everyone reading this can agree, BOTH (1) and
>(2) are USEFUL things, and a good language should allow BOTH. If
>not, then please stop reading: there's no way I can convince you.

Here I agree - division that implicitly coerces integers to floats is
likely to be useful, at least for some. It's *not* the only solution
for the stated problems, but it is a reasonable idea.

>Given that you want both (1) and (2), there is just one problem:
>the syntax "A / B" is the ideal syntax for BOTH of these purposes.
>It's ideal for (1) because the mathematical operation of division 
>is traditionally represented with a "/", and it's ideal for (2) 

The mathematical operation of division is traditionally represented as
÷ in general - or as a horizontal line between the numerator and
denominator. Only rationals merit the / in mathematics - it's simply
the best compromise.

But mathematics uses the same set of symbols equally for both integer
and real division - it's rarely made explicit which is in use as it is
normally blatantly obvious. So I don't believe (1) is valid.


>because C and other languages have historically used "/" between 
>integers for (2). In my opinion (and those of quite a few others), 
>this is pretty lopsided: (1) is probably used more often, so it 

Most programs make heavy use of integer subscripts and indices.
Integer division is heavily used in many applications - including
numerics (even numeric algorithms have to partition their data sets
for some tasks). The first form of division is rarely done outside of
noddy teaching exercises and some specialist fields. *THAT* is the
rare case.

>should get priority, PLUS just imitating C isn't very convincing 
>since other languages (eg: Pascal) use other syntax, PLUS the use 

Pascal is half dead - only Delphi is realistically keeping it alive,
and that is a single vendor non-standard implementation which has
never been a major language in the way C, C++ or Java have - why do
you think they made C++ Builder?

Lets see - on the 1/2 gives 0.5 front we have...

Perl         Completely untyped, so it only has one 'numeric type'
             anyway. Only popular as a scripting language - ie it
             is a competitor, not something lots of people will use
             concurrently with Python so confusion on changeover will
             be relatively limited.
LISP         Lots-Of-Infuriatingly-Stupid-Parentheses? - A very
             important language academically, but *not* a model of
             clarity or simplicity. To my knowledge, the biggest user
             base is the non-standard version used for scripting
             EMACS.
Pascal       Again a very *important* language, but mainly
             historical. Ada - a language very heavily influenced
             by Pascal and by Wirth in general - *REVERSED* this
             decision ON THE GROUNDS OF CLARITY, SIMPLICITY AND
             RELIABILITY - and believe me, Ada was *not* designed
             using guesswork and personal ideologies.
Modula 2/3 etc   Better than Pascal, but sadly not kept around to
                 the same extent. Definitely minority languages.
JavaScript   Hardly a model of cleanliness or simplicity - my
             experience is that half the time the interpreter
             is trying to guess what you mean and getting it
             wrong - just like HTML, I suppose. It's dumbed down
             to the point that you have to figure out just how
             stupid it's assuming you are to have any chance of
             getting it to work at all.


On the other side...

Visual BASIC   The most used language for at least three years
               running.
C              What need I say? Not a great model for clarity in
               general, but at least a *lot* of people are used to it.
C++            Ditto
Java           Ditto
Ada            See 'Pascal' above
VBA            The most used language by non-programmers who just want
               a quick script in a spreadsheet or whatever.


Has anyone noticed a pattern here ;-)


Sorry for arguing against something that's only a minor part of the
issue - especially when you clearly understand my main worry - but I
want people to appreciate that even in twenty years when every Python
script in use has been fixed and no-one remembers the argument - it
will *still* be a matter of opinion which is better.

-- 
Steve Horne
Home : steve at lurking.demon.co.uk
Work : sh at ttsoftware.co.uk



More information about the Python-list mailing list