# How sum() should really be done

Alex Martelli aleax at aleax.it
Fri Nov 14 13:37:24 CET 2003

Douglas Alan wrote:

> Erik Max Francis <max at alcyone.com> writes:
>
>> Douglas Alan wrote:
>
>>> Ah, that reminds me -- both sum() and reduce() can be removed from
>>> Python by extending operator.add so that it will take any number of
>>> arguments.
>
>> reduce can't, since reduce doesn't require the function passed to be
>
> Well, as I said, for this to be true, *all* binary operators (that it
> makes sense to) would have to be upgraded to take an arbitrary number
> of arguments, like they do in Lisp.

Your definition of "operator" appears to be widely at variance with
the normally used one; I've also noticed that in your comparisons of
reduce with APL's / , which DOES require specifically an OPERATOR (of
the binary persuasion) on its left.  reduce has no such helpful
constraints: not only does it allow any (callable-as-binary) FUNCTION
as its first argument, but any other CALLABLE at all.  Many of the
craziest, most obscure, and worst-performing examples of use of
reduce are indeed based on passing as the first argument some callable
whose behaviour is anything but "operator-like" except with respect to
the detail that it IS callable with two arguments and returns something
that may be passed back in again as the first argument on the next call.
[see note later].

Anyway, to remove 'reduce' by the trick of "upgrading to take an
arbitrary number of arguments", that "upgrade" should be applied to
EVERY callable that's currently subsceptible to being called with
exactly two arguments, AND the semantics of "calling with N arguments"
for N != 2 would have to be patterned on what 'reduce' would do
them -- this may be totally incompatible with what the callable does
now when called with N != 2 arguments, of course.  For example,
pow(2, 10, 100)
now returns 24, equal to (2**10) % 100; would you like it to return
10715086071862673209484250490600018105614048117055336074437503883703510511249361224931983788156958581275946729175531468251871452856923140435984577574698574803934567774824230985421074605062371141877954182153046474983581941267398767559165543946077062914571196477686542167660429831652624386837205668069376

I doubt there would be any objection to upgrading the functions in
module operator in the way you request -- offer a patch, or make a
PEP for it first, I would hope it would be taken well (I can't speak
for Guido, of course).  But I don't think it would make much more of
a dent in the tiny set of reduce's current use cases.

[note on often-seen abuses of FP built-ins in Python]

Such a typical abuse, for example, is connected with the common idiom:

for item in seq: acc.umul(item)

which simply calls the same one-argument callable on each item of seq.
Clearly the idiom must rely on some side effect, since it ignores the
return values, and therefore it's totally unsuitable for shoehorning
into "functional-programming" idioms -- functional programming is
based on an ABSENCE of side effects.

Of course, that something is totally inappropriate never stops fanatics
of functional programming, that have totally misunderstood what FP is
all about, from doing their favourite shoehorning exercises.  So you see:

map(acc.umul, seq)

based on ignoring the len(seq)-long list of results; or, relying on the
fact that acc.umul in fact returns None (which evaluates as false),

filter(acc.umul, seq)

which in this case just ignores an _empty_ list of results (I guess
that's not as bad as ignoring a long list of them...?); or, of course:

reduce(lambda x, y: x.umul(y) or x, seq, acc)

which does depend strictly on acc.umul returning a false result so
that the 'or' will let x (i.e., always acc) be returned; or just to
cover all bases

reduce(lambda x, y: (x.umul(y) or x) and x, seq, acc)

Out of all of these blood-curling abuses, it seems to me that the
ones abusing 'reduce' are the very worst, because of the more
complicated signature of reduce's first argument, compared to the
first argument taken by filter, or map with just one sequence.

To be sure, one also sees abuses of list comprehensions here:

[acc.umul(item) for item in seq]

which basically takes us right back to the "abuse of map" case.
List comprehension is also a rather FP-ish construct, in fact,
or we wouldn't have found it in Haskell to steal/copy it...;-).

Alex