[Python-ideas] Why does `sum` use a default for the `start` parameter?
rhamph at gmail.com
Sat Dec 5 20:03:02 CET 2009
On Sat, Dec 5, 2009 at 11:23, George Sakkis <george.sakkis at gmail.com> wrote:
> On Sat, Dec 5, 2009 at 8:10 PM, Stephen J. Turnbull <stephen at xemacs.org> wrote:
>> George Sakkis writes:
>> > On Sat, Dec 5, 2009 at 6:45 PM, Andre Engels <andreengels at gmail.com> wrote:
>> > > In your proposed implementation, sum() would be undefined.
>> > Which would make it consistent with min/max.
>> There's no justification for trying to make 'min' and 'sum'
>> consistent. The sum of an empty list of numbers is a well-defined
>> *number*, namely 0, but the max of an empty list of numbers is a
>> well-defined *non-number*, namely "minus infinity".
>> The real question is "what harm is done by preferring the
>> (well-defined) sum of an empty list of numbers over the (well-defined)
>> empty sums of lists and/or strings?" Then, if there is any harm, "can
>> the situation be improved by having no useful default for empty lists
>> of any type?" Finally, "is it worth breaking existing code to ensure
>> equal treatment of different types?"
>> My guess is that the answers are "very little", "hardly at all", and
>> "emphatically no."<wink>
> Agreed that there is little harm in preferring numbers over other
> types when it comes to empty sequences, but the more important
> question is "should the start argument be used even if the sequence is
> *not* empty?". The OP doesn't think so and I agree.
Only sometimes adding the start value makes it more fragile. If you
have Foo() objects that aren't compatible with int and you do
sum([Foo(), Foo()]) you get a Foo() back. If your sequence then
happens to be empty you do sum() and get an int back. The result is
likely to be used in a context that's not compatible with int either.
Better always fail and require an explicit start if you need it.
Adam Olsen, aka Rhamphoryncus
More information about the Python-ideas