On Sat, Dec 5, 2009 at 11:23, George Sakkis firstname.lastname@example.org wrote:
On Sat, Dec 5, 2009 at 8:10 PM, Stephen J. Turnbull email@example.com wrote:
George Sakkis writes: > On Sat, Dec 5, 2009 at 6:45 PM, Andre Engels firstname.lastname@example.org wrote:
> > In your proposed implementation, sum() would be undefined. > > Which would make it consistent with min/max.
There's no justification for trying to make 'min' and 'sum' consistent. The sum of an empty list of numbers is a well-defined *number*, namely 0, but the max of an empty list of numbers is a well-defined *non-number*, namely "minus infinity".
The real question is "what harm is done by preferring the (well-defined) sum of an empty list of numbers over the (well-defined) empty sums of lists and/or strings?" Then, if there is any harm, "can the situation be improved by having no useful default for empty lists of any type?" Finally, "is it worth breaking existing code to ensure equal treatment of different types?"
My guess is that the answers are "very little", "hardly at all", and "emphatically no."<wink>
Agreed that there is little harm in preferring numbers over other types when it comes to empty sequences, but the more important question is "should the start argument be used even if the sequence is *not* empty?". The OP doesn't think so and I agree.
Only sometimes adding the start value makes it more fragile. If you have Foo() objects that aren't compatible with int and you do sum([Foo(), Foo()]) you get a Foo() back. If your sequence then happens to be empty you do sum() and get an int back. The result is likely to be used in a context that's not compatible with int either. Better always fail and require an explicit start if you need it.