Python's "only one way to do it" philosophy isn't good?

Alexander Schmolck a.schmolck at
Wed Jun 13 02:42:51 CEST 2007

Steven D'Aprano <steve at> writes:

> On Mon, 11 Jun 2007 01:28:09 +0100, Alexander Schmolck wrote:
>> Steven D'Aprano <steve at> writes:
>>> On Sat, 09 Jun 2007 22:42:17 +0100, Alexander Schmolck wrote:
>>>>> As for why tail calls are not optimized out, it was decided that being able
>>>>> to have the stack traces (with variable information, etc.) was more useful
>>>>> than offering tail call optimization
>>>> I don't buy this. 
>>> Do you mean you don't believe the decision was made, or you don't agree
>>> with the decision?
>> Neither. I don't believe the rationale stated in this thread to be the true
>> reason.
> Don't keep us in suspense. What do you believe is the true reason?

It's easier to spot that some rationalization is bogus than to unconver the
true underlying causes; I'm pretty sure it's more a Gestalt thing than a
compelling technical reason (I guess Guido's distaste for scheme also plays a
role). Not that I discount that out of hand -- maybe all that's great about
python is due to Guido being exceptionally good at making such judgements.

>>> Are you volunteering? If you are, I'm sure your suggestion will be welcomed
>>> gratefully.
>> I rather doubt it. Guido has stated quite clearly that his not
>> interested in incorporating this feature.
> He's not the only one who gets to make these decisions. 

This is news to me. Who else does?

> But even if he uses his veto to prevent tail-recursion optimization from
> being put into the main branch, there are other options.

That don't involve abducting his kids?

>>>>> (do what I say).
>>>> Where did you say run out of memory and fail? More importantly how do
>>>> you say "don't run out of memory and fail"?
>>> If we can live with a certain amount of "arbitrary failures" in simple
>>> arithmetic,
>> I prefer not too, and thus when possible avoid to use languages where
>> ``a + b`` is liable to fail arbitrarily (such as C, where the behavior
>> will often be undefined).
> That's not the sort of arbitrary failure I was discussing, but for that
> matter Python is one of those languages. 

Apart from floating point arithmetic, simple arithmetic doesn't tend to fail
arbitrarily in python, as far as I'm aware. You can of course create your very
own classes specifically to get broken arithmetic but that doesn't strike me
as "simple" arithmetic anymore.

> Perhaps Python is not the language for you?

Do you also happen to know what would be?

> Correct me if I'm wrong, but surely it is C++ that can have arbitrary
> behaviour for "a + b", not C?

``INT_MAX + 1`` can do precisely anything in C.

>>> You can always hand-optimize it yourself.
>> Not tail calls, in general, no.
> Sorry, how does that work? You're suggesting that there is an algorithm
> which the compiler could follow to optimize away tail-recursion, but human
> beings can't follow the same algorithm? Now I'm confused.

Does it also confuse you that if I give you a 500x500 matrix A you won't be
able to figure out a single element in A^-1 by doing mental arithmetic (or
using pen and paper), although my computer manages just fine and I'm happy to
give you the algorithm it uses?


More information about the Python-list mailing list