Possibly Pythonic Tail Call Optimization (TCO/TRE)

Joonas Liik liik.joonas at gmail.com
Thu Jul 16 19:34:20 CEST 2015


On 16 July 2015 at 20:03, Chris Angelico <rosuav at gmail.com> wrote:
>
> The trouble with that is that it can quickly run you out memory when
> you accidentally trigger infinite recursion. A classic example is a
> simple wrapper function...
>
> def print(msg):
>     print(ctime()+" "+msg)
>
> With the recursion limit at its default of 1000, this can't do more
> than 1000 iterations before the system detects that it's stuck. With
> an infinite stack, this could destroy your system memory and then
> finally terminate with MemoryError... if you're lucky. If you're not,
> the whole system could get wedged before this gets killed. (Imagine,
> for instance, a computer with a large hard disk devoted to virtual
> memory. Long before that's exhausted, the system will be practically
> unusable, because every little operation will require going back to
> the disk.)
>

That all sounds reasonable. However that can be looked another way.
Soppose you have some code that traverses some tree, a strange
imbalanced tree (say from some xml)

It is, semantically at least, a reasonable aproach to process such a
structure with some recursive function.
Lets say we have a function that counts all <li> elements in a
document for example. and recursively traverses the element tree.
Because most xml is relatively flat (let's assume its rare to find
more than 100 levels of nesting say) this function would perform
perfectly well for most cases.
however if some guy sent you an xml document with 1000 levels of
nesting your program would crash.

Now suddenly you have perfectly good functioning code that randomly
crashes. because of some arbitrary limit.
it most distinctly reminds me of a certain programming language that
kills your thread after 30000 operations because you are
obviously-in-an-infinite-loop.
it leaves a very bad taste in my mouth.

That 30k limit (much less lines of source code ofc) is the reason you
need nasty hacks to do stuff like implement BigInteger.
That 1k stack limit is the limit you cant use perfectly good code
just because your input data has some weird quirk.

This puts python on par with jass2 and this deeply saddens me.

Now i admit that it is possible to have infinite recursion but it is
also possiblew to have infinite loops. and we don't kill your code
after 1000 iterations of a while loop so why should we treat recursion
any differently?

Having a user defined maximum stack limit might be a good idea, eg if
my stack takes up 100MB its prolly broke, but it should be my duty as
a programmer to specify such a limit, not have it inflicted upon me
(worse, in a manner that cannot be changed!).


More information about the Python-list mailing list