Possibly Pythonic Tail Call Optimization (TCO/TRE)

Chris Angelico rosuav at gmail.com
Thu Jul 16 19:49:05 CEST 2015

On Fri, Jul 17, 2015 at 3:34 AM, Joonas Liik <liik.joonas at gmail.com> wrote:
> That all sounds reasonable. However that can be looked another way.
> Soppose you have some code that traverses some tree, a strange
> imbalanced tree (say from some xml)
> It is, semantically at least, a reasonable aproach to process such a
> structure with some recursive function.
> Lets say we have a function that counts all <li> elements in a
> document for example. and recursively traverses the element tree.
> Because most xml is relatively flat (let's assume its rare to find
> more than 100 levels of nesting say) this function would perform
> perfectly well for most cases.
> however if some guy sent you an xml document with 1000 levels of
> nesting your program would crash.

This sounds like a denial-of-service attack. If you can state that no
reasonable document will ever have more than 100 levels of nesting,
then you can equally state that cutting the parser off with a tidy
exception if it exceeds 100 levels is safe.

> Now i admit that it is possible to have infinite recursion but it is
> also possiblew to have infinite loops. and we don't kill your code
> after 1000 iterations of a while loop so why should we treat recursion
> any differently?

Partly because infinite recursion requires infinite storage too. If it
truly is tail calls, then you can turn it into a while loop and not
have the limit; otherwise, you run the risk of blowing out your RAM.

> Having a user defined maximum stack limit might be a good idea, eg if
> my stack takes up 100MB its prolly broke, but it should be my duty as
> a programmer to specify such a limit, not have it inflicted upon me
> (worse, in a manner that cannot be changed!).

Actually, it is up to you. There's a default, but you can change it.

>>> def recurse(n): return n and recurse(n-1)
>>> recurse(998)
>>> recurse(999)
(throws RuntimeError)
>>> sys.getrecursionlimit()
>>> sys.setrecursionlimit(5)
>>> recurse(5)
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "<stdin>", line 1, in recurse
  File "<stdin>", line 1, in recurse
  File "<stdin>", line 1, in recurse
  File "<stdin>", line 1, in recurse
RuntimeError: maximum recursion depth exceeded
>>> sys.setrecursionlimit(5000)
>>> recurse(5000)
(throws RuntimeError with a gigantic traceback)
>>> sys.setrecursionlimit(50000)
>>> recurse(50000)
Segmentation fault

Though as you can see, CPython does have other issues. If you crank
the recursion limit up far enough, you *will* blow out your C stack.
Other Pythons may behave differently, and different builds of CPython
may crash at different points. But within reason, you can expand this
limit, and you can certainly shrink it (eg to reduce the effect of a
tree-parser DOS attack).


More information about the Python-list mailing list