Aahz wrote:
On Mon, May 04, 2009, John Graham wrote:
I do have a question though, if recursive algorithms are generally frowned upon (and I'd say, without TCO, anything too complex hits the stack quick) why is recursion even supported? What is the 'Pythonic' use of recursion?
Who has said that recursive algorithms are "generally" frowned upon? What people have said is that Python is not focused on recursive algorithms the way TCO-languages are, but that is also true of e.g. functional programming style.
An unfortunate result of "Python is not focused on recursive algorithms" is that such algorithms become much harder to work with. Yes, there is always a way to write it without recursion, and often the non-recursive version will run faster because of the slow function calls. However, the non-recursive version is usually much less readable/maintainable, which is really a shame in a language designed for readability.
Recursive algorithms are natural for nested datastructures, but you will almost never find a nested datastructure that is a thousand levels deep.
Balanced trees are not the only data structures in the world. Depth-first traversal of a graph is most easily expressed using recursion, and graphs with thousands (or even millions) of vertices are not uncommon. Also, there are many other types of problems where a recursive algorithm is the most natural. Most discrete optimization problems fall in this category. For toy examples, look at almost any type of puzzle-solving. The proposed "continue <functioncall>" feature doesn't help these cases, because they don't use tail calls. For that you need something like Stackless. I'm -1 on the spelling anyway because it would kill the idea of using "continue <expression>" in a for-loop as a way to send the value of <expression> to the generator being iterated over. I'm +0 on the idea of using syntax to explicitly request tail call optimization. I'm +1 on doing implicit tail-call optimization, even if it does make the stack trace less useful. - Jacob