On Fri, Oct 19, 2012 at 9:50 AM, Christian Tismer <tismer@stackless.com> wrote:
On 19.10.12 18:07, Guido van Rossum wrote:
On Fri, Oct 19, 2012 at 5:05 AM, Christian Tismer <tismer@stackless.com> wrote:
On 19.10.12 07:15, Greg Ewing wrote:
Christian Tismer wrote:
- generators are able to free the stack, when they yield. But when they are active, they use the full stack. At least when I follow the pattern "generator is calling sub-generator". A deeply nested recursion is therefore something to avoid. :-(
Only if yield-from chains aren't optimised the way they used to be.
Does that mean a very deep recursion would be efficient?
TBH, I am not interested in making very deep recursion work at all. If you need that, you're doing it wrong in my opinion.
Misunderstanding I think. Of course I don't want to use deep recursion. But people might write things that happen several levels deep and then iterating over lots of stuff. A true generator would have no problem with that.
Okay, good. I agree that this use case should be as fast as possible -- as long as we still see every frame involved when a traceback is printed.
Assume just five layers of generators that have to be re-invoked for a tight yielding loop is quite some overhead that can be avoided.
The reason why I care is that existing implementations that use greenlet style could be turned into pure python, given that I manage to write the right support functions, and replace all functions by generators that emulate functions with async behavior.
It would just be great if that worked at the same speed, independent from at which stack level an iteration happens.
Yup.
Agreed that new code like that would be bad style.
Like "what"? -- --Guido van Rossum (python.org/~guido)