[Python-ideas] Yield-From: Finalization guarantees

Nick Coghlan ncoghlan at gmail.com
Wed Mar 25 13:17:54 CET 2009

Greg Ewing wrote:
> Nick Coghlan wrote:
>> Greg Ewing wrote:
>>> (1) In non-refcounting implementations, subiterators
>>> are finalized promptly when the delegating generator
>>> is explicitly closed.
>>> (2) Subiterators are not prematurely finalized when
>>> other references to them exist.
>> If you choose (2), then (1) is trivial to implement
>>   with contextlib.closing(make_subiter()) as subiter:
>>       yield from subiter
> That's a fairly horrendous thing to expect people to
> write around all their yield-froms, though. It also
> means we would have to say that the inlining principle
> only holds for refcounting implementations.
> Maybe we should just give up trying to accommodate
> shared subiterators. Is it worth complicating
> everything for the sake of something that's not
> really part of the intended set of use cases?

Consider what happens if you replace the 'yield from' with the basic
form of iterator delegation that exists now:

  for x in make_subiter():
    yield x

Is such code wrong in any way? No it isn't. Failing to finalise the
object of iteration is the *normal* case. If for some reason it is
important in a given application to finalise it properly (e.g. the
subiter opens a database connection or file and we want to ensure they
are closed promptly no matter what else happens), only *then* does
deterministic finalisation come into play:

  with closing(make_subiter()) as subiter:
    for x in subiter:
      yield x

That is, I now believe the 'normal' case for 'yield from' should be
modelled on basic iteration, which means no implicit finalisation.

Now, keep in mind that in parallel with this I am now saying that *all*
exceptions, *including GeneratorExit* should be passed down to the
subiterator if it has a throw() method.

So even without implicit finalisation you can use "yield from" to nest
generators to your heart's content and an explicit close on the
outermost generator will be passed down to the innermost generator and
unwind the generator stack from there.

Using your "no finally clause" version from earlier in this thread as
the base for the exact semantic description:

    _i = iter(EXPR)
        _u = _i.next()
    except StopIteration, _e:
        _r = _e.value
        while 1:
                _v = yield _u
            except BaseException, _e:
                _m = getattr(_i, 'throw', None)
                if _m is not None:
                    _u = _m(_e)
                    if _v is None:
                        _u = _i.next()
                        _u = _i.send(_v)
                except StopIteration, _e:
                    _r = _e.value
    RESULT = _r

With an expansion of that form, you can easily make arbitrary iterators
(including generators) shareable by wrapping them in an iterator with no
throw or send methods:

  class ShareableIterator(object):
    def __init__(self, itr):
      self.itr = itr
    def __iter__(self):
      return self
    def __next__(self):
      return self.itr.next()
    next = __next__ # Be 2.x friendly
    def close(self):
      # Still support explicit finalisation of the
      # shared iterator, just not throw() or send()
         close_itr = self.itr.close
      except AttributeError:

  # Decorator to use the above on a generator function
  def shareable(g):
    def wrapper(*args, **kwds):
      return ShareableIterator(g(*args, **kwds))
    return wrapper

Iterators that need finalisation can either make themselves implicitly
closable in yield from expressions by defining a throw() method that
delegates to close() and then reraises the exception appropriately, or
else they can recommend explicit closure regardless of the means of
iteration (be it a for loop, a generator expression or container
comprehension, manual iteration or the new yield from expression).


Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia

More information about the Python-ideas mailing list