[Python-ideas] Yield-From: Finalization guarantees

Nick Coghlan ncoghlan at gmail.com
Fri Mar 27 13:17:07 CET 2009


Greg Ewing wrote:
> Nick Coghlan wrote:
> 
>> As you say, it does make it easier to write a non-generator delegation
>> target, since implementing close() for finalisation means not having to
>> deal with the vagaries of correctly reraising exceptions.
> 
> It also means that existing things with a close
> method, such as files, can be used without change.
> 
> Having a close method is a fairly well-established
> way to make an iterator explicitly finalizable,
> whereas having a throw method isn't.

But then we're back to the point that if someone *wants* deterministic
finalisation, then that's why the with statement exists. The part that
isn't clicking for me is that I still don't understand *why* 'yield
from' should include implicit finalisation as part of its definition.

The full delegation of next(), send() and throw() I get completely
(since that's the whole point of the new expression). The fact that that
*also* ends up delegating the close() method of generators in particular
 also makes sense (as it's a natural consequence of delegating the first
three methods).

It's the generalisation of that to all other iterators that happen to
offer a close() method that seems somewhat arbitrary. Other than the
fact that generators happen to provide a close() method that invokes
throw(), it appears to have nothing to do with generator delegation and
hence seems like a fairly random addition to the PEP.

Using a file as the subiterator is an interesting case in point (and
perhaps an interesting exploration as to when a shareable subiterator
may make sense: if a subiterator offers separate reading and writing
APIs, then those can be exposed as separate generators):

  class YieldingFile:
    # Mixing reads and writes with this strawman
    # version would be a rather bad idea :)
    EOF = object()

    def __init__(self, f):
      self.f = f

    def read_all(self):
      self.f.seek(0)
      yield from self.f

    def append_lines(self):
      self.f.seek(0, 2)
      lines_written = 0
      while 1:
        line = yield
        if line == self.EOF:
          break
        self.f.writeline(line)
        lines_written += 1
      return lines_written

The problem I see with the above is that with the current specification
in the PEP, the read_all() implementation is outright broken rather than
merely redundant (it is obviously wasteful, since it could just return
self.f instead of yielding from it - but it is far from clear that it
should be broken rather than just pointlessly slow). The first use of
read_all() will implicitly close the file when it is finished - that
seems totally nonobvious to me.

It strikes me as simpler all round to leave the deterministic
finalisation to the tool that was designed for the task, and let the new
expression focus solely on correct delegation to subgenerators without
worrying too much about other iterators.

Sure, there are plenty of ways to avoid the implicit finalisation if you
want to, but I'm still not convinced the "oh, you don't support throw()
so I will fall back to close() instead" fallback behaviour is a
particularly good idea. (It isn't a dealbreaker for me though - I still
support the PEP overall, even though I'm -0 on this particular aspect of
it).

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------



More information about the Python-ideas mailing list