[Python-Dev] Extended Function syntax

Oren Tirosh oren-py-d@hishome.net
Mon, 3 Feb 2003 12:21:16 -0500


On Mon, Feb 03, 2003 at 03:25:49PM +0000, Michael Hudson wrote:
> Well, I'd consider making the "var =" optional, but that would require
> hairy parser games (I think it would be possible, but very nasty).
> 
> Given that it (practically) can't be, I find having to make up a name
> for something that doesn't need one (e.g.
> 
> with dummy = autolock(lock):
>     ...
> 
> ) more unpleasant than having to bind to a name in a different line.

I agree. I don't see anything wrong with

f=open('spam', 'r')
with autoclose(f):
   ...

and I think that 

with autolock(obj):
   ...

looks better than any other alternative proposed so far. I don't
really need the autolock object for anything. I actually prefer not
to see it just like I don't see the anonymous iterator created in
a for loop.


I have an idea how to implement this as an extension of generators.

Currently, when a generator yields and is never resumed it just dies
silently. I suggest that it would be resumed one last time with an 
AbortIteration exception raised at the yield statement. If it does not
catch this exception it is simply ignored. If it does catch it the
generator may either raise another exception or return, but not yield. 
This is similar to a patch I submitted some time ago that allows 
try/finally in a generator. The difference is that this exception 
based system does not make such a strong guarantee as try/finally - it 
will be triggered when the iteration is aborted by break, return or an 
exception in the looping scope but not by a __del__ of a generator 
object dropping to refcnt of 0 so it should be "Jythonially correct".

So how does it solve the 'with' issue? The 'with' statement can be 
defined as a very degenerate form of 'for'. It takes an iterable object 
as argument but throws away the result and does not bind a variable.

Here is a hypothetical implementation of autoclose:

def autoclose(f):
  try:
      yield None
      print >>log, "terminated normally!"
  except AbortIteration:
      print >>log, "iteration aborted!"
  f.close()

Or, more generally:

def foo(args):

  #__enter__

  try:
      yield None

      #__leave__ 
      # normal termination of block

  except AbortIteration, exc:

      #__except__ 
      # exc.reason = exception that caused the iteration to abort 
      # or None for loops aborted with break or return statements

  #"finally" - all cases eventually reach this point

The implementation is relatively simple. It requires a 'for' or 'with' 
block to somehow inform its iterator that it has been terminated 
prematurely when popped off the block stack for any reason other than 
normal StopIteration. When the generator object gets this message it 
resumes the frame one last time with an AbortIteration exception raised.

This also adds a feature to generators and iterators that is generally 
useful for cleaning up after loops, not only 'with' statements. With 
this extension it's possible to write a database iterator that is able 
to tell whether the loop using it has been terminated by break or return, 
an exception, or normal termination and decide whether to commit or 
rollback.

A tricky part is aborting a loop by break or reaturn. It should only 
raise an AbortIteration in this case if an actual new generator was 
created for the loop. It should not be triggered if "iter(x) is x". In 
that case the user may have intended to continue using that iterator. 
In other words, this mechanism is guaranteed only when using automatically 
generated iterators. If you use explicit iterators it's your 
responsibility to ensure everything is cleaned up.

	Oren