[Python-Dev] PEP 340 -- concept clarification

Guido van Rossum gvanrossum at gmail.com
Tue May 3 21:40:11 CEST 2005


> [Raymond Hettinger]
> >> Likewise, is it correct that "yield" is anti-parallel to the current
> >> meaning?  Inside a generator, it returns control upwards to the caller.
> >> But inside a block-iterator, it pushes control downwards (?inwards) to
> >> the block it controls.
> 
[Guido]
> > I have a hard time visualizing the difference.

[Jim Jewett]
> In a normal generator, someone makes a call to establish the
> generator, which then becomes a little island -- anyone can call
> the generator, and it returns control back to whoever made the last call.
> 
> With the block, every yield returns to a single designated callback.
> This callback had to be established at the same time the block was
> created, and must be textually inside it.  (An indented suite to the
> "block XXX:" line.)

Doesn't convince me. The common use for a regular generator is in a
for-loop, where every yield also returns to a single designated place
(calling it callback is really deceptive!).

And with a block, you're free to put the generator call ahead of the
block so you can call next() on it manually:

    it = EXPR1
    block it:
        BLOCK1

is totally equivalent to

    block EXPR1:
        BLOCK1

but the first form lets you call next() on it as you please (until the
block is exited, for sure).

> But are there plenty of other use cases for PEP 340?

Yes. Patterns like "do this little dance in a try/finally block" and
"perform this tune when you catch an XYZ exception" are pretty common
in larger systems and are effectively abstracted away using the
block-statement and an appropriate iterator. The try/finally use case
often also has some setup that needs to go right before the try (and
sometimes some more setup that needs to go *inside* the try). Being
able to write this once makes it a lot easier when the "little dance"
has to be changed everywhere it is performed.

> If not, then why do we need PEP 340?  Are decorators not strong
> enough, or is it just that people aren't comfortable yet?  If it is a
> matter of comfort or recipies, then the new construct might have
> just as much trouble.  (So this one is not a loop, and you can tell
> the difference because ... uh, just skip that advanced stuff.)

PEP 340 and decorators are totally different things, and the only
vaguely common use case would be @synchronized, which is *not* a
proper use case for decorators, but "safe locking" is definitely a use
case for PEP 340.

> > Anyway, @synchronized was mostly a demonstration toy; whole
> > method calls are rarely the right granularity of locking.
> 
> That is an important difference -- though I'm not sure that the critical
> part *shouldn't* be broken out into a separate method.

I'll be the judge of that. I have plenty of examples where breaking it
out would create an entirely artificial helper method that takes
several arguments just because it needs to use stuff that its caller
has set up for it.

> > I presume you mentally discarded the resource allocation use
> > cases where the try/finally statement was the outermost statement
> > in the function body, since those would be helped by @synchronized;
> > but look more closely at Queue, and you'll find that the two such
> > methods use different locks!
> 
> qsize, empty, and full could be done with a lockself decorator.
> Effectively, they *are* lockself decorators for the _xxx functions
> that subclasses are told to override.

Actually you're pointing out a bug in the Queue module: these *should*
be using a try/finally clause to ensure the mutex is released even if
the inner call raises an exception. I hadn't noticed these before
because I was scanning only for "finally".

If a locking primitive had been available, I'm sure it would have been
used here.

> If you're talking about put and get, decorators don't help as much,
> but I'm not sure blocks are much better.
> 
> You can't replace the outermost try ... finally with a common decorator
> because the locks are self variables.  A block, by being inside a method,
> would be delayed until self exists -- but that outer lock is only a
> tiny fraction of the boilerplate.  It doesn't help with
> [...example deleted...]
> I wouldn't object to a helper method, but using a block just to get rid of four
> lines (two of which are the literals "try:" and "finally:") seems barely worth
> doing, let alone with special new syntax.

Well, to me it does; people have been requesting new syntax for this
specific case for a long time (that's where PEP 310 is coming from).

> > Also the use case for closing a file upon leaving a block, while
> > clearly a resource allocation use case, doesn't work well with a
> > decorator.
> 
> def autoclose(fn):
>     def outer(filename, *args, **kwargs):
>         f = open(filename)
>         val = fn(f, *args, **kwargs)
>         f.close()
>         return val
>     return outer
> 
> @autoclose
> def f1(f):
>     for line in f:
>         print line

But the auto-closing file, even more than the self-releasing lock,
most often occurs in the middle of some code that would be unnatural
to turn into a helper method just so that you can use a decorator
pattern. In fact your example is so confusing that I can't figure out
whether it has a bug or whether I'm just confused. This is *not* a
good use case for decorators.

> > I just came across another use case that is fairly common in the
> > standard library: redirecting sys.stdout. This is just a beauty (in
> > fact I'll add it to the PEP):
> 
> > def saving_stdout(f):
> >     save_stdout = sys.stdout
> >     try:
> >         sys.stdout = f
> >         yield
> >     finally:
> >         sys.stdout = save_stdout
> 
> Why does this need a yield?  Why not just a regular call to the
> function?

Because PEP 340 uses yield to pass control to the body of the
block-statement. (I have to resist the urge to add, ", dummy!" :-)

I can't tell whether you have totally not grasped PEP 340, or you are
proposing to solve all its use cases by defining an explicit function
or method representing the body of the block. The latter solution
leads to way too much ugly code -- all that function-definition
boilerplate is worse than the try/finally boilerplate we're trying to
hide!

> If you're trying to generalize the redirector, then this
> also works as a decorator.  The nested functions (and the *args,
> **kwargs, if you don't inherit from a standard dedcorator) is a
> bit of an annoyance, but I'm not sure the new iterator form will
> be any easier to explain.
> 
> def saving_stdout(f):
>     import sys   # Just in case...
>     def captured_stream(fn):
>         def redirect(*args, **kwargs):
>             save_stdout = sys.stdout
>             try:
>                 sys.stdout = f
>                 return fn (*args, **kwargs)
>             finally:
>                 sys.stdout = save_stdout
>         return redirect
>     return captured_stream
> 
> o=StringIO()
> @saving_stdout(o)
> ...

This has absolutely nothing to recommend it.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)


More information about the Python-Dev mailing list