[Python-Dev] PEP 340 -- concept clarification

Jim Jewett jimjjewett at gmail.com
Tue May 3 21:07:37 CEST 2005


[Raymond Hettinger]
>> Likewise, is it correct that "yield" is anti-parallel to the current
>> meaning?  Inside a generator, it returns control upwards to the caller.
>> But inside a block-iterator, it pushes control downwards (?inwards) to
>> the block it controls.

Guido:
> I have a hard time visualizing the difference.

In a normal generator, someone makes a call to establish the 
generator, which then becomes a little island -- anyone can call 
the generator, and it returns control back to whoever made the last call.

With the block, every yield returns to a single designated callback.
This callback had to be established at the same time the block was
created, and must be textually inside it.  (An indented suite to the 
"block XXX:" line.)

>> Are there some good use cases that do not involve resource locking?

> Decorators don't need @synchronized as a motivating use case; 
> there are plenty of other use cases.

But are there plenty of other use cases for PEP 340?

If not, then why do we need PEP 340?  Are decorators not strong
enough, or is it just that people aren't comfortable yet?  If it is a
matter of comfort or recipies, then the new construct might have
just as much trouble.  (So this one is not a loop, and you can tell
the difference because ... uh, just skip that advanced stuff.)

> Anyway, @synchronized was mostly a demonstration toy; whole
> method calls are rarely the right granularity of locking.

That is an important difference -- though I'm not sure that the critical
part *shouldn't* be broken out into a separate method.

>> I've scanned through the code base looking for some places
>> to apply the idea and have come up empty handed.

> I presume you mentally discarded the resource allocation use
> cases where the try/finally statement was the outermost statement
> in the function body, since those would be helped by @synchronized;
> but look more closely at Queue, and you'll find that the two such
> methods use different locks!

qsize, empty, and full could be done with a lockself decorator.
Effectively, they *are* lockself decorators for the _xxx functions 
that subclasses are told to override.

If you're talking about put and get, decorators don't help as much,
but I'm not sure blocks are much better.  

You can't replace the outermost try ... finally with a common decorator 
because the locks are self variables.  A block, by being inside a method,
would be delayed until self exists -- but that outer lock is only a
tiny fraction
of the boilerplate.  It doesn't help with

            if not block:
                if self._STATE():
                    raise STATEException
            elif timeout is None:
                while self._STATE():
                    self.not_STATE.wait()
            else:
                if timeout < 0:
                    raise ValueError("'timeout' must be a positive number")
                endtime = _time() + timeout
                while self._STATE():
                    remaining = endtime - _time()
                    if remaining <= 0.0:
                        raise STATEException
                    self.not_STATE.wait(remaining)
            val = self._RealMethod(item)  # OK, the put optimizes out
this and the return
            self.not_OTHERSTATE.notify()
            return val

I wouldn't object to a helper method, but using a block just to get rid of four
lines (two of which are the literals "try:" and "finally:") seems barely worth
doing, let alone with special new syntax.

> Also the use case for closing a file upon leaving a block, while
> clearly a resource allocation use case, doesn't work well with a
> decorator.

def autoclose(fn):
    def outer(filename, *args, **kwargs):
        f = open(filename)
        val = fn(f, *args, **kwargs)
        f.close()
        return val
    return outer

@autoclose
def f1(f):
    for line in f:
        print line

> I just came across another use case that is fairly common in the
> standard library: redirecting sys.stdout. This is just a beauty (in
> fact I'll add it to the PEP):

> def saving_stdout(f):
>     save_stdout = sys.stdout
>     try:
>         sys.stdout = f
>         yield
>     finally:
>         sys.stdout = save_stdout

Why does this need a yield?  Why not just a regular call to the
function?  If you're trying to generalize the redirector, then this
also works as a decorator.  The nested functions (and the *args,
**kwargs, if you don't inherit from a standard dedcorator) is a
bit of an annoyance, but I'm not sure the new iterator form will
be any easier to explain.

def saving_stdout(f):
    import sys   # Just in case...
    def captured_stream(fn):
        def redirect(*args, **kwargs):
            save_stdout = sys.stdout
            try:
                sys.stdout = f
                return fn (*args, **kwargs)
            finally:
                sys.stdout = save_stdout
        return redirect
    return captured_stream

o=StringIO()
@saving_stdout(o)
...


More information about the Python-Dev mailing list