[Python-Dev] PEP 340 -- concept clarification
Guido van Rossum
gvanrossum at gmail.com
Tue May 3 20:48:09 CEST 2005
> [Raymond]
> > > Likewise, is it correct that "yield" is anti-parallel to the current
> > > meaning? Inside a generator, it returns control upwards to the caller.
> > > But inside a block-iterator, it pushes control downwards (?inwards) to
> > > the block it controls.
>
> [Guido van Rossum]
> > I have a hard time visualizing the difference. They feel the same to
> > me, and the implementation (from the generator's POV) is identical:
> > yield suspends the current frame, returning to the previous frame from
> > the call to next() or __next__(), and the suspended frame can be
> > resumed by calling next() / __next__() again.
[Raymond]
> This concept ought to be highlighted in the PEP because it explains
> clearly what "yield" does and it may help transition from a non-Dutch
> mental model. I expect that many folks (me included) think in terms of
> caller vs callee with a parallel spatial concept of enclosing vs
> enclosed. In that model, the keywords "continue", "break", "yield", and
> "return" all imply a control transfer from the enclosed back to the
> encloser.
I'm still confused and surprised that you think I need to explain what
yield does, since the PEP doesn't change one bit about this.
The encloser/enclosed parallel to caller/callee doesn't make sense to
me; but that may just because I'm Dutch.
> In contrast, the new use of yield differs in that the suspended frame
> transfers control from the encloser to the enclosed.
Why does your notion of who encloses whom suddenly reverse when you go
from a for-loop to a block-statement? This all feels very strange to
me.
> > Anyway, @synchronized was mostly a demonstration toy; whole method
> > calls are rarely the right granularity of locking.
>
> Agreed. Since that is the case, there should be some effort to shift
> some of the examples towards real use cases where a block-iterator is
> the appropriate solution. It need not hold-up releasing the PEP to
> comp.lang.python, but it would go a long way towards improving the
> quality of the subsequent discussion.
Um? I thought I just showed that locking *is* a good use case for the
block-statement and you agreed; now why would I have to move away from
it?
I think I'm thoroughly confused by your critique of the PEP. Perhaps
you could suggest some concrete rewritings to knock me out of my
confusion?
> Personally, I find it amusing when there is an
> early focus on naming rather than on functionality, implementation
> issues, use cases, usability, and goodness-of-fit within the language.
Well, the name of a proposed concept does a lot to establish its first
impression. First imressions matter!
> > > It would be great if we could point to some code in the standard library
> > > or in a major Python application that would be better (cleaner, faster,
> > > or clearer) if re-written using blocks and block-iterators
>
> > look
> > more closely at Queue, and you'll find that the two such methods use
> > different locks!
>
> I don't follow this one. Tim's uses of not_empty and not_full are
> orthogonal (pertaining to pending gets at one end of the queue and to
> pending puts at the other end). The other use of the mutex is
> independent of either pending puts or gets; instead, it is a weak
> attempt to minimize what can happen to the queue during a size query.
I meant to use this as an example of the unsuitability of the
@synchronized decorator, since it implies that all synchronization is
on the same mutex, thereby providing a use case for the locking
block-statement.
I suspect we're violently in agreement though.
> While the try/finallys could get factored-out into separate blocks, I do
> not see how the code could be considered better off. There is a slight
> worsening of all metrics of merit: line counts, total number of
> function defs, number of calls, or number of steps executed outside the
> lock (important given that the value a query result declines rapidly
> once the lock is released).
I don't see how the line count metric would lose: a single "locking()"
primitive exported by the threading module would be usable by all code
that currently uses try/finally to acquire and release a lock.
Performance needn't suffer either, if the locking() primitive is
implemented in C (it could be a straightforward translation of example
6 into C).
> > I just came across another use case that is fairly common in the
> > standard library: redirecting sys.stdout. This is just a beauty (in
> > fact I'll add it to the PEP):
> >
> > def saving_stdout(f):
> > save_stdout = sys.stdout
> > try:
> > sys.stdout = f
> > yield
> > finally:
> > sys.stdout = save_stdout
>
> This is the strongest example so far. When adding it to the PEP, it
> would be useful to contrast the code with simpler alternatives like PEP
> 288's g.throw() or PEP 325's g.close(). On the plus side, the
> block-iterator approach factors out code common to multiple callers. On
> the minus side, the other PEPs involve simpler mechanisms and their
> learning curve would be nearly zero. These pluses and minuses are
> important because apply equally to all examples using blocks for
> initialization/finalization.
Where do you see a learning curve for blocks?
--
--Guido van Rossum (home page: http://www.python.org/~guido/)
More information about the Python-Dev
mailing list