[Python-ideas] PEP 380 close and contextmanagers?
Guido van Rossum
guido at python.org
Thu Oct 28 04:53:14 CEST 2010
On Wed, Oct 27, 2010 at 5:00 PM, Ron Adam <rrr at ronadam.com> wrote:
>
> On 10/27/2010 01:38 PM, Guido van Rossum wrote:
>>
>> On Wed, Oct 27, 2010 at 9:18 AM, Ron Adam<rrr at ronadam.com> wrote:
>>>
>>>
>>> On 10/27/2010 10:01 AM, Ron Adam wrote:
>>> It looks like No context managers return values in the finally or
>>> __exit__
>>> part of a context manager. Is there way to do that?
>>
>> How would that value be communicated to the code containing the
>> with-clause?
>
> I think that was what I was trying to figure out also.
>
>>> def reduce_i(f):
>>> i = yield
>>> while True:
>>> i = f(i, (yield i))
>>
>> Unfortunately from here on till the end of your example my brain exploded.
>
> Mine did too, but I think it was a useful but strange experience. ;-)
>
> It forced me to take a break and think about the problem from a different
> viewpoint. Heres the conclusion I came to, but be forewarned, it's kind of
> anti-climatic. :-)
>
>
> The use of an exception to signal some bit of code, is a way to reach over a
> wall that also protects that bit of code. This seems to be a more common
> need when using coroutines, because it's more common to have some bits of
> code indirectly, direct some other bit of code.
>
> Generators already have a nice .throw() method that will return the value at
> the next yield. But we either have to choose an existing exception to
> throw, that has some other purpose, or make up a new one. When it comes to
> making up new ones, lots of other programmers may each call it something
> else.
>
> That isn't a big problem, but it may be nice if we had a standard exception
> for saying.. "Hey you!, send me a total or subtotal!". And that's all that
> it does. For now lets call it a ValueRequest exception.
>
> ValueRequest makes sense if you are throwing an exception, I think
> ValueReturn may make more sense if you are raising an exception. Or maybe
> there is something that reads well both ways? These both fit very nice with
> ValueError and it may make reading code easier if we make a distinction
> between a request and a return.
>
>
> Below is the previous example rewritten to do this. A ValueRequest doesn't
> stop anything or force anything to close, so it wont ever interfere,
> confuse, or complicate, code that uses other exceptions. You can always
> throw or catch one of these and raise something else if you need to.
>
> Since throwing it into a generator doesn't stop the generator, the generator
> can put the try-except into a larger loop and loop back to get more values
> and catch another ValueRequest at some later point. I feel that is a useful
> and handy thing to do.
>
>
> So here's the example again.
>
> The first version of this took advantage of yield's ability to send and get
> data at the same time to always send back an update (subtotal) to the parent
> routine. That's nearly free since a yield always sends something back
> anyway. (None if you don't give it something else.) But it's not always
> easy to do, or easy to understand if you do it. IE.. brain exploding stuff.
>
> In this version, data only flows into the coroutine until a ValueRequest
> exception is thrown at it, at which point it then yields back a total.
>
>
> *I can see where some routines may reverse the control, by throwing
> ValueReturns from the inside out, rather than ValueRequests from the outside
> in. Is it useful to distinquish between the two or should there be just
> one?
>
> *Yes this can be made to work with gclose() and return, but I feel that is
> more restrictive, and more complex, than it needs to be.
>
> *I still didn't figure out how to use the context managers to get rid of the
> try except. Oh well. ;-)
>
>
>
> from contextlib import contextmanager
>
> class ValueRequest(Exception):
> pass
>
> @contextmanager
> def consumer(cofunc, result=True):
> next(cofunc)
> try:
> yield cofunc
> finally:
> cofunc.close()
>
> @contextmanager
> def multiconsumer(cofuncs, result=True):
> for c in cofuncs:
> next(c)
> try:
> yield cofuncs
> finally:
> for c in cofuncs:
> c.close()
>
> # Min/max coroutine example slpit into
> # nested coroutines for testing these ideas
> # in a more complex situation that may arise
> # when working with cofunctions and generators.
>
> def reduce_item(f):
> try:
> x = yield
> while True:
> x = f(x, (yield))
> except ValueRequest:
> yield x
>
> def reduce_group(funcs):
> with multiconsumer([reduce_item(f) for f in funcs]) as mc:
> try:
> while True:
> x = yield
> for c in mc:
> c.send(x)
> except ValueRequest:
> yield [c.throw(ValueRequest) for c in mc]
>
> def get_reductions(funcs, iterable):
> with consumer(reduce_group(funcs)) as c:
> for x in iterable:
> c.send(x)
> return c.throw(ValueRequest)
>
> def main():
> funcs = [min, max]
> print(get_reductions(funcs, range(100)))
> s = "Python is fun for play, and great for work too."
> print(get_reductions(funcs, s))
>
> if __name__ == '__main__':
> main()
Hm... Certainly interesting. My own (equally anti-climactic :-)
conclusions would be:
- Tastes differ
- There is a point where yield gets overused
- I am not convinced that using reduce as a paradigm here is right
--
--Guido van Rossum (python.org/~guido)
More information about the Python-ideas
mailing list