[Python-ideas] Modifying yield from's return value

Joseph Jevnik joejev at gmail.com
Tue Apr 12 19:20:12 EDT 2016


Changing the builtin map to do this would be questionable because there is
some performance overhead to dispatching to the inner coroutine's `send`
methods. `__next__` is implemented as a tp slot; however, coroutine `send`
is just a normal method that needs to be looked up every time we call next.
While my implementation is not as optimized as it could be, it is about
twice as slow as builtin map (and 4x faster is . I think it would be better
to keep the library separate until there is a lot of demand for this
feature.

On Tue, Apr 12, 2016 at 6:02 PM, Bar Harel <bzvi7919 at gmail.com> wrote:

> Perhaps something like this should go in the standard library? Or change
> the builtin map to forward everything? Changing the builtin map in this
> case will be backwards compatible and will overall be beneficial. If
> someone made a workaround, the workaround would still work, but now the
> builtins will do it per se.
>
> On Wed, Apr 13, 2016 at 12:55 AM Joseph Jevnik <joejev at gmail.com> wrote:
>
>> I have written a library `cotoolz` that provides the primitive pieces
>> needed for this. It implements a `comap` type which is like `map` but
>> properly forwards `send`, `throw`, and `close` to the underlying coroutine.
>> There is no special syntax needed, just write:
>>
>> ```
>> yield from comap(f, inner_coroutine())
>> ```
>>
>> This also supports `cozip` which lets you zip together multiple
>> coroutines, for example:
>>
>> ```
>> yield from cozip(inner_coroutine_a(), inner_corouting_b(),
>> inner_coroutine_c(), ...)
>> ```
>>
>> This will fan out the sends to all of the coroutines and collect the
>> results into a single tuple to yield. Just like `zip`, this will be
>> exchausted when the first coroutine is exhausted.
>>
>> The library is available as free software on pypi or on here:
>> https://github.com/llllllllll/cotoolz
>>
>> On Tue, Apr 12, 2016 at 5:40 PM, Bar Harel <bzvi7919 at gmail.com> wrote:
>>
>>> I asked a question in stackoverflow
>>> <http://stackoverflow.com/q/36582679/1658617> regarding a way of
>>> modifying yield from's return value.
>>> There doesn't seem to be any way of modifying the data yielded by the
>>> yield from expression without breaking the yield from "pipe". By "pipe" I
>>> mean the fact that .send() and .throw() pass to the inner generator. Useful
>>> cases are parsers as I've demonstrated in the question, transcoders,
>>> encoding and decoding the yielded values without interrupting .send() or
>>> .throw() and generally coroutines.
>>>
>>> I believe it would be beneficial to many aspects and libraries in
>>> Python, most notably asyncio.
>>>
>>> I couldn't think of a good syntax though, other than creating a wrapping
>>> function, lets say in itertools, that creates a class overriding send,
>>> throw and __next__ and receives the generator and associated modification
>>> functions (like suggested in one of the answers).
>>>
>>> What do you think? Is it useful? Any suggested syntax?
>>>
>>> _______________________________________________
>>> Python-ideas mailing list
>>> Python-ideas at python.org
>>> https://mail.python.org/mailman/listinfo/python-ideas
>>> Code of Conduct: http://python.org/psf/codeofconduct/
>>>
>>
>>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-ideas/attachments/20160412/819b94a1/attachment.html>


More information about the Python-ideas mailing list