Casey Duncan wrote:
Apologies if this already exists, but for the benefit of those less enlightened, I think it would be very helpful if the pep included or linked to an example of an algorithm implemented 3 ways:
There isn't currently a single one implemented all three ways, but my parser example is implemented with plain Python and yield-from, and the philosophers and socket server are implemented using yield-from and cofunctions. http://www.cosc.canterbury.ac.nz/greg.ewing/python/generators/
iirc, the last two would not look much different, but maybe I'm mistaken.
You're not mistaken -- mainly it's just a matter of replacing 'yield from' with 'codef'. If the implicit-cocalling version of cofunctions gains sway, it would be more different -- all the 'yield from's would disappear, and some function definitions would change from 'def' to 'codef'.
As I understand it:
cocall f(x, y, z)
is sugar for:
yield from f.__cocall__(x, y, z)
and it now magically promotes the function that contains it to a cofunction (thus implementing __cocall__ for said function).
That's essentially correct as the PEP now stands.
From what I understand, __cocall__ does not exist because you might want to also have __call__ with different behavior, but instead it exists to allow the "cocaller" to differentiate between cofunctions and normal functions?
Yes, that's right. A cofunction's __cocall__ method does exactly the same thing as a normal generator's __call__ method does.
In theory though, I could implement an object myself that implemented both __call__ and __cocall__, correct?
You could, and in fact one version of the cofunctions proposal suggests making ordinary functions behaves as though they did implement both, with __cocall__ returning an iterator that yields zero times. There would be nothing to stop you creating an object that had arbitrarily different behaviour for __call__ and __cocall__ either, although I'm not sure what use such an object would be.
I suppose __cocall__ is to __call__ as __iter__ is to __call__ presently.
Not exactly. When you do for x in f(): ... __call__ and __iter__ are *both* involved -- __call__ is invoked first, and then __iter__ on the result. But when making a cocall, __cocall__ is invoked *instead* of __call__ (and the result is expected to already be an iterator, so __iter__ is not used).
It really seems to me that generators should be implemented on top of coroutines and the not the reverse. That would lead to a more linear path to understanding: iterators -> generators -> coroutines.
If generators didn't already exist, it might make sense to do it that way. It would be easy to create an @generator decorator that would turn a cofunction into a generator. (Such a thing might be good to have in any case.) But we're stuck with generators the way the are, so we might as well make the most of them, including using them as a foundation for a less-restricted form of suspendable function. Also keep in mind that the way they're documented and taught doesn't necessarily have to reflect the implementation strategy. It would be possible to describe cofunctions and cocalls as an independent concept, and only later explain how they relate to generators. -- Greg