From greg.ewing at canterbury.ac.nz  Sun Aug  1 12:09:55 2010
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Sun, 01 Aug 2010 22:09:55 +1200
Subject: [Python-ideas] Cofunctions - an enhancement to yield-from
Message-ID: <4C5547F3.6010902@canterbury.ac.nz>

I've been thinking about this idea for a while, and now
that I've got yield-from nailed down to my satisfaction,
I thought I'd put it out there to see if anyone else
thinks it would be a good thing.

Cofunctions
-----------

A drawback of 'yield from' as a coroutine mechanism is that it must
be used every time a function forming part of a coroutine calls
another function that can suspend the coroutine, either directly
or indirectly.

This makes the code read somewhat awkwardly and provides many
opportunities for the programmer to make errors. It also introduces
considerable coupling, since changing one's mind about whether a
function needs to be suspendable requires revisiting all the call
sites of that function.

This proposal builds on the 'yield from' proposal by introducing a
new kind of function that I will call a "cofunction".

A cofunction is a special kind of generator, with the following
characteristics:

- It is defined by using the keyword 'codef' in place of 'def'.

- It is always a generator, even if it does not contain any yields.

- Whenever a call is made inside a cofunction, it is done using a
   special COCALL opcode. This first looks for a __cocall__ method
   on the object being called. If present, it is expected to
   return an iterable object, which is treated as though 'yield from'
   had been performed on it.

   If the object being called does not have a __cocall__ method,
   or it returns NotImplemented, the call is made in the usual way
   through the __call__ method.

- Cofunctions themselves have a __cocall__ method that does the
   same thing as __call__.

Using these cofunctions, it should be possible to write coroutine code
that looks very similar to ordinary code. Cofunctions can call both
ordinary functions and other cofunctions using ordinary call syntax.
The only special consideration is that 'codef' must be used to define
any function that directly or indirectly calls another cofunction.

A few subtle details:

- Ordinary generators will *not* implement __cocall__. This is so
   that a cofunction can e.g. contain a for-loop that iterates over
   a generator without erroneously triggering yield-from behaviour.

- Some objects that wrap functions will need to be enhanced with
   __cocall__ methods that delegate to the underlying function.
   Obvious ones that spring to mind are bound methods, staticmethods
   and classmethods.

   Returning NotImplemented is specified as one of the possible
   responses of __cocall__ so that a wrapper can report that the
   wrapped object does not support __cocall__.

-- 
Greg



From ghazel at gmail.com  Sun Aug  1 13:52:25 2010
From: ghazel at gmail.com (ghazel at gmail.com)
Date: Sun, 1 Aug 2010 04:52:25 -0700
Subject: [Python-ideas] Cofunctions - an enhancement to yield-from
In-Reply-To: <4C5547F3.6010902@canterbury.ac.nz>
References: <4C5547F3.6010902@canterbury.ac.nz>
Message-ID: <AANLkTikPG079O=_GaJM+Hfy_62xn6VrsG3hUFP4-ohEG@mail.gmail.com>

It does seem like there are two different use cases for generators
which the return value semantics of "yield from" starts to make very
clear. The original use is building an iterator which produces results
in sequence, and the other is the "cofunction" style of executing a
sequence of code blocks which might be interleaved with code from the
caller and which returns a value like a regular function. I find
cofunctions to be a very useful concept.

I like the idea of "codef" defining these cofunctions in a way which
makes them behave like generators even if they do not contain a yield
call. Certainly having to revisit call sites when you shuffle or
comment out code is troublesome.

What I am suspicious of is automatically making calls to cofunctions
from within a cofunction imply 'yield from'. Knowing whether the
function call you are making can suspend your code and return to your
parent or not is important, since your state might change between when
you call some function and when it returns. Instead, I would like to
see a different calling mechanism when calling a cofunction, which
makes it clear that you know you're calling a cofunction and that this
could have implications for your own code. "yield" (and "yield from")
serve this purpose somewhat. It seems like calling a normal function
using this special calling mechanism could treat them like a
cofunction which produced zero iterations and a single return value.

Obviously this topic is very close to the monocle framework (
http://github.com/saucelabs/monocle ), where we implement something
very much like this as best we can using existing Python
functionality. Other projects have tried to use C module
stack-swapping coroutines, and I have seen programmers (including the
developers of these projects) struggle with the unexpected,
thread-like preemption. I believe a word like "yield" makes these
cooperation points more obvious.

-Greg

On Sun, Aug 1, 2010 at 3:09 AM, Greg Ewing <greg.ewing at canterbury.ac.nz> wrote:
> I've been thinking about this idea for a while, and now
> that I've got yield-from nailed down to my satisfaction,
> I thought I'd put it out there to see if anyone else
> thinks it would be a good thing.
>
> Cofunctions
> -----------
>
> A drawback of 'yield from' as a coroutine mechanism is that it must
> be used every time a function forming part of a coroutine calls
> another function that can suspend the coroutine, either directly
> or indirectly.
>
> This makes the code read somewhat awkwardly and provides many
> opportunities for the programmer to make errors. It also introduces
> considerable coupling, since changing one's mind about whether a
> function needs to be suspendable requires revisiting all the call
> sites of that function.
>
> This proposal builds on the 'yield from' proposal by introducing a
> new kind of function that I will call a "cofunction".
>
> A cofunction is a special kind of generator, with the following
> characteristics:
>
> - It is defined by using the keyword 'codef' in place of 'def'.
>
> - It is always a generator, even if it does not contain any yields.
>
> - Whenever a call is made inside a cofunction, it is done using a
> ?special COCALL opcode. This first looks for a __cocall__ method
> ?on the object being called. If present, it is expected to
> ?return an iterable object, which is treated as though 'yield from'
> ?had been performed on it.
>
> ?If the object being called does not have a __cocall__ method,
> ?or it returns NotImplemented, the call is made in the usual way
> ?through the __call__ method.
>
> - Cofunctions themselves have a __cocall__ method that does the
> ?same thing as __call__.
>
> Using these cofunctions, it should be possible to write coroutine code
> that looks very similar to ordinary code. Cofunctions can call both
> ordinary functions and other cofunctions using ordinary call syntax.
> The only special consideration is that 'codef' must be used to define
> any function that directly or indirectly calls another cofunction.
>
> A few subtle details:
>
> - Ordinary generators will *not* implement __cocall__. This is so
> ?that a cofunction can e.g. contain a for-loop that iterates over
> ?a generator without erroneously triggering yield-from behaviour.
>
> - Some objects that wrap functions will need to be enhanced with
> ?__cocall__ methods that delegate to the underlying function.
> ?Obvious ones that spring to mind are bound methods, staticmethods
> ?and classmethods.
>
> ?Returning NotImplemented is specified as one of the possible
> ?responses of __cocall__ so that a wrapper can report that the
> ?wrapped object does not support __cocall__.
>
> --
> Greg
>
> _______________________________________________
> Python-ideas mailing list
> Python-ideas at python.org
> http://mail.python.org/mailman/listinfo/python-ideas
>


From solipsis at pitrou.net  Sun Aug  1 15:20:37 2010
From: solipsis at pitrou.net (Antoine Pitrou)
Date: Sun, 1 Aug 2010 15:20:37 +0200
Subject: [Python-ideas] Cofunctions - an enhancement to yield-from
References: <4C5547F3.6010902@canterbury.ac.nz>
Message-ID: <20100801152037.78c802ba@pitrou.net>


Hello,

Is it an enhancement or really an alternate proposal?

From an outsider's view (mine :-)), I think this alternative makes more
sense than trying to stretch the generator protocol far beyond what it
was designed for at the start. It would also clearly separate the two
use cases of generative iteration and coroutines.

If so many people think coroutines are important, then it may be time
to give them first-class synctatical support in Python.

Regards

Antoine.



On Sun, 01 Aug 2010 22:09:55 +1200
Greg Ewing <greg.ewing at canterbury.ac.nz> wrote:

> I've been thinking about this idea for a while, and now
> that I've got yield-from nailed down to my satisfaction,
> I thought I'd put it out there to see if anyone else
> thinks it would be a good thing.
> 
> Cofunctions
> -----------
> 
> A drawback of 'yield from' as a coroutine mechanism is that it must
> be used every time a function forming part of a coroutine calls
> another function that can suspend the coroutine, either directly
> or indirectly.
> 
> This makes the code read somewhat awkwardly and provides many
> opportunities for the programmer to make errors. It also introduces
> considerable coupling, since changing one's mind about whether a
> function needs to be suspendable requires revisiting all the call
> sites of that function.
> 
> This proposal builds on the 'yield from' proposal by introducing a
> new kind of function that I will call a "cofunction".
> 
> A cofunction is a special kind of generator, with the following
> characteristics:
> 
> - It is defined by using the keyword 'codef' in place of 'def'.
> 
> - It is always a generator, even if it does not contain any yields.
> 
> - Whenever a call is made inside a cofunction, it is done using a
>    special COCALL opcode. This first looks for a __cocall__ method
>    on the object being called. If present, it is expected to
>    return an iterable object, which is treated as though 'yield from'
>    had been performed on it.
> 
>    If the object being called does not have a __cocall__ method,
>    or it returns NotImplemented, the call is made in the usual way
>    through the __call__ method.
> 
> - Cofunctions themselves have a __cocall__ method that does the
>    same thing as __call__.
> 
> Using these cofunctions, it should be possible to write coroutine code
> that looks very similar to ordinary code. Cofunctions can call both
> ordinary functions and other cofunctions using ordinary call syntax.
> The only special consideration is that 'codef' must be used to define
> any function that directly or indirectly calls another cofunction.
> 
> A few subtle details:
> 
> - Ordinary generators will *not* implement __cocall__. This is so
>    that a cofunction can e.g. contain a for-loop that iterates over
>    a generator without erroneously triggering yield-from behaviour.
> 
> - Some objects that wrap functions will need to be enhanced with
>    __cocall__ methods that delegate to the underlying function.
>    Obvious ones that spring to mind are bound methods, staticmethods
>    and classmethods.
> 
>    Returning NotImplemented is specified as one of the possible
>    responses of __cocall__ so that a wrapper can report that the
>    wrapped object does not support __cocall__.
> 
> -- 
> Greg




From greg.ewing at canterbury.ac.nz  Mon Aug  2 02:07:06 2010
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Mon, 02 Aug 2010 12:07:06 +1200
Subject: [Python-ideas] Cofunctions - an enhancement to yield-from
In-Reply-To: <20100801152037.78c802ba@pitrou.net>
References: <4C5547F3.6010902@canterbury.ac.nz>
	<20100801152037.78c802ba@pitrou.net>
Message-ID: <4C560C2A.1080304@canterbury.ac.nz>

Antoine Pitrou wrote:

> Is it an enhancement or really an alternate proposal?

I would say it's complementary. There are still use cases for
'yield from', when you're dealing with generators that are
designed to produce values.

Given that both are useful, defining cofunctions in terms of
yield-from allows them to share most of the underlying machinery.
It also makes it clear how ordinary functions, generators and
cofunctions all interact with each other.

> From an outsider's view (mine :-)), I think this alternative makes more
> sense than trying to stretch the generator protocol far beyond what it
> was designed for at the start. It would also clearly separate the two
> use cases of generative iteration and coroutines.

Well, that's one way of looking at it. Another is that generators
and coroutines are such closely related concepts that it would
seem odd not to be able to define one in terms of the other,
or both in terms of some unifying construct.

-- 
Greg


From greg.ewing at canterbury.ac.nz  Mon Aug  2 02:13:00 2010
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Mon, 02 Aug 2010 12:13:00 +1200
Subject: [Python-ideas] Cofunctions - an enhancement to yield-from
In-Reply-To: <AANLkTikPG079O=_GaJM+Hfy_62xn6VrsG3hUFP4-ohEG@mail.gmail.com>
References: <4C5547F3.6010902@canterbury.ac.nz>
	<AANLkTikPG079O=_GaJM+Hfy_62xn6VrsG3hUFP4-ohEG@mail.gmail.com>
Message-ID: <4C560D8C.90709@canterbury.ac.nz>

ghazel at gmail.com wrote:

> What I am suspicious of is automatically making calls to cofunctions
> from within a cofunction imply 'yield from'. Knowing whether the
> function call you are making can suspend your code and return to your
> parent or not is important,

The same situation occurs when dealing with threads, since any
function you call could potentially suspend your thread. This
doesn't seem to bother people much.

> I would like to
> see a different calling mechanism when calling a cofunction,

The main point of the whole thing is to avoid having to specially
mark call sites like this. If you take that away, all that's left
is the ability to define a generator without a yield, and I'm
not sure it's worth having a whole new kind of function definition
just for that.

-- 
Greg


From solipsis at pitrou.net  Mon Aug  2 02:16:36 2010
From: solipsis at pitrou.net (Antoine Pitrou)
Date: Mon, 2 Aug 2010 02:16:36 +0200
Subject: [Python-ideas] Cofunctions - an enhancement to yield-from
References: <4C5547F3.6010902@canterbury.ac.nz>
	<20100801152037.78c802ba@pitrou.net>
	<4C560C2A.1080304@canterbury.ac.nz>
Message-ID: <20100802021636.3922318c@pitrou.net>

On Mon, 02 Aug 2010 12:07:06 +1200
Greg Ewing <greg.ewing at canterbury.ac.nz> wrote:
> There are still use cases for
> 'yield from', when you're dealing with generators that are
> designed to produce values.

It was the tree-walking example, right? It really looked like it solved
a non-existing problem.

> > From an outsider's view (mine :-)), I think this alternative makes
> > more sense than trying to stretch the generator protocol far beyond
> > what it was designed for at the start. It would also clearly
> > separate the two use cases of generative iteration and coroutines.
> 
> Well, that's one way of looking at it. Another is that generators
> and coroutines are such closely related concepts that it would
> seem odd not to be able to define one in terms of the other,
> or both in terms of some unifying construct.

Well, it sounds like saying classes and functions are closely related
concepts because both denote callable objects.
I think there is value in clearly distinguished concepts, rather than
some intellectually appealing over-generalization (what Joel Spolsky
calls Architecture Astronauts).

Regards

Antoine.




From ghazel at gmail.com  Mon Aug  2 02:30:07 2010
From: ghazel at gmail.com (ghazel at gmail.com)
Date: Sun, 1 Aug 2010 17:30:07 -0700
Subject: [Python-ideas] Cofunctions - an enhancement to yield-from
In-Reply-To: <4C560D8C.90709@canterbury.ac.nz>
References: <4C5547F3.6010902@canterbury.ac.nz>
	<AANLkTikPG079O=_GaJM+Hfy_62xn6VrsG3hUFP4-ohEG@mail.gmail.com> 
	<4C560D8C.90709@canterbury.ac.nz>
Message-ID: <AANLkTinJ3uc_paEM6Y-zTBuw639yCpn-1YdK8LH6JO=j@mail.gmail.com>

On Sun, Aug 1, 2010 at 5:13 PM, Greg Ewing <greg.ewing at canterbury.ac.nz> wrote:
> ghazel at gmail.com wrote:
>
>> What I am suspicious of is automatically making calls to cofunctions
>> from within a cofunction imply 'yield from'. Knowing whether the
>> function call you are making can suspend your code and return to your
>> parent or not is important,
>
> The same situation occurs when dealing with threads, since any
> function you call could potentially suspend your thread. This
> doesn't seem to bother people much.

Exactly, it is the same situation as in threads, when it does not have
to be. If you can be preempted at any point, suddenly there is a need
for more fine-grained "locking" around state since you can not be sure
that some function will not pause your execution and allow something
further up the stack to modify it. With a special calling mechanism
these cooperative points are very clear, and you can write code to
handle potential state changes when the call returns. Instead of
locking around state, cooperative points are like unlocking for the
duration of the call.

>> I would like to
>> see a different calling mechanism when calling a cofunction,
>
> The main point of the whole thing is to avoid having to specially
> mark call sites like this. If you take that away, all that's left
> is the ability to define a generator without a yield, and I'm
> not sure it's worth having a whole new kind of function definition
> just for that.

Well, codef is the part I would like to see if anything is added at
all. Maybe that's not worth the trouble.

-Greg


From greg.ewing at canterbury.ac.nz  Mon Aug  2 08:06:01 2010
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Mon, 02 Aug 2010 18:06:01 +1200
Subject: [Python-ideas] Cofunctions - an enhancement to yield-from
In-Reply-To: <20100802021636.3922318c@pitrou.net>
References: <4C5547F3.6010902@canterbury.ac.nz>
	<20100801152037.78c802ba@pitrou.net>
	<4C560C2A.1080304@canterbury.ac.nz>
	<20100802021636.3922318c@pitrou.net>
Message-ID: <4C566049.9070802@canterbury.ac.nz>

Antoine Pitrou wrote:

> Well, it sounds like saying classes and functions are closely related
> concepts because both denote callable objects.

I think it's more like saying that methods and functions
are related, and Python does implement methods in terms of
functions.

Similarly, cofunctions are a new concept, but they're
built out of lower-level pieces -- generators and yield-from
-- that are also available separately.

(That could be another Python catchphrase, btw:
"Batteries available separately." :-)

-- 
Greg


From bruce at leapyear.org  Mon Aug  2 08:29:35 2010
From: bruce at leapyear.org (Bruce Leban)
Date: Sun, 1 Aug 2010 23:29:35 -0700
Subject: [Python-ideas] Cofunctions - an enhancement to yield-from
In-Reply-To: <4C5547F3.6010902@canterbury.ac.nz>
References: <4C5547F3.6010902@canterbury.ac.nz>
Message-ID: <AANLkTinszHH7Y9Kisf26+gG_6yGxdhMkoY_AzcGO76W+@mail.gmail.com>

On Sun, Aug 1, 2010 at 3:09 AM, Greg Ewing <greg.ewing at canterbury.ac.nz>
 wrote:

> The only special consideration is that 'codef' must be used to define
> any function that directly or indirectly calls another cofunction.
>

It seems to me that this is a big requirement. If a module that I'm using
uses cofunctions, then every use of that module must be a cofunction all the
way up to my main function.

Worse, I wouldn't be able to change the implementation of a module to use
cofunctions because it will break the current users.

Why do you think this requirement is necessary?

--- Bruce
http://www.vroospeak.com
http://google-gruyere.appspot.com
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-ideas/attachments/20100801/e655b6d5/attachment.html>

From greg.ewing at canterbury.ac.nz  Mon Aug  2 11:34:25 2010
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Mon, 02 Aug 2010 21:34:25 +1200
Subject: [Python-ideas] Cofunctions - an enhancement to yield-from
In-Reply-To: <AANLkTinszHH7Y9Kisf26+gG_6yGxdhMkoY_AzcGO76W+@mail.gmail.com>
References: <4C5547F3.6010902@canterbury.ac.nz>
	<AANLkTinszHH7Y9Kisf26+gG_6yGxdhMkoY_AzcGO76W+@mail.gmail.com>
Message-ID: <4C569121.2000206@canterbury.ac.nz>

Bruce Leban wrote:

>     The only special consideration is that 'codef' must be used to define
>     any function that directly or indirectly calls another cofunction.
> 
> It seems to me that this is a big requirement. If a module that I'm 
> using uses cofunctions, then every use of that module must be a 
> cofunction all the way up to my main function.

Well, at least all the way up to the coroutine scheduler or
driver or whatever you want to call it.

That statement is actually a bit abbreviated. The caller
doesn't strictly have to be a cofunction, but if it's not,
it has to be aware that the called cofunction is a generator
and deal with it as such, for example by using 'yield from'
(which in turn makes the caller a generator) or by looping
over it (in which case the caller can be an ordinary
function).

However you arrange things, there has to be an unbroken
chain of generators from the main loop driving the coroutines
down to the functions containing the yields. The only question
is whether to mark them as such at the function level (using
'codef') or at the call site level (using 'yield from').

> Worse, I wouldn't be able to change the implementation of a module to 
> use cofunctions because it will break the current users.

There would be nothing to stop a module from using cofunctions
internally, as long as it runs its own driver loop and presents
its outside interface in the form of ordinary functions that
run to completion.

But if it exposes any cofunctions directly to its users, they
will need to be aware of the fact, just as they need to be
aware of an exposed function that is a generator.

-- 
Greg



From ncoghlan at gmail.com  Mon Aug  2 14:03:39 2010
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Mon, 2 Aug 2010 22:03:39 +1000
Subject: [Python-ideas] Cofunctions - an enhancement to yield-from
In-Reply-To: <20100802021636.3922318c@pitrou.net>
References: <4C5547F3.6010902@canterbury.ac.nz>
	<20100801152037.78c802ba@pitrou.net>
	<4C560C2A.1080304@canterbury.ac.nz>
	<20100802021636.3922318c@pitrou.net>
Message-ID: <AANLkTindQ_QTznqA95ECpQFt5c+eCzLmjwwZdp+ZZW_X@mail.gmail.com>

On Mon, Aug 2, 2010 at 10:16 AM, Antoine Pitrou <solipsis at pitrou.net> wrote:
> On Mon, 02 Aug 2010 12:07:06 +1200
> Greg Ewing <greg.ewing at canterbury.ac.nz> wrote:
>> There are still use cases for
>> 'yield from', when you're dealing with generators that are
>> designed to produce values.
>
> It was the tree-walking example, right? It really looked like it solved
> a non-existing problem.

For generators, the important benefit of PEP 380 is that it makes sure
that nested cleanup happens in the right order.

Consider the following currently valid (but subtly broken) generator pair:

  def receive_message_components(channel):
      with channel.open() as session:
          while 1:
              data = session.wait_for_next()
              yield data
              if data == channel.EOM:
                  break

  def receive_multiple_message_components(server_details,
channel_details, limit=None):
      with Server(server_details) as server:
          channel = server.channel(channel_details)
          n = 0
          while 1:
              for component in receive_message_components(channel):
                  yield component
              if limit is not None:
                  n += 1
                  if n >= limit:
                      break

That code is most likely broken: if an exception (e.g. GeneratorExit)
gets thrown into the outer generator, the server connection will be
closed while an open session is still using that connection (since the
inner generator doesn't get closed until the outer generator's
reference to it gets released, by which time the with statement will
have killed the server connection). However, if the body of the inner
generator were written inline in the outer generator instead then
everything would be fine - the session would be closed before the
server connection because the exception handling would correctly
propagate out from the innermost yield. PEP 380 makes it easy to
factor out subgenerators without needing to worry about subtle
misbehaviours of exception handling due to the delayed closure of the
subgenerators:

  def receive_multiple_message_components(server_details,
channel_details, limit=None):
      with Server(server_details) as server:
          channel = server.channel(channel_details)
          n = 0
          while 1:
              yield from receive_message_components(channel)
              if limit is not None:
                  n += 1
                  if n >= limit:
                      break

Cheers,
Nick.

-- 
Nick Coghlan?? |?? ncoghlan at gmail.com?? |?? Brisbane, Australia


From scott+python-ideas at scottdial.com  Mon Aug  2 18:21:57 2010
From: scott+python-ideas at scottdial.com (Scott Dial)
Date: Mon, 02 Aug 2010 12:21:57 -0400
Subject: [Python-ideas] Cofunctions - an enhancement to yield-from
In-Reply-To: <20100802021636.3922318c@pitrou.net>
References: <4C5547F3.6010902@canterbury.ac.nz>	<20100801152037.78c802ba@pitrou.net>	<4C560C2A.1080304@canterbury.ac.nz>
	<20100802021636.3922318c@pitrou.net>
Message-ID: <4C56F0A5.1080604@scottdial.com>

On 8/1/2010 8:16 PM, Antoine Pitrou wrote:
> On Mon, 02 Aug 2010 12:07:06 +1200
> Greg Ewing <greg.ewing at canterbury.ac.nz> wrote:
>> There are still use cases for
>> 'yield from', when you're dealing with generators that are
>> designed to produce values.
> 
> It was the tree-walking example, right? It really looked like it solved
> a non-existing problem.

I disagree with this opinion. I have run into this pattern before and
while I didn't need the whole generator protocol honored, I would've
benefited just from the fact that the proposed "yield from" flattened
the iteration.

A while back there was a proposal about adding an IP address
manipulation library to stdlib that I took issue with. And so I started
to write my own that met the requirements, and when it came time to
write __iter__ for the IPNetwork class, it was quite natural to use
recursive generators:

def __iter__(self):
    # can I get a "yield from" please?!
    for v in self._iter_more(self._network_address.number,
                             self._host_mask_bits):
        yield v

def _iter_more(self, number_so_far, zerobits):
    if zerobits:
        bit = zerobits[-1]
        zerobits = zerobits[:-1]

        # yield all 'x...x0y...y' addresses
        for v in self._iter_more(number_so_far, zerobits):
            yield v
        # yield all 'x...x1y...y' addresses
        for v in self._iter_more(number_so_far | (1 << bit), zerobits):
            yield v
    else:
	# construct a proper IPAddress instance for the number
        yield self._address_class(number_so_far)

Obviously, it can be flattened by hand, but I doubt it would be as
obvious to read later. I have run into this pattern in other cases where
I was writing my own (specialized) containers, and would expect others
to have as well, unless they were uncomfortable with generators and/or
wrote around the problem.

-- 
Scott Dial
scott at scottdial.com
scodial at cs.indiana.edu

-- 
Scott Dial
scott at scottdial.com
scodial at cs.indiana.edu


From solipsis at pitrou.net  Mon Aug  2 18:37:41 2010
From: solipsis at pitrou.net (Antoine Pitrou)
Date: Mon, 2 Aug 2010 18:37:41 +0200
Subject: [Python-ideas] Cofunctions - an enhancement to yield-from
References: <4C5547F3.6010902@canterbury.ac.nz>
	<20100801152037.78c802ba@pitrou.net>
	<4C560C2A.1080304@canterbury.ac.nz>
	<20100802021636.3922318c@pitrou.net>
	<4C56F0A5.1080604@scottdial.com>
Message-ID: <20100802183741.42f8ac36@pitrou.net>

On Mon, 02 Aug 2010 12:21:57 -0400
Scott Dial <scott+python-ideas at scottdial.com>
wrote:
> 
> I disagree with this opinion. I have run into this pattern before and
> while I didn't need the whole generator protocol honored, I would've
> benefited just from the fact that the proposed "yield from" flattened
> the iteration.

How would you have benefitted? Is it a problem if the iteration isn't
"flattened"?

If it's because of the recursion limit, then it's a general problem
and I don't think a generator-specific solution is a good idea.
If it's an aesthetical preference then I don't think new syntax is
warranted for that.

Regards

Antoine.




From guido at python.org  Mon Aug  2 18:39:01 2010
From: guido at python.org (Guido van Rossum)
Date: Mon, 2 Aug 2010 09:39:01 -0700
Subject: [Python-ideas] Cofunctions - an enhancement to yield-from
In-Reply-To: <4C5547F3.6010902@canterbury.ac.nz>
References: <4C5547F3.6010902@canterbury.ac.nz>
Message-ID: <AANLkTikDEPkX-iLXqUZRAkBhpXokgkz2LSgXYubaUg_E@mail.gmail.com>

After mulling it over and trying to understand the thread I still
cannot get excited about this proposal. The only concrete objection I
have is that it might be hard to implement in Jython or IronPython --
IIRC we were careful to define yield in such a way that it was easy to
generate JVM bytecode for them, and IIRC the details of making it easy
had to do with the compiler breaking the generator function into
different entry points for each resumption point (i.e. after each
yield). In a codef you couldn't do this, since you don't know until
run time which calls are codef calls.

OTOH I do appreciate the desire to reduce the number of places where
one has to sprinkle 'yield' over one's code, and I've had a number of
situations recently where I had something that logically needed to be
a coroutine (to match some API) but just happened not to need any
yields, and inevitably my coding went something like (1) forget to put
a yield in, (2) frantically debug, (3) slap forehead, (4) add "if 0:
yield" to the function, (5) continue with another instance of this,
(6) lose sleep over the best place to spell the dummy yield and where
to put it. At the same time I don't want to have to mark all my
coroutines with a decorator, like Monocle requires (though maybe I
should).

Finally, regardless of what happens to codef, I am still
enthusiastically supporting PEP 380 as it stands, and am excited to
see it ported to Python 3.1 (though I hope that once we've done the
Mercurial switch, someone will create a branch for it to be merged
into 3.3).

--Guido

On Sun, Aug 1, 2010 at 3:09 AM, Greg Ewing <greg.ewing at canterbury.ac.nz> wrote:
> I've been thinking about this idea for a while, and now
> that I've got yield-from nailed down to my satisfaction,
> I thought I'd put it out there to see if anyone else
> thinks it would be a good thing.
>
> Cofunctions
> -----------
>
> A drawback of 'yield from' as a coroutine mechanism is that it must
> be used every time a function forming part of a coroutine calls
> another function that can suspend the coroutine, either directly
> or indirectly.
>
> This makes the code read somewhat awkwardly and provides many
> opportunities for the programmer to make errors. It also introduces
> considerable coupling, since changing one's mind about whether a
> function needs to be suspendable requires revisiting all the call
> sites of that function.
>
> This proposal builds on the 'yield from' proposal by introducing a
> new kind of function that I will call a "cofunction".
>
> A cofunction is a special kind of generator, with the following
> characteristics:
>
> - It is defined by using the keyword 'codef' in place of 'def'.
>
> - It is always a generator, even if it does not contain any yields.
>
> - Whenever a call is made inside a cofunction, it is done using a
> ?special COCALL opcode. This first looks for a __cocall__ method
> ?on the object being called. If present, it is expected to
> ?return an iterable object, which is treated as though 'yield from'
> ?had been performed on it.
>
> ?If the object being called does not have a __cocall__ method,
> ?or it returns NotImplemented, the call is made in the usual way
> ?through the __call__ method.
>
> - Cofunctions themselves have a __cocall__ method that does the
> ?same thing as __call__.
>
> Using these cofunctions, it should be possible to write coroutine code
> that looks very similar to ordinary code. Cofunctions can call both
> ordinary functions and other cofunctions using ordinary call syntax.
> The only special consideration is that 'codef' must be used to define
> any function that directly or indirectly calls another cofunction.
>
> A few subtle details:
>
> - Ordinary generators will *not* implement __cocall__. This is so
> ?that a cofunction can e.g. contain a for-loop that iterates over
> ?a generator without erroneously triggering yield-from behaviour.
>
> - Some objects that wrap functions will need to be enhanced with
> ?__cocall__ methods that delegate to the underlying function.
> ?Obvious ones that spring to mind are bound methods, staticmethods
> ?and classmethods.
>
> ?Returning NotImplemented is specified as one of the possible
> ?responses of __cocall__ so that a wrapper can report that the
> ?wrapped object does not support __cocall__.
>
> --
> Greg
>
> _______________________________________________
> Python-ideas mailing list
> Python-ideas at python.org
> http://mail.python.org/mailman/listinfo/python-ideas
>



-- 
--Guido van Rossum (python.org/~guido)


From guido at python.org  Mon Aug  2 18:45:09 2010
From: guido at python.org (Guido van Rossum)
Date: Mon, 2 Aug 2010 09:45:09 -0700
Subject: [Python-ideas] Cofunctions - an enhancement to yield-from
In-Reply-To: <20100802183741.42f8ac36@pitrou.net>
References: <4C5547F3.6010902@canterbury.ac.nz>
	<20100801152037.78c802ba@pitrou.net> 
	<4C560C2A.1080304@canterbury.ac.nz>
	<20100802021636.3922318c@pitrou.net> 
	<4C56F0A5.1080604@scottdial.com> <20100802183741.42f8ac36@pitrou.net>
Message-ID: <AANLkTik3PoC9hRw1zYgHw3fspq0Z5akC1nF5JXeguQps@mail.gmail.com>

On Mon, Aug 2, 2010 at 9:37 AM, Antoine Pitrou <solipsis at pitrou.net> wrote:
> How would you have benefitted? Is it a problem if the iteration isn't
> "flattened"?
>
> If it's because of the recursion limit, then it's a general problem
> and I don't think a generator-specific solution is a good idea.
> If it's an aesthetical preference then I don't think new syntax is
> warranted for that.

I think the main reason for wanting the stack of yields flattened is
the cost of bumping each next() call all the way up and down the
stack. Without an optimized yield-from, yield from G is equivalent to
"for X in G: yield X" and that means if you have this nested 3 times
on the stack, each next() call incurs the overhead of three for-loop
iterations. It would be especially bad if you have a fairly deeply
nested stack and then the innermost generator yields a large number of
values.

It remains to be seen at which point this becomes prohibitive and when
the overhead of wrapping every generator in a From instance (and
passing every next() call through a method of that instance) is
actually faster, given that a for-loop iteration is just a few
bytecode instructions.

-- 
--Guido van Rossum (python.org/~guido)


From greg.ewing at canterbury.ac.nz  Tue Aug  3 03:28:13 2010
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Tue, 03 Aug 2010 13:28:13 +1200
Subject: [Python-ideas] Cofunctions - an enhancement to yield-from
In-Reply-To: <AANLkTikDEPkX-iLXqUZRAkBhpXokgkz2LSgXYubaUg_E@mail.gmail.com>
References: <4C5547F3.6010902@canterbury.ac.nz>
	<AANLkTikDEPkX-iLXqUZRAkBhpXokgkz2LSgXYubaUg_E@mail.gmail.com>
Message-ID: <4C5770AD.8000303@canterbury.ac.nz>

On 03/08/10 04:39, Guido van Rossum wrote:

> The only concrete objection I
> have is that it might be hard to implement in Jython or IronPython

As long as it's possible to implement 'yield from', it should be
possible to implement codef as well.

If nothing else, every call could expand into code that checks
for the presence of __cocall__ and then performs either a
normal call or a yield-from.

Another approach would be to compile all calls as

   yield from cowrap(func, args...)

where cowrap is defined something like

   def cowrap(func, *args, **kwds):
     if hasattr(func, '__cocall__'):
       return yield from func.__cocall__(*args, **kwds)
     else:
       return func(*args, **kwds)

-- 
Greg


From greg.ewing at canterbury.ac.nz  Tue Aug  3 03:46:14 2010
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Tue, 03 Aug 2010 13:46:14 +1200
Subject: [Python-ideas] Cofunctions - an enhancement to yield-from
In-Reply-To: <AANLkTik3PoC9hRw1zYgHw3fspq0Z5akC1nF5JXeguQps@mail.gmail.com>
References: <4C5547F3.6010902@canterbury.ac.nz>
	<20100801152037.78c802ba@pitrou.net>
	<4C560C2A.1080304@canterbury.ac.nz>
	<20100802021636.3922318c@pitrou.net> <4C56F0A5.1080604@scottdial.com>
	<20100802183741.42f8ac36@pitrou.net>
	<AANLkTik3PoC9hRw1zYgHw3fspq0Z5akC1nF5JXeguQps@mail.gmail.com>
Message-ID: <4C5774E6.6040507@canterbury.ac.nz>

On 03/08/10 04:45, Guido van Rossum wrote:

> It remains to be seen at which point this becomes prohibitive and when
> the overhead of wrapping every generator in a From instance (and
> passing every next() call through a method of that instance) is
> actually faster, given that a for-loop iteration is just a few
> bytecode instructions.

I don't know about the trampoline-style implementations
that have been posted, but I did some timings with my
yield-from implementation, and it seems that delegating
a next() call via yield-from has only about 8% of the
overhead of doing the same with a for-loop.

I also tried an experiment where I traversed a binary
tree using recursive generators and yield-from. For
a tree depth of 20, the whole thing was between 2 and
3 times faster than using for-loops.

-- 
Greg



From greg.ewing at canterbury.ac.nz  Tue Aug  3 08:17:30 2010
From: greg.ewing at canterbury.ac.nz (Gregory Ewing)
Date: Tue, 03 Aug 2010 18:17:30 +1200
Subject: [Python-ideas] Cofunctions - an enhancement to yield-from
In-Reply-To: <AANLkTikDEPkX-iLXqUZRAkBhpXokgkz2LSgXYubaUg_E@mail.gmail.com>
References: <4C5547F3.6010902@canterbury.ac.nz>
	<AANLkTikDEPkX-iLXqUZRAkBhpXokgkz2LSgXYubaUg_E@mail.gmail.com>
Message-ID: <4C57B47A.8060209@canterbury.ac.nz>

On 03/08/10 04:39, Guido van Rossum wrote:
> inevitably my coding went something like (1) forget to put
> a yield in, (2) frantically debug, (3) slap forehead, (4) add "if 0:
> yield" to the function, (5) continue with another instance of this,
> (6) lose sleep over the best place to spell the dummy yield and where
> to put it. At the same time I don't want to have to mark all my
> coroutines with a decorator, like Monocle requires (though maybe I
> should).

Would you be interested in a system which requires marking
calls to coroutines, but tells you immediately when you
have forgotten to mark such a call?

It might work something like this:

1. In a cofunction, a call to another cofunction must
    be marked with 'cocall', e,g.

    z = cocall f(x, y)

2. Cofunctions *cannot* be called normally -- they do
    not have a __call__ method, only a __cocall__ method.

So if you try to call a cofunction without using cocall,
you get an exception. If you try to call an ordinary function
using cocall, you get an exception. If you try to use cocall
but forget to declare the function with codef, you get an
exception (because cocall would only be allowed inside a
cofunction).

To start things off, a builtin function could be provided
such as

   def costart(f, *args, **kwds):
     return f.__cocall__(*args, **kwds)

which would return an object that a coroutine driver could
treat as a generator.

I think this scheme would go a long way towards satisfying
Antoine's desire to conceptually separate generators and
coroutines. It would also enable an implementation to
implement coroutines using a different mechanism from
generators if it wanted to.

-- 
Greg

This email may be confidential and subject to legal privilege, it may
not reflect the views of the University of Canterbury, and it is not
guaranteed to be virus free. If you are not an intended recipient,
please notify the sender immediately and erase all copies of the message
and any attachments.

Please refer to http://www.canterbury.ac.nz/emaildisclaimer for more
information.


From ghazel at gmail.com  Tue Aug  3 08:16:58 2010
From: ghazel at gmail.com (ghazel at gmail.com)
Date: Mon, 2 Aug 2010 23:16:58 -0700
Subject: [Python-ideas] Cofunctions - an enhancement to yield-from
In-Reply-To: <4C57B47A.8060209@canterbury.ac.nz>
References: <4C5547F3.6010902@canterbury.ac.nz>
	<AANLkTikDEPkX-iLXqUZRAkBhpXokgkz2LSgXYubaUg_E@mail.gmail.com> 
	<4C57B47A.8060209@canterbury.ac.nz>
Message-ID: <AANLkTinB0T0gCk+Z7aXAtp8U4qnm2ezHvaiv4D-mY57f@mail.gmail.com>

On Mon, Aug 2, 2010 at 11:17 PM, Gregory Ewing
<greg.ewing at canterbury.ac.nz> wrote:
> Would you be interested in a system which requires marking
> calls to coroutines, but tells you immediately when you
> have forgotten to mark such a call?
>
> It might work something like this:
>
> 1. In a cofunction, a call to another cofunction must
> ? be marked with 'cocall', e,g.
>
> ? z = cocall f(x, y)
>
> 2. Cofunctions *cannot* be called normally -- they do
> ? not have a __call__ method, only a __cocall__ method.
>
> So if you try to call a cofunction without using cocall,
> you get an exception. If you try to call an ordinary function
> using cocall, you get an exception. If you try to use cocall
> but forget to declare the function with codef, you get an
> exception (because cocall would only be allowed inside a
> cofunction).

I like this idea.

-Greg


From guido at python.org  Tue Aug  3 16:11:59 2010
From: guido at python.org (Guido van Rossum)
Date: Tue, 3 Aug 2010 07:11:59 -0700
Subject: [Python-ideas] Cofunctions - an enhancement to yield-from
In-Reply-To: <4C57B47A.8060209@canterbury.ac.nz>
References: <4C5547F3.6010902@canterbury.ac.nz>
	<AANLkTikDEPkX-iLXqUZRAkBhpXokgkz2LSgXYubaUg_E@mail.gmail.com> 
	<4C57B47A.8060209@canterbury.ac.nz>
Message-ID: <AANLkTi=gzBo0YN_4KJvgmy+PbY7MVpdnYF_eyb6rC3VG@mail.gmail.com>

That could be done with a decorator, right? The decorator wraps a
fuction in something non-callable and cocall is a function that
unwraps it and calls it.

On Mon, Aug 2, 2010 at 11:17 PM, Gregory Ewing
<greg.ewing at canterbury.ac.nz> wrote:
> On 03/08/10 04:39, Guido van Rossum wrote:
>>
>> inevitably my coding went something like (1) forget to put
>> a yield in, (2) frantically debug, (3) slap forehead, (4) add "if 0:
>> yield" to the function, (5) continue with another instance of this,
>> (6) lose sleep over the best place to spell the dummy yield and where
>> to put it. At the same time I don't want to have to mark all my
>> coroutines with a decorator, like Monocle requires (though maybe I
>> should).
>
> Would you be interested in a system which requires marking
> calls to coroutines, but tells you immediately when you
> have forgotten to mark such a call?
>
> It might work something like this:
>
> 1. In a cofunction, a call to another cofunction must
> ? be marked with 'cocall', e,g.
>
> ? z = cocall f(x, y)
>
> 2. Cofunctions *cannot* be called normally -- they do
> ? not have a __call__ method, only a __cocall__ method.
>
> So if you try to call a cofunction without using cocall,
> you get an exception. If you try to call an ordinary function
> using cocall, you get an exception. If you try to use cocall
> but forget to declare the function with codef, you get an
> exception (because cocall would only be allowed inside a
> cofunction).
>
> To start things off, a builtin function could be provided
> such as
>
> ?def costart(f, *args, **kwds):
> ? ?return f.__cocall__(*args, **kwds)
>
> which would return an object that a coroutine driver could
> treat as a generator.
>
> I think this scheme would go a long way towards satisfying
> Antoine's desire to conceptually separate generators and
> coroutines. It would also enable an implementation to
> implement coroutines using a different mechanism from
> generators if it wanted to.
>
> --
> Greg
>
> This email may be confidential and subject to legal privilege, it may
> not reflect the views of the University of Canterbury, and it is not
> guaranteed to be virus free. If you are not an intended recipient,
> please notify the sender immediately and erase all copies of the message
> and any attachments.
>
> Please refer to http://www.canterbury.ac.nz/emaildisclaimer for more
> information.
> _______________________________________________
> Python-ideas mailing list
> Python-ideas at python.org
> http://mail.python.org/mailman/listinfo/python-ideas
>



-- 
--Guido van Rossum (python.org/~guido)


From alexander.belopolsky at gmail.com  Tue Aug  3 17:49:45 2010
From: alexander.belopolsky at gmail.com (Alexander Belopolsky)
Date: Tue, 3 Aug 2010 11:49:45 -0400
Subject: [Python-ideas] Add aware local time support to datetime module
Message-ID: <AANLkTinwyUr0cwQJFz-A=ojQcDadCcwLgqLkqhgTA46a@mail.gmail.com>

With addition of fixed offset timezone class and the timezone.utc
instance [0], it is easy to get UTC time as an aware datetime
instance:

>>> datetime.now(timezone.utc)
datetime.datetime(2010, 8, 3, 14, 16, 10, 670308, tzinfo=datetime.timezone.utc)

However, if you want to keep time in your local timezone, getting an
aware datetime is almost a catch 22.  If you know your timezone UTC
offset, you can do

>>> EDT = timezone(timedelta(hours=-4))
>>> datetime.now(EDT)
datetime.datetime(2010, 8, 3, 10, 20, 23, 769537,
tzinfo=datetime.timezone(datetime.timedelta(-1, 72000)))

but the problem is that there is no obvious or even correct way to
find local timezone UTC offset. [1]

I a comment on issue #5094 ("datetime lacks concrete tzinfo
implementation for UTC"), I proposed to address this problem in a
localtime([t]) function that would return current time (or time
corresponding to the optional datetime argument) as an aware datetime
object carrying local timezone information in a tzinfo set to an
appropriate timezone instance.   This solution is attractive by its
simplicity, but there are several problems:

1. An aware datetime cannot carry all information that system
localtime() supplies in a time tuple.  Specifically, the is_dst flag
is lost.  This is not a problem for most applications as long as
timezone UTC offset and timezone name are available, but may be an
issue when interoperability with the time module is required.

2.  Datetime's tzinfo interface was designed with the idea that
<2010-11-06 12:00 EDT> + <1 day> =  <2010-11-07 12:00 EST>, not
<2010-11-07 12:00 EDT>. It other words, if I have lunch with someone
at noon (12:00 EDT) on Saturday the day before first Sunday in
November, and want to meet again "at the same time tomorrow", I mean
12:00 EST, not 24 hours later.  With localtime() returning datetime
with tzinfo set to fixed offset timezone, however, localtime()  +
timedelta(1) will mean exactly 24 hours later and the result will be
expressed in an unusual for the given location timezone.

An alternative approach is the one recommended in the python manual.
[3]  One could implement a LocalTimezone class with utcoffset(),
tzname() and dst() extracting information from system mktime and
localtime calls.  This approach has its own shortcomings:

1. While adding integral number of days to datetimes in business
setting, it is natural to expect automatic timezone adjustments, it is
not as clearcut when adding hours or minutes.

2. The tzinfo.utcoffset() interface that expects *standard* local time
as an argument is confusing to many users.  Even the "official"
example in the python manual gets it wrong. [4]

3. datetime(..., tzinfo=LocalTimezone()) is ambiguous during the
"repeated hour" when local clock is set back in DST to standard time
transition.

As far as I can tell, the only way to resolve the last problem is to
add is_dst flag to the datetime object, which would also be the the
only way to achieve full interoperability between datetime objects and
time tuples. [5]

The traditional answer to call for improvement of timezone support in
datetime module has been: "this is upto 3rd parties to implement."
Unfortunately, stdlib is asking 3rd parties to implement an impossible
interface without giving access to the necessary data.   The
impossibility comes from the requirement that dst() method should find
out whether local time represents DST or standard time while there is
an hour each year when the same local time can be either.  The missing
data is the system UTC offset when it changes historically.  The time
module only gives access to the current UTC offset.

My preference is to implement the first alternative - localtime([t])
returning aware datetime with fixed offset timezone.  This will solve
the problem of python's lack of access to the universally available
system facilities that are necessary to implement any kind of aware
local time support.

[0] http://docs.python.org/dev/library/datetime.html#timezone-objects
[1] http://bugs.python.org/issue1647654
[2] http://bugs.python.org/issue5094#msg106997
[3] http://docs.python.org/library/datetime.html#tzinfo-objects
[4] http://bugs.python.org/issue9063
[5] http://bugs.python.org/issue9004


From greg.ewing at canterbury.ac.nz  Wed Aug  4 01:02:15 2010
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Wed, 04 Aug 2010 11:02:15 +1200
Subject: [Python-ideas] Cofunctions - an enhancement to yield-from
In-Reply-To: <AANLkTi=gzBo0YN_4KJvgmy+PbY7MVpdnYF_eyb6rC3VG@mail.gmail.com>
References: <4C5547F3.6010902@canterbury.ac.nz>
	<AANLkTikDEPkX-iLXqUZRAkBhpXokgkz2LSgXYubaUg_E@mail.gmail.com>
	<4C57B47A.8060209@canterbury.ac.nz>
	<AANLkTi=gzBo0YN_4KJvgmy+PbY7MVpdnYF_eyb6rC3VG@mail.gmail.com>
Message-ID: <4C589FF7.6080508@canterbury.ac.nz>

Guido van Rossum wrote:
> That could be done with a decorator, right? The decorator wraps a
> fuction in something non-callable and cocall is a function that
> unwraps it and calls it.

That would cover part of it, the part about not being able
to make a normal call to a cofunction. But it wouldn't
enforce only being able to use cocall inside a cofunction,
or remove the need for a dummy yield in a cofunction that
doesn't otherwise have any.

Also, if cocall is just a function, you would still have to
use 'yield from' on the result, so all your coroutine calls
would end up looking like

    yield from cocall(f, args)

which makes them even more verbose and thumbstickingoutish.

-- 
Greg


From greg.ewing at canterbury.ac.nz  Thu Aug  5 13:51:29 2010
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Thu, 05 Aug 2010 23:51:29 +1200
Subject: [Python-ideas] Draft PEP on Cofunctions - Rev 1
Message-ID: <4C5AA5C1.1060605@canterbury.ac.nz>

Comments are invited on the following draft PEP.

PEP: XXX
Title: Cofunctions
Version: $Revision$
Last-Modified: $Date$
Author: Gregory Ewing <greg.ewing at canterbury.ac.nz>
Status: Draft
Type: Standards Track
Content-Type: text/x-rst
Created: 13-Feb-2009
Python-Version: 3.x
Post-History:


Abstract
========

A syntax is proposed for defining and calling a special type of generator
called a 'cofunction'.  It is designed to provide a streamlined way of
writing generator-based coroutines, and allow the early detection of
certain kinds of error that are easily made when writing such code, which
otherwise tend to cause hard-to-diagnose symptoms.

This proposal builds on the 'yield from' mechanism described in PEP 380,
and describes some of the semantics of cofunctions in terms of it. However,
it would be possible to define and implement cofunctions independently of
PEP 380 if so desired.


Proposal
========

Cofunction definitions
----------------------

A new keyword ``codef`` is introduced which is used in place of ``def`` to
define a cofunction. A cofunction is a special kind of generator having the
following characteristics:

1. A cofunction is always a generator, even if it does not contain any
    ``yield`` or ``yield from`` expressions.

2. A cofunction cannot be called the same way as an ordinary function. An
    exception is raised if an ordinary call to a cofunction is attempted.

Cocalls
-------

Calls from one cofunction to another are made by marking the call with
a new keyword ``cocall``. The expression

::

     cocall f(*args, **kwds)

is semantically equivalent to

::

     yield from f.__cocall__(*args, **kwds)

except that the object returned by __cocall__ is expected to be an
iterator, so the step of calling iter() on it is skipped.

The full syntax of a cocall expression is expressed by the following
grammar lines:

::

     atom: cocall | <existing alternatives for atom>
     cocall: 'cocall' atom cotrailer* '(' [arglist] ')'
     cotrailer: '[' subscriptlist ']' | '.' NAME

The ``cocall`` keyword is syntactically valid only inside a cofunction.
A SyntaxError will result if it is used in any other context.

Objects which implement __cocall__ are expected to return an object
obeying the iterator protocol. Cofunctions respond to __cocall__ the
same way as ordinary generator functions respond to __call__, i.e. by
returning a generator-iterator.

Certain objects that wrap other callable objects, notably bound methods,
will be given __cocall__ implementations that delegate to the underlying
object. Other candidates for this treatment include staticmethods and
classmethods.

New builtins and attributes
---------------------------

To facilitate interfacing cofunctions with non-coroutine code, there will
be a built-in function ``costart`` whose definition is equivalent to

::

     def costart(obj, *args, **kwds):
         return obj.__cocall__(*args, **kwds)

It is left unspecified for now whether a cofunction is a distinct type
of object or, like a generator function, is simply a specially-marked
function instance. If the latter, it is suggested that a read-only attribute
be provided to allow testing whether a given function object is a
cofunction.


Rationale
=========

The ``yield from`` syntax is reasonably self-explanatory when used for
the purpose of delegating part of the work of a generator to another
function. It can also be used to good effect in the implementation of
generator-based coroutines, but it reads somewhat awkwardly when used
for that purpose, and tends to obscure the true intent of the code.

Furthermore, using generators as coroutines is somewhat error-prone.
If one forgets to use ``yield from`` when it should have been used,
or uses it when it shouldn't have, the symptoms that result can be
obscure and confusing.

Finally, sometimes there is a need for a function to be a coroutine
even though it does not yield anything, and in these cases it is
necessary to resort to kludges such as ``if 0: yield`` to force it
to be a generator.

The ``codef`` and ``cocall`` constructs address the first issue by
making the syntax directly reflect the intent, that is, that the
function forms part of a coroutine.

The second issue is addressed
by making it impossible to mix coroutine and non-coroutine code in
ways that don't make sense. If the rules are violated, an exception
is raised that points out exactly what and where the problem is.

Lastly, the need for dummy yields is eliminated by making the
form of definition determine whether the function is a coroutine,
rather than what it contains.


Copyright
=========

This document has been placed in the public domain.



..
    Local Variables:
    mode: indented-text
    indent-tabs-mode: nil
    sentence-end-double-space: t
    fill-column: 70
    coding: utf-8
    End:


From cs at zip.com.au  Fri Aug  6 00:31:28 2010
From: cs at zip.com.au (Cameron Simpson)
Date: Fri, 6 Aug 2010 08:31:28 +1000
Subject: [Python-ideas] Cofunctions - an enhancement to yield-from
In-Reply-To: <AANLkTinB0T0gCk+Z7aXAtp8U4qnm2ezHvaiv4D-mY57f@mail.gmail.com>
References: <AANLkTinB0T0gCk+Z7aXAtp8U4qnm2ezHvaiv4D-mY57f@mail.gmail.com>
Message-ID: <20100805223128.GA2910@cskk.homeip.net>

On 02Aug2010 23:16, ghazel at gmail.com <ghazel at gmail.com> wrote:
| On Mon, Aug 2, 2010 at 11:17 PM, Gregory Ewing
| <greg.ewing at canterbury.ac.nz> wrote:
| > Would you be interested in a system which requires marking
| > calls to coroutines, but tells you immediately when you
| > have forgotten to mark such a call?
| >
| > It might work something like this:
| >
| > 1. In a cofunction, a call to another cofunction must
| > ? be marked with 'cocall', e,g.
| >
| > ? z = cocall f(x, y)
| >
| > 2. Cofunctions *cannot* be called normally -- they do
| > ? not have a __call__ method, only a __cocall__ method.
| >
| > So if you try to call a cofunction without using cocall,
| > you get an exception. If you try to call an ordinary function
| > using cocall, you get an exception. If you try to use cocall
| > but forget to declare the function with codef, you get an
| > exception (because cocall would only be allowed inside a
| > cofunction).
| 
| I like this idea.

Having just caught up with this thread, my first thought on reading the
opening post was that it could do with (2), above. So a big +1 from me.
It avoids misuse by failing early, and makes the failure reason obvious.

Cheers,
-- 
Cameron Simpson <cs at zip.com.au> DoD#743
http://www.cskk.ezoshosting.com/cs/

... you could spend *all day* customizing the title bar.  Believe me.  I
speak from experience.  - Matt Welsh


From jackdied at gmail.com  Fri Aug  6 02:15:05 2010
From: jackdied at gmail.com (Jack Diederich)
Date: Thu, 5 Aug 2010 20:15:05 -0400
Subject: [Python-ideas] Cofunctions - an enhancement to yield-from
In-Reply-To: <AANLkTikDEPkX-iLXqUZRAkBhpXokgkz2LSgXYubaUg_E@mail.gmail.com>
References: <4C5547F3.6010902@canterbury.ac.nz>
	<AANLkTikDEPkX-iLXqUZRAkBhpXokgkz2LSgXYubaUg_E@mail.gmail.com>
Message-ID: <AANLkTin3CyXrjSm6TO_mGuiY8Cc9Y1npTnCU2YyGbNji@mail.gmail.com>

On Mon, Aug 2, 2010 at 12:39 PM, Guido van Rossum <guido at python.org> wrote:
> [snip]
>
> OTOH I do appreciate the desire to reduce the number of places where
> one has to sprinkle 'yield' over one's code, and I've had a number of
> situations recently where I had something that logically needed to be
> a coroutine (to match some API) but just happened not to need any
> yields, and inevitably my coding went something like (1) forget to put
> a yield in, (2) frantically debug, (3) slap forehead, (4) add "if 0:
> yield" to the function, (5) continue with another instance of this,
> (6) lose sleep over the best place to spell the dummy yield and where
> to put it. At the same time I don't want to have to mark all my
> coroutines with a decorator, like Monocle requires (though maybe I
> should).

We already use decorators to change interfaces so I am +1 on reusing
what people already know.  contextlib.contextmanager is shorthand for
"the following function is shorthand for a contextmanager"

from contextlib import contextmanager
@contextmanager
def myfunc(...): ...

So the generator equivalent would be

from somelib import generator
@generator
def myfunc(...): ...

Where generator is as simple as
def generator(func):
  def inner(*args, **opts):
    if False: yield
    return func(*args, **opts)
  return inner

But the bulk of Greg's proposal is to transform the called function in
one of two ways
In the original: to make this
    return func(*args, **opts)
equivalent to this
    yield from func.__cocall__(*args, **opts)  # func must be defined
with 'codef' or support __cocall__

Or in his second suggested form to make this
    cocall func(*args, **opts)
equivalent to this
    yield from func.__cocall__(*args, **opts) #  func must support __cocall__

I'm not sure if the "codef" keyword is included in the second form.

I'm -1 on the first proposal because it buries that the calling
function is a generator.  "yield from" (which it would be a synonym or
replacement for) lets you know the called function is a generator
without having to read the body of the called function.

I'm -1 on the 2nd form (explicit "cocall") because it is a synonym for
"yield from" and "yield from" fits my brain better because reads as
"this is yield-like but slightly different."

-Jack


From greg.ewing at canterbury.ac.nz  Fri Aug  6 09:38:00 2010
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Fri, 06 Aug 2010 19:38:00 +1200
Subject: [Python-ideas] Cofunctions - an enhancement to yield-from
In-Reply-To: <AANLkTin3CyXrjSm6TO_mGuiY8Cc9Y1npTnCU2YyGbNji@mail.gmail.com>
References: <4C5547F3.6010902@canterbury.ac.nz>
	<AANLkTikDEPkX-iLXqUZRAkBhpXokgkz2LSgXYubaUg_E@mail.gmail.com>
	<AANLkTin3CyXrjSm6TO_mGuiY8Cc9Y1npTnCU2YyGbNji@mail.gmail.com>
Message-ID: <4C5BBBD8.4080805@canterbury.ac.nz>

Jack Diederich wrote:

> Or in his second suggested form to make this
>     cocall func(*args, **opts)
> equivalent to this
>     yield from func.__cocall__(*args, **opts) #  func must support __cocall__
> 
> I'm not sure if the "codef" keyword is included in the second form.

Yes, it is. An important part of it is that 'cocall' would only
be allowed inside a function defined with 'codef'. Together
with the other restrictions, this makes it impossible to mix
coroutine and non-coroutine code in invalid ways. I don't
think it's possible to get that using yield-from and decorators
(at least not without a lot of inefficient hackery).

-- 
Greg


From greg.ewing at canterbury.ac.nz  Sat Aug  7 09:12:25 2010
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Sat, 07 Aug 2010 19:12:25 +1200
Subject: [Python-ideas] Cofunctions: It's alive! Its alive!
Message-ID: <4C5D0759.30606@canterbury.ac.nz>

I've been doing some more hacking, and I now have a
working implementation of cofunctions, which I'll
upload somewhere soon.

I have also translated my yield-from examples to
use cofunctions. In the course of doing this, the
additional restrictions that cofunctions impose have
already proved their worth -- I forgot a cocall, and
it clearly told me so and pointed out exactly where
it had to go!

-- 
Greg


From greg.ewing at canterbury.ac.nz  Sat Aug  7 12:11:15 2010
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Sat, 07 Aug 2010 22:11:15 +1200
Subject: [Python-ideas] Cofunctions - Prototype Implementation
Message-ID: <4C5D3143.2020806@canterbury.ac.nz>

I've posted my prototype implementation of cofunctions
here:

http://www.cosc.canterbury.ac.nz/greg.ewing/python/generators/cofunctions.html

-- 
Greg


From cmjohnson.mailinglist at gmail.com  Sat Aug  7 12:05:01 2010
From: cmjohnson.mailinglist at gmail.com (Carl M. Johnson)
Date: Sat, 7 Aug 2010 00:05:01 -1000
Subject: [Python-ideas] Cofunctions: It's alive! Its alive!
In-Reply-To: <4C5D0759.30606@canterbury.ac.nz>
References: <4C5D0759.30606@canterbury.ac.nz>
Message-ID: <AANLkTi=V=WXhSa2LPk6_OGhRDRW91vAGa0eKHT0+HuEu@mail.gmail.com>

On Fri, Aug 6, 2010 at 9:12 PM, Greg Ewing wrote:

> I've been doing some more hacking, and I now have a
> working implementation of cofunctions, which I'll
> upload somewhere soon.
>
> I have also translated my yield-from examples to
> use cofunctions. In the course of doing this, the
> additional restrictions that cofunctions impose have
> already proved their worth -- I forgot a cocall, and
> it clearly told me so and pointed out exactly where
> it had to go!

This is good to hear. Without being too critical, I feel like saying
that so far I've been following the cofunctions threads and waiting
for compelling use cases. So, I'll be happy to see what you have
there. It seems like the main use case that comes to mind for
cofunctions is, essentially, quick and dirty cooperative multitasking.
Of course, as we all know on the OS side of things, cooperative
multitasking has been more or less phased out (I don't know about the
embeded space. Probably it's hanging on there for real time purposes.)
in favor of preemptive multitasking. But Python already has preemptive
multitasking: it's called threads. Or, if one prefers, there is
multiprocessing. Of course, those are relatively heavy-weight, but
then again, so is adding new keywords and syntax. So, other use cases
would be appreciated.

> I forgot a cocall, and it clearly told me so and pointed out exactly where it had to go!

Hmm. I think this can be pushed even farther. For example, we could
use mandatory function annotations to mark what classes a function is
capable of receiving and we could mark the classes of variable names
using some new syntax. Then when, for example, you accidentally try to
send a string to sum, you could be told at compile time "TypeError." I
propose we call this "Static TypeErroring". ;-)

OTOH, "Explicit is better than implicit." So, maybe the explicit
syntax for cocalling is worth the pain. Again, I'd like to see more
motivating examples.

Cautiously-optimistically-yrs,

-- Carl Johnson


From ghazel at gmail.com  Sat Aug  7 13:02:58 2010
From: ghazel at gmail.com (ghazel at gmail.com)
Date: Sat, 7 Aug 2010 04:02:58 -0700
Subject: [Python-ideas] Cofunctions: It's alive! Its alive!
In-Reply-To: <AANLkTi=V=WXhSa2LPk6_OGhRDRW91vAGa0eKHT0+HuEu@mail.gmail.com>
References: <4C5D0759.30606@canterbury.ac.nz>
	<AANLkTi=V=WXhSa2LPk6_OGhRDRW91vAGa0eKHT0+HuEu@mail.gmail.com>
Message-ID: <AANLkTimqz_kbLUzcCyS2CxUk6ox5MMxMAVasO13NRUad@mail.gmail.com>

On Sat, Aug 7, 2010 at 3:05 AM, Carl M. Johnson
<cmjohnson.mailinglist at gmail.com> wrote:
>
> On Fri, Aug 6, 2010 at 9:12 PM, Greg Ewing wrote:
>
> > I've been doing some more hacking, and I now have a
> > working implementation of cofunctions, which I'll
> > upload somewhere soon.
> >
> > I have also translated my yield-from examples to
> > use cofunctions. In the course of doing this, the
> > additional restrictions that cofunctions impose have
> > already proved their worth -- I forgot a cocall, and
> > it clearly told me so and pointed out exactly where
> > it had to go!
>
> This is good to hear. Without being too critical, I feel like saying
> that so far I've been following the cofunctions threads and waiting
> for compelling use cases. So, I'll be happy to see what you have
> there. It seems like the main use case that comes to mind for
> cofunctions is, essentially, quick and dirty cooperative multitasking.
> Of course, as we all know on the OS side of things, cooperative
> multitasking has been more or less phased out (I don't know about the
> embeded space. Probably it's hanging on there for real time purposes.)
> in favor of preemptive multitasking. But Python already has preemptive
> multitasking: it's called threads. Or, if one prefers, there is
> multiprocessing. Of course, those are relatively heavy-weight, but
> then again, so is adding new keywords and syntax. So, other use cases
> would be appreciated.

I am excited about the cofunctions PEP and features, since it will
greatly improve the ability for monocle to have cooperative async
tasks. An extension in the yield-from PEP allows us to return normal
values without confusing semantics, cocall and codef allow us to avoid
several types of errors, and in general the idea of a cofunction is
promoted to a first-class tool, instead of an overloaded trick on top
of generators.

This need for these monocle chains is a much larger topic, but threads
and processes introduce locking and state sharing complications which
are much more involved than simply pausing a function until data is
received. Not to mention the GIL. Cooperative multitasking in the OS
was set aside, but async IO is alive and well. Allowing programmers to
use async IO with a linear blocking look-alike syntax seems to be an
important compromise which makes it easier to write efficient
concurrent networking code.

-Greg


From guido at python.org  Sat Aug  7 16:51:04 2010
From: guido at python.org (Guido van Rossum)
Date: Sat, 7 Aug 2010 07:51:04 -0700
Subject: [Python-ideas] Cofunctions: It's alive! Its alive!
In-Reply-To: <AANLkTi=V=WXhSa2LPk6_OGhRDRW91vAGa0eKHT0+HuEu@mail.gmail.com>
References: <4C5D0759.30606@canterbury.ac.nz>
	<AANLkTi=V=WXhSa2LPk6_OGhRDRW91vAGa0eKHT0+HuEu@mail.gmail.com>
Message-ID: <AANLkTi=8i7pRiC4AiDCxB=B6gE42OWDuuFQjbEY6CBp4@mail.gmail.com>

On Sat, Aug 7, 2010 at 3:05 AM, Carl M. Johnson
<cmjohnson.mailinglist at gmail.com> wrote:
> Of course, as we all know on the OS side of things, cooperative
> multitasking has been more or less phased out [...]

I don't think you can see this as reflecting badly on cooperative
multitasking. Rather, it is now understood better and has moved
entirely to user space, leaving the OS to deal with preemptive
threads, which cannot work well without OS support. As an example of
cooperative multitasking being alive and well, see Stackless,
Greenlets, and, in fact, coroutines built out of generators (possible
in Python 2.5 and later).

-- 
--Guido van Rossum (python.org/~guido)


From ncoghlan at gmail.com  Sun Aug  8 02:20:54 2010
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Sun, 8 Aug 2010 10:20:54 +1000
Subject: [Python-ideas] Cofunctions: It's alive! Its alive!
In-Reply-To: <AANLkTi=8i7pRiC4AiDCxB=B6gE42OWDuuFQjbEY6CBp4@mail.gmail.com>
References: <4C5D0759.30606@canterbury.ac.nz>
	<AANLkTi=V=WXhSa2LPk6_OGhRDRW91vAGa0eKHT0+HuEu@mail.gmail.com>
	<AANLkTi=8i7pRiC4AiDCxB=B6gE42OWDuuFQjbEY6CBp4@mail.gmail.com>
Message-ID: <AANLkTikP2d6i+x+=vbmcs7ey3TksoVHFJ1kmxOmApQUU@mail.gmail.com>

On Sun, Aug 8, 2010 at 12:51 AM, Guido van Rossum <guido at python.org> wrote:
> On Sat, Aug 7, 2010 at 3:05 AM, Carl M. Johnson
> <cmjohnson.mailinglist at gmail.com> wrote:
>> Of course, as we all know on the OS side of things, cooperative
>> multitasking has been more or less phased out [...]
>
> I don't think you can see this as reflecting badly on cooperative
> multitasking. Rather, it is now understood better and has moved
> entirely to user space, leaving the OS to deal with preemptive
> threads, which cannot work well without OS support. As an example of
> cooperative multitasking being alive and well, see Stackless,
> Greenlets, and, in fact, coroutines built out of generators (possible
> in Python 2.5 and later).

Even simpler: GUI event loops are fundamentally about cooperative
multi-tasking. To me, the cofunction idea is really about making it
easier to write event loop code.

I think PEP 380 works well on its own, but will work better when
paired with a separate cofunctions PEP (similar to how PEP 342 and PEP
343 were best considered as a pair of co-proposals that went together,
even though they were technically independent of each other).

Cheers,
Nick.

-- 
Nick Coghlan?? |?? ncoghlan at gmail.com?? |?? Brisbane, Australia


From greg.ewing at canterbury.ac.nz  Sun Aug  8 04:00:49 2010
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Sun, 08 Aug 2010 14:00:49 +1200
Subject: [Python-ideas] Cofunctions: It's alive! Its alive!
In-Reply-To: <AANLkTi=V=WXhSa2LPk6_OGhRDRW91vAGa0eKHT0+HuEu@mail.gmail.com>
References: <4C5D0759.30606@canterbury.ac.nz>
	<AANLkTi=V=WXhSa2LPk6_OGhRDRW91vAGa0eKHT0+HuEu@mail.gmail.com>
Message-ID: <4C5E0FD1.8030200@canterbury.ac.nz>

Carl M. Johnson wrote:

> It seems like the main use case that comes to mind for
> cofunctions is, essentially, quick and dirty cooperative multitasking.
> Of course, as we all know on the OS side of things, cooperative
> multitasking has been more or less phased out ...
> in favor of preemptive multitasking. 

They're meant for different things. Preemptive multitasking
is for concurrency -- you have N cores and they can all
physically run at the same time. Cooperative multitasking
is a program structuring technique, for when it helps to
think in terms of several separate state machines interacting
with each other.

The existence of widely-used frameworks such as Twisted
attests that there are very real use cases for non-preemptive
multitasking.

Recently I've come across a discrete-event simulation package
called SimPy that does process-oriented simulation using
the generator/trampoline technique. This seems like an ideal
use case for cofunctions, and I'm going to look into creating
a cofunction-based version of it.

> Hmm. I think this can be pushed even farther. ... 
 > Then when, for example, you accidentally try to
> send a string to sum, you could be told at compile time "TypeError." I
> propose we call this "Static TypeErroring". ;-)

Well, I think you're indulging in extrapolation ad absurdum
here...

My initial proposal for cofunctions was considerably more
dynamic, but it got criticised for not being explicit enough.
It also wasn't capable of detecting and clearly diagnosing
all of the errors that the current one can.

> OTOH, "Explicit is better than implicit." So, maybe the explicit
> syntax for cocalling is worth the pain.

When you consider that the alternative to writing 'cocall'
in certain places is to write 'yield from' or some equivalent
thing in all the same places, I'd say the cofunction way is
actually *less* painful.

"Doctor, it hurts when I do this."
"Don't do that, then."
"But if I don't, it hurts twice as badly!"

-- 
Greg



From guido at python.org  Sun Aug  8 04:16:03 2010
From: guido at python.org (Guido van Rossum)
Date: Sat, 7 Aug 2010 19:16:03 -0700
Subject: [Python-ideas] Cofunctions: It's alive! Its alive!
In-Reply-To: <AANLkTikP2d6i+x+=vbmcs7ey3TksoVHFJ1kmxOmApQUU@mail.gmail.com>
References: <4C5D0759.30606@canterbury.ac.nz>
	<AANLkTi=V=WXhSa2LPk6_OGhRDRW91vAGa0eKHT0+HuEu@mail.gmail.com> 
	<AANLkTi=8i7pRiC4AiDCxB=B6gE42OWDuuFQjbEY6CBp4@mail.gmail.com> 
	<AANLkTikP2d6i+x+=vbmcs7ey3TksoVHFJ1kmxOmApQUU@mail.gmail.com>
Message-ID: <AANLkTikk595h5VOUvGzQnsqYRL+kXLC3zs=udzxTT1=z@mail.gmail.com>

On Sat, Aug 7, 2010 at 5:20 PM, Nick Coghlan <ncoghlan at gmail.com> wrote:
> On Sun, Aug 8, 2010 at 12:51 AM, Guido van Rossum <guido at python.org> wrote:
>> On Sat, Aug 7, 2010 at 3:05 AM, Carl M. Johnson
>> <cmjohnson.mailinglist at gmail.com> wrote:
>>> Of course, as we all know on the OS side of things, cooperative
>>> multitasking has been more or less phased out [...]
>>
>> I don't think you can see this as reflecting badly on cooperative
>> multitasking. Rather, it is now understood better and has moved
>> entirely to user space, leaving the OS to deal with preemptive
>> threads, which cannot work well without OS support. As an example of
>> cooperative multitasking being alive and well, see Stackless,
>> Greenlets, and, in fact, coroutines built out of generators (possible
>> in Python 2.5 and later).
>
> Even simpler: GUI event loops are fundamentally about cooperative
> multi-tasking. To me, the cofunction idea is really about making it
> easier to write event loop code.

I actually added a reference to Twisted to my email, and then took it
out because (while I agree to a point) the programming model with
callbacks is so different that the word "task" doesn't really cover it
for me. But it is indeed all about total app control over where to
suspend execution in favor of another activity.

> I think PEP 380 works well on its own, but will work better when
> paired with a separate cofunctions PEP

We'll see. I still cannot get my head around why cofunctions are so
great. (Also the name sucks for me.)

> (similar to how PEP 342 and PEP
> 343 were best considered as a pair of co-proposals that went together,
> even though they were technically independent of each other).

That's a rather one-sided relationship though. While PEP 343 deeply
relies on the improvements to yield (especially close() and the new
exception semantics), the main subject of PEP 342 (coroutines)
couldn't care less about with-statements (unless there's a pattern I'm
missing).

-- 
--Guido van Rossum (python.org/~guido)


From greg.ewing at canterbury.ac.nz  Sun Aug  8 04:47:40 2010
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Sun, 08 Aug 2010 14:47:40 +1200
Subject: [Python-ideas] Cofunctions: It's alive! Its alive!
In-Reply-To: <AANLkTikk595h5VOUvGzQnsqYRL+kXLC3zs=udzxTT1=z@mail.gmail.com>
References: <4C5D0759.30606@canterbury.ac.nz>
	<AANLkTi=V=WXhSa2LPk6_OGhRDRW91vAGa0eKHT0+HuEu@mail.gmail.com>
	<AANLkTi=8i7pRiC4AiDCxB=B6gE42OWDuuFQjbEY6CBp4@mail.gmail.com>
	<AANLkTikP2d6i+x+=vbmcs7ey3TksoVHFJ1kmxOmApQUU@mail.gmail.com>
	<AANLkTikk595h5VOUvGzQnsqYRL+kXLC3zs=udzxTT1=z@mail.gmail.com>
Message-ID: <4C5E1ACC.8020305@canterbury.ac.nz>

Guido van Rossum wrote:

> We'll see. I still cannot get my head around why cofunctions are so
> great. (Also the name sucks for me.)

I'm open to suggestions for an alternative name.

-- 
Greg



From rrr at ronadam.com  Sun Aug  8 05:17:51 2010
From: rrr at ronadam.com (Ron Adam)
Date: Sat, 07 Aug 2010 22:17:51 -0500
Subject: [Python-ideas] Cofunctions: It's alive! Its alive!
In-Reply-To: <4C5D0759.30606@canterbury.ac.nz>
References: <4C5D0759.30606@canterbury.ac.nz>
Message-ID: <4C5E21DF.6070707@ronadam.com>

On 08/07/2010 02:12 AM, Greg Ewing wrote:
> I've been doing some more hacking, and I now have a
> working implementation of cofunctions, which I'll
> upload somewhere soon.
>
> I have also translated my yield-from examples to
> use cofunctions. In the course of doing this, the
> additional restrictions that cofunctions impose have
> already proved their worth -- I forgot a cocall, and
> it clearly told me so and pointed out exactly where
> it had to go!

Would it be even remotely possible...

... to write a co-function program in a way where it could be switched from 
cooperative multitasking to preemptive multitasking by the use of a single 
flag? (I'd be +10,000 for this.)

If so, it would enable a way to hide a lot of details of multi-tasking and 
multi-processing in a convenient to use api.


No I haven't thought it though all that far.  But it seems to me it might 
be able to work if generators could work in a suspend before yield, instead 
of a suspend after yield mode.


Ideally pie in the sky,
   Ron

Hey, this is the idea list after all. ;-)


From stefan_ml at behnel.de  Sun Aug  8 08:50:31 2010
From: stefan_ml at behnel.de (Stefan Behnel)
Date: Sun, 08 Aug 2010 08:50:31 +0200
Subject: [Python-ideas] Cofunctions: It's alive! Its alive!
In-Reply-To: <4C5E21DF.6070707@ronadam.com>
References: <4C5D0759.30606@canterbury.ac.nz> <4C5E21DF.6070707@ronadam.com>
Message-ID: <i3lk3n$tnd$1@dough.gmane.org>

Ron Adam, 08.08.2010 05:17:
> Would it be even remotely possible...
>
> ... to write a co-function program in a way where it could be switched
> from cooperative multitasking to preemptive multitasking by the use of a
> single flag? (I'd be +10,000 for this.)

I wouldn't.


> If so, it would enable a way to hide a lot of details of multi-tasking
> and multi-processing in a convenient to use api.

Totally not. Cooperative multitasking is about predictable interaction 
between parts of a program. Preemptive multitasking (in the sense of 
threading) is about non-deterministic concurrency. Except for some very 
special cases, there is no way you can take a piece of code that uses 
cooperative multitasking, switch it over to run concurrently, and still 
have it execute safely and correctly.

I may end up liking the idea of using yield statements for thread 
synchronisation points, though.

Stefan



From brtzsnr at gmail.com  Mon Aug  9 19:16:31 2010
From: brtzsnr at gmail.com (=?UTF-8?Q?Alexandru_Mo=C8=99oi?=)
Date: Mon, 9 Aug 2010 13:16:31 -0400
Subject: [Python-ideas] iterator length
Message-ID: <AANLkTi=YrGownzjyNju3io472o=JE8BaogN-VsCHmJdv@mail.gmail.com>

Hello,

Sometimes it's useful to get the number of elements yield by an
iterator. For example (if ilen is the name of the function):

def pi(n):
 return ilen(for e in xrange(n) if isprime(e))

def count_pred(pred, iterator):
 return ilen(itertools.ifilter(pred, iterator))

A few solutions were discussed here
http://stackoverflow.com/questions/3393431/how-to-counting-not-0-elements-in-an-iterable
with the best two below:

1) sum(1 for e in iterator)
2) len(list(iterator))

First solution is slow, the second solution uses O(N) extra memory.

I propose the addition of a new function ilen() which is functionally
equivalent to:

def ilen(iterator):
 return sum(1 for e in iterator)

This function should be different from len() because it's time
complexity is O(N) (most people assume that len() takes O(1)) and it
consumes the iterator.

Regards,

-- 
Alexandru Mo?oi
http://www.alexandru.mosoi.ro/


From p.f.moore at gmail.com  Mon Aug  9 21:52:39 2010
From: p.f.moore at gmail.com (Paul Moore)
Date: Mon, 9 Aug 2010 20:52:39 +0100
Subject: [Python-ideas] iterator length
In-Reply-To: <AANLkTi=YrGownzjyNju3io472o=JE8BaogN-VsCHmJdv@mail.gmail.com>
References: <AANLkTi=YrGownzjyNju3io472o=JE8BaogN-VsCHmJdv@mail.gmail.com>
Message-ID: <AANLkTimMUQ2butzBp=mUnc9NMWUiu0Fh5jsaux1H=jY9@mail.gmail.com>

On 9 August 2010 18:16, Alexandru Mo?oi <brtzsnr at gmail.com> wrote:
> 1) sum(1 for e in iterator)
> 2) len(list(iterator))
>
> First solution is slow, the second solution uses O(N) extra memory.
>
> I propose the addition of a new function ilen() which is functionally
> equivalent to:
>
> def ilen(iterator):
> ?return sum(1 for e in iterator)

You say that this solution "is slow" and then you propose it? I'm confused.

Besides which, as you define it, it exhausts the iterator, which makes
it useless. It may be useful for an *iterable*, but most of them
support len in any case.

> This function should be different from len() because it's time
> complexity is O(N) (most people assume that len() takes O(1)) and it
> consumes the iterator.

Precisely. So how is it useful?

If you could show some real code that uses your ilen function, that
would help clarify. But it still won't explain why the function should
be built in rather than just defined by your code where it's needed -
you'll have to have some very common and compelling use cases to argue
that.

Paul.


From brtzsnr at gmail.com  Mon Aug  9 22:17:23 2010
From: brtzsnr at gmail.com (=?UTF-8?Q?Alexandru_Mo=C8=99oi?=)
Date: Mon, 9 Aug 2010 16:17:23 -0400
Subject: [Python-ideas] iterator length
In-Reply-To: <AANLkTimMUQ2butzBp=mUnc9NMWUiu0Fh5jsaux1H=jY9@mail.gmail.com>
References: <AANLkTi=YrGownzjyNju3io472o=JE8BaogN-VsCHmJdv@mail.gmail.com>
	<AANLkTimMUQ2butzBp=mUnc9NMWUiu0Fh5jsaux1H=jY9@mail.gmail.com>
Message-ID: <AANLkTi=94TQZc3HcF0ev_KhE+suW2o3+cOuynF521pxF@mail.gmail.com>

2010/8/9 Paul Moore <p.f.moore at gmail.com>:
> If you could show some real code that uses your ilen function, that
> would help clarify. But it still won't explain why the function should
> be built in rather than just defined by your code where it's needed -
> you'll have to have some very common and compelling use cases to argue
> that.

My requirements was to count the non-zero elements from a list like this:
   sum(1 for e in iterator if not e)

What I'm really looking for is the number of elements in a list comprehension:
   len(list(for e in iterator if not e))

but this is not generally useful nor optimal in terms of memory requirements.

My idea was to implement the above with the aid of itertools.ifilter:
   ilen(itertools.ifilter(pred, iterable))

if pred is None, this would translate in my usecase.

Since I first post this I learned that ilen (or something similar) was
rejected before due to similar concerns: it consumes the iterator,
it's not a real optimization.

How about: count(pred, iterable) which returns the same value as
len(filter(pred, iterable))?


-- 
Alexandru Mo?oi
http://www.alexandru.mosoi.ro/


From python at mrabarnett.plus.com  Mon Aug  9 22:22:51 2010
From: python at mrabarnett.plus.com (MRAB)
Date: Mon, 09 Aug 2010 21:22:51 +0100
Subject: [Python-ideas] iterator length
In-Reply-To: <AANLkTimMUQ2butzBp=mUnc9NMWUiu0Fh5jsaux1H=jY9@mail.gmail.com>
References: <AANLkTi=YrGownzjyNju3io472o=JE8BaogN-VsCHmJdv@mail.gmail.com>
	<AANLkTimMUQ2butzBp=mUnc9NMWUiu0Fh5jsaux1H=jY9@mail.gmail.com>
Message-ID: <4C60639B.3040007@mrabarnett.plus.com>

Paul Moore wrote:
> On 9 August 2010 18:16, Alexandru Mo?oi <brtzsnr at gmail.com> wrote:
>> 1) sum(1 for e in iterator)
>> 2) len(list(iterator))
>>
>> First solution is slow, the second solution uses O(N) extra memory.
>>
>> I propose the addition of a new function ilen() which is functionally
>> equivalent to:
>>
>> def ilen(iterator):
>>  return sum(1 for e in iterator)
> 
> You say that this solution "is slow" and then you propose it? I'm confused.
> 
That's just to describe its behaviour. An actual implementation wouldn't
necessarily do it that way.

> Besides which, as you define it, it exhausts the iterator, which makes
> it useless. It may be useful for an *iterable*, but most of them
> support len in any case.
> 
>> This function should be different from len() because it's time
>> complexity is O(N) (most people assume that len() takes O(1)) and it
>> consumes the iterator.
> 
> Precisely. So how is it useful?
> 
> If you could show some real code that uses your ilen function, that
> would help clarify. But it still won't explain why the function should
> be built in rather than just defined by your code where it's needed -
> you'll have to have some very common and compelling use cases to argue
> that.
> 
I agree.


From vano at mail.mipt.ru  Mon Aug  9 23:42:12 2010
From: vano at mail.mipt.ru (Ivan Pozdeev)
Date: Tue, 10 Aug 2010 01:42:12 +0400
Subject: [Python-ideas] iterator length
In-Reply-To: <AANLkTi=YrGownzjyNju3io472o=JE8BaogN-VsCHmJdv@mail.gmail.com>
References: <AANLkTi=YrGownzjyNju3io472o=JE8BaogN-VsCHmJdv@mail.gmail.com>
Message-ID: <211192209.20100810014212@mail.mipt.ru>

> Hello,

> Sometimes it's useful to get the number of elements yield by an
> iterator. For example (if ilen is the name of the function):

A generic iterator is something that inherently, by definition, has
unknown and unpredictable length
(think, for example, about berries you gather in a forest).

There is no guarantee two tries give the same results,
there isn't even a guarantee that it's finite.

You just can't, absolutely, by any means, know its length otherwise
than by exhausting it. If you can, it's not a generic iterator but
something else.

-- 
Regards,
 Ivan                          mailto:vano at mail.mipt.ru



From ncoghlan at gmail.com  Tue Aug 10 04:37:32 2010
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Tue, 10 Aug 2010 12:37:32 +1000
Subject: [Python-ideas] iterator length
In-Reply-To: <211192209.20100810014212@mail.mipt.ru>
References: <AANLkTi=YrGownzjyNju3io472o=JE8BaogN-VsCHmJdv@mail.gmail.com>
	<211192209.20100810014212@mail.mipt.ru>
Message-ID: <AANLkTi=uFRHvN0AS+8fqFSS3Hr8mwoJ92-cu-U87U416@mail.gmail.com>

On Tue, Aug 10, 2010 at 7:42 AM, Ivan Pozdeev <vano at mail.mipt.ru> wrote:
>> Hello,
>
>> Sometimes it's useful to get the number of elements yield by an
>> iterator. For example (if ilen is the name of the function):
>
> A generic iterator is something that inherently, by definition, has
> unknown and unpredictable length
> (think, for example, about berries you gather in a forest).
>
> There is no guarantee two tries give the same results,
> there isn't even a guarantee that it's finite.

Indeed - iterating over a completely arbitrary iterator with no exit
criteria is a recipe for infinite loops when someone passes in
something like itertools.count or itertools.repeat.

> You just can't, absolutely, by any means, know its length otherwise
> than by exhausting it. If you can, it's not a generic iterator but
> something else.

Yep, hence the existence of the __length_hint__ API as an internal
optimisation when dealing with iterators that do have some idea of
their length (see
http://mail.python.org/pipermail/python-dev/2009-April/088108.html for
discussion as to why it is undocumented).

Cheers,
Nick.

-- 
Nick Coghlan?? |?? ncoghlan at gmail.com?? |?? Brisbane, Australia


From greg.ewing at canterbury.ac.nz  Tue Aug 10 09:22:31 2010
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Tue, 10 Aug 2010 19:22:31 +1200
Subject: [Python-ideas] Cofunctions: It's alive! Its alive!
In-Reply-To: <AANLkTikk595h5VOUvGzQnsqYRL+kXLC3zs=udzxTT1=z@mail.gmail.com>
References: <4C5D0759.30606@canterbury.ac.nz>
	<AANLkTi=V=WXhSa2LPk6_OGhRDRW91vAGa0eKHT0+HuEu@mail.gmail.com>
	<AANLkTi=8i7pRiC4AiDCxB=B6gE42OWDuuFQjbEY6CBp4@mail.gmail.com>
	<AANLkTikP2d6i+x+=vbmcs7ey3TksoVHFJ1kmxOmApQUU@mail.gmail.com>
	<AANLkTikk595h5VOUvGzQnsqYRL+kXLC3zs=udzxTT1=z@mail.gmail.com>
Message-ID: <4C60FE37.2020303@canterbury.ac.nz>

Guido van Rossum wrote:

> We'll see. I still cannot get my head around why cofunctions are so
> great.

I think I can offer some evidence. I've been playing around with
two versions of a discrete-event simulation kernel, one using
yield-from and the other using cofunctions. Here's the main
function of one of my test cases. I've introduced a deliberate
bug -- can you spot it?

def customer(i):
   print("Customer", i, "arriving at", now())
   yield from tables.acquire(1)
   print("Customer", i, "sits down at a table at", now())
   yield from waiters.acquire(1)
   print("Customer", i, "orders spam at", now())
   hold(random.normalvariate(20, 2))
   waiters.release(1)
   print("Customer", i, "gets served spam at", now())
   yield from hold(random.normalvariate(10, 5))
   print("Customer", i, "finished eating at", now())
   tables.release(1)

The bug is that the first call to hold() is missing a 'yield
from' in front of it. If I run this, I don't get any exception --
it produces plausible-looking but incorrect results.

Here's another version, with a very similar bug in a different
place -- this time it's the second call to hold() that's missing
a 'yield from'.

def customer(i):
   print("Customer", i, "arriving at", now())
   yield from tables.acquire(1)
   print("Customer", i, "sits down at a table at", now())
   yield from waiters.acquire(1)
   print("Customer", i, "orders spam at", now())
   yield from hold(random.normalvariate(20, 2))
   waiters.release(1)
   print("Customer", i, "gets served spam at", now())
   hold(random.normalvariate(10, 5))
   print("Customer", i, "finished eating at", now())
   tables.release(1)

If I run this one, I do get an exception, but it's a rather
unhelpful one:

Traceback (most recent call last):
   File "restaurant2.py", line 35, in <module>
     run()
   File 
"/Local/Projects/D/Python/YieldFrom/3.1/YieldFrom-3.1.2/Examples/Simulation/simulation.py", 
line 25, in run
     next(current_process)
   File "restaurant2.py", line 32, in customer
     tables.release(1)
   File 
"/Local/Projects/D/Python/YieldFrom/3.1/YieldFrom-3.1.2/Examples/Simulation/resource.py", 
line 25, in release
     wakeup(self.queue[0])
   File 
"/Local/Projects/D/Python/YieldFrom/3.1/YieldFrom-3.1.2/Examples/Simulation/simulation.py", 
line 34, in wakeup
     schedule(process, now())
   File 
"/Local/Projects/D/Python/YieldFrom/3.1/YieldFrom-3.1.2/Examples/Simulation/simulation.py", 
line 16, in schedule
     heappush(event_queue, (time, process))
TypeError: unorderable types: generator() < generator()

If you examine the traceback, you'll find that *nowhere* does it
mention the location where the error actually is! Instead, a
mysterious error emanates from some place deep inside the scheduler.
I would hate to have to track down a problem like this in a large
program.

Here's the equivalent thing using cofunctions, complete with a
corresponding missing 'cocall':

codef customer(i):
   print("Customer", i, "arriving at", now())
   cocall tables.acquire(1)
   print("Customer", i, "sits down at a table at", now())
   cocall waiters.acquire(1)
   print("Customer", i, "orders spam at", now())
   cocall hold(random.normalvariate(20, 2))
   cocall waiters.release(1)
   print("Customer", i, "gets served spam at", now())
   hold(random.normalvariate(10, 5))
   print("Customer", i, "finished eating at", now())
   cocall tables.release(1)

The exception and traceback resulting from this is crystal clear:

Traceback (most recent call last):
   File "restaurant2.py", line 34, in <module>
     run()
   File 
"/Local/Projects/D/Python/YieldFrom/3.1/Cofunctions-3.1.2/Examples/Simulation/simulation.py", 
line 25, in run
     next(current_process)
   File "restaurant2.py", line 29, in customer
     hold(random.normalvariate(10, 5))
TypeError: Cofunction must be called with cocall or costart

If this doesn't convince you of the utility of cofunctions or
something like them, I don't know what will.

-- 
Greg


From jackdied at gmail.com  Tue Aug 10 14:10:44 2010
From: jackdied at gmail.com (Jack Diederich)
Date: Tue, 10 Aug 2010 08:10:44 -0400
Subject: [Python-ideas] Cofunctions: It's alive! Its alive!
In-Reply-To: <4C60FE37.2020303@canterbury.ac.nz>
References: <4C5D0759.30606@canterbury.ac.nz>
	<AANLkTi=V=WXhSa2LPk6_OGhRDRW91vAGa0eKHT0+HuEu@mail.gmail.com>
	<AANLkTi=8i7pRiC4AiDCxB=B6gE42OWDuuFQjbEY6CBp4@mail.gmail.com>
	<AANLkTikP2d6i+x+=vbmcs7ey3TksoVHFJ1kmxOmApQUU@mail.gmail.com>
	<AANLkTikk595h5VOUvGzQnsqYRL+kXLC3zs=udzxTT1=z@mail.gmail.com>
	<4C60FE37.2020303@canterbury.ac.nz>
Message-ID: <AANLkTinRFV6f4_u3Q5+HQ1gUyniTwoCvTvb+-EaDj2zD@mail.gmail.com>

On Tue, Aug 10, 2010 at 3:22 AM, Greg Ewing <greg.ewing at canterbury.ac.nz> wrote:
> Guido van Rossum wrote:
>
>> We'll see. I still cannot get my head around why cofunctions are so
>> great.
>
[snip]
> Here's the equivalent thing using cofunctions, complete with a
> corresponding missing 'cocall':
>
[snip]
> The exception and traceback resulting from this is crystal clear:
>
> Traceback (most recent call last):
> ?File "restaurant2.py", line 34, in <module>
> ? ?run()
> ?File
> "/Local/Projects/D/Python/YieldFrom/3.1/Cofunctions-3.1.2/Examples/Simulation/simulation.py",
> line 25, in run
> ? ?next(current_process)
> ?File "restaurant2.py", line 29, in customer
> ? ?hold(random.normalvariate(10, 5))
> TypeError: Cofunction must be called with cocall or costart
>
> If this doesn't convince you of the utility of cofunctions or
> something like them, I don't know what will.

So the benefit of cocalls is runtime type checking? Are your unit tests broken?

I was -0 on ABCs and function annotations because I was promised by
people that liked them that I could safely ignore them.  I can't
safely ignore this so I'm -1.

-Jack


From ghazel at gmail.com  Tue Aug 10 14:45:34 2010
From: ghazel at gmail.com (ghazel at gmail.com)
Date: Tue, 10 Aug 2010 05:45:34 -0700
Subject: [Python-ideas] Cofunctions: It's alive! Its alive!
In-Reply-To: <AANLkTinRFV6f4_u3Q5+HQ1gUyniTwoCvTvb+-EaDj2zD@mail.gmail.com>
References: <4C5D0759.30606@canterbury.ac.nz>
	<AANLkTi=V=WXhSa2LPk6_OGhRDRW91vAGa0eKHT0+HuEu@mail.gmail.com> 
	<AANLkTi=8i7pRiC4AiDCxB=B6gE42OWDuuFQjbEY6CBp4@mail.gmail.com> 
	<AANLkTikP2d6i+x+=vbmcs7ey3TksoVHFJ1kmxOmApQUU@mail.gmail.com> 
	<AANLkTikk595h5VOUvGzQnsqYRL+kXLC3zs=udzxTT1=z@mail.gmail.com> 
	<4C60FE37.2020303@canterbury.ac.nz>
	<AANLkTinRFV6f4_u3Q5+HQ1gUyniTwoCvTvb+-EaDj2zD@mail.gmail.com>
Message-ID: <AANLkTi=5mdsw3qVPCgWhTpHKJRxg-3Ae3eMSxa41DxSb@mail.gmail.com>

On Tue, Aug 10, 2010 at 5:10 AM, Jack Diederich <jackdied at gmail.com> wrote:
> On Tue, Aug 10, 2010 at 3:22 AM, Greg Ewing <greg.ewing at canterbury.ac.nz> wrote:
>> Guido van Rossum wrote:
>>
>>> We'll see. I still cannot get my head around why cofunctions are so
>>> great.
>>
> [snip]
>> Here's the equivalent thing using cofunctions, complete with a
>> corresponding missing 'cocall':
>>
> [snip]
>> The exception and traceback resulting from this is crystal clear:
>>
>> Traceback (most recent call last):
>> ?File "restaurant2.py", line 34, in <module>
>> ? ?run()
>> ?File
>> "/Local/Projects/D/Python/YieldFrom/3.1/Cofunctions-3.1.2/Examples/Simulation/simulation.py",
>> line 25, in run
>> ? ?next(current_process)
>> ?File "restaurant2.py", line 29, in customer
>> ? ?hold(random.normalvariate(10, 5))
>> TypeError: Cofunction must be called with cocall or costart
>>
>> If this doesn't convince you of the utility of cofunctions or
>> something like them, I don't know what will.
>
> So the benefit of cocalls is runtime type checking? Are your unit tests broken?

Not entirely. I think "cocall" is a better term than "yield from" for
this task, and I see benefit in having an explicit way to declare a
cofunction, instead of the "if 0: yield" trick. The runtime type
checking benefits are certainly helpful, though, even if all they do
is make it clear why your unit tests are failing.

> I was -0 on ABCs and function annotations because I was promised by
> people that liked them that I could safely ignore them. ?I can't
> safely ignore this so I'm -1.

How were you hoping to ignore them? You can't safely ignore yield or
generators, either. If a function you are calling which returned a
list decided to change its implementation and be a generator, then
your list subscript access would stop working, since generators are
not subscriptable.

-Greg


From mwm-keyword-python.b4bdba at mired.org  Tue Aug 10 15:41:45 2010
From: mwm-keyword-python.b4bdba at mired.org (Mike Meyer)
Date: Tue, 10 Aug 2010 09:41:45 -0400
Subject: [Python-ideas] Cofunctions: It's alive! Its alive!
In-Reply-To: <AANLkTinRFV6f4_u3Q5+HQ1gUyniTwoCvTvb+-EaDj2zD@mail.gmail.com>
References: <4C5D0759.30606@canterbury.ac.nz>
	<AANLkTi=V=WXhSa2LPk6_OGhRDRW91vAGa0eKHT0+HuEu@mail.gmail.com>
	<AANLkTi=8i7pRiC4AiDCxB=B6gE42OWDuuFQjbEY6CBp4@mail.gmail.com>
	<AANLkTikP2d6i+x+=vbmcs7ey3TksoVHFJ1kmxOmApQUU@mail.gmail.com>
	<AANLkTikk595h5VOUvGzQnsqYRL+kXLC3zs=udzxTT1=z@mail.gmail.com>
	<4C60FE37.2020303@canterbury.ac.nz>
	<AANLkTinRFV6f4_u3Q5+HQ1gUyniTwoCvTvb+-EaDj2zD@mail.gmail.com>
Message-ID: <20100810094145.15d0a3d5@bhuda.mired.org>

On Tue, 10 Aug 2010 08:10:44 -0400
Jack Diederich <jackdied at gmail.com> wrote:

> On Tue, Aug 10, 2010 at 3:22 AM, Greg Ewing <greg.ewing at canterbury.ac.nz> wrote:
> > Guido van Rossum wrote:
> >
> >> We'll see. I still cannot get my head around why cofunctions are so
> >> great.
> >
> [snip]
> > Here's the equivalent thing using cofunctions, complete with a
> > corresponding missing 'cocall':
> >
> [snip]
> > The exception and traceback resulting from this is crystal clear:
> >
> > Traceback (most recent call last):
> > ?File "restaurant2.py", line 34, in <module>
> > ? ?run()
> > ?File
> > "/Local/Projects/D/Python/YieldFrom/3.1/Cofunctions-3.1.2/Examples/Simulation/simulation.py",
> > line 25, in run
> > ? ?next(current_process)
> > ?File "restaurant2.py", line 29, in customer
> > ? ?hold(random.normalvariate(10, 5))
> > TypeError: Cofunction must be called with cocall or costart
> >
> > If this doesn't convince you of the utility of cofunctions or
> > something like them, I don't know what will.
> 
> So the benefit of cocalls is runtime type checking? Are your unit tests broken?

Isn't "runtime type checking" just another way to say "duck typing"?
Would you be happier if the error message was "'Cofunction' object is
not callable", so it was matched the error you get when you call other
non-callable objects?

Given two features that are otherwise equivalent - which seems to be
the case with yield-from vs cocalls - I'll take the one that makes
debugging easier.

> I was -0 on ABCs and function annotations because I was promised by
> people that liked them that I could safely ignore them.  I can't
> safely ignore this so I'm -1.

I don't see how you can ignore any feature that some program you're
trying to debug makes use of.

       <mike
-- 
Mike Meyer <mwm at mired.org>		http://www.mired.org/consulting.html
Independent Network/Unix/Perforce consultant, email for more information.

O< ascii ribbon campaign - stop html mail - www.asciiribbon.org


From guido at python.org  Tue Aug 10 17:57:10 2010
From: guido at python.org (Guido van Rossum)
Date: Tue, 10 Aug 2010 08:57:10 -0700
Subject: [Python-ideas] Cofunctions: It's alive! Its alive!
In-Reply-To: <4C60FE37.2020303@canterbury.ac.nz>
References: <4C5D0759.30606@canterbury.ac.nz>
	<AANLkTi=V=WXhSa2LPk6_OGhRDRW91vAGa0eKHT0+HuEu@mail.gmail.com> 
	<AANLkTi=8i7pRiC4AiDCxB=B6gE42OWDuuFQjbEY6CBp4@mail.gmail.com> 
	<AANLkTikP2d6i+x+=vbmcs7ey3TksoVHFJ1kmxOmApQUU@mail.gmail.com> 
	<AANLkTikk595h5VOUvGzQnsqYRL+kXLC3zs=udzxTT1=z@mail.gmail.com> 
	<4C60FE37.2020303@canterbury.ac.nz>
Message-ID: <AANLkTimHqO_0ZREJLiR3mH_jsPVfTtPXODcNh1F5fodT@mail.gmail.com>

I'm convinced of the utility. I still find the mechanism somehow odd
or clumsy; the need for two new keywords (codef and cocall), a new
builtin (costart), and a new api (__cocall__) doesn't sit well.

Please don't consider this a -1; I think there's something that can be
done (either to my mind or to the proposal :-).

I suppose I should look again at goroutines and see what syntax and
other rules they use.

In the mean time let me ask a few more questions (sorry if these have
been answered before, my attention is focusing in and out of this
thread):

- Is it possible to mix and match yield, yield from, and cocall in the
same function? Should / shouldn't it be?

- Would it be sufficient if codef was a decorator instead of a
keyword? (This new keyword in particular chafes me, since we've been
so successful at overloading 'def' for so many meanings -- functions,
methods, class methods, static methods, properties...)

- If we had cocall, would yield from still be useful? (I suppose yield
from is the thing of choice when using generators-as-iterators, e.g.
when walking a tree. But what bout yield from for coroutines?)

- The cocall keyword eerily reminds me of Fortran. I know that's not
fair, but still...

- The syntax worries me. Your PEP suggests that cocall binds tightly
to an atom. That would mean that if the cofunction is really a
comethod, you'd have to parenthesis it, like cocall (obj.method)(args)
? Huuuu, ugly. Also things lke 'cocall foo' (no call syntax) weird me
out.

- How much of the improved error flagging of codef/cocall can be
obtained by judicious use of decorators and helper functions? (I need
this in Python 2.5 *now*. :-)

--Guido

On Tue, Aug 10, 2010 at 12:22 AM, Greg Ewing
<greg.ewing at canterbury.ac.nz> wrote:
> Guido van Rossum wrote:
>
>> We'll see. I still cannot get my head around why cofunctions are so
>> great.
>
> I think I can offer some evidence. I've been playing around with
> two versions of a discrete-event simulation kernel, one using
> yield-from and the other using cofunctions. Here's the main
> function of one of my test cases. I've introduced a deliberate
> bug -- can you spot it?
>
> def customer(i):
> ?print("Customer", i, "arriving at", now())
> ?yield from tables.acquire(1)
> ?print("Customer", i, "sits down at a table at", now())
> ?yield from waiters.acquire(1)
> ?print("Customer", i, "orders spam at", now())
> ?hold(random.normalvariate(20, 2))
> ?waiters.release(1)
> ?print("Customer", i, "gets served spam at", now())
> ?yield from hold(random.normalvariate(10, 5))
> ?print("Customer", i, "finished eating at", now())
> ?tables.release(1)
>
> The bug is that the first call to hold() is missing a 'yield
> from' in front of it. If I run this, I don't get any exception --
> it produces plausible-looking but incorrect results.
>
> Here's another version, with a very similar bug in a different
> place -- this time it's the second call to hold() that's missing
> a 'yield from'.
>
> def customer(i):
> ?print("Customer", i, "arriving at", now())
> ?yield from tables.acquire(1)
> ?print("Customer", i, "sits down at a table at", now())
> ?yield from waiters.acquire(1)
> ?print("Customer", i, "orders spam at", now())
> ?yield from hold(random.normalvariate(20, 2))
> ?waiters.release(1)
> ?print("Customer", i, "gets served spam at", now())
> ?hold(random.normalvariate(10, 5))
> ?print("Customer", i, "finished eating at", now())
> ?tables.release(1)
>
> If I run this one, I do get an exception, but it's a rather
> unhelpful one:
>
> Traceback (most recent call last):
> ?File "restaurant2.py", line 35, in <module>
> ? ?run()
> ?File
> "/Local/Projects/D/Python/YieldFrom/3.1/YieldFrom-3.1.2/Examples/Simulation/simulation.py",
> line 25, in run
> ? ?next(current_process)
> ?File "restaurant2.py", line 32, in customer
> ? ?tables.release(1)
> ?File
> "/Local/Projects/D/Python/YieldFrom/3.1/YieldFrom-3.1.2/Examples/Simulation/resource.py",
> line 25, in release
> ? ?wakeup(self.queue[0])
> ?File
> "/Local/Projects/D/Python/YieldFrom/3.1/YieldFrom-3.1.2/Examples/Simulation/simulation.py",
> line 34, in wakeup
> ? ?schedule(process, now())
> ?File
> "/Local/Projects/D/Python/YieldFrom/3.1/YieldFrom-3.1.2/Examples/Simulation/simulation.py",
> line 16, in schedule
> ? ?heappush(event_queue, (time, process))
> TypeError: unorderable types: generator() < generator()
>
> If you examine the traceback, you'll find that *nowhere* does it
> mention the location where the error actually is! Instead, a
> mysterious error emanates from some place deep inside the scheduler.
> I would hate to have to track down a problem like this in a large
> program.
>
> Here's the equivalent thing using cofunctions, complete with a
> corresponding missing 'cocall':
>
> codef customer(i):
> ?print("Customer", i, "arriving at", now())
> ?cocall tables.acquire(1)
> ?print("Customer", i, "sits down at a table at", now())
> ?cocall waiters.acquire(1)
> ?print("Customer", i, "orders spam at", now())
> ?cocall hold(random.normalvariate(20, 2))
> ?cocall waiters.release(1)
> ?print("Customer", i, "gets served spam at", now())
> ?hold(random.normalvariate(10, 5))
> ?print("Customer", i, "finished eating at", now())
> ?cocall tables.release(1)
>
> The exception and traceback resulting from this is crystal clear:
>
> Traceback (most recent call last):
> ?File "restaurant2.py", line 34, in <module>
> ? ?run()
> ?File
> "/Local/Projects/D/Python/YieldFrom/3.1/Cofunctions-3.1.2/Examples/Simulation/simulation.py",
> line 25, in run
> ? ?next(current_process)
> ?File "restaurant2.py", line 29, in customer
> ? ?hold(random.normalvariate(10, 5))
> TypeError: Cofunction must be called with cocall or costart
>
> If this doesn't convince you of the utility of cofunctions or
> something like them, I don't know what will.
>
> --
> Greg
> _______________________________________________
> Python-ideas mailing list
> Python-ideas at python.org
> http://mail.python.org/mailman/listinfo/python-ideas
>



-- 
--Guido van Rossum (python.org/~guido)


From scott+python-ideas at scottdial.com  Tue Aug 10 18:39:49 2010
From: scott+python-ideas at scottdial.com (Scott Dial)
Date: Tue, 10 Aug 2010 12:39:49 -0400
Subject: [Python-ideas] iterator length
In-Reply-To: <AANLkTi=94TQZc3HcF0ev_KhE+suW2o3+cOuynF521pxF@mail.gmail.com>
References: <AANLkTi=YrGownzjyNju3io472o=JE8BaogN-VsCHmJdv@mail.gmail.com>	<AANLkTimMUQ2butzBp=mUnc9NMWUiu0Fh5jsaux1H=jY9@mail.gmail.com>
	<AANLkTi=94TQZc3HcF0ev_KhE+suW2o3+cOuynF521pxF@mail.gmail.com>
Message-ID: <4C6180D5.7040202@scottdial.com>

On 8/9/2010 4:17 PM, Alexandru Mo?oi wrote:
> 2010/8/9 Paul Moore <p.f.moore at gmail.com>:
>> If you could show some real code that uses your ilen function, that
>> would help clarify.
> 
> My requirements was to count the non-zero elements from a list like this:
>    sum(1 for e in iterator if not e)
> 
> What I'm really looking for is the number of elements in a list comprehension:
>    len(list(for e in iterator if not e))

You are responding to his request for a specific example with a generic
class of examples, which is what prompted his request for a specific
example in the first place. Please give a *specific* example and be
prepared to be told that you are going about it the wrong way, since, at
this point, nobody has replied as having recognized this as a problem
they've encountered before.

-- 
Scott Dial
scott at scottdial.com
scodial at cs.indiana.edu


From brtzsnr at gmail.com  Tue Aug 10 18:45:09 2010
From: brtzsnr at gmail.com (=?UTF-8?Q?Alexandru_Mo=C8=99oi?=)
Date: Tue, 10 Aug 2010 12:45:09 -0400
Subject: [Python-ideas] iterator length
In-Reply-To: <4C6180D5.7040202@scottdial.com>
References: <AANLkTi=YrGownzjyNju3io472o=JE8BaogN-VsCHmJdv@mail.gmail.com>
	<AANLkTimMUQ2butzBp=mUnc9NMWUiu0Fh5jsaux1H=jY9@mail.gmail.com>
	<AANLkTi=94TQZc3HcF0ev_KhE+suW2o3+cOuynF521pxF@mail.gmail.com>
	<4C6180D5.7040202@scottdial.com>
Message-ID: <AANLkTimdxSKm9ZpfM_PbHnZqwnjvjYa4gUvzOp5qo207@mail.gmail.com>

2010/8/10 Scott Dial <scott+python-ideas at scottdial.com>:
> You are responding to his request for a specific example with a generic
> class of examples, which is what prompted his request for a specific
> example in the first place. Please give a *specific* example and be
> prepared to be told that you are going about it the wrong way, since, at
> this point, nobody has replied as having recognized this as a problem
> they've encountered before.

My exact need is to count the not-None elements in a list. My current
solution is, as described before:
    sum(1 for e in iterator if not e)


-- 
Alexandru Mo?oi
http://www.alexandru.mosoi.ro/


From rob.cliffe at btinternet.com  Tue Aug 10 18:55:28 2010
From: rob.cliffe at btinternet.com (Rob Cliffe)
Date: Tue, 10 Aug 2010 17:55:28 +0100
Subject: [Python-ideas] iterator length
References: <AANLkTi=YrGownzjyNju3io472o=JE8BaogN-VsCHmJdv@mail.gmail.com><AANLkTimMUQ2butzBp=mUnc9NMWUiu0Fh5jsaux1H=jY9@mail.gmail.com><AANLkTi=94TQZc3HcF0ev_KhE+suW2o3+cOuynF521pxF@mail.gmail.com><4C6180D5.7040202@scottdial.com>
	<AANLkTimdxSKm9ZpfM_PbHnZqwnjvjYa4gUvzOp5qo207@mail.gmail.com>
Message-ID: <4F2D7FB9861A4185AB9175940DD3C8B8@robslaptop>

It seems to me that, to count the not-None elements, your logic is inverted.
Surely you want
    sum(1 for e in iterator if e)
or more accurately
    sum(1 for e in iterator if e is not None)
Rob Cliffe

----- Original Message ----- 
From: "Alexandru Mo?oi" <brtzsnr at gmail.com>
To: "Scott Dial" <scott+python-ideas at scottdial.com>
Cc: <python-ideas at python.org>
Sent: Tuesday, August 10, 2010 5:45 PM
Subject: Re: [Python-ideas] iterator length


> 2010/8/10 Scott Dial <scott+python-ideas at scottdial.com>:
>> You are responding to his request for a specific example with a generic
>> class of examples, which is what prompted his request for a specific
>> example in the first place. Please give a *specific* example and be
>> prepared to be told that you are going about it the wrong way, since, at
>> this point, nobody has replied as having recognized this as a problem
>> they've encountered before.
>
> My exact need is to count the not-None elements in a list. My current
> solution is, as described before:
>    sum(1 for e in iterator if not e)
>
>
> -- 
> Alexandru Mo?oi
> http://www.alexandru.mosoi.ro/
> _______________________________________________
> Python-ideas mailing list
> Python-ideas at python.org
> http://mail.python.org/mailman/listinfo/python-ideas
> 



From guido at python.org  Tue Aug 10 19:20:42 2010
From: guido at python.org (Guido van Rossum)
Date: Tue, 10 Aug 2010 10:20:42 -0700
Subject: [Python-ideas] iterator length
In-Reply-To: <4F2D7FB9861A4185AB9175940DD3C8B8@robslaptop>
References: <AANLkTi=YrGownzjyNju3io472o=JE8BaogN-VsCHmJdv@mail.gmail.com> 
	<AANLkTimMUQ2butzBp=mUnc9NMWUiu0Fh5jsaux1H=jY9@mail.gmail.com> 
	<AANLkTi=94TQZc3HcF0ev_KhE+suW2o3+cOuynF521pxF@mail.gmail.com> 
	<4C6180D5.7040202@scottdial.com>
	<AANLkTimdxSKm9ZpfM_PbHnZqwnjvjYa4gUvzOp5qo207@mail.gmail.com> 
	<4F2D7FB9861A4185AB9175940DD3C8B8@robslaptop>
Message-ID: <AANLkTimYTJABZF8E0zBxQjRGfyFLHviDgDBnKL3bVX+n@mail.gmail.com>

Can you all take this off-line? It is turning into programming help
instead of a feature discussion.

-- 
--Guido van Rossum (python.org/~guido)


From bruce at leapyear.org  Tue Aug 10 22:15:58 2010
From: bruce at leapyear.org (Bruce Leban)
Date: Tue, 10 Aug 2010 13:15:58 -0700
Subject: [Python-ideas] Cofunctions: It's alive! Its alive!
In-Reply-To: <AANLkTimHqO_0ZREJLiR3mH_jsPVfTtPXODcNh1F5fodT@mail.gmail.com>
References: <4C5D0759.30606@canterbury.ac.nz>
	<AANLkTi=V=WXhSa2LPk6_OGhRDRW91vAGa0eKHT0+HuEu@mail.gmail.com>
	<AANLkTi=8i7pRiC4AiDCxB=B6gE42OWDuuFQjbEY6CBp4@mail.gmail.com>
	<AANLkTikP2d6i+x+=vbmcs7ey3TksoVHFJ1kmxOmApQUU@mail.gmail.com>
	<AANLkTikk595h5VOUvGzQnsqYRL+kXLC3zs=udzxTT1=z@mail.gmail.com>
	<4C60FE37.2020303@canterbury.ac.nz>
	<AANLkTimHqO_0ZREJLiR3mH_jsPVfTtPXODcNh1F5fodT@mail.gmail.com>
Message-ID: <AANLkTinb74o1qqE69eESm=3p_J1NDzPDX_2e=YjisW6v@mail.gmail.com>

I think I like this. I'd like to see it without new keywords. It's easy to
imagine @cofunction def ...

For cocall if this were some other language we might see:

<cocall>foo()
cocall:foo()
cocall$foo()
=>foo()

Is there a pythonesque alternative?

--- Bruce
(via android)
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-ideas/attachments/20100810/d0fcba26/attachment.html>

From greg.ewing at canterbury.ac.nz  Wed Aug 11 01:07:19 2010
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Wed, 11 Aug 2010 11:07:19 +1200
Subject: [Python-ideas] Cofunctions: It's alive! Its alive!
In-Reply-To: <AANLkTinRFV6f4_u3Q5+HQ1gUyniTwoCvTvb+-EaDj2zD@mail.gmail.com>
References: <4C5D0759.30606@canterbury.ac.nz>
	<AANLkTi=V=WXhSa2LPk6_OGhRDRW91vAGa0eKHT0+HuEu@mail.gmail.com>
	<AANLkTi=8i7pRiC4AiDCxB=B6gE42OWDuuFQjbEY6CBp4@mail.gmail.com>
	<AANLkTikP2d6i+x+=vbmcs7ey3TksoVHFJ1kmxOmApQUU@mail.gmail.com>
	<AANLkTikk595h5VOUvGzQnsqYRL+kXLC3zs=udzxTT1=z@mail.gmail.com>
	<4C60FE37.2020303@canterbury.ac.nz>
	<AANLkTinRFV6f4_u3Q5+HQ1gUyniTwoCvTvb+-EaDj2zD@mail.gmail.com>
Message-ID: <4C61DBA7.5070100@canterbury.ac.nz>

Jack Diederich wrote:

> So the benefit of cocalls is runtime type checking? Are your unit tests broken?

The benefit is *diagnosis* of certain types of errors that
otherwise produce very mysterious symptoms.

Unit tests would tell me that *something* was wrong, but
not what it is or where it is.

> I was -0 on ABCs and function annotations because I was promised by
> people that liked them that I could safely ignore them.  I can't
> safely ignore this so I'm -1.

I don't see any difference in ignorability. Just like ABCs, you
can ignore it if you don't use it in your own code. You have to
understand it in order to deal with someone else's code that
uses it, but the same is true of ABCs or anything else.

-- 
Greg


From greg.ewing at canterbury.ac.nz  Wed Aug 11 02:49:19 2010
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Wed, 11 Aug 2010 12:49:19 +1200
Subject: [Python-ideas] Cofunctions: It's alive! Its alive!
In-Reply-To: <AANLkTimHqO_0ZREJLiR3mH_jsPVfTtPXODcNh1F5fodT@mail.gmail.com>
References: <4C5D0759.30606@canterbury.ac.nz>
	<AANLkTi=V=WXhSa2LPk6_OGhRDRW91vAGa0eKHT0+HuEu@mail.gmail.com>
	<AANLkTi=8i7pRiC4AiDCxB=B6gE42OWDuuFQjbEY6CBp4@mail.gmail.com>
	<AANLkTikP2d6i+x+=vbmcs7ey3TksoVHFJ1kmxOmApQUU@mail.gmail.com>
	<AANLkTikk595h5VOUvGzQnsqYRL+kXLC3zs=udzxTT1=z@mail.gmail.com>
	<4C60FE37.2020303@canterbury.ac.nz>
	<AANLkTimHqO_0ZREJLiR3mH_jsPVfTtPXODcNh1F5fodT@mail.gmail.com>
Message-ID: <4C61F38F.5070509@canterbury.ac.nz>

Guido van Rossum wrote:
> I'm convinced of the utility. I still find the mechanism somehow odd
> or clumsy; the need for two new keywords (codef and cocall), a new
> builtin (costart), and a new api (__cocall__) doesn't sit well.

The costart function isn't strictly necessary, because you can
always invoke __cocall__ directly. It's just there for
convenience and for symmetry with all the other function/typeslot
pairs.

It's possible that 'codef' could be eliminated by following a
similar rule to generators, i.e. if it contains 'codef' anywhere
then it's a cofunction. But then we're back to the situation
of having to insert a dummy statement to force cofunctionness
some of the time. I expect this be needed much more frequently
with cofunctions than with generators, because it's very rare
that you call a generator for a side effect, whereas with
coroutines it's quite normal.

I also feel strongly that it would be too implicit. We
manage to get away with it in generators because from the
outside a generator is just a normal function that happens to
return an iterator. But a cofunction has a very different
interface, and the caller must be aware of that, so I would
much rather make it explicit at the point of definition.

A decorator could be provided to take care of the no-cocall
case, but since it wouldn't be required in most cases, and
people wouldn't bother to use it when they didn't need to,
so it wouldn't serve to make the interface explicit in
general.

Maybe something could be done to force the decorator to be
used on all cofunctions, such as making them neither callable
nor cocallable until the decorator has been applied, but
things are getting terribly convoluted by then. There would
be something perverse about a definition that looks to all
the world like a plain function, except that you can't actually
do anything with it until it's been wrapped in a specific
decorator.

> - Is it possible to mix and match yield, yield from, and cocall in the
> same function? Should / shouldn't it be?

Yes, it is. A cofunction is a kind of generator, and 'yield'
and 'yield from' work just the same way in a cofunction as
they do in an ordinary generator.

> (This new keyword in particular chafes me, since we've been
> so successful at overloading 'def' for so many meanings -- functions,
> methods, class methods, static methods, properties...)

I understand how you feel, but this case seems fundamentally
different to me. All of those decorators are pretty much
agnostic about what they wrap -- they just take a callable
object and externally modify its behaviour. A decorator with
the same properties as codef would need to be much more
intimately connected with the thing it's wrapping.

> - If we had cocall, would yield from still be useful?

You need *some* way to suspend a cofunction, so if not
yield, some other keyword would need to be invented. There
doesn't seem to be any point to that.

If you're asking whether it needs to be able to send and
receive values when used in a cofunction, I suppose it's
not strictly necessary, but again there doesn't seem to be
any point in disallowing these things.

> - The syntax worries me. Your PEP suggests that cocall binds tightly
> to an atom. That would mean that if the cofunction is really a
> comethod, you'd have to parenthesis it,

No, if you examine the grammar in the PEP you'll see that
the atom can be followed by a subset of the trailers allowed
after atoms in other contexts, so it's possible to write
things like

    x = cocall foo.blarg[42].stuff(y)

which parses as

    x = cocall (foo.blarg[42].stuff)(y)

 > Also things lke 'cocall foo' (no call syntax) weird me
> out.

That's a syntax error -- a cocall *must* ultimately
terminate with an argument list.

> - How much of the improved error flagging of codef/cocall can be
> obtained by judicious use of decorators and helper functions? (I need
> this in Python 2.5 *now*. :-)

I'll have to think about that and get back to you.

-- 
Greg



From greg.ewing at canterbury.ac.nz  Wed Aug 11 03:27:04 2010
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Wed, 11 Aug 2010 13:27:04 +1200
Subject: [Python-ideas] Cofunctions: It's alive! Its alive!
In-Reply-To: <AANLkTinb74o1qqE69eESm=3p_J1NDzPDX_2e=YjisW6v@mail.gmail.com>
References: <4C5D0759.30606@canterbury.ac.nz>
	<AANLkTi=V=WXhSa2LPk6_OGhRDRW91vAGa0eKHT0+HuEu@mail.gmail.com>
	<AANLkTi=8i7pRiC4AiDCxB=B6gE42OWDuuFQjbEY6CBp4@mail.gmail.com>
	<AANLkTikP2d6i+x+=vbmcs7ey3TksoVHFJ1kmxOmApQUU@mail.gmail.com>
	<AANLkTikk595h5VOUvGzQnsqYRL+kXLC3zs=udzxTT1=z@mail.gmail.com>
	<4C60FE37.2020303@canterbury.ac.nz>
	<AANLkTimHqO_0ZREJLiR3mH_jsPVfTtPXODcNh1F5fodT@mail.gmail.com>
	<AANLkTinb74o1qqE69eESm=3p_J1NDzPDX_2e=YjisW6v@mail.gmail.com>
Message-ID: <4C61FC68.5020004@canterbury.ac.nz>

Bruce Leban wrote:

> For cocall if this were some other language we might see:
> 
> <cocall>foo()
> cocall:foo()
> cocall$foo()
> =>foo()
> 
> Is there a pythonesque alternative?

It's generally considered unpythonic to assign arbitary meanings
to randomly chosen punctuation (decorator syntax notwithstanding!)
so it will be difficult to find one.

-- 
Greg


From ncoghlan at gmail.com  Wed Aug 11 04:48:05 2010
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Wed, 11 Aug 2010 12:48:05 +1000
Subject: [Python-ideas] Cofunctions: It's alive! Its alive!
In-Reply-To: <4C61F38F.5070509@canterbury.ac.nz>
References: <4C5D0759.30606@canterbury.ac.nz>
	<AANLkTi=V=WXhSa2LPk6_OGhRDRW91vAGa0eKHT0+HuEu@mail.gmail.com>
	<AANLkTi=8i7pRiC4AiDCxB=B6gE42OWDuuFQjbEY6CBp4@mail.gmail.com>
	<AANLkTikP2d6i+x+=vbmcs7ey3TksoVHFJ1kmxOmApQUU@mail.gmail.com>
	<AANLkTikk595h5VOUvGzQnsqYRL+kXLC3zs=udzxTT1=z@mail.gmail.com>
	<4C60FE37.2020303@canterbury.ac.nz>
	<AANLkTimHqO_0ZREJLiR3mH_jsPVfTtPXODcNh1F5fodT@mail.gmail.com>
	<4C61F38F.5070509@canterbury.ac.nz>
Message-ID: <AANLkTi=3zOt94zX6TLWzy3XtKaZWZen=XNkrUBObU3kZ@mail.gmail.com>

On Wed, Aug 11, 2010 at 10:49 AM, Greg Ewing
<greg.ewing at canterbury.ac.nz> wrote:
>> - The syntax worries me. Your PEP suggests that cocall binds tightly
>> to an atom. That would mean that if the cofunction is really a
>> comethod, you'd have to parenthesis it,
>
> No, if you examine the grammar in the PEP you'll see that
> the atom can be followed by a subset of the trailers allowed
> after atoms in other contexts, so it's possible to write
> things like
>
> ? x = cocall foo.blarg[42].stuff(y)
>
> which parses as
>
> ? x = cocall (foo.blarg[42].stuff)(y)

I would expect the grammatical rules for cocall expressions to be
similar to those for yield expressions. And if they weren't, I'd want
to hear a really good excuse for the inconsistency :)

Also, a possible trick to make a @cofunction decorator work:

class cofunction:
    # Obviously missing a bunch of stuff to tidy up the metadata
    def __init__(self, f):
        self._f = f

    def __cocall__(*args, **kwds):
        self, *args = *args
        return yield from self._f(*args, **kwds)

Cofunctions then wouldn't even *have* a __call__ slot, so you couldn't
call them normally by mistake, and ordinary functions wouldn't define
__cocall__ so you couldn't invadvertently use them with the new
keyword.

Cheers,
Nick.

-- 
Nick Coghlan?? |?? ncoghlan at gmail.com?? |?? Brisbane, Australia


From guido at python.org  Wed Aug 11 05:01:49 2010
From: guido at python.org (Guido van Rossum)
Date: Tue, 10 Aug 2010 20:01:49 -0700
Subject: [Python-ideas] Cofunctions: It's alive! Its alive!
In-Reply-To: <AANLkTi=3zOt94zX6TLWzy3XtKaZWZen=XNkrUBObU3kZ@mail.gmail.com>
References: <4C5D0759.30606@canterbury.ac.nz>
	<AANLkTi=V=WXhSa2LPk6_OGhRDRW91vAGa0eKHT0+HuEu@mail.gmail.com> 
	<AANLkTi=8i7pRiC4AiDCxB=B6gE42OWDuuFQjbEY6CBp4@mail.gmail.com> 
	<AANLkTikP2d6i+x+=vbmcs7ey3TksoVHFJ1kmxOmApQUU@mail.gmail.com> 
	<AANLkTikk595h5VOUvGzQnsqYRL+kXLC3zs=udzxTT1=z@mail.gmail.com> 
	<4C60FE37.2020303@canterbury.ac.nz>
	<AANLkTimHqO_0ZREJLiR3mH_jsPVfTtPXODcNh1F5fodT@mail.gmail.com> 
	<4C61F38F.5070509@canterbury.ac.nz>
	<AANLkTi=3zOt94zX6TLWzy3XtKaZWZen=XNkrUBObU3kZ@mail.gmail.com>
Message-ID: <AANLkTimEJ-KvV7Hrz7heb8QAM28tdg1AvGkez=z2eWmG@mail.gmail.com>

On Tue, Aug 10, 2010 at 7:48 PM, Nick Coghlan <ncoghlan at gmail.com> wrote:
> On Wed, Aug 11, 2010 at 10:49 AM, Greg Ewing
> <greg.ewing at canterbury.ac.nz> wrote:
>>> - The syntax worries me. Your PEP suggests that cocall binds tightly
>>> to an atom. That would mean that if the cofunction is really a
>>> comethod, you'd have to parenthesis it,
>>
>> No, if you examine the grammar in the PEP you'll see that
>> the atom can be followed by a subset of the trailers allowed
>> after atoms in other contexts, so it's possible to write
>> things like
>>
>> ? x = cocall foo.blarg[42].stuff(y)
>>
>> which parses as
>>
>> ? x = cocall (foo.blarg[42].stuff)(y)
>
> I would expect the grammatical rules for cocall expressions to be
> similar to those for yield expressions. And if they weren't, I'd want
> to hear a really good excuse for the inconsistency :)

This I can explain -- a cocall *must* be a call, syntactically, so
that it can take the callable, check that it has a __cocall__ method,
and call it with the given argument list. But I have to say, I don't
really like it, it's very odd syntax (even worse than decorators).

> Also, a possible trick to make a @cofunction decorator work:
>
> class cofunction:
> ? ?# Obviously missing a bunch of stuff to tidy up the metadata
> ? ?def __init__(self, f):
> ? ? ? ?self._f = f
>
> ? ?def __cocall__(*args, **kwds):
> ? ? ? ?self, *args = *args
> ? ? ? ?return yield from self._f(*args, **kwds)
>
> Cofunctions then wouldn't even *have* a __call__ slot, so you couldn't
> call them normally by mistake, and ordinary functions wouldn't define
> __cocall__ so you couldn't invadvertently use them with the new
> keyword.
>
> Cheers,
> Nick.
>
> --
> Nick Coghlan?? |?? ncoghlan at gmail.com?? |?? Brisbane, Australia
> _______________________________________________
> Python-ideas mailing list
> Python-ideas at python.org
> http://mail.python.org/mailman/listinfo/python-ideas
>



-- 
--Guido van Rossum (python.org/~guido)


From rrr at ronadam.com  Wed Aug 11 06:39:07 2010
From: rrr at ronadam.com (Ron Adam)
Date: Tue, 10 Aug 2010 23:39:07 -0500
Subject: [Python-ideas] Cofunctions: It's alive! Its alive!
In-Reply-To: <AANLkTi=3zOt94zX6TLWzy3XtKaZWZen=XNkrUBObU3kZ@mail.gmail.com>
References: <4C5D0759.30606@canterbury.ac.nz>	<AANLkTi=V=WXhSa2LPk6_OGhRDRW91vAGa0eKHT0+HuEu@mail.gmail.com>	<AANLkTi=8i7pRiC4AiDCxB=B6gE42OWDuuFQjbEY6CBp4@mail.gmail.com>	<AANLkTikP2d6i+x+=vbmcs7ey3TksoVHFJ1kmxOmApQUU@mail.gmail.com>	<AANLkTikk595h5VOUvGzQnsqYRL+kXLC3zs=udzxTT1=z@mail.gmail.com>	<4C60FE37.2020303@canterbury.ac.nz>	<AANLkTimHqO_0ZREJLiR3mH_jsPVfTtPXODcNh1F5fodT@mail.gmail.com>	<4C61F38F.5070509@canterbury.ac.nz>
	<AANLkTi=3zOt94zX6TLWzy3XtKaZWZen=XNkrUBObU3kZ@mail.gmail.com>
Message-ID: <4C62296B.3040301@ronadam.com>



On 08/10/2010 09:48 PM, Nick Coghlan wrote:
> On Wed, Aug 11, 2010 at 10:49 AM, Greg Ewing
> <greg.ewing at canterbury.ac.nz>  wrote:
>>> - The syntax worries me. Your PEP suggests that cocall binds tightly
>>> to an atom. That would mean that if the cofunction is really a
>>> comethod, you'd have to parenthesis it,
>>
>> No, if you examine the grammar in the PEP you'll see that
>> the atom can be followed by a subset of the trailers allowed
>> after atoms in other contexts, so it's possible to write
>> things like
>>
>>    x = cocall foo.blarg[42].stuff(y)
>>
>> which parses as
>>
>>    x = cocall (foo.blarg[42].stuff)(y)
>
> I would expect the grammatical rules for cocall expressions to be
> similar to those for yield expressions. And if they weren't, I'd want
> to hear a really good excuse for the inconsistency :)
>
> Also, a possible trick to make a @cofunction decorator work:
>
> class cofunction:
>      # Obviously missing a bunch of stuff to tidy up the metadata
>      def __init__(self, f):
>          self._f = f
>
>      def __cocall__(*args, **kwds):
>          self, *args = *args
>          return yield from self._f(*args, **kwds)
>
> Cofunctions then wouldn't even *have* a __call__ slot, so you couldn't
> call them normally by mistake, and ordinary functions wouldn't define
> __cocall__ so you couldn't invadvertently use them with the new
> keyword.

I was wondering if a class would work.

Could using conext() and cosend() methods instead of next() and send() give 
the better error messages and distinguish cofunctions from generators 
without the need for __cocall__ ?

Can that be done with a decorator or class?

Ron










From greg.ewing at canterbury.ac.nz  Wed Aug 11 07:16:48 2010
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Wed, 11 Aug 2010 17:16:48 +1200
Subject: [Python-ideas] Cofunctions: It's alive! Its alive!
In-Reply-To: <AANLkTimEJ-KvV7Hrz7heb8QAM28tdg1AvGkez=z2eWmG@mail.gmail.com>
References: <4C5D0759.30606@canterbury.ac.nz>
	<AANLkTi=V=WXhSa2LPk6_OGhRDRW91vAGa0eKHT0+HuEu@mail.gmail.com>
	<AANLkTi=8i7pRiC4AiDCxB=B6gE42OWDuuFQjbEY6CBp4@mail.gmail.com>
	<AANLkTikP2d6i+x+=vbmcs7ey3TksoVHFJ1kmxOmApQUU@mail.gmail.com>
	<AANLkTikk595h5VOUvGzQnsqYRL+kXLC3zs=udzxTT1=z@mail.gmail.com>
	<4C60FE37.2020303@canterbury.ac.nz>
	<AANLkTimHqO_0ZREJLiR3mH_jsPVfTtPXODcNh1F5fodT@mail.gmail.com>
	<4C61F38F.5070509@canterbury.ac.nz>
	<AANLkTi=3zOt94zX6TLWzy3XtKaZWZen=XNkrUBObU3kZ@mail.gmail.com>
	<AANLkTimEJ-KvV7Hrz7heb8QAM28tdg1AvGkez=z2eWmG@mail.gmail.com>
Message-ID: <4C623240.5040603@canterbury.ac.nz>

Guido van Rossum wrote:
> But I have to say, I don't
> really like it, it's very odd syntax (even worse than decorators).

I'm open to suggestions, but I haven't been able to think
of anything less awkward that doesn't use punctuation randomly.

-- 
Greg


From greg.ewing at canterbury.ac.nz  Wed Aug 11 07:21:47 2010
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Wed, 11 Aug 2010 17:21:47 +1200
Subject: [Python-ideas] Cofunctions: It's alive! Its alive!
In-Reply-To: <4C62296B.3040301@ronadam.com>
References: <4C5D0759.30606@canterbury.ac.nz>
	<AANLkTi=V=WXhSa2LPk6_OGhRDRW91vAGa0eKHT0+HuEu@mail.gmail.com>
	<AANLkTi=8i7pRiC4AiDCxB=B6gE42OWDuuFQjbEY6CBp4@mail.gmail.com>
	<AANLkTikP2d6i+x+=vbmcs7ey3TksoVHFJ1kmxOmApQUU@mail.gmail.com>
	<AANLkTikk595h5VOUvGzQnsqYRL+kXLC3zs=udzxTT1=z@mail.gmail.com>
	<4C60FE37.2020303@canterbury.ac.nz>
	<AANLkTimHqO_0ZREJLiR3mH_jsPVfTtPXODcNh1F5fodT@mail.gmail.com>
	<4C61F38F.5070509@canterbury.ac.nz>
	<AANLkTi=3zOt94zX6TLWzy3XtKaZWZen=XNkrUBObU3kZ@mail.gmail.com>
	<4C62296B.3040301@ronadam.com>
Message-ID: <4C62336B.1050005@canterbury.ac.nz>

Ron Adam wrote:

> Could using conext() and cosend() methods instead of next() and send() 
> give the better error messages and distinguish cofunctions from 
> generators without the need for __cocall__ ?

I don't think it would help much. If you forget to use yield-from
in a call that doesn't use the return value, the returned
generator just gets thrown away without anything trying to
call any of its methods.

-- 
Greg



From bruce at leapyear.org  Wed Aug 11 07:26:11 2010
From: bruce at leapyear.org (Bruce Leban)
Date: Tue, 10 Aug 2010 22:26:11 -0700
Subject: [Python-ideas] Cofunctions: It's alive! Its alive!
In-Reply-To: <AANLkTi=h0R35MSDn5AT0JjCctYVWfTLqbDVei233Zk_4@mail.gmail.com>
References: <4C5D0759.30606@canterbury.ac.nz>
	<AANLkTi=V=WXhSa2LPk6_OGhRDRW91vAGa0eKHT0+HuEu@mail.gmail.com>
	<AANLkTi=8i7pRiC4AiDCxB=B6gE42OWDuuFQjbEY6CBp4@mail.gmail.com>
	<AANLkTikP2d6i+x+=vbmcs7ey3TksoVHFJ1kmxOmApQUU@mail.gmail.com>
	<AANLkTikk595h5VOUvGzQnsqYRL+kXLC3zs=udzxTT1=z@mail.gmail.com>
	<4C60FE37.2020303@canterbury.ac.nz>
	<AANLkTimHqO_0ZREJLiR3mH_jsPVfTtPXODcNh1F5fodT@mail.gmail.com>
	<AANLkTinb74o1qqE69eESm=3p_J1NDzPDX_2e=YjisW6v@mail.gmail.com>
	<4C61FC68.5020004@canterbury.ac.nz>
	<AANLkTimVQKwdouHE+6-49MahiivNZgg_hmisKSrNPakk@mail.gmail.com>
	<AANLkTi=h0R35MSDn5AT0JjCctYVWfTLqbDVei233Zk_4@mail.gmail.com>
Message-ID: <AANLkTim-C=cMMc_Kr4ZYhBF-UHX9A9rwA5OqpWk01gPG@mail.gmail.com>

On Aug 10, 2010 6:18 PM, "Greg Ewing" <greg.ewing at canterbury.ac.nz> wrote:
> Bruce Leban wrote:
>> Is there a pythonesque alternative?
>
> It's generally considered unpythonic to assign arbitary meanings
> to randomly chosen punctuation (decorator syntax notwithstanding!) ...

I quite agree. I hate random punctuation.

Suppose hypothetically that cocall is just one of a class of execution
modifiers that we might ultimately want to add. In that case maybe there's a
syntax (analogous to adding decorator syntax) that doesn't require new
keywords. There's yield, yield from and cocall. Are there more?

(I don't know. I'm just wondering.)

--- Bruce
(via android)
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-ideas/attachments/20100810/ac9fbfca/attachment.html>

From greg.ewing at canterbury.ac.nz  Wed Aug 11 07:45:58 2010
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Wed, 11 Aug 2010 17:45:58 +1200
Subject: [Python-ideas] Cofunctions without codef
Message-ID: <4C623916.3010902@canterbury.ac.nz>

I think I may have found a way to eliminate the need for
the 'codef' keyword. I realised that although it's important
to prevent making a regular call to a cofunction, there's
no need to prevent making a cocall to an ordinary function.
In that case, it can just be treated like a normal call.

So I suggest the following changes:

1. The presence of 'cocall' inside a function causes it
    to be a cofunction. There is no 'codef' keyword.

2. The 'cocall' construct checks whether the function
    supports cocalls, and if so, proceeds as previously
    specified. Otherwise, it calls the function normally
    and returns the result.

(To allow for objects such as bound methods to wrap things
which could be cofunctions or not, the __cocall__ method
will be permitted to return NotImplemented as a way of
signalling that cocalls are not supported.)

Does this scheme sound any better?

-- 
Greg


From ghazel at gmail.com  Wed Aug 11 07:41:24 2010
From: ghazel at gmail.com (ghazel at gmail.com)
Date: Tue, 10 Aug 2010 22:41:24 -0700
Subject: [Python-ideas] Cofunctions without codef
In-Reply-To: <4C623916.3010902@canterbury.ac.nz>
References: <4C623916.3010902@canterbury.ac.nz>
Message-ID: <AANLkTi=yY1hNeAosJaGpJhxi_AUGYtZPhzMGG0FejESY@mail.gmail.com>

On Tue, Aug 10, 2010 at 10:45 PM, Greg Ewing
<greg.ewing at canterbury.ac.nz> wrote:
>
> 2. The 'cocall' construct checks whether the function
> ? supports cocalls, and if so, proceeds as previously
> ? specified. Otherwise, it calls the function normally
> ? and returns the result.
>
> Does this scheme sound any better?

Well, considering:

On Sun, Aug 1, 2010 at 4:52 AM,  <ghazel at gmail.com> wrote:
> It seems like calling a normal function
> using this special calling mechanism could treat them like a
> cofunction which produced zero iterations and a single return value.

I would say yes. :)

-Greg


From greg.ewing at canterbury.ac.nz  Wed Aug 11 07:56:44 2010
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Wed, 11 Aug 2010 17:56:44 +1200
Subject: [Python-ideas] Cofunctions: It's alive! Its alive!
In-Reply-To: <AANLkTim-C=cMMc_Kr4ZYhBF-UHX9A9rwA5OqpWk01gPG@mail.gmail.com>
References: <4C5D0759.30606@canterbury.ac.nz>
	<AANLkTi=V=WXhSa2LPk6_OGhRDRW91vAGa0eKHT0+HuEu@mail.gmail.com>
	<AANLkTi=8i7pRiC4AiDCxB=B6gE42OWDuuFQjbEY6CBp4@mail.gmail.com>
	<AANLkTikP2d6i+x+=vbmcs7ey3TksoVHFJ1kmxOmApQUU@mail.gmail.com>
	<AANLkTikk595h5VOUvGzQnsqYRL+kXLC3zs=udzxTT1=z@mail.gmail.com>
	<4C60FE37.2020303@canterbury.ac.nz>
	<AANLkTimHqO_0ZREJLiR3mH_jsPVfTtPXODcNh1F5fodT@mail.gmail.com>
	<AANLkTinb74o1qqE69eESm=3p_J1NDzPDX_2e=YjisW6v@mail.gmail.com>
	<4C61FC68.5020004@canterbury.ac.nz>
	<AANLkTimVQKwdouHE+6-49MahiivNZgg_hmisKSrNPakk@mail.gmail.com>
	<AANLkTi=h0R35MSDn5AT0JjCctYVWfTLqbDVei233Zk_4@mail.gmail.com>
	<AANLkTim-C=cMMc_Kr4ZYhBF-UHX9A9rwA5OqpWk01gPG@mail.gmail.com>
Message-ID: <4C623B9C.1010605@canterbury.ac.nz>

Bruce Leban wrote:

> In that case maybe 
> there's a syntax (analogous to adding decorator syntax) that doesn't 
> require new keywords. There's yield, yield from and cocall. 

'yield from' was a happy accident -- there just happened to
be two existing keywords that could be put together in a
suggestive way. We don't get that lucky very often.

The problem with general-purpose syntax extension hooks is
that they don't usually allow you a syntax that's really
well tailored for what you want to do. Typically you end up
with something of a kludgy compromise.

 > Are there more?

Beats me. My crystal ball has been having reception problems
for years. Really should get a new one.

-- 
Greg



From greg.ewing at canterbury.ac.nz  Wed Aug 11 10:03:21 2010
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Wed, 11 Aug 2010 20:03:21 +1200
Subject: [Python-ideas] Cofunctions PEP - Revision 4
Message-ID: <4C625949.1060803@canterbury.ac.nz>

Here's an updated version of the PEP reflecting my
recent suggestions on how to eliminate 'codef'.

PEP: XXX
Title: Cofunctions
Version: $Revision$
Last-Modified: $Date$
Author: Gregory Ewing <greg.ewing at canterbury.ac.nz>
Status: Draft
Type: Standards Track
Content-Type: text/x-rst
Created: 13-Feb-2009
Python-Version: 3.x
Post-History:


Abstract
========

A syntax is proposed for defining and calling a special type of generator
called a 'cofunction'.  It is designed to provide a streamlined way of
writing generator-based coroutines, and allow the early detection of
certain kinds of error that are easily made when writing such code, which
otherwise tend to cause hard-to-diagnose symptoms.

This proposal builds on the 'yield from' mechanism described in PEP 380,
and describes some of the semantics of cofunctions in terms of it. However,
it would be possible to define and implement cofunctions independently of
PEP 380 if so desired.


Specification
=============

Cofunction definitions
----------------------

A cofunction is a special kind of generator, distinguished by the presence
of the keyword ``cocall`` (defined below) at least once in its body. It may
also contain ``yield`` and/or ``yield from`` expressions, which behave as
they do in other generators.

 From the outside, the distinguishing feature of a cofunction is that it cannot
be called the same way as an ordinary function. An exception is raised if an
ordinary call to a cofunction is attempted.

Cocalls
-------

Calls from one cofunction to another are made by marking the call with
a new keyword ``cocall``. The expression

::

     cocall f(*args, **kwds)

is evaluated by first checking whether the object ``f`` implements
a ``__cocall__`` method. If it does, the cocall expression is
equivalent to

::

     yield from f.__cocall__(*args, **kwds)

except that the object returned by __cocall__ is expected to be an
iterator, so the step of calling iter() on it is skipped.

If ``f`` does not have a ``__cocall__`` method, or the ``__cocall__``
method returns ``NotImplemented``, then the cocall expression is
treated as an ordinary call, and the ``__call__`` method of ``f``
is invoked.

Objects which implement __cocall__ are expected to return an object
obeying the iterator protocol. Cofunctions respond to __cocall__ the
same way as ordinary generator functions respond to __call__, i.e. by
returning a generator-iterator.

Certain objects that wrap other callable objects, notably bound methods,
will be given __cocall__ implementations that delegate to the underlying
object.

Grammar
-------

The full syntax of a cocall expression is described by the following
grammar lines:

::

     atom: cocall | <existing alternatives for atom>
     cocall: 'cocall' atom cotrailer* '(' [arglist] ')'
     cotrailer: '[' subscriptlist ']' | '.' NAME

Note that this syntax allows cocalls to methods and elements of sequences
or mappings to be expressed naturally. For example, the following are valid:

::

     y = cocall self.foo(x)
     y = cocall funcdict[key](x)
     y = cocall a.b.c[i].d(x)

Also note that the final calling parentheses are mandatory, so that for example
the following is invalid syntax:

::

     y = cocall f     # INVALID

New builtins, attributes and C API functions
--------------------------------------------

To facilitate interfacing cofunctions with non-coroutine code, there will
be a built-in function ``costart`` whose definition is equivalent to

::

     def costart(obj, *args, **kwds):
         try:
             m = obj.__cocall__
         except AttributeError:
             result = NotImplemented
         else:
             result = m(*args, **kwds)
         if result is NotImplemented:
             raise TypeError("Object does not support cocall")
         return result

There will also be a corresponding C API function

::

     PyObject *PyObject_CoCall(PyObject *obj, PyObject *args, PyObject *kwds)

It is left unspecified for now whether a cofunction is a distinct type
of object or, like a generator function, is simply a specially-marked
function instance. If the latter, a read-only boolean attribute
``__iscofunction__`` should be provided to allow testing whether a given
function object is a cofunction.


Motivation and Rationale
========================

The ``yield from`` syntax is reasonably self-explanatory when used for the
purpose of delegating part of the work of a generator to another function. It
can also be used to good effect in the implementation of generator-based
coroutines, but it reads somewhat awkwardly when used for that purpose, and
tends to obscure the true intent of the code.

Furthermore, using generators as coroutines is somewhat error-prone. If one
forgets to use ``yield from`` when it should have been used, or uses it when it
shouldn't have, the symptoms that result can be extremely obscure and confusing.

Finally, sometimes there is a need for a function to be a coroutine even though
it does not yield anything, and in these cases it is necessary to resort to
kludges such as ``if 0: yield`` to force it to be a generator.

The ``cocall`` construct address the first issue by making the syntax directly
reflect the intent, that is, that the function being called forms part of a
coroutine.

The second issue is addressed by making it impossible to mix coroutine and
non-coroutine code in ways that don't make sense. If the rules are violated, an
exception is raised that points out exactly what and where the problem is.

Lastly, the need for dummy yields is eliminated by making it possible for a
cofunction to call both cofunctions and ordinary functions with the same syntax,
so that an ordinary function can be used in place of a cofunction that yields
zero times.


Record of Discussion
====================

An earlier version of this proposal required a special keyword ``codef`` to be
used in place of ``def`` when defining a cofunction, and disallowed calling an
ordinary function using ``cocall``.  However, it became evident that these
features were not necessary, and the ``codef`` keyword was dropped in the
interests of minimising the number of new keywords required.

The use of a decorator instead of ``codef`` was also suggested, but the current
proposal makes this unnecessary as well.

It has been questioned whether some combination of decorators and functions
could be used instead of a dedicated ``cocall`` syntax. While this might be
possible, to achieve equivalent error-detecting power it would be necessary
to write cofunction calls as something like

::

     yield from cocall(f)(args)

making them even more verbose and inelegant than an unadorned ``yield from``.
It is also not clear whether it is possible to achieve all of the benefits of
the cocall syntax using this kind of approach.


Prototype Implementation
========================

An implementation of an earlier version of this proposal in the form of patches
to Python 3.1.2 can be found here:

http://www.cosc.canterbury.ac.nz/greg.ewing/python/generators/cofunctions.html

If this version of the proposal is received favourably, the implementation will
be updated to match.


Copyright
=========

This document has been placed in the public domain.



..
    Local Variables:
    mode: indented-text
    indent-tabs-mode: nil
    sentence-end-double-space: t
    fill-column: 70
    coding: utf-8
    End:


From paul.dubois at gmail.com  Wed Aug 11 10:22:40 2010
From: paul.dubois at gmail.com (Paul Du Bois)
Date: Wed, 11 Aug 2010 01:22:40 -0700
Subject: [Python-ideas] Cofunctions PEP - Revision 4
In-Reply-To: <4C625949.1060803@canterbury.ac.nz>
References: <4C625949.1060803@canterbury.ac.nz>
Message-ID: <AANLkTi=LocPyykU+mbbpDWGYe_nQCXMA_bAgxxzkwnHG@mail.gmail.com>

Perhaps it's the "cocall" keyword that could be removed, rather than
"codef"? A revised semantics for "codef" could cause the body to use
the most recent PEP revisions's "__cocall__ or __call__" mechanism for
all function calls, perhaps at the expense of some runtime efficiency.

p


From ncoghlan at gmail.com  Wed Aug 11 11:37:35 2010
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Wed, 11 Aug 2010 19:37:35 +1000
Subject: [Python-ideas] Cofunctions PEP - Revision 4
In-Reply-To: <4C625949.1060803@canterbury.ac.nz>
References: <4C625949.1060803@canterbury.ac.nz>
Message-ID: <AANLkTin1xraNjOpQvryu97TnmU8q0qmaCT7LNBTBSPVv@mail.gmail.com>

On Wed, Aug 11, 2010 at 6:03 PM, Greg Ewing <greg.ewing at canterbury.ac.nz> wrote:
> is evaluated by first checking whether the object ``f`` implements
> a ``__cocall__`` method. If it does, the cocall expression is
> equivalent to
>
> ::
>
> ? ?yield from f.__cocall__(*args, **kwds)
>
> except that the object returned by __cocall__ is expected to be an
> iterator, so the step of calling iter() on it is skipped.

I think I'd like to see this exist for a while as:

    yield from f.cocall(*args, **kwds)

for a while after PEP 380 is implemented before it is given syntactic sugar.

Similar to my other suggestion, a @cofunction decorator could easily
provide a cocall method without implementing __call__.

The compiler wouldn't pick up usages of f.cocall() without yield from
with this approach, but tools like pychecker and pylint could
certainly warn about it.


> If ``f`` does not have a ``__cocall__`` method, or the ``__cocall__``
> method returns ``NotImplemented``, then the cocall expression is
> treated as an ordinary call, and the ``__call__`` method of ``f``
> is invoked.
>
> Objects which implement __cocall__ are expected to return an object
> obeying the iterator protocol. Cofunctions respond to __cocall__ the
> same way as ordinary generator functions respond to __call__, i.e. by
> returning a generator-iterator.

You want more than the iterator protocol - you want the whole
generator API (i.e. send() and throw() as well as __next__()).

> Certain objects that wrap other callable objects, notably bound methods,
> will be given __cocall__ implementations that delegate to the underlying
> object.

If you use a @cofunction decorator, you can define your own descriptor
semantics, independent of those for ordinary functions.

> The use of a decorator instead of ``codef`` was also suggested, but the
> current
> proposal makes this unnecessary as well.

I'm not sure that is really an advantage, given that using a decorator
gives much greater control over the way cofunctions behave.

> It has been questioned whether some combination of decorators and functions
> could be used instead of a dedicated ``cocall`` syntax. While this might be
> possible, to achieve equivalent error-detecting power it would be necessary
> to write cofunction calls as something like
>
> ::
>
> ? ?yield from cocall(f)(args)
>
> making them even more verbose and inelegant than an unadorned ``yield
> from``.
> It is also not clear whether it is possible to achieve all of the benefits
> of
> the cocall syntax using this kind of approach.

As far as I can see, the only thing dedicated syntax adds is the
ability for the compiler to detect when a cofunction is called without
correctly yielding control. But pylint/pychecker would still be able
to do that with a decorator based approach.

I'd really want to see a nice clean @cofunction decorator approach
based on PEP 380 seriously attempted before we threw our hands up and
said new syntax was the only way.

Cheers,
Nick.

-- 
Nick Coghlan?? |?? ncoghlan at gmail.com?? |?? Brisbane, Australia


From cmjohnson.mailinglist at gmail.com  Wed Aug 11 12:27:48 2010
From: cmjohnson.mailinglist at gmail.com (Carl M. Johnson)
Date: Wed, 11 Aug 2010 00:27:48 -1000
Subject: [Python-ideas] Cofunctions PEP - Revision 4
In-Reply-To: <4C625949.1060803@canterbury.ac.nz>
References: <4C625949.1060803@canterbury.ac.nz>
Message-ID: <AANLkTikFkD5OSC1+CA45ftEzM-VFtzxdCRtdKd=4=wp5@mail.gmail.com>

On Tue, Aug 10, 2010 at 10:03 PM, Greg Ewing
<greg.ewing at canterbury.ac.nz> wrote:
> Here's an updated version of the PEP reflecting my
> recent suggestions on how to eliminate 'codef'.

...

> Also note that the final calling parentheses are mandatory, so that for
> example
> the following is invalid syntax:
>
> ::
>
> ? ?y = cocall f ? ? # INVALID

If this is the case, why not say "y= cocall f with (x)"  or something
like that instead of f(x)? When I see f(x), I think, "OK, so it's
going to call f with the argument x then it will do something
cocall-ish to it." But actually the reality is first it looks up the
__cocall__ on f and only then passes in the args and kwargs. Of
course, anyone who studies cocalls will learn how it really works
before they get too deep into the theory behind them, but I still
think it's kind of misleading for people new to the world of
cocalling.

I also suspect that Nick is right that we should probably try out
something like "y = yield from cocall(f, *args, **kwargs)" and see if
it catches on before resorting to a new keyword?

-- Carl Johnson


From greg.ewing at canterbury.ac.nz  Thu Aug 12 00:42:33 2010
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Thu, 12 Aug 2010 10:42:33 +1200
Subject: [Python-ideas] Cofunctions PEP - Revision 4
In-Reply-To: <AANLkTin1xraNjOpQvryu97TnmU8q0qmaCT7LNBTBSPVv@mail.gmail.com>
References: <4C625949.1060803@canterbury.ac.nz>
	<AANLkTin1xraNjOpQvryu97TnmU8q0qmaCT7LNBTBSPVv@mail.gmail.com>
Message-ID: <4C632759.2010707@canterbury.ac.nz>

Nick Coghlan wrote:

> You want more than the iterator protocol - you want the whole
> generator API (i.e. send() and throw() as well as __next__()).

I can't see why that should be necessary. A 'yield from'
manages to degrade gracefully when given something that
only supports next(), and there's no reason a cocall can't
do the same.

-- 
Greg


From greg.ewing at canterbury.ac.nz  Thu Aug 12 01:17:28 2010
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Thu, 12 Aug 2010 11:17:28 +1200
Subject: [Python-ideas] Cofunctions PEP - Revision 4
In-Reply-To: <AANLkTi=LocPyykU+mbbpDWGYe_nQCXMA_bAgxxzkwnHG@mail.gmail.com>
References: <4C625949.1060803@canterbury.ac.nz>
	<AANLkTi=LocPyykU+mbbpDWGYe_nQCXMA_bAgxxzkwnHG@mail.gmail.com>
Message-ID: <4C632F88.9070405@canterbury.ac.nz>

Paul Du Bois wrote:
> Perhaps it's the "cocall" keyword that could be removed, rather than
> "codef"? A revised semantics for "codef" could cause the body to use
> the most recent PEP revisions's "__cocall__ or __call__" mechanism for
> all function calls, perhaps at the expense of some runtime efficiency.

Thinking about it overnight, I came to exactly the same
conclusion! This is actually the idea I had in mind right
back at the beginning.

I think there are some good arguments in favour of it.
If cocall sites have to be marked in some way, then when
you change your mind about whether a function is a
generator or not, you have to track down all the places
where the function is called and change them as well. If
that causes the enclosing functions to also become
generators, then you have to track down all the calls to
them as well, etc. etc. I can envisage this being a
major hassle in a large program.

Whereas if we mark the functions instead of the calls,
although some changes will still be necessary, there
ought to be far fewer of them. Generally one tends to
call functions more often than one defines them.

Also, it seems to me that changing 'def' into 'codef' is
a far less intrusive change than sprinkling some kind of
call marker throughout the body. It means we don't
have to invent a weird new calling syntax. It also means
you can read the function and think about it as normal
code instead of having to be aware at every point of
what kind of thing you're calling. It's more duck-typish.

So now I'm thinking that my original instinct was right:
cofunctions should be functions that call things in a
different way.

-- 
Greg


From ghazel at gmail.com  Thu Aug 12 06:16:48 2010
From: ghazel at gmail.com (ghazel at gmail.com)
Date: Wed, 11 Aug 2010 21:16:48 -0700
Subject: [Python-ideas] Cofunctions PEP - Revision 4
In-Reply-To: <4C632F88.9070405@canterbury.ac.nz>
References: <4C625949.1060803@canterbury.ac.nz>
	<AANLkTi=LocPyykU+mbbpDWGYe_nQCXMA_bAgxxzkwnHG@mail.gmail.com> 
	<4C632F88.9070405@canterbury.ac.nz>
Message-ID: <AANLkTinOjB14dxGo61Ns23y6nboMzLa10zEqAv0i17Le@mail.gmail.com>

On Wed, Aug 11, 2010 at 4:17 PM, Greg Ewing <greg.ewing at canterbury.ac.nz> wrote:
> Paul Du Bois wrote:
>>
>> Perhaps it's the "cocall" keyword that could be removed, rather than
>> "codef"? A revised semantics for "codef" could cause the body to use
>> the most recent PEP revisions's "__cocall__ or __call__" mechanism for
>> all function calls, perhaps at the expense of some runtime efficiency.
>
> Thinking about it overnight, I came to exactly the same
> conclusion! This is actually the idea I had in mind right
> back at the beginning.
>
> I think there are some good arguments in favour of it.
> If cocall sites have to be marked in some way, then when
> you change your mind about whether a function is a
> generator or not, you have to track down all the places
> where the function is called and change them as well.
>
> If that causes the enclosing functions to also become
> generators, then you have to track down all the calls to
> them as well, etc. etc. I can envisage this being a
> major hassle in a large program.
>
> Whereas if we mark the functions instead of the calls,
> although some changes will still be necessary, there
> ought to be far fewer of them. Generally one tends to
> call functions more often than one defines them.

There still has to be some weird way to call cofunctions from regular
functions. Changing the single definition of a function from "def" to
"codef" means revisiting all the sites which call that function in the
body of regular functions, and pushing the change up the stack as you
mentioned.

I think this is the wrong direction. But, if you want to head that
way, why not make calling a cofunction from a function also
transparent, and exhaust the iterator when the function is called?
Then it never matters which kind of function or cofunction you call
from anywhere, you can just magically change the control flow by
adding "co" to your "def"s.

> Also, it seems to me that changing 'def' into 'codef' is
> a far less intrusive change than sprinkling some kind of
> call marker throughout the body. It means we don't
> have to invent a weird new calling syntax. It also means
> you can read the function and think about it as normal
> code instead of having to be aware at every point of
> what kind of thing you're calling. It's more duck-typish.

Again, marking the points at which your function could be suspended is
a very important feature, in my mind. I would stick with explicitly
using "yield from" where needed rather than magically hiding it.

-Greg


From greg.ewing at canterbury.ac.nz  Thu Aug 12 08:31:52 2010
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Thu, 12 Aug 2010 18:31:52 +1200
Subject: [Python-ideas] Cofunctions PEP - Revision 4
In-Reply-To: <AANLkTinOjB14dxGo61Ns23y6nboMzLa10zEqAv0i17Le@mail.gmail.com>
References: <4C625949.1060803@canterbury.ac.nz>
	<AANLkTi=LocPyykU+mbbpDWGYe_nQCXMA_bAgxxzkwnHG@mail.gmail.com>
	<4C632F88.9070405@canterbury.ac.nz>
	<AANLkTinOjB14dxGo61Ns23y6nboMzLa10zEqAv0i17Le@mail.gmail.com>
Message-ID: <4C639558.5020602@canterbury.ac.nz>

ghazel at gmail.com wrote:

> There still has to be some weird way to call cofunctions from regular
> functions. Changing the single definition of a function from "def" to
> "codef" means revisiting all the sites which call that function in the
> body of regular functions, and pushing the change up the stack as you
> mentioned.

Yes, but when using generators as coroutines, I believe
that invoking a coroutine from a non-coroutine will be
a relatively rare thing to do. Essentially you only do
it when starting a new coroutine, and most of the time
it can be hidden inside whatever library you're using
to schedule your coroutines.

In each of my scheduler examples, there is only one
place where this happens. It's not particularly weird,
either -- just a matter of wrapping costart() around
it, which is a normal function, no magic involved.

> I think this is the wrong direction. But, if you want to head that
> way, why not make calling a cofunction from a function also
> transparent, and exhaust the iterator when the function is called?

Because this is almost always the *wrong* thing to do.
The cofunction you're calling is expecting to be able to
suspend the whole stack of calls right back up to the
trampoline, and by automatically exhausting it you're
preventing it from being able to do so.

Calling a cofunction from a non-cofunction is overwhelmingly
likely to be an error, and should be reported as such.
For cases where you really do want to exhaust it, a function
could be provided for that purpose, but you should have
to make a conscious decision to use it.

> Again, marking the points at which your function could be suspended is
> a very important feature, in my mind.

I'm still very far from convinced about that. Or at least
I'm not convinced that the benefits of such awareness
justify the maintenance cost of keeping the call markers
up to date in the face of program changes.

Also, consider that if cocall is made to work on both
ordinary functions and cofunctions, there is nothing to
stop you from simply marking *every* call with cocall
just on the offchance. People being basically lazy, I
can well imagine someone doing this, and then they've
lost any suspendability-awareness benefit that the
call markers might bring.

Even if they don't go to that extreme, there is nothing
to ensure that cocall markers are removed when no longer
necessary, so redundant cocalls are likely to accumulate
over time, to give misleading indications to future
maintainers.

-- 
Greg


From ghazel at gmail.com  Thu Aug 12 08:44:56 2010
From: ghazel at gmail.com (ghazel at gmail.com)
Date: Wed, 11 Aug 2010 23:44:56 -0700
Subject: [Python-ideas] Cofunctions PEP - Revision 4
In-Reply-To: <4C639558.5020602@canterbury.ac.nz>
References: <4C625949.1060803@canterbury.ac.nz>
	<AANLkTi=LocPyykU+mbbpDWGYe_nQCXMA_bAgxxzkwnHG@mail.gmail.com> 
	<4C632F88.9070405@canterbury.ac.nz>
	<AANLkTinOjB14dxGo61Ns23y6nboMzLa10zEqAv0i17Le@mail.gmail.com> 
	<4C639558.5020602@canterbury.ac.nz>
Message-ID: <AANLkTikfuj+m83Q+h91v4+Ahc2M646=-fjV6wu4-bjh_@mail.gmail.com>

On Wed, Aug 11, 2010 at 11:31 PM, Greg Ewing
<greg.ewing at canterbury.ac.nz> wrote:
> ghazel at gmail.com wrote:
>
>> I think this is the wrong direction. But, if you want to head that
>> way, why not make calling a cofunction from a function also
>> transparent, and exhaust the iterator when the function is called?
>
> Because this is almost always the *wrong* thing to do.
> The cofunction you're calling is expecting to be able to
> suspend the whole stack of calls right back up to the
> trampoline, and by automatically exhausting it you're
> preventing it from being able to do so.

Right, this is also why you do not always join a thread immediately
after launching it. So if cofunctions have a special way of launching
and are preemptable at unmarked points and terminate at some time in
the future which is based on an indeterminate number of iterations,
why not just use threads? codef could simply be a decorator something
like this:

def codef(f):
    t = threading.Thread(target=f)
    t.start()
    while t.is_alive():
        t.join(0.010)
        yield


> Calling a cofunction from a non-cofunction is overwhelmingly
> likely to be an error, and should be reported as such.
> For cases where you really do want to exhaust it, a function
> could be provided for that purpose, but you should have
> to make a conscious decision to use it.
>
>> Again, marking the points at which your function could be suspended is
>> a very important feature, in my mind.
>
> I'm still very far from convinced about that. Or at least
> I'm not convinced that the benefits of such awareness
> justify the maintenance cost of keeping the call markers
> up to date in the face of program changes.
>
> Also, consider that if cocall is made to work on both
> ordinary functions and cofunctions, there is nothing to
> stop you from simply marking *every* call with cocall
> just on the offchance. People being basically lazy, I
> can well imagine someone doing this, and then they've
> lost any suspendability-awareness benefit that the
> call markers might bring.
>
> Even if they don't go to that extreme, there is nothing
> to ensure that cocall markers are removed when no longer
> necessary, so redundant cocalls are likely to accumulate
> over time, to give misleading indications to future
> maintainers.

The important thing about a cooperation point is that you are
specifying where it is safe to pause your function - even if it is not
paused. The accumulation of these could eventually be noisy, but it is
safe. If someone decorates all their calls with cocall, they have just
written a bunch of bugs, and it was hard to do that. Automatically
adding cocall everywhere creates a bunch of bugs which they are
unaware of.

-Greg


From ncoghlan at gmail.com  Thu Aug 12 14:12:35 2010
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Thu, 12 Aug 2010 22:12:35 +1000
Subject: [Python-ideas] Cofunctions PEP - Revision 4
In-Reply-To: <4C632759.2010707@canterbury.ac.nz>
References: <4C625949.1060803@canterbury.ac.nz>
	<AANLkTin1xraNjOpQvryu97TnmU8q0qmaCT7LNBTBSPVv@mail.gmail.com>
	<4C632759.2010707@canterbury.ac.nz>
Message-ID: <AANLkTimFvMDrfVtbVHXJmUKcmi3+3d5WSUPrr7kPuDrj@mail.gmail.com>

On Thu, Aug 12, 2010 at 8:42 AM, Greg Ewing <greg.ewing at canterbury.ac.nz> wrote:
> Nick Coghlan wrote:
>
>> You want more than the iterator protocol - you want the whole
>> generator API (i.e. send() and throw() as well as __next__()).
>
> I can't see why that should be necessary. A 'yield from'
> manages to degrade gracefully when given something that
> only supports next(), and there's no reason a cocall can't
> do the same.

Without send() and throw(), an object is just an iterator, never a
cofunction (as there is no way for it to make cooperative calls - you
need the extra two methods in order to receive the results of any such
calls). Implementing __cocall__ without yourself being able to make
cooperative calls doesn't make any sense.

Cheers,
Nick.

-- 
Nick Coghlan?? |?? ncoghlan at gmail.com?? |?? Brisbane, Australia


From greg.ewing at canterbury.ac.nz  Thu Aug 12 14:51:01 2010
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Fri, 13 Aug 2010 00:51:01 +1200
Subject: [Python-ideas] Cofunctions PEP - Revision 4
In-Reply-To: <AANLkTimFvMDrfVtbVHXJmUKcmi3+3d5WSUPrr7kPuDrj@mail.gmail.com>
References: <4C625949.1060803@canterbury.ac.nz>
	<AANLkTin1xraNjOpQvryu97TnmU8q0qmaCT7LNBTBSPVv@mail.gmail.com>
	<4C632759.2010707@canterbury.ac.nz>
	<AANLkTimFvMDrfVtbVHXJmUKcmi3+3d5WSUPrr7kPuDrj@mail.gmail.com>
Message-ID: <4C63EE35.6010805@canterbury.ac.nz>

Nick Coghlan wrote:

> Without send() and throw(), an object is just an iterator, never a
> cofunction (as there is no way for it to make cooperative calls - you
> need the extra two methods in order to receive the results of any such
> calls).

There are plenty of uses for cofunctions that never send or
receive any values using yield, but just use it as a suspension
point. In that case, send() is never used, only next(). And
I suspect that use of throw() will be even rarer.

-- 
Greg


From guido at python.org  Thu Aug 12 16:38:35 2010
From: guido at python.org (Guido van Rossum)
Date: Thu, 12 Aug 2010 07:38:35 -0700
Subject: [Python-ideas] Cofunctions PEP - Revision 4
In-Reply-To: <4C639558.5020602@canterbury.ac.nz>
References: <4C625949.1060803@canterbury.ac.nz>
	<AANLkTi=LocPyykU+mbbpDWGYe_nQCXMA_bAgxxzkwnHG@mail.gmail.com>
	<4C632F88.9070405@canterbury.ac.nz>
	<AANLkTinOjB14dxGo61Ns23y6nboMzLa10zEqAv0i17Le@mail.gmail.com>
	<4C639558.5020602@canterbury.ac.nz>
Message-ID: <AANLkTikEaaXXUDx8Rm3vc7Nu=njXbUi4mZ32q6xZPEUd@mail.gmail.com>

On Wed, Aug 11, 2010 at 11:31 PM, Greg Ewing
<greg.ewing at canterbury.ac.nz> wrote:
> ghazel at gmail.com wrote:
>> Again, marking the points at which your function could be suspended is
>> a very important feature, in my mind.
>
> I'm still very far from convinced about that. Or at least
> I'm not convinced that the benefits of such awareness
> justify the maintenance cost of keeping the call markers
> up to date in the face of program changes.
>
> Also, consider that if cocall is made to work on both
> ordinary functions and cofunctions, there is nothing to
> stop you from simply marking *every* call with cocall
> just on the offchance. People being basically lazy, I
> can well imagine someone doing this, and then they've
> lost any suspendability-awareness benefit that the
> call markers might bring.
>
> Even if they don't go to that extreme, there is nothing
> to ensure that cocall markers are removed when no longer
> necessary, so redundant cocalls are likely to accumulate
> over time, to give misleading indications to future
> maintainers.

I'm with ghazel on this one. Long, long ago I used a system that
effectively used implicit cocalls. It was a threading OS with
non-preemptive scheduling, so instead of locking you'd just refrain
from calling any one of the (very few) syscalls that would allow
another thread to run. This worked fine when we just got started, but
as we started building more powerful abstractions, a common bug was
making a call to some abstraction which behind your back, sometimes,
perhaps inside more abstraction, would make a syscall. This was
nightmarish to debug (especially since it could happen that the
offending abstraction was maintained by someone else and had just
evolved to make its first syscall).

So, coming back to this, I think I am on the side of explicitly
marking cocalls. But whether it's better to use cocall or yield-from,
I don't know.

-- 
--Guido van Rossum (python.org/~guido)


From casey at pandora.com  Thu Aug 12 23:15:31 2010
From: casey at pandora.com (Casey Duncan)
Date: Thu, 12 Aug 2010 15:15:31 -0600
Subject: [Python-ideas] Cofunctions PEP - Revision 4
In-Reply-To: <4C625949.1060803@canterbury.ac.nz>
References: <4C625949.1060803@canterbury.ac.nz>
Message-ID: <42FFF65E-8D42-4DB7-BCBD-833AAF6E04AC@pandora.com>

Apologies if this already exists, but for the benefit of those less enlightened, I think it would be very helpful if the pep included or linked to an example of an algorithm implemented 3 ways:

- Straight python, no coroutines
- Coroutines implemented via "yield from"
- Coroutines implemented via "cocall"

iirc, the last two would not look much different, but maybe I'm mistaken.

As I understand it:

cocall f(x, y, z)

is sugar for:

yield from f.__cocall__(x, y, z)

and it now magically promotes the function that contains it to a cofunction (thus implementing __cocall__ for said function).

From what I understand, __cocall__ does not exist because you might want to also have __call__ with different behavior, but instead it exists to allow the "cocaller" to differentiate between cofunctions and normal functions? In theory though, I could implement an object myself that implemented both __call__ and __cocall__, correct? I suppose __cocall__ is to __call__ as __iter__ is to __call__ presently.

I'd say my main problem with this is conceptual complexity. Many folks have a hard time understanding generators, and the presence of the conceptually similar language concept of iterators doesn't help. This feels like yet another conceptually similar concept to generators, that just muddies the waters further.

What I'd love to see is a version of coroutines that isn't just sugared generators. It really seems to me that generators should be implemented on top of coroutines and the not the reverse. That would lead to a more linear path to understanding: iterators -> generators -> coroutines. This proposal doesn't feel like that to me, it feels more like an adjunct thing that uses the generator machinery for different ends.

I'm surely confused, but that's part of my point 8^)

-Casey

On Aug 11, 2010, at 2:03 AM, Greg Ewing wrote:

> Here's an updated version of the PEP reflecting my
> recent suggestions on how to eliminate 'codef'.
> 
> PEP: XXX
> Title: Cofunctions
> Version: $Revision$
> Last-Modified: $Date$
> Author: Gregory Ewing <greg.ewing at canterbury.ac.nz>
> Status: Draft
> Type: Standards Track
> Content-Type: text/x-rst
> Created: 13-Feb-2009
> Python-Version: 3.x
> Post-History:
> 
> 
> Abstract
> ========
> 
> A syntax is proposed for defining and calling a special type of generator
> called a 'cofunction'.  It is designed to provide a streamlined way of
> writing generator-based coroutines, and allow the early detection of
> certain kinds of error that are easily made when writing such code, which
> otherwise tend to cause hard-to-diagnose symptoms.
> 
> This proposal builds on the 'yield from' mechanism described in PEP 380,
> and describes some of the semantics of cofunctions in terms of it. However,
> it would be possible to define and implement cofunctions independently of
> PEP 380 if so desired.
> 
> 
> Specification
> =============
> 
> Cofunction definitions
> ----------------------
> 
> A cofunction is a special kind of generator, distinguished by the presence
> of the keyword ``cocall`` (defined below) at least once in its body. It may
> also contain ``yield`` and/or ``yield from`` expressions, which behave as
> they do in other generators.
> 
> From the outside, the distinguishing feature of a cofunction is that it cannot
> be called the same way as an ordinary function. An exception is raised if an
> ordinary call to a cofunction is attempted.
> 
> Cocalls
> -------
> 
> Calls from one cofunction to another are made by marking the call with
> a new keyword ``cocall``. The expression
> 
> ::
> 
>    cocall f(*args, **kwds)
> 
> is evaluated by first checking whether the object ``f`` implements
> a ``__cocall__`` method. If it does, the cocall expression is
> equivalent to
> 
> ::
> 
>    yield from f.__cocall__(*args, **kwds)
> 
> except that the object returned by __cocall__ is expected to be an
> iterator, so the step of calling iter() on it is skipped.
> 
> If ``f`` does not have a ``__cocall__`` method, or the ``__cocall__``
> method returns ``NotImplemented``, then the cocall expression is
> treated as an ordinary call, and the ``__call__`` method of ``f``
> is invoked.
> 
> Objects which implement __cocall__ are expected to return an object
> obeying the iterator protocol. Cofunctions respond to __cocall__ the
> same way as ordinary generator functions respond to __call__, i.e. by
> returning a generator-iterator.
> 
> Certain objects that wrap other callable objects, notably bound methods,
> will be given __cocall__ implementations that delegate to the underlying
> object.
> 
> Grammar
> -------
> 
> The full syntax of a cocall expression is described by the following
> grammar lines:
> 
> ::
> 
>    atom: cocall | <existing alternatives for atom>
>    cocall: 'cocall' atom cotrailer* '(' [arglist] ')'
>    cotrailer: '[' subscriptlist ']' | '.' NAME
> 
> Note that this syntax allows cocalls to methods and elements of sequences
> or mappings to be expressed naturally. For example, the following are valid:
> 
> ::
> 
>    y = cocall self.foo(x)
>    y = cocall funcdict[key](x)
>    y = cocall a.b.c[i].d(x)
> 
> Also note that the final calling parentheses are mandatory, so that for example
> the following is invalid syntax:
> 
> ::
> 
>    y = cocall f     # INVALID
> 
> New builtins, attributes and C API functions
> --------------------------------------------
> 
> To facilitate interfacing cofunctions with non-coroutine code, there will
> be a built-in function ``costart`` whose definition is equivalent to
> 
> ::
> 
>    def costart(obj, *args, **kwds):
>        try:
>            m = obj.__cocall__
>        except AttributeError:
>            result = NotImplemented
>        else:
>            result = m(*args, **kwds)
>        if result is NotImplemented:
>            raise TypeError("Object does not support cocall")
>        return result
> 
> There will also be a corresponding C API function
> 
> ::
> 
>    PyObject *PyObject_CoCall(PyObject *obj, PyObject *args, PyObject *kwds)
> 
> It is left unspecified for now whether a cofunction is a distinct type
> of object or, like a generator function, is simply a specially-marked
> function instance. If the latter, a read-only boolean attribute
> ``__iscofunction__`` should be provided to allow testing whether a given
> function object is a cofunction.
> 
> 
> Motivation and Rationale
> ========================
> 
> The ``yield from`` syntax is reasonably self-explanatory when used for the
> purpose of delegating part of the work of a generator to another function. It
> can also be used to good effect in the implementation of generator-based
> coroutines, but it reads somewhat awkwardly when used for that purpose, and
> tends to obscure the true intent of the code.
> 
> Furthermore, using generators as coroutines is somewhat error-prone. If one
> forgets to use ``yield from`` when it should have been used, or uses it when it
> shouldn't have, the symptoms that result can be extremely obscure and confusing.
> 
> Finally, sometimes there is a need for a function to be a coroutine even though
> it does not yield anything, and in these cases it is necessary to resort to
> kludges such as ``if 0: yield`` to force it to be a generator.
> 
> The ``cocall`` construct address the first issue by making the syntax directly
> reflect the intent, that is, that the function being called forms part of a
> coroutine.
> 
> The second issue is addressed by making it impossible to mix coroutine and
> non-coroutine code in ways that don't make sense. If the rules are violated, an
> exception is raised that points out exactly what and where the problem is.
> 
> Lastly, the need for dummy yields is eliminated by making it possible for a
> cofunction to call both cofunctions and ordinary functions with the same syntax,
> so that an ordinary function can be used in place of a cofunction that yields
> zero times.
> 
> 
> Record of Discussion
> ====================
> 
> An earlier version of this proposal required a special keyword ``codef`` to be
> used in place of ``def`` when defining a cofunction, and disallowed calling an
> ordinary function using ``cocall``.  However, it became evident that these
> features were not necessary, and the ``codef`` keyword was dropped in the
> interests of minimising the number of new keywords required.
> 
> The use of a decorator instead of ``codef`` was also suggested, but the current
> proposal makes this unnecessary as well.
> 
> It has been questioned whether some combination of decorators and functions
> could be used instead of a dedicated ``cocall`` syntax. While this might be
> possible, to achieve equivalent error-detecting power it would be necessary
> to write cofunction calls as something like
> 
> ::
> 
>    yield from cocall(f)(args)
> 
> making them even more verbose and inelegant than an unadorned ``yield from``.
> It is also not clear whether it is possible to achieve all of the benefits of
> the cocall syntax using this kind of approach.
> 
> 
> Prototype Implementation
> ========================
> 
> An implementation of an earlier version of this proposal in the form of patches
> to Python 3.1.2 can be found here:
> 
> http://www.cosc.canterbury.ac.nz/greg.ewing/python/generators/cofunctions.html
> 
> If this version of the proposal is received favourably, the implementation will
> be updated to match.
> 
> 
> Copyright
> =========
> 
> This document has been placed in the public domain.
> 
> 
> 
> ..
>   Local Variables:
>   mode: indented-text
>   indent-tabs-mode: nil
>   sentence-end-double-space: t
>   fill-column: 70
>   coding: utf-8
>   End:
> _______________________________________________
> Python-ideas mailing list
> Python-ideas at python.org
> http://mail.python.org/mailman/listinfo/python-ideas



From ncoghlan at gmail.com  Thu Aug 12 23:39:55 2010
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Fri, 13 Aug 2010 07:39:55 +1000
Subject: [Python-ideas] Cofunctions PEP - Revision 4
In-Reply-To: <4C63EE35.6010805@canterbury.ac.nz>
References: <4C625949.1060803@canterbury.ac.nz>
	<AANLkTin1xraNjOpQvryu97TnmU8q0qmaCT7LNBTBSPVv@mail.gmail.com>
	<4C632759.2010707@canterbury.ac.nz>
	<AANLkTimFvMDrfVtbVHXJmUKcmi3+3d5WSUPrr7kPuDrj@mail.gmail.com>
	<4C63EE35.6010805@canterbury.ac.nz>
Message-ID: <AANLkTim+yAxVuhTyPekB83n58R_kUFRoPYW7pzJ_xpf8@mail.gmail.com>

On Thu, Aug 12, 2010 at 10:51 PM, Greg Ewing
<greg.ewing at canterbury.ac.nz> wrote:
> Nick Coghlan wrote:
>
>> Without send() and throw(), an object is just an iterator, never a
>> cofunction (as there is no way for it to make cooperative calls - you
>> need the extra two methods in order to receive the results of any such
>> calls).
>
> There are plenty of uses for cofunctions that never send or
> receive any values using yield, but just use it as a suspension
> point. In that case, send() is never used, only next(). And
> I suspect that use of throw() will be even rarer.

Could you name some of those uses please? If you aren't getting
answers back, they sound like ordinary iterators to me. The whole
*point* of cofunctions to my mind is that they let you do things like
async I/O (where you expect a result back, in the form of a return
value or an exception) in a way that feels more like normal imperative
programming.

So, you may consider there to be plenty of uses for iterate-only
cofunctions, but I come up blank.

Cheers,
Nick.

-- 
Nick Coghlan?? |?? ncoghlan at gmail.com?? |?? Brisbane, Australia


From ncoghlan at gmail.com  Fri Aug 13 00:00:43 2010
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Fri, 13 Aug 2010 08:00:43 +1000
Subject: [Python-ideas] Cofunctions PEP - Revision 4
In-Reply-To: <AANLkTim+yAxVuhTyPekB83n58R_kUFRoPYW7pzJ_xpf8@mail.gmail.com>
References: <4C625949.1060803@canterbury.ac.nz>
	<AANLkTin1xraNjOpQvryu97TnmU8q0qmaCT7LNBTBSPVv@mail.gmail.com>
	<4C632759.2010707@canterbury.ac.nz>
	<AANLkTimFvMDrfVtbVHXJmUKcmi3+3d5WSUPrr7kPuDrj@mail.gmail.com>
	<4C63EE35.6010805@canterbury.ac.nz>
	<AANLkTim+yAxVuhTyPekB83n58R_kUFRoPYW7pzJ_xpf8@mail.gmail.com>
Message-ID: <AANLkTi=gZ3X4aHAfXi7+_sXy7Mrvc=zs1w_tWG1+8S4z@mail.gmail.com>

On Fri, Aug 13, 2010 at 7:39 AM, Nick Coghlan <ncoghlan at gmail.com> wrote:
> On Thu, Aug 12, 2010 at 10:51 PM, Greg Ewing
> <greg.ewing at canterbury.ac.nz> wrote:
>> Nick Coghlan wrote:
>>
>>> Without send() and throw(), an object is just an iterator, never a
>>> cofunction (as there is no way for it to make cooperative calls - you
>>> need the extra two methods in order to receive the results of any such
>>> calls).
>>
>> There are plenty of uses for cofunctions that never send or
>> receive any values using yield, but just use it as a suspension
>> point. In that case, send() is never used, only next(). And
>> I suspect that use of throw() will be even rarer.
>
> Could you name some of those uses please? If you aren't getting
> answers back, they sound like ordinary iterators to me. The whole
> *point* of cofunctions to my mind is that they let you do things like
> async I/O (where you expect a result back, in the form of a return
> value or an exception) in a way that feels more like normal imperative
> programming.
>
> So, you may consider there to be plenty of uses for iterate-only
> cofunctions, but I come up blank.

At the very least, a non-generator cofunction will need to offer
close() and __del__() (or its weakref equivalent) to release resources
in the event of an exception in any called cofunctions (independent of
any expected exceptions, almost anything can throw KeyboardInterrupt).

I just don't see how further blurring the lines between cofunctions
and ordinary generators is helping here. Providing dummy
implementations of send() and throw() that ignore their arguments and
devolve to next() is trivial, while still making the conceptual
separation clearer. PEP 342 is *called* "Coroutines via enhanced
generators", and it still seems to me that the usage of send() and
throw() is one of the key features distinguishing a cooperative
scheduler from ordinary iteration.

Cheers,
Nick.

-- 
Nick Coghlan?? |?? ncoghlan at gmail.com?? |?? Brisbane, Australia


From greg.ewing at canterbury.ac.nz  Fri Aug 13 01:57:39 2010
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Fri, 13 Aug 2010 11:57:39 +1200
Subject: [Python-ideas] Cofunctions PEP - Revision 4
In-Reply-To: <AANLkTikEaaXXUDx8Rm3vc7Nu=njXbUi4mZ32q6xZPEUd@mail.gmail.com>
References: <4C625949.1060803@canterbury.ac.nz>
	<AANLkTi=LocPyykU+mbbpDWGYe_nQCXMA_bAgxxzkwnHG@mail.gmail.com>
	<4C632F88.9070405@canterbury.ac.nz>
	<AANLkTinOjB14dxGo61Ns23y6nboMzLa10zEqAv0i17Le@mail.gmail.com>
	<4C639558.5020602@canterbury.ac.nz>
	<AANLkTikEaaXXUDx8Rm3vc7Nu=njXbUi4mZ32q6xZPEUd@mail.gmail.com>
Message-ID: <4C648A73.60200@canterbury.ac.nz>

Guido van Rossum wrote:

> I'm with ghazel on this one. Long, long ago I used a system that
> effectively used implicit cocalls. It was a threading OS with
> non-preemptive scheduling, so instead of locking you'd just refrain
> from calling any one of the (very few) syscalls that would allow
> another thread to run.

What this says to me is that the mindset of "I'm using
cooperative threading, so I don't need to bother with
locks" is misguided. Even with non-preemptive scheduling,
it's still a good idea to organise your program so that
threads interact only at well defined points using
appropriate synchronisation structures, because it
makes the program easier to reason about.

Anyway, the scheme I'm proposing is not the same as the
scenario you describe above. There are some things you
can be sure of: a plain function (defined with 'def')
won't ever block. A function defined with 'codef'
*could* block, so this serves as a flag that you need
to be careful what you do inside it.

If you need to make sure that some section of code
can't block, you can factor it out into a plain function
and put a comment at the top saying "Warning -- don't
ever change this into a cofunction!"

So you effectively have a way of creating a critical
section, and a mechanism that will alert you if anything
changes in a way that would break it.

I think this is a better approach than relying on call
markers to alert you of potential blockages. Consider
that to verify whether a stretch of code inside a
cofunction is non-blocking that way, you need to scan
it and examine every function call to ensure that it
isn't marked. Whereas by looking up the top and seeing
that it starts with 'def', you can tell immediately
that blocking is impossible.

-- 
Greg


From vano at mail.mipt.ru  Fri Aug 13 03:02:14 2010
From: vano at mail.mipt.ru (Ivan Pozdeev)
Date: Fri, 13 Aug 2010 05:02:14 +0400
Subject: [Python-ideas] Cofunctions/yield from -> fibers
Message-ID: <216611029.20100813050214@mail.mipt.ru>

The original proposal for introducing 'yield from' (which is still
in PEP 380's name) was to delegate a part of generator's work to another generator.
However, in later discussions, the focus totally shifted to
cooperative multitasking.

In the example code Greg has given in
http://mail.python.org/pipermail/python-ideas/2010-August/007927.html ,

there's not a single use case for delegating! 'yield from's essentially replace
normal function calls!
All Greg uses this stuff for is to manually switch between `threads'
simulating individual customers! The _programming_logic_ is plain function calls,
yields/cocalls are just a crutch (and quite an ugly one) to release a time slot.


So, in fact, this all `yield from/coroutine' effort is an attempt to
unseparatably mix two very loosely-related subjects:

- code delegation (sequence in which sequential/closely-related code is executed)
- execution scheduling (sequence in which parallel/loosely-related/unrelated code is executed)

in the form 'a function call _may_be_ a scheduling event at the same time'!


That's why it all feels so `clumsy/weird/twisted/fallacious' !
Cooperative threads must be switched somehow but choosing such a
quirky and implicit technique for that is completely against Python
Zen (violates about half of the items :^) )!


Instead, if it's cooperative multitasking you play with,
switching must be independent from other activity and as explicit as possible.
There's a technique just for that called 'fibers'
(MS fiber API: http://msdn.microsoft.com/en-us/library/ms682661.aspx ).

In short:
- ConvertThreadToFiber() decorates current thread as an initial fiber
- CreateFiber (*function, *parameter)->ID creates a new fiber
- SwitchToFiber(ID) switches execution to another fiber.
- ExitFiber() exits a fiber. In python, the function may just return
as in threads

As the goal of fibers is the same as threads, it's reasonable to
derive knowledge from there too, maybe duplicate the interface
where applicable.

And as cofunctions, these Fibers are just what
http://en.wikipedia.org/wiki/Coroutine describes. With interacting coroutines,
there is no execution stack anymore: there are 'calls' but no 'returns'!
It's essentially like passing messages back and forth as in win32k.


`Yield from's are still valid. But only as code delegation technique, i.e. a
shortcut to `for i in subgen(): yield i'.
The latter statement looks brief enough for me to discard the proposal
altogether. Send() and stuff doesn't look to fit in what
generators were originally intended to do - producing sequences with
arbitrary or unknown length.





From greg.ewing at canterbury.ac.nz  Fri Aug 13 04:09:47 2010
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Fri, 13 Aug 2010 14:09:47 +1200
Subject: [Python-ideas] Cofunctions PEP - Revision 4
In-Reply-To: <AANLkTikfuj+m83Q+h91v4+Ahc2M646=-fjV6wu4-bjh_@mail.gmail.com>
References: <4C625949.1060803@canterbury.ac.nz>
	<AANLkTi=LocPyykU+mbbpDWGYe_nQCXMA_bAgxxzkwnHG@mail.gmail.com>
	<4C632F88.9070405@canterbury.ac.nz>
	<AANLkTinOjB14dxGo61Ns23y6nboMzLa10zEqAv0i17Le@mail.gmail.com>
	<4C639558.5020602@canterbury.ac.nz>
	<AANLkTikfuj+m83Q+h91v4+Ahc2M646=-fjV6wu4-bjh_@mail.gmail.com>
Message-ID: <4C64A96B.1030808@canterbury.ac.nz>

On 12/08/10 18:44, ghazel at gmail.com wrote:

> why not just use threads?

One reason not to use threads is that they're fairly heavyweight.
They use OS resources, and each one needs its own C stack that
has to be big enough for everything it might want to do. Switching
between threads can be slow, too.

In an application that requires thousands of small, cooperating
processes, threads are not a good solution. And applications
like that do exist -- discrete-event simulation is one example.

-- 
Greg


From greg.ewing at canterbury.ac.nz  Fri Aug 13 04:34:05 2010
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Fri, 13 Aug 2010 14:34:05 +1200
Subject: [Python-ideas] Cofunctions PEP - Revision 4
In-Reply-To: <AANLkTim+yAxVuhTyPekB83n58R_kUFRoPYW7pzJ_xpf8@mail.gmail.com>
References: <4C625949.1060803@canterbury.ac.nz>
	<AANLkTin1xraNjOpQvryu97TnmU8q0qmaCT7LNBTBSPVv@mail.gmail.com>
	<4C632759.2010707@canterbury.ac.nz>
	<AANLkTimFvMDrfVtbVHXJmUKcmi3+3d5WSUPrr7kPuDrj@mail.gmail.com>
	<4C63EE35.6010805@canterbury.ac.nz>
	<AANLkTim+yAxVuhTyPekB83n58R_kUFRoPYW7pzJ_xpf8@mail.gmail.com>
Message-ID: <4C64AF1D.7050105@canterbury.ac.nz>

On 13/08/10 09:39, Nick Coghlan wrote:
> On Thu, Aug 12, 2010 at 10:51 PM, Greg Ewing
> <greg.ewing at canterbury.ac.nz>  wrote:

>> There are plenty of uses for cofunctions that never send or
>> receive any values using yield
>
> Could you name some of those uses please? If you aren't getting
> answers back, they sound like ordinary iterators to me. The whole
> *point* of cofunctions to my mind is that they let you do things like
> async I/O (where you expect a result back, in the form of a return
> value or an exception) in a way that feels more like normal imperative
> programming.

I provided an example of doing exactly that during the
yield-from debate. A full discussion can be found here:

http://www.cosc.canterbury.ac.nz/greg.ewing/python/generators/yf_current/Examples/Scheduler/scheduler.txt

Are you perhaps confusing the value produced by 'yield'
with the function return value of a cofunction or a
generator used with yield-from? They're different things,
and it's the return value that gets seen by the function
doing the cocall or yield-from. That's what enables you
to think you're writing in a normal imperative style.

In the above example, for instance, I define a function
sock_readline() that waits for data to arrive on a socket,
reads it and returns it to the caller. It's used like
this:

   line = yield from sock_readline(sock)

or if you're using cofunctions,

   line = cocall sock_readline(sock)

The definition of sock_readline looks like this:

   def sock_readline(sock):
     buf = ""
     while buf[-1:] != "\n":
       block_for_reading(sock)
       yield
       data = sock.recv(1024)
       if not data:
         break
       buf += data
     if not buf:
       close_fd(sock)
     return buf

The 'yield' in there is what suspends the coroutine, and
it neither sends or receives any value. The data read from
the socket is returned to the caller by the return
statement at the end. [Clarification: block_for_reading
doesn't actually suspend, it just puts the current
coroutine on a list to be woken up when the socket is
ready.]

-- 
Greg


From greg.ewing at canterbury.ac.nz  Fri Aug 13 05:48:52 2010
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Fri, 13 Aug 2010 15:48:52 +1200
Subject: [Python-ideas] Cofunctions PEP - Revision 4
In-Reply-To: <AANLkTi=gZ3X4aHAfXi7+_sXy7Mrvc=zs1w_tWG1+8S4z@mail.gmail.com>
References: <4C625949.1060803@canterbury.ac.nz>
	<AANLkTin1xraNjOpQvryu97TnmU8q0qmaCT7LNBTBSPVv@mail.gmail.com>
	<4C632759.2010707@canterbury.ac.nz>
	<AANLkTimFvMDrfVtbVHXJmUKcmi3+3d5WSUPrr7kPuDrj@mail.gmail.com>
	<4C63EE35.6010805@canterbury.ac.nz>
	<AANLkTim+yAxVuhTyPekB83n58R_kUFRoPYW7pzJ_xpf8@mail.gmail.com>
	<AANLkTi=gZ3X4aHAfXi7+_sXy7Mrvc=zs1w_tWG1+8S4z@mail.gmail.com>
Message-ID: <4C64C0A4.9040406@canterbury.ac.nz>

On 13/08/10 10:00, Nick Coghlan wrote:

> At the very least, a non-generator cofunction will need to offer
> close() and __del__() (or its weakref equivalent) to release resources
> in the event of an exception in any called cofunctions

But if it doesn't call any other cofunctions and doesn't
use any resources besides memory, there's no need for it
to provide a close() method.

> Providing dummy
> implementations of send() and throw() that ignore their arguments and
> devolve to next() is trivial,

But it seems perverse to force people to provide such
implementations, given that yield-from is defined in
such a way that the same effect results from simply
omitting those methods.

> PEP 342 is *called* "Coroutines via enhanced
> generators", and it still seems to me that the usage of send() and
> throw() is one of the key features distinguishing a cooperative
> scheduler from ordinary iteration.

Ahhh.....

I've just looked at that PEP, and I can now see where the
confusion is coming from.

PEP 342 talks about using yield to communicate
instructions to and from a coroutine-driving trampoline.
However, that entire technique is a *workaround* for
not having something like yield-from.

If you do have yield-from, then none of that is necessary,
and you don't *need* generators to be "enhanced" with a
send() facility in order to do coroutine scheduling --
plain next() is more than sufficient, as I hope my socket
example demonstrates.

A similar thing applies to throw(). The PEP 342 motivation
for it is so that the trampoline can propagate an
exception raised in an inner generator back up the
call stack, by manually throwing it into each generator
along the way. But this technique is rendered obsolete by
yield-from as well, because any exception occurring in
an inner generator propagates up the yield-from chain
automatically, without having to do anything special.

-- 
Greg



From ncoghlan at gmail.com  Fri Aug 13 06:13:50 2010
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Fri, 13 Aug 2010 14:13:50 +1000
Subject: [Python-ideas] Cofunctions PEP - Revision 4
In-Reply-To: <4C64AF1D.7050105@canterbury.ac.nz>
References: <4C625949.1060803@canterbury.ac.nz>
	<AANLkTin1xraNjOpQvryu97TnmU8q0qmaCT7LNBTBSPVv@mail.gmail.com>
	<4C632759.2010707@canterbury.ac.nz>
	<AANLkTimFvMDrfVtbVHXJmUKcmi3+3d5WSUPrr7kPuDrj@mail.gmail.com>
	<4C63EE35.6010805@canterbury.ac.nz>
	<AANLkTim+yAxVuhTyPekB83n58R_kUFRoPYW7pzJ_xpf8@mail.gmail.com>
	<4C64AF1D.7050105@canterbury.ac.nz>
Message-ID: <AANLkTin-D+ds5RobjQFuV_rWi-NVWdYz_39g6P7p+oa1@mail.gmail.com>

On Fri, Aug 13, 2010 at 12:34 PM, Greg Ewing
<greg.ewing at canterbury.ac.nz> wrote:
> Are you perhaps confusing the value produced by 'yield'
> with the function return value of a cofunction or a
> generator used with yield-from? They're different things,
> and it's the return value that gets seen by the function
> doing the cocall or yield-from. That's what enables you
> to think you're writing in a normal imperative style.

I'll admit that I was forgetting the difference between the return
value of yield and that of yield from. So send() isn't essential.

> ?def sock_readline(sock):
> ? ?buf = ""
> ? ?while buf[-1:] != "\n":
> ? ? ?block_for_reading(sock)
> ? ? ?yield
> ? ? ?data = sock.recv(1024)
> ? ? ?if not data:
> ? ? ? ?break
> ? ? ?buf += data
> ? ?if not buf:
> ? ? ?close_fd(sock)
> ? ?return buf
>
> The 'yield' in there is what suspends the coroutine, and
> it neither sends or receives any value. The data read from
> the socket is returned to the caller by the return
> statement at the end. [Clarification: block_for_reading
> doesn't actually suspend, it just puts the current
> coroutine on a list to be woken up when the socket is
> ready.]

But the "yield" is also the point that allows the scheduler to throw
in an exception to indicate that the socket has gone away and that the
error should be propagated up the coroutine stack.

You need throw() at least to participate in proper exception handling.

Cheers,
Nick.

-- 
Nick Coghlan?? |?? ncoghlan at gmail.com?? |?? Brisbane, Australia


From greg.ewing at canterbury.ac.nz  Fri Aug 13 06:55:14 2010
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Fri, 13 Aug 2010 16:55:14 +1200
Subject: [Python-ideas] Cofunctions/yield from -> fibers
In-Reply-To: <216611029.20100813050214@mail.mipt.ru>
References: <216611029.20100813050214@mail.mipt.ru>
Message-ID: <4C64D032.1070102@canterbury.ac.nz>

On 13/08/10 13:02, Ivan Pozdeev wrote:
> The original proposal for introducing 'yield from' (which is still
> in PEP 380's name) was to delegate a part of generator's work to another generator.
> However, in later discussions, the focus totally shifted to
> cooperative multitasking.

I still regard delegation as an important use case for
yield-from. I haven't dwelled much on delegation of plain
value-yielding generators in my examples because it seems
rather obvious and straightforward. My coroutine examples
are meant to show the motivation for some of the less-obvious
aspects of yield-from, such as the ability to return a value.

> In the example code Greg has given in
> http://mail.python.org/pipermail/python-ideas/2010-August/007927.html ,
>
> there's not a single use case for delegating!

That seems like a strange statement to me. There is delegation
going on all over the place in all my examples. Whenever a
generator in one of my coroutine uses yield-from to call
another one, it's delegating some of the work of that
coroutine to that generator.

The whole point of yield-from, and, to an even greater
extent, cofunctions, is to make delegation between suspendable
functions look as similar as possible to delegation between
ordinary ones.

> Instead, if it's cooperative multitasking you play with,
> switching must be independent from other activity and as explicit as possible.
> There's a technique just for that called 'fibers'
> (MS fiber API: http://msdn.microsoft.com/en-us/library/ms682661.aspx ).

I'm confused about what you're asking for. If you're
complaining that it's weird having to write 'yield from'
or 'cocall' instead of a plain function call, then I
actually agree with you. I'm trying to move cofunctions
in the direction of eliminating such call markers
completely, but I'm meeting quite a bit of resistance.

If you're saying that switching between coroutines should
involve explicitly nominating which coroutine to run next,
it would be straightforward to write a scheduler that
works this way. In fact, I can write one right now, it
would look something like this:

   def switch_to_fibre(x):
     global current_fibre
     current_fibre = x

   def main_loop():
     while 1:
       next(current_fibre)

However, it seems to me that a function such as
switch_to_fibre() is mainly useful as a primitive on
which to build higher-level scheduling strategies.
It would be possible to reformulate all of my example
schedulers to be based on such a primitive, but I'm
not sure there would be any point in doing so.
Generators already have their own primitive notion
of "run this one next", i.e. calling next() on
them, and it seems to be quite sufficient.

> `Yield from's are still valid. But only as code delegation technique, i.e. a
> shortcut to `for i in subgen(): yield i'.
> The latter statement looks brief enough for me to discard the proposal
> altogether.

But it only delegates next(), not send(), throw() or
close(). If those things are considered important for
single-function generators to have, then they are
presumably equally important for generators that are
spread over more than one function.

Also, yield-from is *much* more efficient than the
equivalent for-loop -- less than 10% of the overhead
in my current implementation.

-- 
Greg


From greg.ewing at canterbury.ac.nz  Fri Aug 13 07:05:23 2010
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Fri, 13 Aug 2010 17:05:23 +1200
Subject: [Python-ideas] Cofunctions PEP - Revision 4
In-Reply-To: <AANLkTin-D+ds5RobjQFuV_rWi-NVWdYz_39g6P7p+oa1@mail.gmail.com>
References: <4C625949.1060803@canterbury.ac.nz>
	<AANLkTin1xraNjOpQvryu97TnmU8q0qmaCT7LNBTBSPVv@mail.gmail.com>
	<4C632759.2010707@canterbury.ac.nz>
	<AANLkTimFvMDrfVtbVHXJmUKcmi3+3d5WSUPrr7kPuDrj@mail.gmail.com>
	<4C63EE35.6010805@canterbury.ac.nz>
	<AANLkTim+yAxVuhTyPekB83n58R_kUFRoPYW7pzJ_xpf8@mail.gmail.com>
	<4C64AF1D.7050105@canterbury.ac.nz>
	<AANLkTin-D+ds5RobjQFuV_rWi-NVWdYz_39g6P7p+oa1@mail.gmail.com>
Message-ID: <4C64D293.8090908@canterbury.ac.nz>

On 13/08/10 16:13, Nick Coghlan wrote:

> But the "yield" is also the point that allows the scheduler to throw
> in an exception to indicate that the socket has gone away and that the
> error should be propagated up the coroutine stack.

If by "gone away" you mean that the other end has been closed,
that's already taken care of. The socket becomes ready to read,
the coroutine wakes up, tries to read it and discovers the EOF
condition for itself.

The same thing applies if anything happens to the socket that
would cause an exception if you tried to read it. All the
scheduler needs to do is notice that the socket is being
reported as ready by select(). When the coroutine tries to deal
with it, the exception will occur in that coroutine and be
propagated within it.

I'm not saying that no scheduler will ever want to throw an
exception into a coroutine, but it's not needed in this case.

-- 
Greg



From debatem1 at gmail.com  Fri Aug 13 07:07:31 2010
From: debatem1 at gmail.com (geremy condra)
Date: Thu, 12 Aug 2010 22:07:31 -0700
Subject: [Python-ideas] Cofunctions/yield from -> fibers
In-Reply-To: <4C64D032.1070102@canterbury.ac.nz>
References: <216611029.20100813050214@mail.mipt.ru>
	<4C64D032.1070102@canterbury.ac.nz>
Message-ID: <AANLkTineWK+MdbH2c9-nQcCbvwyS2t07MnKUzKWuGDnt@mail.gmail.com>

On Thu, Aug 12, 2010 at 9:55 PM, Greg Ewing <greg.ewing at canterbury.ac.nz> wrote:
> On 13/08/10 13:02, Ivan Pozdeev wrote:
>>
>> The original proposal for introducing 'yield from' (which is still
>> in PEP 380's name) was to delegate a part of generator's work to another
>> generator.
>> However, in later discussions, the focus totally shifted to
>> cooperative multitasking.
>
> I still regard delegation as an important use case for
> yield-from. I haven't dwelled much on delegation of plain
> value-yielding generators in my examples because it seems
> rather obvious and straightforward. My coroutine examples
> are meant to show the motivation for some of the less-obvious
> aspects of yield-from, such as the ability to return a value.

I won't pretend to understand the current discussion or its
motivation, but I know I would appreciate it if examples of the
obvious-and-straightforward variety were added.

Geremy Condra


From ncoghlan at gmail.com  Fri Aug 13 07:35:09 2010
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Fri, 13 Aug 2010 15:35:09 +1000
Subject: [Python-ideas] Cofunctions/yield from -> fibers
In-Reply-To: <AANLkTineWK+MdbH2c9-nQcCbvwyS2t07MnKUzKWuGDnt@mail.gmail.com>
References: <216611029.20100813050214@mail.mipt.ru>
	<4C64D032.1070102@canterbury.ac.nz>
	<AANLkTineWK+MdbH2c9-nQcCbvwyS2t07MnKUzKWuGDnt@mail.gmail.com>
Message-ID: <AANLkTi=+Ycx2k=TYYHYo14FowCMb_pYHDRvT586K4Dc2@mail.gmail.com>

On Fri, Aug 13, 2010 at 3:07 PM, geremy condra <debatem1 at gmail.com> wrote:
> On Thu, Aug 12, 2010 at 9:55 PM, Greg Ewing <greg.ewing at canterbury.ac.nz> wrote:
>> On 13/08/10 13:02, Ivan Pozdeev wrote:
>>>
>>> The original proposal for introducing 'yield from' (which is still
>>> in PEP 380's name) was to delegate a part of generator's work to another
>>> generator.
>>> However, in later discussions, the focus totally shifted to
>>> cooperative multitasking.
>>
>> I still regard delegation as an important use case for
>> yield-from. I haven't dwelled much on delegation of plain
>> value-yielding generators in my examples because it seems
>> rather obvious and straightforward. My coroutine examples
>> are meant to show the motivation for some of the less-obvious
>> aspects of yield-from, such as the ability to return a value.
>
> I won't pretend to understand the current discussion or its
> motivation, but I know I would appreciate it if examples of the
> obvious-and-straightforward variety were added.

Yes, it would be nice if PEP 380's generator delegation forest didn't
get lost in the cofunction trees :)

I think the cofunction discussion suggests that there are some very
different possible answers as to what is the scheduler's
responsibility and what is the responsibility of the individual
coroutines. Picking one of them as a winner by blessing it with syntax
seems rather premature.

Cheers,
Nick.

-- 
Nick Coghlan?? |?? ncoghlan at gmail.com?? |?? Brisbane, Australia


From scott+python-ideas at scottdial.com  Fri Aug 13 08:52:47 2010
From: scott+python-ideas at scottdial.com (Scott Dial)
Date: Fri, 13 Aug 2010 02:52:47 -0400
Subject: [Python-ideas] Cofunctions/yield from -> fibers
In-Reply-To: <AANLkTi=+Ycx2k=TYYHYo14FowCMb_pYHDRvT586K4Dc2@mail.gmail.com>
References: <216611029.20100813050214@mail.mipt.ru>	<4C64D032.1070102@canterbury.ac.nz>	<AANLkTineWK+MdbH2c9-nQcCbvwyS2t07MnKUzKWuGDnt@mail.gmail.com>
	<AANLkTi=+Ycx2k=TYYHYo14FowCMb_pYHDRvT586K4Dc2@mail.gmail.com>
Message-ID: <4C64EBBF.4060601@scottdial.com>

On 8/13/2010 1:35 AM, Nick Coghlan wrote:
> On Fri, Aug 13, 2010 at 3:07 PM, geremy condra <debatem1 at gmail.com> wrote:
>> I won't pretend to understand the current discussion or its
>> motivation, but I know I would appreciate it if examples of the
>> obvious-and-straightforward variety were added.
> 
> Yes, it would be nice if PEP 380's generator delegation forest didn't
> get lost in the cofunction trees :)

Yes, I think even something as trivial as an example in-order iteration
over a binary tree should be included since it is accessible and the
benefits of readability, efficiency, and correctness are apparent:

class BinaryTree:
    def __init__(self, left=None, us=None, right=None):
        self.left = left
        self.us = us
        self.right = right

    def __iter__(self):
        if self.left:
            yield from self.left
        if self.us:
            yield self.us
        if self.right:
            yield from self.right

You can point out that "yield from" prevents a recursion depth problem
while also being agnostic to whether left/right is also a BinaryTree
object (e.g., a tuple or list or some other user-defined type works just
as well as an iterable leaf) -- a feat that would be rather complicated
and verbose otherwise. As a bonus, the run-time of such an iteration is
faster due to the flattening optimization that is only possible with
special syntax.

-- 
Scott Dial
scott at scottdial.com
scodial at cs.indiana.edu


From greg.ewing at canterbury.ac.nz  Fri Aug 13 09:36:56 2010
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Fri, 13 Aug 2010 19:36:56 +1200
Subject: [Python-ideas] Cofunctions PEP - Revision 4
In-Reply-To: <42FFF65E-8D42-4DB7-BCBD-833AAF6E04AC@pandora.com>
References: <4C625949.1060803@canterbury.ac.nz>
	<42FFF65E-8D42-4DB7-BCBD-833AAF6E04AC@pandora.com>
Message-ID: <4C64F618.2010008@canterbury.ac.nz>

Casey Duncan wrote:
> Apologies if this already exists, but for the benefit of those less
> enlightened, I think it would be very helpful if the pep included or linked
> to an example of an algorithm implemented 3 ways:

There isn't currently a single one implemented all three
ways, but my parser example is implemented with plain Python
and yield-from, and the philosophers and socket server are
implemented using yield-from and cofunctions.

http://www.cosc.canterbury.ac.nz/greg.ewing/python/generators/

> iirc, the last two would not look much different, but maybe I'm mistaken.

You're not mistaken -- mainly it's just a matter of replacing
'yield from' with 'codef'. If the implicit-cocalling version
of cofunctions gains sway, it would be more different --
all the 'yield from's would disappear, and some function
definitions would change from 'def' to 'codef'.
> 
> As I understand it:
> 
> cocall f(x, y, z)
> 
> is sugar for:
> 
> yield from f.__cocall__(x, y, z)
> 
> and it now magically promotes the function that contains it to a cofunction
> (thus implementing __cocall__ for said function).

That's essentially correct as the PEP now stands.

> From what I understand, __cocall__ does not exist because you might want to
> also have __call__ with different behavior, but instead it exists to allow
> the "cocaller" to differentiate between cofunctions and normal functions?

Yes, that's right. A cofunction's __cocall__ method does
exactly the same thing as a normal generator's __call__
method does.

> In
> theory though, I could implement an object myself that implemented both
> __call__ and __cocall__, correct?

You could, and in fact one version of the cofunctions
proposal suggests making ordinary functions behaves as
though they did implement both,  with __cocall__ returning
an iterator that yields zero times.

There would be nothing to stop you creating an object that
had arbitrarily different behaviour for __call__ and __cocall__
either, although I'm not sure what use such an object would
be.

> I suppose __cocall__ is to __call__ as __iter__ is to __call__ presently.

Not exactly. When you do

   for x in f():
     ...

__call__ and __iter__ are *both* involved -- __call__ is invoked
first, and then __iter__ on the result.

But when making a cocall, __cocall__ is invoked *instead* of __call__
(and the result is expected to already be an iterator, so __iter__
is not used).

 > It really seems to me that generators should be implemented on
> top of coroutines and the not the reverse. That would lead to a more linear
> path to understanding: iterators -> generators -> coroutines.

If generators didn't already exist, it might make sense to do it
that way. It would be easy to create an @generator decorator that
would turn a cofunction into a generator. (Such a thing might be
good to have in any case.)

But we're stuck with generators the way the are, so we might as well
make the most of them, including using them as a foundation for a
less-restricted form of suspendable function.

Also keep in mind that the way they're documented and taught doesn't
necessarily have to reflect the implementation strategy. It would
be possible to describe cofunctions and cocalls as an independent
concept, and only later explain how they relate to generators.

-- 
Greg



From greg.ewing at canterbury.ac.nz  Fri Aug 13 09:54:07 2010
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Fri, 13 Aug 2010 19:54:07 +1200
Subject: [Python-ideas] Cofunctions/yield from -> fibers
In-Reply-To: <AANLkTi=+Ycx2k=TYYHYo14FowCMb_pYHDRvT586K4Dc2@mail.gmail.com>
References: <216611029.20100813050214@mail.mipt.ru>
	<4C64D032.1070102@canterbury.ac.nz>
	<AANLkTineWK+MdbH2c9-nQcCbvwyS2t07MnKUzKWuGDnt@mail.gmail.com>
	<AANLkTi=+Ycx2k=TYYHYo14FowCMb_pYHDRvT586K4Dc2@mail.gmail.com>
Message-ID: <4C64FA1F.5070304@canterbury.ac.nz>

Nick Coghlan wrote:

> I think the cofunction discussion suggests that there are some very
> different possible answers as to what is the scheduler's
> responsibility and what is the responsibility of the individual
> coroutines. Picking one of them as a winner by blessing it with syntax
> seems rather premature.

I don't follow. How does the syntax I've suggested
have any bearing on what is the responsibility of the
scheduler?

As far as I can see, it stays out of the way and lets
the scheduler and coroutines thrash out whatever
agreement they want between them.

-- 
Greg




From debatem1 at gmail.com  Fri Aug 13 09:59:54 2010
From: debatem1 at gmail.com (geremy condra)
Date: Fri, 13 Aug 2010 00:59:54 -0700
Subject: [Python-ideas] Cofunctions/yield from -> fibers
In-Reply-To: <4C64EBBF.4060601@scottdial.com>
References: <216611029.20100813050214@mail.mipt.ru>
	<4C64D032.1070102@canterbury.ac.nz>
	<AANLkTineWK+MdbH2c9-nQcCbvwyS2t07MnKUzKWuGDnt@mail.gmail.com>
	<AANLkTi=+Ycx2k=TYYHYo14FowCMb_pYHDRvT586K4Dc2@mail.gmail.com>
	<4C64EBBF.4060601@scottdial.com>
Message-ID: <AANLkTimvTosJ5z-i+9NUiTArd-hAUAUw0xg=A4VFv=RD@mail.gmail.com>

On Thu, Aug 12, 2010 at 11:52 PM, Scott Dial
<scott+python-ideas at scottdial.com> wrote:
> On 8/13/2010 1:35 AM, Nick Coghlan wrote:
>> On Fri, Aug 13, 2010 at 3:07 PM, geremy condra <debatem1 at gmail.com> wrote:
>>> I won't pretend to understand the current discussion or its
>>> motivation, but I know I would appreciate it if examples of the
>>> obvious-and-straightforward variety were added.
>>
>> Yes, it would be nice if PEP 380's generator delegation forest didn't
>> get lost in the cofunction trees :)
>
> Yes, I think even something as trivial as an example in-order iteration
> over a binary tree should be included since it is accessible and the
> benefits of readability, efficiency, and correctness are apparent:
>
> class BinaryTree:
> ? ?def __init__(self, left=None, us=None, right=None):
> ? ? ? ?self.left = left
> ? ? ? ?self.us = us
> ? ? ? ?self.right = right
>
> ? ?def __iter__(self):
> ? ? ? ?if self.left:
> ? ? ? ? ? ?yield from self.left
> ? ? ? ?if self.us:
> ? ? ? ? ? ?yield self.us
> ? ? ? ?if self.right:
> ? ? ? ? ? ?yield from self.right
>
> You can point out that "yield from" prevents a recursion depth problem
> while also being agnostic to whether left/right is also a BinaryTree
> object (e.g., a tuple or list or some other user-defined type works just
> as well as an iterable leaf) -- a feat that would be rather complicated
> and verbose otherwise. As a bonus, the run-time of such an iteration is
> faster due to the flattening optimization that is only possible with
> special syntax.

...aaaaaand sold in one. Thanks for the example.

Geremy Condra


From greg.ewing at canterbury.ac.nz  Fri Aug 13 10:11:31 2010
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Fri, 13 Aug 2010 20:11:31 +1200
Subject: [Python-ideas] Cofunctions/yield from -> fibers
In-Reply-To: <4C64EBBF.4060601@scottdial.com>
References: <216611029.20100813050214@mail.mipt.ru>
	<4C64D032.1070102@canterbury.ac.nz>
	<AANLkTineWK+MdbH2c9-nQcCbvwyS2t07MnKUzKWuGDnt@mail.gmail.com>
	<AANLkTi=+Ycx2k=TYYHYo14FowCMb_pYHDRvT586K4Dc2@mail.gmail.com>
	<4C64EBBF.4060601@scottdial.com>
Message-ID: <4C64FE33.4070900@canterbury.ac.nz>

Scott Dial wrote:

> Yes, I think even something as trivial as an example in-order iteration
> over a binary tree should be included since it is accessible and the
> benefits of readability, efficiency, and correctness are apparent:

That's a nice example -- I've just added it to the web page
(with appropriate attribution).

> while also being agnostic to whether left/right is also a BinaryTree
> object (e.g., a tuple or list or some other user-defined type works just
> as well as an iterable leaf) -- a feat that would be rather complicated
> and verbose otherwise.

That's not actually true -- a for-loop would work with any
iterable node object just as well.

-- 
Greg


From mal at egenix.com  Fri Aug 13 13:02:56 2010
From: mal at egenix.com (M.-A. Lemburg)
Date: Fri, 13 Aug 2010 13:02:56 +0200
Subject: [Python-ideas] Cofunctions PEP - Revision 4
In-Reply-To: <4C64A96B.1030808@canterbury.ac.nz>
References: <4C625949.1060803@canterbury.ac.nz>	<AANLkTi=LocPyykU+mbbpDWGYe_nQCXMA_bAgxxzkwnHG@mail.gmail.com>	<4C632F88.9070405@canterbury.ac.nz>	<AANLkTinOjB14dxGo61Ns23y6nboMzLa10zEqAv0i17Le@mail.gmail.com>	<4C639558.5020602@canterbury.ac.nz>	<AANLkTikfuj+m83Q+h91v4+Ahc2M646=-fjV6wu4-bjh_@mail.gmail.com>
	<4C64A96B.1030808@canterbury.ac.nz>
Message-ID: <4C652660.5010907@egenix.com>

Greg Ewing wrote:
> On 12/08/10 18:44, ghazel at gmail.com wrote:
> 
>> why not just use threads?
> 
> One reason not to use threads is that they're fairly heavyweight.
> They use OS resources, and each one needs its own C stack that
> has to be big enough for everything it might want to do. Switching
> between threads can be slow, too.
> 
> In an application that requires thousands of small, cooperating
> processes, threads are not a good solution. And applications
> like that do exist -- discrete-event simulation is one example.

Sure, and those use Stackless to solve the problem, which IMHO
provides a much more Pythonic approach to these things.

Stackless also works across C function calls, a detail which will become
more important as we think about JIT compilers for Python and
which is not something we want the average Python programmer
to have to worry about. Stackless hides all these details from
the Python programmer and works well on the platforms that it
supports.

So if we really want such functionality in general Python (which
I'm not convinced of, but that may be just me), then I'd
suggest to look at an existing and proven approach first.

The techniques used by Stackless to achieve this are nasty, but then
Python also ships with ctypes which relies on similar nasty techniques
(hidden away in libffi), so I guess the barrier for entry
is lower nowadays than it was a few years ago.

-- 
Marc-Andre Lemburg
eGenix.com

Professional Python Services directly from the Source  (#1, Aug 13 2010)
>>> Python/Zope Consulting and Support ...        http://www.egenix.com/
>>> mxODBC.Zope.Database.Adapter ...             http://zope.egenix.com/
>>> mxODBC, mxDateTime, mxTextTools ...        http://python.egenix.com/
________________________________________________________________________

::: Try our new mxODBC.Connect Python Database Interface for free ! ::::


   eGenix.com Software, Skills and Services GmbH  Pastor-Loeh-Str.48
    D-40764 Langenfeld, Germany. CEO Dipl.-Math. Marc-Andre Lemburg
           Registered at Amtsgericht Duesseldorf: HRB 46611
               http://www.egenix.com/company/contact/


From ncoghlan at gmail.com  Fri Aug 13 14:54:01 2010
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Fri, 13 Aug 2010 22:54:01 +1000
Subject: [Python-ideas] Cofunctions/yield from -> fibers
In-Reply-To: <4C64FA1F.5070304@canterbury.ac.nz>
References: <216611029.20100813050214@mail.mipt.ru>
	<4C64D032.1070102@canterbury.ac.nz>
	<AANLkTineWK+MdbH2c9-nQcCbvwyS2t07MnKUzKWuGDnt@mail.gmail.com>
	<AANLkTi=+Ycx2k=TYYHYo14FowCMb_pYHDRvT586K4Dc2@mail.gmail.com>
	<4C64FA1F.5070304@canterbury.ac.nz>
Message-ID: <AANLkTimyxaFk=iOhR8txMmVSM5akQR-eU1igRSkkfvxP@mail.gmail.com>

On Fri, Aug 13, 2010 at 5:54 PM, Greg Ewing <greg.ewing at canterbury.ac.nz> wrote:
> Nick Coghlan wrote:
>
>> I think the cofunction discussion suggests that there are some very
>> different possible answers as to what is the scheduler's
>> responsibility and what is the responsibility of the individual
>> coroutines. Picking one of them as a winner by blessing it with syntax
>> seems rather premature.
>
> I don't follow. How does the syntax I've suggested
> have any bearing on what is the responsibility of the
> scheduler?
>
> As far as I can see, it stays out of the way and lets
> the scheduler and coroutines thrash out whatever
> agreement they want between them.

A PEP 342 based scheduler requires coroutines that implement the
generator API (i.e. ones that support send/throw/close) but you're
claiming that it is acceptable to refer to an ordinary iterator as a
coroutine and use other channels (such as module globals) to
communicate requests to the scheduler. That's the only point I'm
objecting to - I want to see the ability to receive values and
exceptions from outside at suspension points as part of any defined
coroutine API. Part of my goal in that is to emphasise that coroutines
are *not* used in the same way as iterators, and hence have a wider
API.

Cheers,
Nick.

-- 
Nick Coghlan?? |?? ncoghlan at gmail.com?? |?? Brisbane, Australia


From scott+python-ideas at scottdial.com  Fri Aug 13 17:37:57 2010
From: scott+python-ideas at scottdial.com (Scott Dial)
Date: Fri, 13 Aug 2010 11:37:57 -0400
Subject: [Python-ideas] Cofunctions/yield from -> fibers
In-Reply-To: <4C64FE33.4070900@canterbury.ac.nz>
References: <216611029.20100813050214@mail.mipt.ru>	<4C64D032.1070102@canterbury.ac.nz>	<AANLkTineWK+MdbH2c9-nQcCbvwyS2t07MnKUzKWuGDnt@mail.gmail.com>	<AANLkTi=+Ycx2k=TYYHYo14FowCMb_pYHDRvT586K4Dc2@mail.gmail.com>	<4C64EBBF.4060601@scottdial.com>
	<4C64FE33.4070900@canterbury.ac.nz>
Message-ID: <4C6566D5.70001@scottdial.com>

On 8/13/2010 4:11 AM, Greg Ewing wrote:
> That's not actually true -- a for-loop would work with any
> iterable node object just as well.
> 

I agree that it works, but it does not avoid the recursion-depth problem
and you pay for the decent through all of the for-loops. I assume we are
both talking about an implementation __iter__() like:

    def __iter__(self):
        if self.left:
            for v in self.left:
                yield v
        if self.us:
            yield self.us
        if self.right:
            for v in self.right:
                yield v

But this fails with a recursion depth RuntimeError around a depth of
~1000 or so. Perhaps that is not an interesting practical problem. But,
my comment about it being "complicated and verbose" was that to avoid
that depth problem, the obvious solution is to make __iter__() implement
an in-order stack itself and manufacturer some way to deal with other
types being in the tree. But again, due to how deep of a structure you
need, perhaps it's not that interesting.

-- 
Scott Dial
scott at scottdial.com
scodial at cs.indiana.edu


From greg.ewing at canterbury.ac.nz  Sat Aug 14 03:22:13 2010
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Sat, 14 Aug 2010 13:22:13 +1200
Subject: [Python-ideas] Cofunctions PEP - Revision 4
In-Reply-To: <4C652660.5010907@egenix.com>
References: <4C625949.1060803@canterbury.ac.nz>
	<AANLkTi=LocPyykU+mbbpDWGYe_nQCXMA_bAgxxzkwnHG@mail.gmail.com>
	<4C632F88.9070405@canterbury.ac.nz>
	<AANLkTinOjB14dxGo61Ns23y6nboMzLa10zEqAv0i17Le@mail.gmail.com>
	<4C639558.5020602@canterbury.ac.nz>
	<AANLkTikfuj+m83Q+h91v4+Ahc2M646=-fjV6wu4-bjh_@mail.gmail.com>
	<4C64A96B.1030808@canterbury.ac.nz> <4C652660.5010907@egenix.com>
Message-ID: <4C65EFC5.4080100@canterbury.ac.nz>

M.-A. Lemburg wrote:

> Greg Ewing wrote:

>>In an application that requires thousands of small, cooperating
>>processes,

> Sure, and those use Stackless to solve the problem, which IMHO
> provides a much more Pythonic approach to these things.

At the expense of using a non-standard Python installation,
though. I'm trying to design something that can be incorporated
into standard Python and work without requiring any deep
black magic. Guido has so far rejected any idea of merging
Stackless into CPython.

Also I gather that Stackless works by copying pieces of
C stack around, which is probably more lightweight than using
an OS thread, but not as light as it could be.

And I'm not sure what criteria to judge pythonicity by in
all this. Stackless tasklets work without requiring any kind
of function or call markers -- everything looks exactly
like normal Python code. But Guido and others seem to be
objecting to my implicit-cocall proposal on the basis that
it looks *too much* like normal code. It seems to me that
the same criticism should apply even more to Stackless.

> The techniques used by Stackless to achieve this are nasty,
> but then
> Python also ships with ctypes which relies on similar nasty techniques

But at least it works provided you write your ctypes code
correctly and the library you're calling isn't buggy. I
seem to remember that there are certain C libraries that
break Stackless because they assume that their C stack
frames don't move around.

-- 
Greg


From greg.ewing at canterbury.ac.nz  Sat Aug 14 03:54:13 2010
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Sat, 14 Aug 2010 13:54:13 +1200
Subject: [Python-ideas] Cofunctions/yield from -> fibers
In-Reply-To: <AANLkTimyxaFk=iOhR8txMmVSM5akQR-eU1igRSkkfvxP@mail.gmail.com>
References: <216611029.20100813050214@mail.mipt.ru>
	<4C64D032.1070102@canterbury.ac.nz>
	<AANLkTineWK+MdbH2c9-nQcCbvwyS2t07MnKUzKWuGDnt@mail.gmail.com>
	<AANLkTi=+Ycx2k=TYYHYo14FowCMb_pYHDRvT586K4Dc2@mail.gmail.com>
	<4C64FA1F.5070304@canterbury.ac.nz>
	<AANLkTimyxaFk=iOhR8txMmVSM5akQR-eU1igRSkkfvxP@mail.gmail.com>
Message-ID: <4C65F745.2020507@canterbury.ac.nz>

Nick Coghlan wrote:

> A PEP 342 based scheduler requires coroutines that implement the
> generator API (i.e. ones that support send/throw/close) but you're
> claiming that it is acceptable to refer to an ordinary iterator as a
> coroutine and use other channels (such as module globals) to
> communicate requests to the scheduler.

No, I don't use module globals to communicate requests to
the scheduler, I use function calls or coroutine calls. For
example, where a PEP 342 coroutine would do something like

   yield WaitForSocket(sock)

I would instead do

   yield from wait_for_socket(sock)

or

   cocall wait_for_socket(sock)

The implementation of wait_for_socket() may make use of
module globals internal to the scheduler to keep track of
which coroutines are waiting for which sockets, but that's
a detail private to the scheduler.

> I want to see the ability to receive values and
> exceptions from outside at suspension points as part of any defined
> coroutine API.

The *ability* is there by virtue of the fact that they
*can* implement those methods if they want. You seem to
be going further and insisting that they *must* implement
those methods, even if the implementations are empty
or trivial. I still can't see how that's a good thing.

> Part of my goal in that is to emphasise that coroutines
> are *not* used in the same way as iterators,

If you really wanted to do that, you would have to give
them a different API altogether, such as using resume()
instead of next().

-- 
Greg


From greg.ewing at canterbury.ac.nz  Sat Aug 14 04:01:54 2010
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Sat, 14 Aug 2010 14:01:54 +1200
Subject: [Python-ideas] Cofunctions/yield from -> fibers
In-Reply-To: <4C6566D5.70001@scottdial.com>
References: <216611029.20100813050214@mail.mipt.ru>
	<4C64D032.1070102@canterbury.ac.nz>
	<AANLkTineWK+MdbH2c9-nQcCbvwyS2t07MnKUzKWuGDnt@mail.gmail.com>
	<AANLkTi=+Ycx2k=TYYHYo14FowCMb_pYHDRvT586K4Dc2@mail.gmail.com>
	<4C64EBBF.4060601@scottdial.com> <4C64FE33.4070900@canterbury.ac.nz>
	<4C6566D5.70001@scottdial.com>
Message-ID: <4C65F912.5080404@canterbury.ac.nz>

Scott Dial wrote:
> On 8/13/2010 4:11 AM, Greg Ewing wrote:
> 
>>That's not actually true -- a for-loop would work with any
>>iterable node object just as well.
>
> I agree that it works, but it does not avoid the recursion-depth problem
> and you pay for the decent through all of the for-loops.

To clarify, I was responding only to the point that yield-from
deals with any kind of iterator, not just another generator.
I was pointing out that, to the extent that a for-loop handles
generators, it also handles other iterables as well.

-- 
Greg


From ncoghlan at gmail.com  Sat Aug 14 17:47:54 2010
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Sun, 15 Aug 2010 01:47:54 +1000
Subject: [Python-ideas] Cofunctions/yield from -> fibers
In-Reply-To: <4C65F745.2020507@canterbury.ac.nz>
References: <216611029.20100813050214@mail.mipt.ru>
	<4C64D032.1070102@canterbury.ac.nz>
	<AANLkTineWK+MdbH2c9-nQcCbvwyS2t07MnKUzKWuGDnt@mail.gmail.com>
	<AANLkTi=+Ycx2k=TYYHYo14FowCMb_pYHDRvT586K4Dc2@mail.gmail.com>
	<4C64FA1F.5070304@canterbury.ac.nz>
	<AANLkTimyxaFk=iOhR8txMmVSM5akQR-eU1igRSkkfvxP@mail.gmail.com>
	<4C65F745.2020507@canterbury.ac.nz>
Message-ID: <AANLkTimP91Xoxx5fMaCKBj+2BhyV3V+YXm16h-y8fhUd@mail.gmail.com>

On Sat, Aug 14, 2010 at 11:54 AM, Greg Ewing
<greg.ewing at canterbury.ac.nz> wrote:
> The implementation of wait_for_socket() may make use of
> module globals internal to the scheduler to keep track of
> which coroutines are waiting for which sockets, but that's
> a detail private to the scheduler.

By scheduler, I mean the actual dispatch loop, not just any code that
happens to live in the same module or package.

>> I want to see the ability to receive values and
>> exceptions from outside at suspension points as part of any defined
>> coroutine API.
>
> The *ability* is there by virtue of the fact that they
> *can* implement those methods if they want. You seem to
> be going further and insisting that they *must* implement
> those methods, even if the implementations are empty
> or trivial. I still can't see how that's a good thing.

Scheduler authors shouldn't have to pepper their code with conditional
checks for send/throw/close support on the coroutines. By allowing
things that only implement the iterator API without those three
methods to be called "coroutines", you're invalidating that
assumption, making schedulers that make it technically incorrect.

If a scheduler chooses *not* to rely on PEP 342, that's fine. But with
PEP 342 in place, we should respect its definition of the expected
coroutine API.

>> Part of my goal in that is to emphasise that coroutines
>> are *not* used in the same way as iterators,
>
> If you really wanted to do that, you would have to give
> them a different API altogether, such as using resume()
> instead of next().

The coroutine API is a superset of the iterator API, but it's still different.

Cheers,
Nick.

-- 
Nick Coghlan?? |?? ncoghlan at gmail.com?? |?? Brisbane, Australia


From 8mayday at gmail.com  Sat Aug 14 18:40:52 2010
From: 8mayday at gmail.com (Andrey Popp)
Date: Sat, 14 Aug 2010 20:40:52 +0400
Subject: [Python-ideas] Cofunctions PEP - Revision 4
In-Reply-To: <4C65EFC5.4080100@canterbury.ac.nz>
References: <4C625949.1060803@canterbury.ac.nz>
	<AANLkTi=LocPyykU+mbbpDWGYe_nQCXMA_bAgxxzkwnHG@mail.gmail.com>
	<4C632F88.9070405@canterbury.ac.nz>
	<AANLkTinOjB14dxGo61Ns23y6nboMzLa10zEqAv0i17Le@mail.gmail.com>
	<4C639558.5020602@canterbury.ac.nz>
	<AANLkTikfuj+m83Q+h91v4+Ahc2M646=-fjV6wu4-bjh_@mail.gmail.com>
	<4C64A96B.1030808@canterbury.ac.nz> <4C652660.5010907@egenix.com>
	<4C65EFC5.4080100@canterbury.ac.nz>
Message-ID: <AANLkTi=_qKxV=LFH2sO=FjGBrSQ4PMd4z0x5mDaQ=VuZ@mail.gmail.com>

Hello,

Greg Ewing:
> M.-A. Lemburg wrote:
>> Sure, and those use Stackless to solve the problem, which IMHO
>> provides a much more Pythonic approach to these things.
>
> At the expense of using a non-standard Python installation,
> though. I'm trying to design something that can be incorporated
> into standard Python and work without requiring any deep
> black magic. Guido has so far rejected any idea of merging
> Stackless into CPython.

Note, that there is also greenlet library that provides such part of
functionality from Stackless (except preemptive scheduling and
pickling) and eventlet/gevent libraries that is quite popular
solutions for network applications written in Python, which use
greenlet.

> And I'm not sure what criteria to judge pythonicity by in
> all this. Stackless tasklets work without requiring any kind
> of function or call markers -- everything looks exactly
> like normal Python code. But Guido and others seem to be
> objecting to my implicit-cocall proposal on the basis that
> it looks *too much* like normal code. It seems to me that
> the same criticism should apply even more to Stackless.

For me the fact that greenlet code looks like normal code is more
preferable against generator-based coroutines (I think they are
overuse of generator syntax). Also I don't see the need to explicit
cocall:

1) It will increase the complexity of language without necessity ? we
have no special syntax for threading, so why we should have one for
cooperative threads? Semantics are almost the same relative to
unthreaded Python, except with cooperative threading we should
explicitly control execution, which has less semantic impact than
preemptive threading code, I think.

2) That will affect code reusability a lot, because we can't mix
cocalls and calls.

All this issues are solved with greenlet library and I think if Python
needs cooperative threads they should have API and behave like
greenlets.

-- 
Andrey Popp

phone: +7 911 740 24 91
e-mail: 8mayday at gmail.com


From solipsis at pitrou.net  Sat Aug 14 18:57:32 2010
From: solipsis at pitrou.net (Antoine Pitrou)
Date: Sat, 14 Aug 2010 18:57:32 +0200
Subject: [Python-ideas] Cofunctions PEP - Revision 4
References: <4C625949.1060803@canterbury.ac.nz>
	<AANLkTi=LocPyykU+mbbpDWGYe_nQCXMA_bAgxxzkwnHG@mail.gmail.com>
	<4C632F88.9070405@canterbury.ac.nz>
	<AANLkTinOjB14dxGo61Ns23y6nboMzLa10zEqAv0i17Le@mail.gmail.com>
	<4C639558.5020602@canterbury.ac.nz>
	<AANLkTikfuj+m83Q+h91v4+Ahc2M646=-fjV6wu4-bjh_@mail.gmail.com>
	<4C64A96B.1030808@canterbury.ac.nz> <4C652660.5010907@egenix.com>
	<4C65EFC5.4080100@canterbury.ac.nz>
	<AANLkTi=_qKxV=LFH2sO=FjGBrSQ4PMd4z0x5mDaQ=VuZ@mail.gmail.com>
Message-ID: <20100814185732.02f718c6@pitrou.net>

On Sat, 14 Aug 2010 20:40:52 +0400
Andrey Popp <8mayday at gmail.com> wrote:
> 2) That will affect code reusability a lot, because we can't mix
> cocalls and calls.
> 
> All this issues are solved with greenlet library and I think if Python
> needs cooperative threads they should have API and behave like
> greenlets.

As far as I understand, the only way Stackless and the like can make
things "transparent" is that they monkey-patch core socket
functionality (and perhaps other critical built-in functionalities).
"Soft" cooperative multithreading isn't naturally transparent, which
makes it quite different from OS-level multithreading.

Regards

Antoine.




From 8mayday at gmail.com  Sun Aug 15 15:36:30 2010
From: 8mayday at gmail.com (Andrey Popp)
Date: Sun, 15 Aug 2010 17:36:30 +0400
Subject: [Python-ideas] Cofunctions PEP - Revision 4
In-Reply-To: <20100814185732.02f718c6@pitrou.net>
References: <4C625949.1060803@canterbury.ac.nz>
	<AANLkTi=LocPyykU+mbbpDWGYe_nQCXMA_bAgxxzkwnHG@mail.gmail.com>
	<4C632F88.9070405@canterbury.ac.nz>
	<AANLkTinOjB14dxGo61Ns23y6nboMzLa10zEqAv0i17Le@mail.gmail.com>
	<4C639558.5020602@canterbury.ac.nz>
	<AANLkTikfuj+m83Q+h91v4+Ahc2M646=-fjV6wu4-bjh_@mail.gmail.com>
	<4C64A96B.1030808@canterbury.ac.nz> <4C652660.5010907@egenix.com>
	<4C65EFC5.4080100@canterbury.ac.nz>
	<AANLkTi=_qKxV=LFH2sO=FjGBrSQ4PMd4z0x5mDaQ=VuZ@mail.gmail.com>
	<20100814185732.02f718c6@pitrou.net>
Message-ID: <AANLkTi=L4OC9fKnGLSPRpXvEFnkzvY5=+TUYuTkfxuAm@mail.gmail.com>

Antoine Pitrou wrote:
> Andrey Popp wrote:
>> 2) That will affect code reusability a lot, because we can't mix
>> cocalls and calls.
>>
>> All this issues are solved with greenlet library and I think if Python
>> needs cooperative threads they should have API and behave like
>> greenlets.
>
> As far as I understand, the only way Stackless and the like can make
> things "transparent" is that they monkey-patch core socket
> functionality (and perhaps other critical built-in functionalities).
> "Soft" cooperative multithreading isn't naturally transparent, which
> makes it quite different from OS-level multithreading.

Stackless does not monkey-patch socket, gevent and eventlet do, but
I'm not about that. We can just have another socket implementation
that cooperates and use it in cooperative code.

I'm about how to start coroutines and switch between them ? the way
it's done in greenlet (but not the implementation) is more preferable
to explicit cocalls.

-- 
Andrey Popp

phone: +7 911 740 24 91
e-mail: 8mayday at gmail.com


From greg.ewing at canterbury.ac.nz  Mon Aug 16 00:46:11 2010
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Mon, 16 Aug 2010 10:46:11 +1200
Subject: [Python-ideas] Cofunctions/yield from -> fibers
In-Reply-To: <AANLkTimP91Xoxx5fMaCKBj+2BhyV3V+YXm16h-y8fhUd@mail.gmail.com>
References: <216611029.20100813050214@mail.mipt.ru>
	<4C64D032.1070102@canterbury.ac.nz>
	<AANLkTineWK+MdbH2c9-nQcCbvwyS2t07MnKUzKWuGDnt@mail.gmail.com>
	<AANLkTi=+Ycx2k=TYYHYo14FowCMb_pYHDRvT586K4Dc2@mail.gmail.com>
	<4C64FA1F.5070304@canterbury.ac.nz>
	<AANLkTimyxaFk=iOhR8txMmVSM5akQR-eU1igRSkkfvxP@mail.gmail.com>
	<4C65F745.2020507@canterbury.ac.nz>
	<AANLkTimP91Xoxx5fMaCKBj+2BhyV3V+YXm16h-y8fhUd@mail.gmail.com>
Message-ID: <4C686E33.5010305@canterbury.ac.nz>

Nick Coghlan wrote:

> Scheduler authors shouldn't have to pepper their code with conditional
> checks for send/throw/close support on the coroutines.

Okay, I see what you're getting at now. We've been talking
about slightly different things. I've been talking about the
semantics of cocall, which, if defined in terms of yield-from
without further qualification, inherits its fallback behaviour
in the absence of a full set of generator methods.

However, you're talking about the agreed-upon interface
between a scheduler and the objects that it schedules. That's
a matter for the scheduler author do decide upon and document.
It may well be that some schedulers will require some or all
of the generator methods to be implemented, but I expect
there to be a substantial class that don't.

The wording in the PEP is only meant to specify the
language-mandated requirements for an object to be used with
cocall. If both yield-from and cocall are to exist, it's
simplest to use the same semantics for both.

Making the language definition any more complicated would
require a fairly strong justification, and I'm not convinced
that this qualifies. For the convenience of schedulers,
functions could be provided for sending, throwing and
closing that provide the same fallback behaviour as
yield-from and cocall.

-- 
Greg


From ncoghlan at gmail.com  Mon Aug 16 11:30:46 2010
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Mon, 16 Aug 2010 19:30:46 +1000
Subject: [Python-ideas] Cofunctions/yield from -> fibers
In-Reply-To: <4C686E33.5010305@canterbury.ac.nz>
References: <216611029.20100813050214@mail.mipt.ru>
	<4C64D032.1070102@canterbury.ac.nz>
	<AANLkTineWK+MdbH2c9-nQcCbvwyS2t07MnKUzKWuGDnt@mail.gmail.com>
	<AANLkTi=+Ycx2k=TYYHYo14FowCMb_pYHDRvT586K4Dc2@mail.gmail.com>
	<4C64FA1F.5070304@canterbury.ac.nz>
	<AANLkTimyxaFk=iOhR8txMmVSM5akQR-eU1igRSkkfvxP@mail.gmail.com>
	<4C65F745.2020507@canterbury.ac.nz>
	<AANLkTimP91Xoxx5fMaCKBj+2BhyV3V+YXm16h-y8fhUd@mail.gmail.com>
	<4C686E33.5010305@canterbury.ac.nz>
Message-ID: <AANLkTin58TNfjMdRwqaWPOw4gZ9_onwPeiyWUkQem5kX@mail.gmail.com>

On Mon, Aug 16, 2010 at 8:46 AM, Greg Ewing <greg.ewing at canterbury.ac.nz> wrote:
> Nick Coghlan wrote:
>
>> Scheduler authors shouldn't have to pepper their code with conditional
>> checks for send/throw/close support on the coroutines.
>
> Okay, I see what you're getting at now. We've been talking
> about slightly different things. I've been talking about the
> semantics of cocall, which, if defined in terms of yield-from
> without further qualification, inherits its fallback behaviour
> in the absence of a full set of generator methods.

Ah, OK. To avoid anyone else being tempted to overspecify, perhaps a
comment in parentheses to point out that some coroutine schedulers may
require a broader coroutine API, such as that of PEP 342? I don't
think this is really common enough to justify yet more functions.

Although, if you added a coroutines module, you could put costart and
those helper functions in there rather than making them builtins. Such
a module would also make sense for the decorator/function based
alternative proposal.

Cheers,
Nick.


-- 
Nick Coghlan?? |?? ncoghlan at gmail.com?? |?? Brisbane, Australia


From mal at egenix.com  Mon Aug 16 17:20:28 2010
From: mal at egenix.com (M.-A. Lemburg)
Date: Mon, 16 Aug 2010 17:20:28 +0200
Subject: [Python-ideas] Cofunctions PEP - Revision 4
In-Reply-To: <4C65EFC5.4080100@canterbury.ac.nz>
References: <4C625949.1060803@canterbury.ac.nz>	<AANLkTi=LocPyykU+mbbpDWGYe_nQCXMA_bAgxxzkwnHG@mail.gmail.com>	<4C632F88.9070405@canterbury.ac.nz>	<AANLkTinOjB14dxGo61Ns23y6nboMzLa10zEqAv0i17Le@mail.gmail.com>	<4C639558.5020602@canterbury.ac.nz>	<AANLkTikfuj+m83Q+h91v4+Ahc2M646=-fjV6wu4-bjh_@mail.gmail.com>	<4C64A96B.1030808@canterbury.ac.nz>
	<4C652660.5010907@egenix.com> <4C65EFC5.4080100@canterbury.ac.nz>
Message-ID: <4C69573C.7000301@egenix.com>

Greg Ewing wrote:
> M.-A. Lemburg wrote:
> 
>> Greg Ewing wrote:
> 
>>> In an application that requires thousands of small, cooperating
>>> processes,
> 
>> Sure, and those use Stackless to solve the problem, which IMHO
>> provides a much more Pythonic approach to these things.
> 
> At the expense of using a non-standard Python installation,
> though. I'm trying to design something that can be incorporated
> into standard Python and work without requiring any deep
> black magic. Guido has so far rejected any idea of merging
> Stackless into CPython.

The problem with doing so is twofold:

 1. The use case Stackless addresses is not something an everyday
    programmer will need, so making CPython more complicated
    just to add this one extra feature, doesn't appear worth
    the trouble.

 2. The Stackless implementation is not very portable, so
    the feature would only be available on a limited number of
    platforms.

Apart from that, every new feature will raise the bar for learning
Python.

If you could turn your proposal into something more like
the Stackless tasklets and move the implementation to an
extension module (perhaps with some extra help from new CPython
APIs), then I'm sure the proposal would get more followers.

> Also I gather that Stackless works by copying pieces of
> C stack around, which is probably more lightweight than using
> an OS thread, but not as light as it could be.

Well, it works great in practice and is a proven approach.
Copying in C is certainly fast enough for most needs and
the black magic is well hidden in Stackless.

> And I'm not sure what criteria to judge pythonicity by in
> all this. 

"explicit is better than implicit".

Tasklets are normal Python objects wrapping functions.
The create of those tasklets is explicit, not implicit
via some (special) yield burried deep in the code.

> Stackless tasklets work without requiring any kind
> of function or call markers -- everything looks exactly
> like normal Python code.

Right, because everything *is* normal Python code. Tasklets
are much more like threads to the programmer, i.e. a
well understood concept.

> But Guido and others seem to be
> objecting to my implicit-cocall proposal on the basis that
> it looks *too much* like normal code. It seems to me that
> the same criticism should apply even more to Stackless.

I think an important part of the criticism is
hiding the fact that you are writing a cofunction
away inside the function definition itself.

Generators have the same problem, but at least you
can call them as regular Python functions and they
only work a little different than normal functions.

>> The techniques used by Stackless to achieve this are nasty,
>> but then
>> Python also ships with ctypes which relies on similar nasty techniques
> 
> But at least it works provided you write your ctypes code
> correctly and the library you're calling isn't buggy. I
> seem to remember that there are certain C libraries that
> break Stackless because they assume that their C stack
> frames don't move around.

Well, viruses will have a harder time for sure ;-) I am not
aware of other use cases that would need to know the
location of the stack frame in memory.

BTW: I'm sure that the functionality needed by Stackless could
also be moved into a C lib for other languages to use
(much like the libffi code).

-- 
Marc-Andre Lemburg
eGenix.com

Professional Python Services directly from the Source  (#1, Aug 16 2010)
>>> Python/Zope Consulting and Support ...        http://www.egenix.com/
>>> mxODBC.Zope.Database.Adapter ...             http://zope.egenix.com/
>>> mxODBC, mxDateTime, mxTextTools ...        http://python.egenix.com/
________________________________________________________________________

::: Try our new mxODBC.Connect Python Database Interface for free ! ::::


   eGenix.com Software, Skills and Services GmbH  Pastor-Loeh-Str.48
    D-40764 Langenfeld, Germany. CEO Dipl.-Math. Marc-Andre Lemburg
           Registered at Amtsgericht Duesseldorf: HRB 46611
               http://www.egenix.com/company/contact/


From g.rodola at gmail.com  Mon Aug 16 22:37:07 2010
From: g.rodola at gmail.com (=?ISO-8859-1?Q?Giampaolo_Rodol=E0?=)
Date: Mon, 16 Aug 2010 22:37:07 +0200
Subject: [Python-ideas] @unittest.skip_others decorator
Message-ID: <AANLkTikoFAhD4ARiB73mSxnebVj80De5fHqbP=vd4Up-@mail.gmail.com>

Hello,
when working on a test suite, no matter if writing new test cases from
scratch or modifying an existing one, the need of temporarily
excluding other tests and focus on that particular one is very common
(at least for me).
This is especially true when working on a test suite which contains a
lot of tests producing a very verbose output result or which takes a
lot to complete.

What I usually do in such cases is to scroll down until test_main()
function, manually change support.run_unittest() ilke this:

-    support.run_unittest(TestCase1, TestCase2, TestCase3, TestCase4)
+    support.run_unittest(TestCase4)

...and then comment out all the test cases in TestCase4 class except
the one I'm interested in.
This is obviously not very flexible, then I thought that maybe
unittest (or test/support module, I'm not sure) could provide a
specific decorator for temporarily running only one specific test
class(es) or method(s).
The decorator could be used in 3 ways:

== use case #1 ==

@unittest.exclude_others
class TestCase1:
     ...

All TestCase1 test cases are run, TestCase2, TestCase3 and TestCase4
classes are excluded.


== use case #2 ==

class TestCase1:

   @unittest.exclude_others
    def test_something(self):
         ....

All TestCase1.* tests are excluded except "test_something".
TestCase2, TestCase3 and TestCase4 are run.


== use case #3 ==

@unittest.exclude_others
class TestCase1:

   @unittest.exclude_others
    def test_something(self):
         ....
Only TestCase1.test_something() is run (this is the most common use case).


Thoughts?


Kindest regards,


--- Giampaolo
http://code.google.com/p/pyftpdlib/
http://code.google.com/p/psutil/


From jseutter at gmail.com  Mon Aug 16 23:01:39 2010
From: jseutter at gmail.com (Jerry Seutter)
Date: Mon, 16 Aug 2010 15:01:39 -0600
Subject: [Python-ideas] @unittest.skip_others decorator
In-Reply-To: <AANLkTikoFAhD4ARiB73mSxnebVj80De5fHqbP=vd4Up-@mail.gmail.com>
References: <AANLkTikoFAhD4ARiB73mSxnebVj80De5fHqbP=vd4Up-@mail.gmail.com>
Message-ID: <AANLkTi=qM6-bjm+mW-bmHx3fmc0XBmhK+y-MGshS7t-D@mail.gmail.com>

Hi, you may be interested in this.  It's part of the unittest module that
ships with Python:

=======================
./python.exe -m unittest test.test_call.CFunctionCalls.test_varargs0
.
----------------------------------------------------------------------
Ran 1 test in 0.000s

OK
=======================

An added bonus is that this works across many Python projects.

Hope this helps,

Jerry Seutter


On Mon, Aug 16, 2010 at 2:37 PM, Giampaolo Rodol? <g.rodola at gmail.com>wrote:

> Hello,
> when working on a test suite, no matter if writing new test cases from
> scratch or modifying an existing one, the need of temporarily
> excluding other tests and focus on that particular one is very common
> (at least for me).
> This is especially true when working on a test suite which contains a
> lot of tests producing a very verbose output result or which takes a
> lot to complete.
>
> What I usually do in such cases is to scroll down until test_main()
> function, manually change support.run_unittest() ilke this:
>
> -    support.run_unittest(TestCase1, TestCase2, TestCase3, TestCase4)
> +    support.run_unittest(TestCase4)
>
> ...and then comment out all the test cases in TestCase4 class except
> the one I'm interested in.
> This is obviously not very flexible, then I thought that maybe
> unittest (or test/support module, I'm not sure) could provide a
> specific decorator for temporarily running only one specific test
> class(es) or method(s).
> The decorator could be used in 3 ways:
>
> == use case #1 ==
>
> @unittest.exclude_others
> class TestCase1:
>     ...
>
> All TestCase1 test cases are run, TestCase2, TestCase3 and TestCase4
> classes are excluded.
>
>
> == use case #2 ==
>
> class TestCase1:
>
>   @unittest.exclude_others
>    def test_something(self):
>         ....
>
> All TestCase1.* tests are excluded except "test_something".
> TestCase2, TestCase3 and TestCase4 are run.
>
>
> == use case #3 ==
>
> @unittest.exclude_others
> class TestCase1:
>
>   @unittest.exclude_others
>    def test_something(self):
>         ....
> Only TestCase1.test_something() is run (this is the most common use case).
>
>
> Thoughts?
>
>
> Kindest regards,
>
>
> --- Giampaolo
> http://code.google.com/p/pyftpdlib/
> http://code.google.com/p/psutil/
> _______________________________________________
> Python-ideas mailing list
> Python-ideas at python.org
> http://mail.python.org/mailman/listinfo/python-ideas
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-ideas/attachments/20100816/99ee1647/attachment.html>

From tjreedy at udel.edu  Tue Aug 17 01:34:27 2010
From: tjreedy at udel.edu (Terry Reedy)
Date: Mon, 16 Aug 2010 19:34:27 -0400
Subject: [Python-ideas] @unittest.skip_others decorator
In-Reply-To: <AANLkTikoFAhD4ARiB73mSxnebVj80De5fHqbP=vd4Up-@mail.gmail.com>
References: <AANLkTikoFAhD4ARiB73mSxnebVj80De5fHqbP=vd4Up-@mail.gmail.com>
Message-ID: <i4chu1$e9r$1@dough.gmane.org>

On 8/16/2010 4:37 PM, Giampaolo Rodol? wrote:
> Hello,
> when working on a test suite, no matter if writing new test cases from
> scratch or modifying an existing one, the need of temporarily
> excluding other tests and focus on that particular one is very common
> (at least for me).
> This is especially true when working on a test suite which contains a
> lot of tests producing a very verbose output result or which takes a
> lot to complete.
>
> What I usually do in such cases is to scroll down until test_main()
> function, manually change support.run_unittest() ilke this:
>
> -    support.run_unittest(TestCase1, TestCase2, TestCase3, TestCase4)
> +    support.run_unittest(TestCase4)
>
> ...and then comment out all the test cases in TestCase4 class except
> the one I'm interested in.

How about using a special TestDevelopment with only the test cases you 
are working on? Then comment out the standard run and uncomment the 
development run

# support.run_unittest(TestCase1, TestCase2, TestCase3, TestCase4)
support.run_unittest(TestDevel)

When done, move tests to whichever regular TestCase they belong in.
-- 
Terry Jan Reedy




From greg.ewing at canterbury.ac.nz  Tue Aug 17 02:16:32 2010
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Tue, 17 Aug 2010 12:16:32 +1200
Subject: [Python-ideas] Cofunctions PEP - Revision 4
In-Reply-To: <4C69573C.7000301@egenix.com>
References: <4C625949.1060803@canterbury.ac.nz>
	<AANLkTi=LocPyykU+mbbpDWGYe_nQCXMA_bAgxxzkwnHG@mail.gmail.com>
	<4C632F88.9070405@canterbury.ac.nz>
	<AANLkTinOjB14dxGo61Ns23y6nboMzLa10zEqAv0i17Le@mail.gmail.com>
	<4C639558.5020602@canterbury.ac.nz>
	<AANLkTikfuj+m83Q+h91v4+Ahc2M646=-fjV6wu4-bjh_@mail.gmail.com>
	<4C64A96B.1030808@canterbury.ac.nz> <4C652660.5010907@egenix.com>
	<4C65EFC5.4080100@canterbury.ac.nz> <4C69573C.7000301@egenix.com>
Message-ID: <4C69D4E0.1000906@canterbury.ac.nz>

M.-A. Lemburg wrote:

> If you could turn your proposal into something more like
> the Stackless tasklets and move the implementation to an
> extension module (perhaps with some extra help from new CPython
> APIs), then I'm sure the proposal would get more followers.

As far as I can see, it's impossible to do what I'm
proposing with standard C and without language support.
That's why greenlets have to resort to black magic.

> "explicit is better than implicit".

This is a strange argument to be making in favour of
Stackless, though, where the fact that you're dealing
with suspendable code is almost completely *implicit*.

> Tasklets are normal Python objects wrapping functions.
> The create of those tasklets is explicit, not implicit
> via some (special) yield burried deep in the code.

So would you be more in favour of the alternative version,
where there is 'codef' but no 'cocall'?

> I think an important part of the criticism is
> hiding the fact that you are writing a cofunction
> away inside the function definition itself.

The only reason I did that is because Guido turned his
nose up at the idea of defining a function using anything
other than 'def'. I'm starting to wish I'd stuck to my
guns a bit longer in the hope of changing his sense of
smell. :-)

> Well, viruses will have a harder time for sure ;-) I am not
> aware of other use cases that would need to know the
> location of the stack frame in memory.

One way it can happen is that a task sets up a callback
referencing something in a stack frame, and then the
callback gets invoked while the task is switched out,
so the referenced data isn't in the right place. I
believe this is the kind of thing that was causing the
trouble with Tkinter.

-- 
Greg



From bochecha at fedoraproject.org  Tue Aug 17 14:13:26 2010
From: bochecha at fedoraproject.org (Mathieu Bridon)
Date: Tue, 17 Aug 2010 14:13:26 +0200
Subject: [Python-ideas] Curly braces expansion in shell-like matcher modules
Message-ID: <AANLkTi=170htwZXy-zmGhTyZy-T-NC=G_8sN6j26-ch8@mail.gmail.com>

Hi,

A few days ago I submitted a feature request for the fnmatch module to
allow curly braces shell-like expansions:
    http://bugs.python.org/issue9584

The patch I submitted was found to be incorrect, and I'm working on it
(I already have a patch that seems to be working and I need to test it
extensively).

However, some concerns were raised about the fnmatch module not being
the correct place to do that, if the goal is to mimic shell behavior.
Expansion would have to be done before, generating the list of
patterns that would then be given to fnmatch.

It was suggested that I took this issue on Python-ideas, so here we go.

What do people think? Should it be done in fnmatch, expanding the
curly braces to the regex? Should it be done before that?

If the latter, what would be an appropriate module to expand the
braces? glob? Another module?

FWIW (i.e not much), my opinion is leaning towards implementing it in glob.py:
- add '{' to the magic_check regex
- in glob1 (which is called when the pattern 'has_magic'), expand the
braces and then call fnmatch.filter() on each resulting pattern

That would respect more what is done in shells like Bash, and it makes
it also more straight-forward to implement.

Cheers,


-- 
Mathieu


From greg.ewing at canterbury.ac.nz  Tue Aug 17 14:44:51 2010
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Wed, 18 Aug 2010 00:44:51 +1200
Subject: [Python-ideas] Curly braces expansion in shell-like matcher
	modules
In-Reply-To: <AANLkTi=170htwZXy-zmGhTyZy-T-NC=G_8sN6j26-ch8@mail.gmail.com>
References: <AANLkTi=170htwZXy-zmGhTyZy-T-NC=G_8sN6j26-ch8@mail.gmail.com>
Message-ID: <4C6A8443.6020503@canterbury.ac.nz>

Mathieu Bridon wrote:

> If the latter, what would be an appropriate module to expand the
> braces? glob? Another module?

Making it an enhancement of glob() sounds like the
right thing to me.

-- 
Greg


From fdrake at acm.org  Tue Aug 17 14:51:18 2010
From: fdrake at acm.org (Fred Drake)
Date: Tue, 17 Aug 2010 08:51:18 -0400
Subject: [Python-ideas] Curly braces expansion in shell-like matcher
	modules
In-Reply-To: <AANLkTi=170htwZXy-zmGhTyZy-T-NC=G_8sN6j26-ch8@mail.gmail.com>
References: <AANLkTi=170htwZXy-zmGhTyZy-T-NC=G_8sN6j26-ch8@mail.gmail.com>
Message-ID: <AANLkTingP42rQ-Mq0Z4wkP2xEdyOh4WTc7RHr2UDPxks@mail.gmail.com>

On Tue, Aug 17, 2010 at 8:13 AM, Mathieu Bridon
<bochecha at fedoraproject.org> wrote:
> If the latter, what would be an appropriate module to expand the
> braces? glob? Another module?

Since normal ("sh-like") Unix shells apply this generally, I'd be
inclined to have a function to do this in shlex.


? -Fred

--
Fred L. Drake, Jr.? ? <fdrake at gmail.com>
"A storm broke loose in my mind."? --Albert Einstein


From solipsis at pitrou.net  Tue Aug 17 15:03:58 2010
From: solipsis at pitrou.net (Antoine Pitrou)
Date: Tue, 17 Aug 2010 15:03:58 +0200
Subject: [Python-ideas] Curly braces expansion in shell-like matcher
	modules
References: <AANLkTi=170htwZXy-zmGhTyZy-T-NC=G_8sN6j26-ch8@mail.gmail.com>
Message-ID: <20100817150358.134067b1@pitrou.net>

On Tue, 17 Aug 2010 14:13:26 +0200
Mathieu Bridon
<bochecha at fedoraproject.org> wrote:
> 
> However, some concerns were raised about the fnmatch module not being
> the correct place to do that, if the goal is to mimic shell behavior.
> Expansion would have to be done before, generating the list of
> patterns that would then be given to fnmatch.

I don't think mimicking shell behaviour should be a design goal of
fnmatch or any other stdlib module. Shells are multiple and, besides,
users are generally not interested in reproducing shell behaviour when
they use Python; they simply are looking for useful functionality.

IMO, fnmatch is the right place for such an enhancement.
(and, as the doc states, ?glob uses fnmatch() to match pathname
segments?).

> FWIW (i.e not much), my opinion is leaning towards implementing it in glob.py:
> - add '{' to the magic_check regex
> - in glob1 (which is called when the pattern 'has_magic'), expand the
> braces and then call fnmatch.filter() on each resulting pattern
> 
> That would respect more what is done in shells like Bash, and it makes
> it also more straight-forward to implement.

It also introduces a bizarrely inconsistent behaviour in the
dubious name of compatibility. Let's not reproduce the quirks of Unix
shells, which are hardly a reference in beautiful UI and API design.

Regards

Antoine.




From fdrake at acm.org  Tue Aug 17 15:14:04 2010
From: fdrake at acm.org (Fred Drake)
Date: Tue, 17 Aug 2010 09:14:04 -0400
Subject: [Python-ideas] Curly braces expansion in shell-like matcher
	modules
In-Reply-To: <20100817150358.134067b1@pitrou.net>
References: <AANLkTi=170htwZXy-zmGhTyZy-T-NC=G_8sN6j26-ch8@mail.gmail.com>
	<20100817150358.134067b1@pitrou.net>
Message-ID: <AANLkTi=LP77LNLUnfmOus=M_iKOeekE=Mce7JwRAs5x8@mail.gmail.com>

On Tue, Aug 17, 2010 at 9:03 AM, Antoine Pitrou <solipsis at pitrou.net> wrote:
> IMO, fnmatch is the right place for such an enhancement.
> (and, as the doc states, ?glob uses fnmatch() to match pathname
> segments?).

This is a good reason not to push the implementation down into glob,
actually: the expansion may cross segment boundaries:

    for{bar/turtle,car/monkey}_test.*

should expand to the two patterns:

    foobar/turtle_test.*
    foocar/monkey_test.*


? -Fred

--
Fred L. Drake, Jr.? ? <fdrake at gmail.com>
"A storm broke loose in my mind."? --Albert Einstein


From solipsis at pitrou.net  Tue Aug 17 20:29:04 2010
From: solipsis at pitrou.net (Antoine Pitrou)
Date: Tue, 17 Aug 2010 20:29:04 +0200
Subject: [Python-ideas] @unittest.skip_others decorator
References: <AANLkTikoFAhD4ARiB73mSxnebVj80De5fHqbP=vd4Up-@mail.gmail.com>
Message-ID: <20100817202904.358f92e1@pitrou.net>

On Mon, 16 Aug 2010 22:37:07 +0200
Giampaolo Rodol? <g.rodola at gmail.com>
wrote:
> Hello,
> when working on a test suite, no matter if writing new test cases from
> scratch or modifying an existing one, the need of temporarily
> excluding other tests and focus on that particular one is very common
> (at least for me).
> This is especially true when working on a test suite which contains a
> lot of tests producing a very verbose output result or which takes a
> lot to complete.
> 
[...]
> 
> Thoughts?

This would certainly be useful to me as well.

Regards

Antoine.




From bochecha at fedoraproject.org  Tue Aug 17 21:39:27 2010
From: bochecha at fedoraproject.org (Mathieu Bridon)
Date: Tue, 17 Aug 2010 21:39:27 +0200
Subject: [Python-ideas] Curly braces expansion in shell-like matcher
 modules
In-Reply-To: <AANLkTi=LP77LNLUnfmOus=M_iKOeekE=Mce7JwRAs5x8@mail.gmail.com>
References: <AANLkTi=170htwZXy-zmGhTyZy-T-NC=G_8sN6j26-ch8@mail.gmail.com>
	<20100817150358.134067b1@pitrou.net>
	<AANLkTi=LP77LNLUnfmOus=M_iKOeekE=Mce7JwRAs5x8@mail.gmail.com>
Message-ID: <1282073967.3752.157.camel@caan>

Hi,

On Tue, 2010-08-17 at 09:14 -0400, Fred Drake wrote:
> On Tue, Aug 17, 2010 at 9:03 AM, Antoine Pitrou <solipsis at pitrou.net> wrote:
> > IMO, fnmatch is the right place for such an enhancement.
> > (and, as the doc states, ?glob uses fnmatch() to match pathname
> > segments?).
> 
> This is a good reason not to push the implementation down into glob,
> actually: the expansion may cross segment boundaries:
> 
>     for{bar/turtle,car/monkey}_test.*
> 
> should expand to the two patterns:
> 
>     foobar/turtle_test.*
>     foocar/monkey_test.*

Then I have the correct behavior with the attached patch against the
glob module. :)
(I still have to write some proper unit tests for it, this is only a
working proof of concept)

Note that I wrote this patch against the Python trunk, and tested it on
Python 2.5 (Windows XP) and Python 2.6 (Fedora 13). (I didn't have time
to actually build the Python trunk and run the unit tests yet)

To test it, I use the following dictionary where keys are the patterns I
want to try and values are the corresponding expected output.

d = {
        'foo.txt': 'foo.txt',
        'foo-{bar,baz}.txt': 'foo-bar.txt foo-baz.txt',
        'foo-{bar,baz-{toto,plouf}}.txt': 'foo-bar.txt foo-baz-plouf.txt
foo-baz-toto.txt',
        'foo-{bar,baz}-{toto,plouf}.txt': 'foo-bar-plouf.txt
foo-bar-toto.txt foo-baz-plouf.txt foo-baz-toto.txt',
        'foo-{}.txt': 'foo-{}.txt',
        'foo-{bar}.txt': 'foo-{bar}.txt',
        'foo-{bar.txt': 'foo-{bar.txt',
        'foo-bar}.txt': 'foo-bar}.txt',
        'foo-{bar{baz,plouf}.txt': 'foo-{barbaz.txt foo-{barplouf.txt',
        'foo-{bar,baz}-{toto}.txt': 'foo-bar-{toto}.txt
foo-baz-{toto}.txt',
        'foo-{bar,baz}-{toto.txt': 'foo-bar-{toto.txt
foo-baz-{toto.txt',
        'foo-{bar,baz}-toto}.txt': 'foo-bar-toto}.txt
foo-baz-toto}.txt',
        'tmp/foo.txt': 'tmp/foo.txt',
        'tmp/foo-{bar,baz}.txt': 'tmp/foo-bar.txt tmp/foo-baz.txt',
        'tmp/foo-{bar,baz-{toto,plouf}}.txt': 'tmp/foo-bar.txt
tmp/foo-baz-plouf.txt tmp/foo-baz-toto.txt',
        'tmp/foo-{bar,baz}-{toto,plouf}.txt': 'tmp/foo-bar-plouf.txt
tmp/foo-bar-toto.txt tmp/foo-baz-plouf.txt tmp/foo-baz-toto.txt',
        'tmp/foo-{}.txt': 'tmp/foo-{}.txt',
        'tmp/foo-{bar}.txt': 'tmp/foo-{bar}.txt',
        'tmp/foo-{bar.txt': 'tmp/foo-{bar.txt',
        'tmp/foo-bar}.txt': 'tmp/foo-bar}.txt',
        'tmp/foo-{bar{baz,plouf}.txt': 'tmp/foo-{barbaz.txt
tmp/foo-{barplouf.txt',
        'tmp/foo-{bar,baz}-{toto}.txt': 'tmp/foo-bar-{toto}.txt
tmp/foo-baz-{toto}.txt',
        'tmp/foo-{bar,baz}-{toto.txt': 'tmp/foo-bar-{toto.txt
tmp/foo-baz-{toto.txt',
        'tmp/foo-{bar,baz}-toto}.txt': 'tmp/foo-bar-toto}.txt
tmp/foo-baz-toto}.txt',
        '{tmp,tmp2}/foo.txt': 'tmp2/foo.txt tmp/foo.txt',
        '{tmp,tmp2}/foo-{bar,baz}.txt': 'tmp2/foo-bar.txt
tmp2/foo-baz.txt tmp/foo-bar.txt tmp/foo-baz.txt',
        '{tmp,tmp2}/foo-{bar,baz-{toto,plouf}}.txt': 'tmp2/foo-bar.txt
tmp2/foo-baz-plouf.txt tmp2/foo-baz-toto.txt tmp/foo-bar.txt
tmp/foo-baz-plouf.txt tmp/foo-baz-toto.txt',
        '{tmp,tmp2}/foo-{bar,baz}-{toto,plouf}.txt':
'tmp2/foo-bar-plouf.txt tmp2/foo-bar-toto.txt tmp2/foo-baz-plouf.txt
tmp2/foo-baz-toto.txt tmp/foo-bar-plouf.txt tmp/foo-bar-toto.txt
tmp/foo-baz-plouf.txt tmp/foo-baz-toto.txt',
        '{tmp,tmp2}/foo-{}.txt': 'tmp2/foo-{}.txt tmp/foo-{}.txt',
        '{tmp,tmp2}/foo-{bar}.txt': 'tmp2/foo-{bar}.txt
tmp/foo-{bar}.txt',
        '{tmp,tmp2}/foo-{bar.txt': 'tmp2/foo-{bar.txt tmp/foo-{bar.txt',
        '{tmp,tmp2}/foo-bar}.txt': 'tmp2/foo-bar}.txt tmp/foo-bar}.txt',
        '{tmp,tmp2}/foo-{bar{baz,plouf}.txt': 'tmp2/foo-{barbaz.txt
tmp2/foo-{barplouf.txt tmp/foo-{barbaz.txt tmp/foo-{barplouf.txt',
        '{tmp,tmp2}/foo-{bar,baz}-{toto}.txt': 'tmp2/foo-bar-{toto}.txt
tmp2/foo-baz-{toto}.txt tmp/foo-bar-{toto}.txt tmp/foo-baz-{toto}.txt',
        '{tmp,tmp2}/foo-{bar,baz}-{toto.txt': 'tmp2/foo-bar-{toto.txt
tmp2/foo-baz-{toto.txt tmp/foo-bar-{toto.txt tmp/foo-baz-{toto.txt',
        '{tmp,tmp2}/foo-{bar,baz}-toto}.txt': 'tmp2/foo-bar-toto}.txt
tmp2/foo-baz-toto}.txt tmp/foo-bar-toto}.txt tmp/foo-baz-toto}.txt',
        'tm{p/foo,p2/foo}.txt': 'tmp2/foo.txt tmp/foo.txt',
        'foo-bar*{txt,xml}': 'foo-bar-plouf.txt foo-bar-toto.txt
foo-bar-toto}.txt foo-bar-{toto.txt foo-bar-{toto}.txt foo-bar.txt
foo-bar}.txt',
        'foo?bar.{txt,xml}': 'foo-bar.txt',
        }

(note that those actually correspond to files I have in the current
folder so that they match)

Anyone can think about other interesting patterns involving braces?

Also, if the consensus is that glob is not the proper place, it would be
pretty straight-forward to do it in a module that would expand the
braces before calling glob on the resulting patterns.


-- 
Mathieu
-------------- next part --------------
A non-text attachment was scrubbed...
Name: glob-curly.patch
Type: text/x-patch
Size: 2414 bytes
Desc: not available
URL: <http://mail.python.org/pipermail/python-ideas/attachments/20100817/a9771d18/attachment.bin>

From tjreedy at udel.edu  Tue Aug 17 22:24:56 2010
From: tjreedy at udel.edu (Terry Reedy)
Date: Tue, 17 Aug 2010 16:24:56 -0400
Subject: [Python-ideas] Curly braces expansion in shell-like matcher
	modules
In-Reply-To: <1282073967.3752.157.camel@caan>
References: <AANLkTi=170htwZXy-zmGhTyZy-T-NC=G_8sN6j26-ch8@mail.gmail.com>	<20100817150358.134067b1@pitrou.net>	<AANLkTi=LP77LNLUnfmOus=M_iKOeekE=Mce7JwRAs5x8@mail.gmail.com>
	<1282073967.3752.157.camel@caan>
Message-ID: <i4er6n$4t8$1@dough.gmane.org>

On 8/17/2010 3:39 PM, Mathieu Bridon wrote:

> Note that I wrote this patch against the Python trunk, and tested it on
> Python 2.5 (Windows XP) and Python 2.6 (Fedora 13). (I didn't have time
> to actually build the Python trunk and run the unit tests yet)

Just to make sure you are aware, the current trunk is the py3k branch. 
That is what 3.2 is being released from and what the patch needs to work 
with. The one labelled 'trunk' was 2.x and is now frozen (though it is, 
in a sense, continued as the maint27 branch).

At some point, summarize this thread on the tracker and give a link.
http://bugs.python.org/issue9584

-- 
Terry Jan Reedy



From ncoghlan at gmail.com  Tue Aug 17 23:16:48 2010
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Wed, 18 Aug 2010 07:16:48 +1000
Subject: [Python-ideas] @unittest.skip_others decorator
In-Reply-To: <20100817202904.358f92e1@pitrou.net>
References: <AANLkTikoFAhD4ARiB73mSxnebVj80De5fHqbP=vd4Up-@mail.gmail.com>
	<20100817202904.358f92e1@pitrou.net>
Message-ID: <AANLkTimWZLKGtFCiXk_22W2RrkeTF8ecjkZjQuw_cWCq@mail.gmail.com>

On Wed, Aug 18, 2010 at 4:29 AM, Antoine Pitrou <solipsis at pitrou.net> wrote:
> This would certainly be useful to me as well.

An alternative that would work for me is being able to specify
particular test cases and methods to regrtest - currently it only
offers test module granularity from the command line.

Cheers,
Nick.

-- 
Nick Coghlan?? |?? ncoghlan at gmail.com?? |?? Brisbane, Australia


From sergio at gruposinternet.com.br  Mon Aug 23 16:55:47 2010
From: sergio at gruposinternet.com.br (=?ISO-8859-1?Q?S=E9rgio?= Surkamp)
Date: Mon, 23 Aug 2010 11:55:47 -0300
Subject: [Python-ideas] Mutable default function parameter warning
Message-ID: <20100823115547.10907135@icedearth.corp.grupos.com.br>

Hello list,

The documentation states that the default value of function parameter,
if mutable, can change it's default value at runtime due to be
evaluated only once on function object creation.

I would like to suggest the inclusion of an default language warning
when this kind of construction is used, as it's Python specific
behavior and can lead to "strange behavior" or misuse by programmers
that are migrating from other languages to Python.

This proposal was first open as a suggestion issue in bug track, but,
as a request from Mr. Peterson, I'm rewriting it to this list.
http://bugs.python.org/issue9646

Regards,
-- 
  .:''''':.
.:'        `     S?rgio Surkamp | Gerente de Rede
::    ........   sergio at gruposinternet.com.br
`:.        .:'
  `:,   ,.:'     *Grupos Internet S.A.*
    `: :'        R. Lauro Linhares, 2123 Torre B - Sala 201
     : :         Trindade - Florian?polis - SC
     :.'
     ::          +55 48 3234-4109
     :
     '           http://www.gruposinternet.com.br


From merwok at netwok.org  Mon Aug 23 17:04:50 2010
From: merwok at netwok.org (=?UTF-8?B?w4lyaWMgQXJhdWpv?=)
Date: Mon, 23 Aug 2010 17:04:50 +0200
Subject: [Python-ideas] Mutable default function parameter warning
In-Reply-To: <20100823115547.10907135@icedearth.corp.grupos.com.br>
References: <20100823115547.10907135@icedearth.corp.grupos.com.br>
Message-ID: <4C728E12.3090907@netwok.org>

Hello

I think it?s more the realm of code checkers like pyflakes, pychecker or
pylint. From Python?s viewpoint, the behavior is perfectly legal and
documented.

Regards


From daniel at stutzbachenterprises.com  Mon Aug 23 17:08:26 2010
From: daniel at stutzbachenterprises.com (Daniel Stutzbach)
Date: Mon, 23 Aug 2010 10:08:26 -0500
Subject: [Python-ideas] Mutable default function parameter warning
In-Reply-To: <4C728E12.3090907@netwok.org>
References: <20100823115547.10907135@icedearth.corp.grupos.com.br>
	<4C728E12.3090907@netwok.org>
Message-ID: <AANLkTikdMvv5Gtswv3MyLdwUJnz+PWqrF03aOfHKnsfi@mail.gmail.com>

On Mon, Aug 23, 2010 at 10:04 AM, ?ric Araujo <merwok at netwok.org> wrote:

> I think it?s more the realm of code checkers like pyflakes, pychecker or
> pylint. From Python?s viewpoint, the behavior is perfectly legal and
> documented.
>

+1
--
Daniel Stutzbach, Ph.D.
President, Stutzbach Enterprises, LLC <http://stutzbachenterprises.com>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-ideas/attachments/20100823/e291be81/attachment.html>

From masklinn at masklinn.net  Mon Aug 23 17:09:44 2010
From: masklinn at masklinn.net (Masklinn)
Date: Mon, 23 Aug 2010 17:09:44 +0200
Subject: [Python-ideas] Mutable default function parameter warning
In-Reply-To: <4C728E12.3090907@netwok.org>
References: <20100823115547.10907135@icedearth.corp.grupos.com.br>
	<4C728E12.3090907@netwok.org>
Message-ID: <332F52BD-35E9-41E9-8E3C-CCAAC33039FD@masklinn.net>

On 2010-08-23, at 17:04 , ?ric Araujo wrote:
> Hello
> 
> I think it?s more the realm of code checkers like pyflakes, pychecker or
> pylint.

And FWIW, PyCharm for instance does check it for the usual suspects (dict and list used as default values)



From tjreedy at udel.edu  Mon Aug 23 20:06:13 2010
From: tjreedy at udel.edu (Terry Reedy)
Date: Mon, 23 Aug 2010 14:06:13 -0400
Subject: [Python-ideas] Mutable default function parameter warning
In-Reply-To: <20100823115547.10907135@icedearth.corp.grupos.com.br>
References: <20100823115547.10907135@icedearth.corp.grupos.com.br>
Message-ID: <i4udal$g53$1@dough.gmane.org>

On 8/23/2010 10:55 AM, S?rgio Surkamp wrote:
> Hello list,
>
> The documentation states that the default value of function parameter,
> if mutable, can change it's default value at runtime due to be
> evaluated only once on function object creation.
>
> I would like to suggest the inclusion of an default language warning
> when this kind of construction is used, as it's Python specific
> behavior and can lead to "strange behavior" or misuse by programmers
> that are migrating from other languages to Python.

I am opposed to this for multiple reasons.

1. Static checking is for checking programs, not the compiler. IDLE 
currently has a Run syntax check option. I believe there have be 
proposals to extend that to running external checking programs. I would 
be in favor of that.
2. Dynamic warning are already done -- by eventually raising an 
exception. If no exception is raised, then code is legal and the result 
*might* be correct.
3. There are numerous Python features that *might* be incorrect. For 
instance, reuse of builtin names. This is a long, slippery, slope.
4. Defining 'ismutable' is even harder than defining 'iscallable'.

> This proposal was first open as a suggestion issue in bug track, but,
> as a request from Mr. Peterson, I'm rewriting it to this list.
> http://bugs.python.org/issue9646

Please close the issue when the tracker is working again.

-- 
Terry Jan Reedy




From mal at egenix.com  Mon Aug 23 20:32:46 2010
From: mal at egenix.com (M.-A. Lemburg)
Date: Mon, 23 Aug 2010 20:32:46 +0200
Subject: [Python-ideas] Mutable default function parameter warning
In-Reply-To: <i4udal$g53$1@dough.gmane.org>
References: <20100823115547.10907135@icedearth.corp.grupos.com.br>
	<i4udal$g53$1@dough.gmane.org>
Message-ID: <4C72BECE.8050406@egenix.com>

Terry Reedy wrote:
> On 8/23/2010 10:55 AM, S?rgio Surkamp wrote:
>> Hello list,
>>
>> The documentation states that the default value of function parameter,
>> if mutable, can change it's default value at runtime due to be
>> evaluated only once on function object creation.
>>
>> I would like to suggest the inclusion of an default language warning
>> when this kind of construction is used, as it's Python specific
>> behavior and can lead to "strange behavior" or misuse by programmers
>> that are migrating from other languages to Python.
> 
> I am opposed to this for multiple reasons.
> 
> 1. Static checking is for checking programs, not the compiler. IDLE
> currently has a Run syntax check option. I believe there have be
> proposals to extend that to running external checking programs. I would
> be in favor of that.
> 2. Dynamic warning are already done -- by eventually raising an
> exception. If no exception is raised, then code is legal and the result
> *might* be correct.
> 3. There are numerous Python features that *might* be incorrect. For
> instance, reuse of builtin names. This is a long, slippery, slope.
> 4. Defining 'ismutable' is even harder than defining 'iscallable'.

Same here... there's a common idiom used in Python code for
function localized globals which the above would break:

"""
# Cache used for func()
_func_cache = {}

# Note: func_cache may be overridden with a caller private
# cache, default is to use the module global cache.
def func(a, b, c, func_cache=_func_cache):
   ...

or:

# Speed up func() by localizing len and str globals/builtins
def func(a, b, c, len=len, str=str):
   ... code using len() and str() ...

"""

The first is a perfectly legitimate use case.

The latter is not nice, but often needed to shortcut lookups
for globals and builtins when used in tight loops. There have
been numerous proposals on how to solve this, but none have made
it into the core.

>> This proposal was first open as a suggestion issue in bug track, but,
>> as a request from Mr. Peterson, I'm rewriting it to this list.
>> http://bugs.python.org/issue9646
> 
> Please close the issue when the tracker is working again.

-- 
Marc-Andre Lemburg
eGenix.com

Professional Python Services directly from the Source  (#1, Aug 23 2010)
>>> Python/Zope Consulting and Support ...        http://www.egenix.com/
>>> mxODBC.Zope.Database.Adapter ...             http://zope.egenix.com/
>>> mxODBC, mxDateTime, mxTextTools ...        http://python.egenix.com/
________________________________________________________________________

::: Try our new mxODBC.Connect Python Database Interface for free ! ::::


   eGenix.com Software, Skills and Services GmbH  Pastor-Loeh-Str.48
    D-40764 Langenfeld, Germany. CEO Dipl.-Math. Marc-Andre Lemburg
           Registered at Amtsgericht Duesseldorf: HRB 46611
               http://www.egenix.com/company/contact/


From jwringstad at gmail.com  Thu Aug 26 21:25:30 2010
From: jwringstad at gmail.com (Jonny)
Date: Thu, 26 Aug 2010 21:25:30 +0200
Subject: [Python-ideas] Keyword to disambiguate python version
Message-ID: <AANLkTi=3hEQAhEqxZKrKmvuyzuzvg0gc7aCvuOToJ92R@mail.gmail.com>

I would propose that an idiomatic way is created, for instance a
pragma or statement, which allows one to disambiguate the used python
version in a manner that is both obvious for the human reader, and as
well allows python to reject the script, should the wrong version of
the interpreter be present. I think this is a quite common
problem[1][2], which could be solved in a fairly easy and pragmatic
way.

Being a python beginner, I'm not well-adversed in the ways of
idiomatic python, so feel free to reject or improve on my syntactic
examples, these are merely meant to demonstrate my point.

Building on [3], I would suggest some syntax like this:

#!/usr/local/bin/python
? ? ? ? ?# version: python-3
? ? ? ? ?import os, sys
? ? ? ? ?...

Alternatively, some syntax could be used that allows one to use basic
comparison operators, a range or even simple chained logical
statements.

#!/usr/local/bin/python
? ? ? ? ?# version: [2.4 .. 3.0] and not 2.6.1

? ? ? ? ?# or multiple clauses

? ? ? ? ?# version: >= 2.4.2
? ? ? ? ?# version: < 3.0
? ? ? ? ?# version: not 2.6.1
? ? ? ? ?# jpython-version: ...

? ? ? ? ?# or multiple keys

? ? ? ? ?# min-version: 2.4.2
? ? ? ? ?# min-jpython-version: 2.4.4
? ? ? ? ?# max-version: 2.6.1
? ? ? ? ?import os, sys
? ? ? ? ?...

This way it should be fairly obvious to the casual reader which python
version the program is intended to run under, and the python
interpreter can simply reject to parse/execute the program, should it
encounter an incompatible requirement.

All comments, thoughts and suggestions welcome.

References
[1] http://stackoverflow.com/questions/446052/python-best-way-to-check-for-python-version-in-program-that-uses-new-language-fe
[2] http://stackoverflow.com/questions/1093322/how-do-i-check-what-version-of-python-is-running-my-script
[3] http://www.python.org/dev/peps/pep-0263/


From tony at pagedna.com  Thu Aug 26 21:56:58 2010
From: tony at pagedna.com (Tony Lownds)
Date: Thu, 26 Aug 2010 12:56:58 -0700
Subject: [Python-ideas] Keyword to disambiguate python version
In-Reply-To: <AANLkTi=3hEQAhEqxZKrKmvuyzuzvg0gc7aCvuOToJ92R@mail.gmail.com>
References: <AANLkTi=3hEQAhEqxZKrKmvuyzuzvg0gc7aCvuOToJ92R@mail.gmail.com>
Message-ID: <F2836B0D-02FD-40E3-9B16-B133ABD64B51@pagedna.com>


On Aug 26, 2010, at 12:25 PM, Jonny wrote:

> I would propose that an idiomatic way is created, for instance a
> pragma or statement, which allows one to disambiguate the used python
> version in a manner that is both obvious for the human reader, and as
> well allows python to reject the script, should the wrong version of
> the interpreter be present. I think this is a quite common
> problem[1][2], which could be solved in a fairly easy and pragmatic
> way.

+1. I've found 3rd party libs where I only find out what version its compatible with
by tripping over the wrong module. IMO it would be great for a pragma like Jonny 
mentions to be well supported.

Here is a different take on mechanism and semantics.

First, I'd like to be able to use:

#!/usr/bin/python3

This allows a simple rough cut is a packaging issue, but IIRC the standard 
Makefile does not create python3. So the specific proposal is for the 
Makefile start doing that.  This thread might be worth reading re: this issue.
http://www.mail-archive.com/fedora-devel-list at redhat.com/msg08865.html

Second, for the pragma, how about something like:
__python__ = '2.4'

When a new Python version sees that in the main script, a warning 
should be issued if __python__ is assigned a value that is lower than 
the current version or doesn't match the version format. In addition, if a 
module is imported with a higher version of __python__ than either
the importing module or the main script, a warning should be issued.

When a lint program sees that, any features not present in the version 
of python assigned to __python__ should be an error. In addition, lint 
programs should check the use of modules using the semantics that Python
does.

-Tony



From ncoghlan at gmail.com  Thu Aug 26 23:43:08 2010
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Fri, 27 Aug 2010 07:43:08 +1000
Subject: [Python-ideas] Keyword to disambiguate python version
In-Reply-To: <F2836B0D-02FD-40E3-9B16-B133ABD64B51@pagedna.com>
References: <AANLkTi=3hEQAhEqxZKrKmvuyzuzvg0gc7aCvuOToJ92R@mail.gmail.com>
	<F2836B0D-02FD-40E3-9B16-B133ABD64B51@pagedna.com>
Message-ID: <AANLkTinEasWZ5xy9NKKBigOe+BerZKAbGZpM7jMecZcf@mail.gmail.com>

On Fri, Aug 27, 2010 at 5:56 AM, Tony Lownds <tony at pagedna.com> wrote:
> Here is a different take on mechanism and semantics.
>
> First, I'd like to be able to use:
>
> #!/usr/bin/python3
>
> This allows a simple rough cut is a packaging issue, but IIRC the standard
> Makefile does not create python3. So the specific proposal is for the
> Makefile start doing that. ?This thread might be worth reading re: this issue.
> http://www.mail-archive.com/fedora-devel-list at redhat.com/msg08865.html

The Python 3 bininstall target in the Makefile puts the binary under
the python3 name:
    (cd $(DESTDIR)$(BINDIR); $(LN) python$(VERSION)$(EXE) $(PYTHON)3$(EXE))

(Note the "3" between $(PYTHON) and $(EXE))

So if people want to flag a script specifically as python3, they can
already say that in the shebang line either by referencing it directly
or by writing "#!/usr/bin/env python3").

This was done so that a "make install" of Py3k on a Linux box wouldn't
inadvertently clobber the system Python installation (I believe David
Malcolm from Fedora was one of a number of people making that
suggestion).

For anything else, I've never been a fan of "last version tested"
markers embedded in the code. It usually just leads to unnecessary
noise and busy-work (cf. Blizzard's "load out of date addons" flag to
workaround their own version markers for World of Warcraft addons).

Cheers,
Nick.

-- 
Nick Coghlan?? |?? ncoghlan at gmail.com?? |?? Brisbane, Australia


From vano at mail.mipt.ru  Thu Aug 26 23:43:31 2010
From: vano at mail.mipt.ru (Ivan Pozdeev)
Date: Fri, 27 Aug 2010 01:43:31 +0400
Subject: [Python-ideas] Keyword to disambiguate python version
In-Reply-To: <AANLkTi=3hEQAhEqxZKrKmvuyzuzvg0gc7aCvuOToJ92R@mail.gmail.com>
References: <AANLkTi=3hEQAhEqxZKrKmvuyzuzvg0gc7aCvuOToJ92R@mail.gmail.com>
Message-ID: <825575878.20100827014331@mail.mipt.ru>

????????????, Jonny.

?? ?????? 26 ??????? 2010 ?., 23:25:30:

> I would propose that an idiomatic way is created, for instance a
> pragma or statement, which allows one to disambiguate the used python
> version in a manner that is both obvious for the human reader, and as
> well allows python to reject the script, should the wrong version of
> the interpreter be present. I think this is a quite common
> problem[1][2], which could be solved in a fairly easy and pragmatic
> way.

Read sys.version / sys.api_version / sys.version_info / sys.subversion for that
kind of information.

However, depriving a user from the freedom to do as he pleases
(in this case, dictating what environment to use) is a disastrous mistake.

1) It's not version number that matters, it's functionality. There are
environments you can't even imagine where your code can be run.
There are Python builds and flavours outside the CPython's version sequence.

2) Tracking all possible Python kinds and versions where your code works isn't something
worth wasting time on. And if you don't include EVERYTHING IN EXISTENCE that can
run it, it will cause trouble on compatible environments that you didn't
'officially approve'.

If someone uses a code in an environment where it's unable to perform, it should
just break for that very reason.

Typical environment incompatibilities are clearly seen in Python: a code
adhering to Python Zen ('Errors should never pass silently') raises an error
where it's unable to proceed normally. Examples:

* ImportError (missing module)
* NameError (missing identifier in a module)
* TypeError/ValueError (incompatible interface)
* DeprecationWarning (exactly what is says)

The right place to tell a user the environment your code was written for is
documentation. A user just needs to know where to rely on you and where to take
responsibility his/herself.



From ncoghlan at gmail.com  Thu Aug 26 23:52:25 2010
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Fri, 27 Aug 2010 07:52:25 +1000
Subject: [Python-ideas] Keyword to disambiguate python version
In-Reply-To: <AANLkTi=3hEQAhEqxZKrKmvuyzuzvg0gc7aCvuOToJ92R@mail.gmail.com>
References: <AANLkTi=3hEQAhEqxZKrKmvuyzuzvg0gc7aCvuOToJ92R@mail.gmail.com>
Message-ID: <AANLkTim_P40ckqMwR+KBzT1PvxesYgs2nsUKUDSh9rv2@mail.gmail.com>

On Fri, Aug 27, 2010 at 5:25 AM, Jonny <jwringstad at gmail.com> wrote:
> I would propose that an idiomatic way is created, for instance a
> pragma or statement, which allows one to disambiguate the used python
> version in a manner that is both obvious for the human reader, and as
> well allows python to reject the script, should the wrong version of
> the interpreter be present. I think this is a quite common
> problem[1][2], which could be solved in a fairly easy and pragmatic
> way.

Your proposed solution doesn't help with either of the linked Stack
Overflow questions. The second one is completely covered by the
existing sys.version_info() function, as it is only asking how to
obtain the details of the running Python version. The first one is
subtler, asking how to deal with code that won't even compile on
earlier versions, so it will still fail to compile instead of
degrading gracefully on older versions, even if the file is flagged
somehow as requiring a later version.

So it isn't at all clear what is to be gained here over the
try-it-and-see-what-happens try/except approach (often best) or the
explicit "sys.version_info() >= required_version" check.

Cheers,
Nick.

-- 
Nick Coghlan?? |?? ncoghlan at gmail.com?? |?? Brisbane, Australia


From mikegraham at gmail.com  Thu Aug 26 23:59:02 2010
From: mikegraham at gmail.com (Mike Graham)
Date: Thu, 26 Aug 2010 17:59:02 -0400
Subject: [Python-ideas] Keyword to disambiguate python version
In-Reply-To: <AANLkTi=3hEQAhEqxZKrKmvuyzuzvg0gc7aCvuOToJ92R@mail.gmail.com>
References: <AANLkTi=3hEQAhEqxZKrKmvuyzuzvg0gc7aCvuOToJ92R@mail.gmail.com>
Message-ID: <AANLkTinojKJeWw1gwiTj7w7LfaGdnhuKQnBSQZtudN-P@mail.gmail.com>

On Thu, Aug 26, 2010 at 3:25 PM, Jonny <jwringstad at gmail.com> wrote:
> I would propose that an idiomatic way is created, for instance a
> pragma or statement, which allows one to disambiguate the used python
> version in a manner that is both obvious for the human reader, and as
> well allows python to reject the script, should the wrong version of
> the interpreter be present. I think this is a quite common
> problem[1][2], which could be solved in a fairly easy and pragmatic
> way.

-1 from me on this one.

The risk of something not working only because some dummy put the
wrong version in the pragma outweighs the benefit of version
information being enforced problematically rather than merely by
documentation.


From jwringstad at gmail.com  Fri Aug 27 00:11:40 2010
From: jwringstad at gmail.com (Jonny)
Date: Fri, 27 Aug 2010 00:11:40 +0200
Subject: [Python-ideas] Keyword to disambiguate python version
In-Reply-To: <825575878.20100827014331@mail.mipt.ru>
References: <AANLkTi=3hEQAhEqxZKrKmvuyzuzvg0gc7aCvuOToJ92R@mail.gmail.com>
	<825575878.20100827014331@mail.mipt.ru>
Message-ID: <AANLkTikpWX7Ddb7zQrvcMatVN4bsgKWaHEPaTU-3Nwfa@mail.gmail.com>

> However, depriving a user from the freedom to do as he pleases
> (in this case, dictating what environment to use) is a disastrous mistake.
That's a valid criticism, but maybe python could still warn about the
mismatch, even if it would not reject the program by default.

> 1) It's not version number that matters, it's functionality. There are
> environments you can't even imagine where your code can be run.
> There are Python builds and flavours outside the CPython's version sequence.
I agree, but there are also environments that I can determine not
being able to run my code, and then I could warn on those.
As to CPython, etc., maybe those can match against a "normalized"
version number that they claim to be compatible with.

> The right place to tell a user the environment your code was written for is
> documentation. A user just needs to know where to rely on you and where to take
> responsibility his/herself.
I guess you are right about that. And for the really important cases,
I suppose performing a check in the makefile, or, if everything else
fails, directly in the code (yuck) should suffice.


From ncoghlan at gmail.com  Fri Aug 27 00:29:21 2010
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Fri, 27 Aug 2010 08:29:21 +1000
Subject: [Python-ideas] Keyword to disambiguate python version
In-Reply-To: <AANLkTikpWX7Ddb7zQrvcMatVN4bsgKWaHEPaTU-3Nwfa@mail.gmail.com>
References: <AANLkTi=3hEQAhEqxZKrKmvuyzuzvg0gc7aCvuOToJ92R@mail.gmail.com>
	<825575878.20100827014331@mail.mipt.ru>
	<AANLkTikpWX7Ddb7zQrvcMatVN4bsgKWaHEPaTU-3Nwfa@mail.gmail.com>
Message-ID: <AANLkTi=mn0r_QV6rxAKkqRPeHBWBhuiVkJKKMHXb=HBj@mail.gmail.com>

On Fri, Aug 27, 2010 at 8:11 AM, Jonny <jwringstad at gmail.com> wrote:
>> The right place to tell a user the environment your code was written for is
>> documentation. A user just needs to know where to rely on you and where to take
>> responsibility his/herself.
> I guess you are right about that. And for the really important cases,
> I suppose performing a check in the makefile, or, if everything else
> fails, directly in the code (yuck) should suffice.

Umm, how is an explicit runtime check any uglier than a pragma? You
would need to check the pragma at runtime anyway, otherwise pyc files
wouldn't get checked reliably.

If a module breaks due to version incompatibilities, developers can
generally figure it out, especially if you document the tested
versions clearly (e.g. by using the relevant PyPI trove classifiers*).

Cheers,
Nick.

* Version specific classifiers for Python modules on PyPI:
Programming Language :: Python :: 2
Programming Language :: Python :: 2.3
Programming Language :: Python :: 2.4
Programming Language :: Python :: 2.5
Programming Language :: Python :: 2.6
Programming Language :: Python :: 2.7
Programming Language :: Python :: 3
Programming Language :: Python :: 3.0
Programming Language :: Python :: 3.1
Programming Language :: Python :: 3.2

-- 
Nick Coghlan?? |?? ncoghlan at gmail.com?? |?? Brisbane, Australia


From tony at pagedna.com  Fri Aug 27 01:02:47 2010
From: tony at pagedna.com (Tony Lownds)
Date: Thu, 26 Aug 2010 16:02:47 -0700
Subject: [Python-ideas] Keyword to disambiguate python version
In-Reply-To: <825575878.20100827014331@mail.mipt.ru>
References: <AANLkTi=3hEQAhEqxZKrKmvuyzuzvg0gc7aCvuOToJ92R@mail.gmail.com>
	<825575878.20100827014331@mail.mipt.ru>
Message-ID: <DB5149C1-7107-4F50-A1DD-56CDDAB9E8CB@pagedna.com>

> 1) It's not version number that matters, it's functionality. There are
> environments you can't even imagine where your code can be run.
> There are Python builds and flavours outside the CPython's version sequence.

Very true, but other Python flavors commonly specify what features they version 
of CPython's features they support. eg "Jython 2.5 implements the same language 
as CPython 2.5", "IronPython 2.7 Alpha 1 was released, supporting Python 2.7 features"

> 2) ... The right place to tell a user the environment your code was written for is
> documentation. A user just needs to know where to rely on you and where to take
> responsibility his/herself.

Well if a project you want to use doesn't specify what version they support pretty well, 
taking responsibility yourself is potentially time consuming...

I for one like the idea of having some automated systems take care of that, to lower 
the cost for projects to support a wider range of python versions. Establishing an idiom 
like was done for __future__ is only the start of that, and may not even be strictly 
necessary but it would be a good start.

-Tony

From sergio at gruposinternet.com.br  Fri Aug 27 02:12:30 2010
From: sergio at gruposinternet.com.br (=?ISO-8859-1?Q?S=E9rgio?= Surkamp)
Date: Thu, 26 Aug 2010 21:12:30 -0300
Subject: [Python-ideas] Keyword to disambiguate python version
In-Reply-To: <AANLkTinojKJeWw1gwiTj7w7LfaGdnhuKQnBSQZtudN-P@mail.gmail.com>
References: <AANLkTi=3hEQAhEqxZKrKmvuyzuzvg0gc7aCvuOToJ92R@mail.gmail.com>
	<AANLkTinojKJeWw1gwiTj7w7LfaGdnhuKQnBSQZtudN-P@mail.gmail.com>
Message-ID: <20100826211230.159b0415@icedearth.corp.grupos.com.br>

-1 from me too.

You'll need to change (or update) the source code every time a new
release went out.

Em Thu, 26 Aug 2010 17:59:02 -0400
Mike Graham <mikegraham at gmail.com> escreveu:

> On Thu, Aug 26, 2010 at 3:25 PM, Jonny <jwringstad at gmail.com> wrote:
> > I would propose that an idiomatic way is created, for instance a
> > pragma or statement, which allows one to disambiguate the used
> > python version in a manner that is both obvious for the human
> > reader, and as well allows python to reject the script, should the
> > wrong version of the interpreter be present. I think this is a
> > quite common problem[1][2], which could be solved in a fairly easy
> > and pragmatic way.
> 
> -1 from me on this one.
> 
> The risk of something not working only because some dummy put the
> wrong version in the pragma outweighs the benefit of version
> information being enforced problematically rather than merely by
> documentation.
> _______________________________________________
> Python-ideas mailing list
> Python-ideas at python.org
> http://mail.python.org/mailman/listinfo/python-ideas


-- 
  .:''''':.
.:'        `     S?rgio Surkamp | Gerente de Rede
::    ........   sergio at gruposinternet.com.br
`:.        .:'
  `:,   ,.:'     *Grupos Internet S.A.*
    `: :'        R. Lauro Linhares, 2123 Torre B - Sala 201
     : :         Trindade - Florian?polis - SC
     :.'
     ::          +55 48 3234-4109
     :
     '           http://www.gruposinternet.com.br