From greg.ewing at canterbury.ac.nz  Wed Apr  1 01:29:29 2009
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Wed, 01 Apr 2009 11:29:29 +1200
Subject: [Python-ideas] Yield-From: Finalization guarantees
In-Reply-To: <49D255BC.6080503@improva.dk>
References: <49AB1F90.7070201@canterbury.ac.nz> <49C789F3.30301@improva.dk>
	<49C7F0C3.10904@gmail.com> <49C81621.9040600@canterbury.ac.nz>
	<49C81A45.1070803@canterbury.ac.nz> <49C94CF6.5070301@gmail.com>
	<49C9D162.5040907@canterbury.ac.nz> <49CA20F2.7040207@gmail.com>
	<49CA4029.6050703@improva.dk> <49CABFC6.1080207@canterbury.ac.nz>
	<49CAC0FE.5010305@improva.dk> <49CACB39.3020708@canterbury.ac.nz>
	<49CAD15D.2090008@improva.dk> <49CB155E.4040504@canterbury.ac.nz>
	<49CB8E4A.3050108@improva.dk> <49CC5D85.30409@canterbury.ac.nz>
	<49CE29BF.3040502@improva.dk> <49CEB8DE.8060200@gmail.com>
	<49CEBCD5.7020107@canterbury.ac.nz> <49CF6AAF.70109@improva.dk>
	<49D05C8F.3040800@canterbury.ac.nz> <49D0A324.1030701@gmail.com>
	<49D143B1.9040009@canterbury.ac.nz> <49D1542E.7070503@improva.dk>
	<49D16294.9030205@canterbury.ac.nz> <49D17247.20705@improva.dk>
	<49D18D31.9000008@canterbury.ac.nz> <49D1E5E6.5000007@improva.dk>
	<49D207BE.8090909@gmail.com> <49D255BC.6080503@improva.dk>
Message-ID: <49D2A759.5080204@canterbury.ac.nz>

Jacob Holm wrote:

> will also remove some behavior 
> that could have been useful, such as the ability to suppress the 
> GeneratorExit if you know what you are doing.

I'm not convinced there are any use cases for suppressing
GeneratorExit in the first place. Can you provide an
example that couldn't be easily done some other way?

--
Greg


From jh at improva.dk  Wed Apr  1 03:45:25 2009
From: jh at improva.dk (Jacob Holm)
Date: Wed, 01 Apr 2009 03:45:25 +0200
Subject: [Python-ideas] Yield-From: Finalization guarantees
In-Reply-To: <49D2A759.5080204@canterbury.ac.nz>
References: <49AB1F90.7070201@canterbury.ac.nz> <49C7F0C3.10904@gmail.com>
	<49C81621.9040600@canterbury.ac.nz>	<49C81A45.1070803@canterbury.ac.nz>
	<49C94CF6.5070301@gmail.com>	<49C9D162.5040907@canterbury.ac.nz>
	<49CA20F2.7040207@gmail.com>	<49CA4029.6050703@improva.dk>
	<49CABFC6.1080207@canterbury.ac.nz>	<49CAC0FE.5010305@improva.dk>
	<49CACB39.3020708@canterbury.ac.nz>	<49CAD15D.2090008@improva.dk>
	<49CB155E.4040504@canterbury.ac.nz>	<49CB8E4A.3050108@improva.dk>
	<49CC5D85.30409@canterbury.ac.nz>	<49CE29BF.3040502@improva.dk>
	<49CEB8DE.8060200@gmail.com>	<49CEBCD5.7020107@canterbury.ac.nz>
	<49CF6AAF.70109@improva.dk>	<49D05C8F.3040800@canterbury.ac.nz>
	<49D0A324.1030701@gmail.com>	<49D143B1.9040009@canterbury.ac.nz>
	<49D1542E.7070503@improva.dk>	<49D16294.9030205@canterbury.ac.nz>
	<49D17247.20705@improva.dk>	<49D18D31.9000008@canterbury.ac.nz>
	<49D1E5E6.5000007@improva.dk>	<49D207BE.8090909@gmail.com>
	<49D255BC.6080503@improva.dk> <49D2A759.5080204@canterbury.ac.nz>
Message-ID: <49D2C735.8020803@improva.dk>

Greg Ewing wrote:
> Jacob Holm wrote:
>
>> will also remove some behavior that could have been useful, such as 
>> the ability to suppress the GeneratorExit if you know what you are 
>> doing.
>
> I'm not convinced there are any use cases for suppressing
> GeneratorExit in the first place. Can you provide an
> example that couldn't be easily done some other way?

I don't have any real use cases, just a few examples of things you can 
do in #2 that become a bit uglier in #3 or #4. This:

def inner():
    try:
        for i in xrange(10):
            yield i
    except GeneratorExit:
        return i
    return "all"

def outer():
    val = yield from inner()
    print val
    return val


Does not behave like you would expect because the "return i" is 
swallowed by the call to inner.close() (or is it?) and the "print val" 
and "return val" statements are skipped due to the reraised 
GeneratorExit. To get the a value out of the generator being closed you 
need to raise and catch your own exception:

class Return(Exception): pass

def inner():
    try:
        for i in xrange(10):
            yield i
    except GeneratorExit:
        raise Return(i)
    return "all"

def outer():
    try:
        val = yield from inner()
    except Return as r:
        val = r.args[0]
    print val
    return val


This is certainly doable, but ugly compared to the version using return.

Here is an idea that would help this a little bit. We could change close 
to return the value (if any) returned by the generator, and then attach 
that value to the reraised GeneratorExit in the yield-from expansion. 
That would allow the example to be rewritten as:

def inner():
    try:
        for i in xrange(10):
            yield i
    except GeneratorExit:
        return i
    return "all"

def outer():
    try:
        val = yield from inner()
    except GeneratorExit as e:
        val = e.value
    print val
    return val


Which I think is much nicer.

- Jacob



From guido at python.org  Wed Apr  1 06:10:28 2009
From: guido at python.org (Guido van Rossum)
Date: Tue, 31 Mar 2009 21:10:28 -0700
Subject: [Python-ideas] Yield-From: Finalization guarantees
In-Reply-To: <49D05C8F.3040800@canterbury.ac.nz>
References: <49AB1F90.7070201@canterbury.ac.nz> <49CAD15D.2090008@improva.dk> 
	<49CB155E.4040504@canterbury.ac.nz> <49CB8E4A.3050108@improva.dk> 
	<49CC5D85.30409@canterbury.ac.nz> <49CE29BF.3040502@improva.dk> 
	<49CEB8DE.8060200@gmail.com> <49CEBCD5.7020107@canterbury.ac.nz> 
	<49CF6AAF.70109@improva.dk> <49D05C8F.3040800@canterbury.ac.nz>
Message-ID: <ca471dc20903312110y287cc5fbhf3abfb8fdac5d3dd@mail.gmail.com>

On Sun, Mar 29, 2009 at 10:45 PM, Greg Ewing
<greg.ewing at canterbury.ac.nz> wrote:
> The problem of how to handle GeneratorExit doesn't
> seem to have any entirely satisfactory solution.
>
> On the one hand, the inlining principle requires that
> we never re-raise it if the subgenerator turns it into
> a StopIteration (or GeneratorReturn).
>
> On the other hand, not re-raising it means that a
> broken generator can easily result from innocuously
> combining two things that are individually legitimate.
>
> I think we just have to accept this, and state that
> refactoring only preserves semantics as long as the
> code block being factored out does not catch
> GeneratorExit without re-raising it. Then we're free
> to always re-raise GeneratorExit and prevent broken
> generators from occurring.
>
> I'm inclined to think this situation is a symptom that
> the idea of being able to catch GeneratorExit at all
> is flawed. If generator finalization were implemented
> by means of a forced return, or something equally
> uncatchable, instead of an exception, we wouldn't have
> so much of a problem.
>
> Earlier I said that I thought GeneratorExit was best
> regarded as an implementation detail of generators.
> I'd like to strengthen that statement and say that it
> should be considered a detail of the *present*
> implementation of generators, subject to change in
> future or alternate Pythons.
>
> Related to that, I'm starting to come back to my
> original instinct that GeneratorExit should not be
> thrown into the subiterator at all. Rather, it should
> be taken as an indication that the delegating generator
> is being finalized, and the subiterator's close()
> method called if it has one. Then there's never any
> question about whether to re-raise it -- we should
> always do so.

This sounds fine -- though somehow I have a feeling nobody will really
care either way, and when it causees a problem, it's going to cost an
afternoon of debugging regardless. So do what's easiest to implement,
we can always fix it later.

BTW, I'd really like it if you (and others interested in PEP 380) read
Dave Beazley's excellent coroutines tutorial
(http://dabeaz.com/coroutines/), and commented on how yield-from can
make his example code easier to write or faster. The tutorial comes
with ample warnings about its mind-bending nature but I found it
excellently written and very clear on the three different use cases
for yield: iteration, receiving messages, and "traps" (cooperative
multitasking). I cannot plug this enough. (Thanks Jeremy Hylton for
mentioning it to me.)

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)


From greg.ewing at canterbury.ac.nz  Wed Apr  1 09:23:00 2009
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Wed, 01 Apr 2009 19:23:00 +1200
Subject: [Python-ideas] Yield-From: Finalization guarantees
In-Reply-To: <ca471dc20903312110y287cc5fbhf3abfb8fdac5d3dd@mail.gmail.com>
References: <49AB1F90.7070201@canterbury.ac.nz> <49CAD15D.2090008@improva.dk>
	<49CB155E.4040504@canterbury.ac.nz> <49CB8E4A.3050108@improva.dk>
	<49CC5D85.30409@canterbury.ac.nz> <49CE29BF.3040502@improva.dk>
	<49CEB8DE.8060200@gmail.com> <49CEBCD5.7020107@canterbury.ac.nz>
	<49CF6AAF.70109@improva.dk> <49D05C8F.3040800@canterbury.ac.nz>
	<ca471dc20903312110y287cc5fbhf3abfb8fdac5d3dd@mail.gmail.com>
Message-ID: <49D31654.6000203@canterbury.ac.nz>

Guido van Rossum wrote:

> BTW, I'd really like it if you (and others interested in PEP 380) read
> Dave Beazley's excellent coroutines tutorial
> (http://dabeaz.com/coroutines/), and commented on how yield-from can
> make his example code easier to write or faster.

The place where yield-from enters the picture would
be in Part 8 ("The Problem with the Stack"), where
it would eliminate the need for the scheduler to do
trampolining of calls and returns.

The idea of yield being like a system call is an
interesting perspective. Imagine what it would be like
if ordinary programs had to make system calls every
time they wanted to call or return from a function!
That's the situation we have now when using generators
as coroutines, and it's the problem that yield-from
addresses.

-- 
Greg


From greg.ewing at canterbury.ac.nz  Wed Apr  1 09:29:25 2009
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Wed, 01 Apr 2009 19:29:25 +1200
Subject: [Python-ideas] Yet another alternative name for yield-from
In-Reply-To: <ca471dc20903312110y287cc5fbhf3abfb8fdac5d3dd@mail.gmail.com>
References: <49AB1F90.7070201@canterbury.ac.nz> <49CAD15D.2090008@improva.dk>
	<49CB155E.4040504@canterbury.ac.nz> <49CB8E4A.3050108@improva.dk>
	<49CC5D85.30409@canterbury.ac.nz> <49CE29BF.3040502@improva.dk>
	<49CEB8DE.8060200@gmail.com> <49CEBCD5.7020107@canterbury.ac.nz>
	<49CF6AAF.70109@improva.dk> <49D05C8F.3040800@canterbury.ac.nz>
	<ca471dc20903312110y287cc5fbhf3abfb8fdac5d3dd@mail.gmail.com>
Message-ID: <49D317D5.6080705@canterbury.ac.nz>

I've just thought of another possible alternative
name for yield-from:

   y = gcall f(x)

-- 
Greg



From leif.walsh at gmail.com  Wed Apr  1 10:46:19 2009
From: leif.walsh at gmail.com (Leif Walsh)
Date: Wed, 1 Apr 2009 04:46:19 -0400 (EDT)
Subject: [Python-ideas] Yet another alternative name for yield-from
In-Reply-To: <49D317D5.6080705@canterbury.ac.nz>
Message-ID: <fszrx5grjj3hg4kpriUYAxe124vaj_firegpg@mail.gmail.com>

On Wed, Apr 1, 2009 at 3:29 AM, Greg Ewing <greg.ewing at canterbury.ac.nz> wrote:
> I've just thought of another possible alternative
> name for yield-from:
>
> ?y = gcall f(x)

Makes me think of a google hotline, or a typo of gcal.

$\pm 0$

-- 
Cheers,
Leif

-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 270 bytes
Desc: OpenPGP digital signature
URL: <http://mail.python.org/pipermail/python-ideas/attachments/20090401/a5d790f7/attachment.pgp>

From ncoghlan at gmail.com  Wed Apr  1 13:28:28 2009
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Wed, 01 Apr 2009 21:28:28 +1000
Subject: [Python-ideas] Yield-From: Finalization guarantees
In-Reply-To: <49D2C735.8020803@improva.dk>
References: <49AB1F90.7070201@canterbury.ac.nz>
	<49C81621.9040600@canterbury.ac.nz>	<49C81A45.1070803@canterbury.ac.nz>	<49C94CF6.5070301@gmail.com>	<49C9D162.5040907@canterbury.ac.nz>	<49CA20F2.7040207@gmail.com>	<49CA4029.6050703@improva.dk>	<49CABFC6.1080207@canterbury.ac.nz>	<49CAC0FE.5010305@improva.dk>	<49CACB39.3020708@canterbury.ac.nz>	<49CAD15D.2090008@improva.dk>	<49CB155E.4040504@canterbury.ac.nz>	<49CB8E4A.3050108@improva.dk>	<49CC5D85.30409@canterbury.ac.nz>	<49CE29BF.3040502@improva.dk>	<49CEB8DE.8060200@gmail.com>	<49CEBCD5.7020107@canterbury.ac.nz>	<49CF6AAF.70109@improva.dk>	<49D05C8F.3040800@canterbury.ac.nz>	<49D0A324.1030701@gmail.com>	<49D143B1.9040009@canterbury.ac.nz>	<49D1542E.7070503@improva.dk>	<49D16294.9030205@canterbury.ac.nz>	<49D17247.20705@improva.dk>	<49D18D31.9000008@canterbury.ac.nz>	<49D1E5E6.5000007@improva.dk>	<49D207BE.8090909@gmail.com>	<49D255BC.6080503@improva.dk>
	<49D2A759.5080204@canterbury.ac.nz> <49D2C735.8020803@improva.dk>
Message-ID: <49D34FDC.5050106@gmail.com>

Jacob Holm wrote:
> Greg Ewing wrote:
>> Jacob Holm wrote:
>>
>>> will also remove some behavior that could have been useful, such as
>>> the ability to suppress the GeneratorExit if you know what you are
>>> doing.
>>
>> I'm not convinced there are any use cases for suppressing
>> GeneratorExit in the first place. Can you provide an
>> example that couldn't be easily done some other way?
> 
> I don't have any real use cases, just a few examples of things you can
> do in #2 that become a bit uglier in #3 or #4.

You appear to be thinking of GeneratorExit as a way to ask a generator
to finish normally such that it still makes sense to try to return a
value after a GeneratorExit has been thrown in to the current frame, but
that really isn't its role.

Instead, it's more of an "Abandon Ship! Abandon Ship! All hands to the
lifeboats!" indication that gives the generator a chance to release any
resources it might be holding and bail out. The reason that close()
accepts a StopIteration as well as a GeneratorExit is that the former
still indicates that the generator has finalised itself, so the
objective of calling close() has been achieved and there is no need to
report an error.

Any code that catches GeneratorExit without reraising it is highly
suspect, just like code that suppresses SystemExit and KeyboardInterrupt.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------


From ncoghlan at gmail.com  Wed Apr  1 13:45:10 2009
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Wed, 01 Apr 2009 21:45:10 +1000
Subject: [Python-ideas] Yet another alternative name for yield-from
In-Reply-To: <49D317D5.6080705@canterbury.ac.nz>
References: <49AB1F90.7070201@canterbury.ac.nz>
	<49CAD15D.2090008@improva.dk>	<49CB155E.4040504@canterbury.ac.nz>
	<49CB8E4A.3050108@improva.dk>	<49CC5D85.30409@canterbury.ac.nz>
	<49CE29BF.3040502@improva.dk>	<49CEB8DE.8060200@gmail.com>
	<49CEBCD5.7020107@canterbury.ac.nz>	<49CF6AAF.70109@improva.dk>
	<49D05C8F.3040800@canterbury.ac.nz>	<ca471dc20903312110y287cc5fbhf3abfb8fdac5d3dd@mail.gmail.com>
	<49D317D5.6080705@canterbury.ac.nz>
Message-ID: <49D353C6.3040509@gmail.com>

Greg Ewing wrote:
> I've just thought of another possible alternative
> name for yield-from:
> 
>   y = gcall f(x)

However, you would lose the common mnemonic with yield for both turning
the current function into a generator and indicating to the reader that
the current frame may get suspended at a particular point.

While using 'yield from' obscures the new 'generator calling' aspect of
the new expression, it maintains the use of yield to indicate both "this
is a generator" and "this frame may get suspended here for an
arbitrarily long period of time".

"'yield from' is like calling a generator" may be a slightly odd
spelling of the concept, it is at least still memorable and fairly easy
to learn - while you aren't likely to guess all the details of what it
does a priori, you're unlikely to forget what it does after you have
learnt it the first time. That strikes me as a much better option than
asking everyone to learn new rules for what can turn a normal function
into a generator.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------


From jh at improva.dk  Wed Apr  1 15:44:48 2009
From: jh at improva.dk (Jacob Holm)
Date: Wed, 01 Apr 2009 15:44:48 +0200
Subject: [Python-ideas] Yield-From: Finalization guarantees
In-Reply-To: <49D34FDC.5050106@gmail.com>
References: <49AB1F90.7070201@canterbury.ac.nz>
	<49C81A45.1070803@canterbury.ac.nz>	<49C94CF6.5070301@gmail.com>	<49C9D162.5040907@canterbury.ac.nz>	<49CA20F2.7040207@gmail.com>	<49CA4029.6050703@improva.dk>	<49CABFC6.1080207@canterbury.ac.nz>	<49CAC0FE.5010305@improva.dk>	<49CACB39.3020708@canterbury.ac.nz>	<49CAD15D.2090008@improva.dk>	<49CB155E.4040504@canterbury.ac.nz>	<49CB8E4A.3050108@improva.dk>	<49CC5D85.30409@canterbury.ac.nz>	<49CE29BF.3040502@improva.dk>	<49CEB8DE.8060200@gmail.com>	<49CEBCD5.7020107@canterbury.ac.nz>	<49CF6AAF.70109@improva.dk>	<49D05C8F.3040800@canterbury.ac.nz>	<49D0A324.1030701@gmail.com>	<49D143B1.9040009@canterbury.ac.nz>	<49D1542E.7070503@improva.dk>	<49D16294.9030205@canterbury.ac.nz>	<49D17247.20705@improva.dk>	<49D18D31.9000008@canterbury.ac.nz>	<49D1E5E6.5000007@improva.dk>	<49D207BE.8090909@gmail.com>	<49D255BC.6080503@improva.dk>
	<49D2A759.5080204@canterbury.ac.nz>
	<49D2C735.8020803@improva.dk> <49D34FDC.5050106@gmail.com>
Message-ID: <49D36FD0.3080602@improva.dk>

Nick Coghlan wrote:
> Jacob Holm wrote:
>   
>> Greg Ewing wrote:
>>     
>>> Jacob Holm wrote:
>>>
>>>       
>>>> will also remove some behavior that could have been useful, such as
>>>> the ability to suppress the GeneratorExit if you know what you are
>>>> doing.
>>>>         
>>> I'm not convinced there are any use cases for suppressing
>>> GeneratorExit in the first place. Can you provide an
>>> example that couldn't be easily done some other way?
>>>       
>> I don't have any real use cases, just a few examples of things you can
>> do in #2 that become a bit uglier in #3 or #4.
>>     
>
> You appear to be thinking of GeneratorExit as a way to ask a generator
> to finish normally such that it still makes sense to try to return a
> value after a GeneratorExit has been thrown in to the current frame, 
>   

Yes.  I am thinking that when using this for refactoring, there are 
likely to be cases where the closing generator needs to provide some 
final piece of information to its caller so that the caller can do *its* 
finalization.  Using return for that purpose has a number of control 
flow advantages.   If you insist we shouldn't use return for this, we 
should make close raise a RuntimeError like this:

def close(self):
    try:
        self.throw(GeneratorExit)
    except StopIteration, e:
        if e.value is not None:
            raise RuntimeError('generator responded to GeneratorExit by returning with a value')
    except GeneratorExit:
        pass
    else:
        raise RuntimeError('generator ignored GeneratorExit')


Of course I would prefer to use "return e.value" instead of the first 
RuntimeError, because that seems like the obvious thing to expect when 
you close a generator containing "try..except GeneratorExit: return 
value".   And once we have close returning a value, it would be nice to 
have access to that value in the context of the yield-from expression.  
Attaching it to the GeneratorExit (re)raised by yield-from seems like 
the only logical choice.  As my third code fragment showed, you could 
then explicitly recatch the GeneratorExit and get the value there.

> but that really isn't its role.
>
> Instead, it's more of an "Abandon Ship! Abandon Ship! All hands to the
> lifeboats!" indication that gives the generator a chance to release any
> resources it might be holding and bail out. 

That might be the prevailing wisdom concerning GeneratorExit, at least 
partly based on the fact that the only way to communicate anything 
useful out of a closing generator is to raise another exception.   
Thinking a bit about coroutines, it would be nice to use "send" for the 
normal communication and "close" to shut it down and getting a final 
result.  Example:

def averager():
    count = 0
    sum = 0
    while 1:
        try: 
            val = (yield)
        except GeneratorExit:
            return sum/count
        else:
            sum += val
            count += 1

avg = averager()
avg.next() # start coroutine
avg.send(1.0)
avg.send(2.0)
print avg.close()  # prints 1.5


To do something similar today requires either a custom exception, or the 
use of special values to tell the generator to yield the result.  I find 
this version a lot cleaner.

> The reason that close()
> accepts a StopIteration as well as a GeneratorExit is that the former
> still indicates that the generator has finalised itself, so the
> objective of calling close() has been achieved and there is no need to
> report an error.
>   

I have argued before that accepting StopIteration in close is likely to 
hide bugs in the closed generator, because the StopIteration may come 
from a return in a finally clause.  However, since we *are* accepting 
StopIteration we might as well make it useful.

> Any code that catches GeneratorExit without reraising it is highly
> suspect, just like code that suppresses SystemExit and KeyboardInterrupt.
>   

Explicitly catching GeneratorExit and then returning is a valid use 
today that I wouldn't consider suspect.  Catching GeneratorExit and then 
exiting the except block by other means than a raise or return is 
suspect, but has valid uses.

Best regards,

- Jacob


From ncoghlan at gmail.com  Wed Apr  1 16:22:51 2009
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Thu, 02 Apr 2009 00:22:51 +1000
Subject: [Python-ideas] Yield-From: Finalization guarantees
In-Reply-To: <49D36FD0.3080602@improva.dk>
References: <49AB1F90.7070201@canterbury.ac.nz>
	<49C94CF6.5070301@gmail.com>	<49C9D162.5040907@canterbury.ac.nz>	<49CA20F2.7040207@gmail.com>	<49CA4029.6050703@improva.dk>	<49CABFC6.1080207@canterbury.ac.nz>	<49CAC0FE.5010305@improva.dk>	<49CACB39.3020708@canterbury.ac.nz>	<49CAD15D.2090008@improva.dk>	<49CB155E.4040504@canterbury.ac.nz>	<49CB8E4A.3050108@improva.dk>	<49CC5D85.30409@canterbury.ac.nz>	<49CE29BF.3040502@improva.dk>	<49CEB8DE.8060200@gmail.com>	<49CEBCD5.7020107@canterbury.ac.nz>	<49CF6AAF.70109@improva.dk>	<49D05C8F.3040800@canterbury.ac.nz>	<49D0A324.1030701@gmail.com>	<49D143B1.9040009@canterbury.ac.nz>	<49D1542E.7070503@improva.dk>	<49D16294.9030205@canterbury.ac.nz>	<49D17247.20705@improva.dk>	<49D18D31.9000008@canterbury.ac.nz>	<49D1E5E6.5000007@improva.dk>	<49D207BE.8090909@gmail.com>	<49D255BC.6080503@improva.dk>
	<49D2A759.5080204@canterbury.ac.nz>
	<49D2C735.8020803@improva.dk> <49D34FDC.5050106@gmail.com>
	<49D36FD0.3080602@improva.dk>
Message-ID: <49D378BB.1030409@gmail.com>

Jacob Holm wrote:
> Explicitly catching GeneratorExit and then returning is a valid use
> today that I wouldn't consider suspect.  Catching GeneratorExit and then
> exiting the except block by other means than a raise or return is
> suspect, but has valid uses.

What are these valid uses? The PEP 342 definition made some sense
originally when GeneratorExit was a subclass of Exception and hence easy
to suppress accidentally, but I have serious doubts about the validity
of trapping it and turning it into StopIteration now that it has been
moved out to inherit directly from BaseException.

Regardless, unless Greg goes out of his way to change the meaning of
close() in the PEP, GeneratorReturn will escape from close() (since that
only traps StopIteration). That means you'll be able to catch that
exception directly if you really want to, and if you don't it will
bubble up out of the original close() call that was made on the
outermost generator.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------


From jh at improva.dk  Wed Apr  1 17:39:34 2009
From: jh at improva.dk (Jacob Holm)
Date: Wed, 01 Apr 2009 17:39:34 +0200
Subject: [Python-ideas] Yield-From: Finalization guarantees
In-Reply-To: <49D378BB.1030409@gmail.com>
References: <49AB1F90.7070201@canterbury.ac.nz>
	<49C9D162.5040907@canterbury.ac.nz>	<49CA20F2.7040207@gmail.com>	<49CA4029.6050703@improva.dk>	<49CABFC6.1080207@canterbury.ac.nz>	<49CAC0FE.5010305@improva.dk>	<49CACB39.3020708@canterbury.ac.nz>	<49CAD15D.2090008@improva.dk>	<49CB155E.4040504@canterbury.ac.nz>	<49CB8E4A.3050108@improva.dk>	<49CC5D85.30409@canterbury.ac.nz>	<49CE29BF.3040502@improva.dk>	<49CEB8DE.8060200@gmail.com>	<49CEBCD5.7020107@canterbury.ac.nz>	<49CF6AAF.70109@improva.dk>	<49D05C8F.3040800@canterbury.ac.nz>	<49D0A324.1030701@gmail.com>	<49D143B1.9040009@canterbury.ac.nz>	<49D1542E.7070503@improva.dk>	<49D16294.9030205@canterbury.ac.nz>	<49D17247.20705@improva.dk>	<49D18D31.9000008@canterbury.ac.nz>	<49D1E5E6.5000007@improva.dk>	<49D207BE.8090909@gmail.com>	<49D255BC.6080503@improva.dk>
	<49D2A759.5080204@canterbury.ac.nz>
	<49D2C735.8020803@improva.dk> <49D34FDC.5050106@gmail.com>
	<49D36FD0.3080602@improva.dk> <49D378BB.1030409@gmail.com>
Message-ID: <49D38AB6.3080009@improva.dk>

Nick Coghlan wrote:
> Jacob Holm wrote:
>   
>> Explicitly catching GeneratorExit and then returning is a valid use
>> today that I wouldn't consider suspect.  Catching GeneratorExit and then
>> exiting the except block by other means than a raise or return is
>> suspect, but has valid uses.
>>     
>
> What are these valid uses? The PEP 342 definition made some sense
> originally when GeneratorExit was a subclass of Exception and hence easy
> to suppress accidentally, but I have serious doubts about the validity
> of trapping it and turning it into StopIteration now that it has been
> moved out to inherit directly from BaseException.
>   
When catching and returning, the control flow is different than if you 
were catching and raising.  Also using "return" more clearly signals the 
intent to leave the generator than "raise".  Even with GeneratorExit 
inheriting directly from BaseException, it is still easier to intercept 
an exception than a return.

The use for catching and exiting the block normally is to share some 
code between the cases.  Of course you need to be careful when you do 
that, but it can save some duplication.

Both these uses are valid in the sense that the generators work as 
advertised and follow the rules of finalization as defined by PEP 342.  
I don't think a proposal for changing close to not accept StopIteration 
is going to fly.

> Regardless, unless Greg goes out of his way to change the meaning of
> close() in the PEP, GeneratorReturn will escape from close() (since that
> only traps StopIteration). That means you'll be able to catch that
> exception directly if you really want to, and if you don't it will
> bubble up out of the original close() call that was made on the
> outermost generator.
>   

Hmm.  I had almost forgotten about the separate GeneratorReturn 
exception.  It would be good to see how that changes  things.  So far I 
consider it a needless complication, but I would like to read a version 
of the PEP that include it to see how bad it is.

As for GeneratorReturn not being caught by close(), I find it really 
strange if returning a non-None value as a response to GeneratorExit 
makes close() raise a GeneratorReturn.  Whereas returning None makes 
close finish without an exception.  If you think returning a non-None 
value is an error, we should make it a (subclass of) RuntimeError rather 
than a GeneratorReturn to clearly indicate this.

I am strongly in favor of changing close to return the value rather than 
letting the GeneratorReturn pass through or raising a RuntimeError.  I 
think the "averager" example I just gave is a good (but simplistic) 
example of the kind of code I would consider using coroutines for.  The 
need to catch an exception would make that code a lot less readable, not 
to mention slower. 

I am not about to write a separate PEP for this, but I would consider 
"return from generator" plus "close returns value returned from 
generator"  to be a worthwhile addition in itself.

- Jacob




From ncoghlan at gmail.com  Wed Apr  1 18:05:05 2009
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Thu, 02 Apr 2009 02:05:05 +1000
Subject: [Python-ideas] Yield-From: Finalization guarantees
In-Reply-To: <49D36FD0.3080602@improva.dk>
References: <49AB1F90.7070201@canterbury.ac.nz>
	<49C94CF6.5070301@gmail.com>	<49C9D162.5040907@canterbury.ac.nz>	<49CA20F2.7040207@gmail.com>	<49CA4029.6050703@improva.dk>	<49CABFC6.1080207@canterbury.ac.nz>	<49CAC0FE.5010305@improva.dk>	<49CACB39.3020708@canterbury.ac.nz>	<49CAD15D.2090008@improva.dk>	<49CB155E.4040504@canterbury.ac.nz>	<49CB8E4A.3050108@improva.dk>	<49CC5D85.30409@canterbury.ac.nz>	<49CE29BF.3040502@improva.dk>	<49CEB8DE.8060200@gmail.com>	<49CEBCD5.7020107@canterbury.ac.nz>	<49CF6AAF.70109@improva.dk>	<49D05C8F.3040800@canterbury.ac.nz>	<49D0A324.1030701@gmail.com>	<49D143B1.9040009@canterbury.ac.nz>	<49D1542E.7070503@improva.dk>	<49D16294.9030205@canterbury.ac.nz>	<49D17247.20705@improva.dk>	<49D18D31.9000008@canterbury.ac.nz>	<49D1E5E6.5000007@improva.dk>	<49D207BE.8090909@gmail.com>	<49D255BC.6080503@improva.dk>
	<49D2A759.5080204@canterbury.ac.nz>
	<49D2C735.8020803@improva.dk> <49D34FDC.5050106@gmail.com>
	<49D36FD0.3080602@improva.dk>
Message-ID: <49D390B1.7090904@gmail.com>

Jacob Holm wrote:
> def averager():
>    count = 0
>    sum = 0
>    while 1:
>        try:            val = (yield)
>        except GeneratorExit:
>            return sum/count
>        else:
>            sum += val
>            count += 1
> 
> avg = averager()
> avg.next() # start coroutine
> avg.send(1.0)
> avg.send(2.0)
> print avg.close()  # prints 1.5

But that's not how it works, unless you're asking Greg to change the PEP
to allow that. And while it looks cute a single layer deep like that, it
goes wrong as soon as you consider the fact that if you get a
GeneratorReturn exception on close(), you *don't know* if that result
came from the outer iterator.

A better way to write that averager would be:

def averager():
   # Works for Python 2.5+
   count = 0
   sum = 0
   while 1:
       val = (yield)
       if val is None:
           yield sum/count
           break
       sum += val
       count += 1

>>> avg = averager()
>>> avg.next() # start coroutine
>>> avg.send(1.0)
>>> avg.send(2.0)
>>> print avg.send(None)
1.5

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------


From guido at python.org  Wed Apr  1 18:20:10 2009
From: guido at python.org (Guido van Rossum)
Date: Wed, 1 Apr 2009 09:20:10 -0700
Subject: [Python-ideas] Yield-From: Finalization guarantees
In-Reply-To: <49D34FDC.5050106@gmail.com>
References: <49AB1F90.7070201@canterbury.ac.nz>
	<49D16294.9030205@canterbury.ac.nz> 
	<49D17247.20705@improva.dk> <49D18D31.9000008@canterbury.ac.nz> 
	<49D1E5E6.5000007@improva.dk> <49D207BE.8090909@gmail.com> 
	<49D255BC.6080503@improva.dk> <49D2A759.5080204@canterbury.ac.nz> 
	<49D2C735.8020803@improva.dk> <49D34FDC.5050106@gmail.com>
Message-ID: <ca471dc20904010920m7139a08vfa49adaa6488adea@mail.gmail.com>

On Wed, Apr 1, 2009 at 4:28 AM, Nick Coghlan <ncoghlan at gmail.com> wrote:
> You appear to be thinking of GeneratorExit as a way to ask a generator
> to finish normally such that it still makes sense to try to return a
> value after a GeneratorExit has been thrown in to the current frame, but
> that really isn't its role.
>
> Instead, it's more of an "Abandon Ship! Abandon Ship! All hands to the
> lifeboats!" indication that gives the generator a chance to release any
> resources it might be holding and bail out. The reason that close()
> accepts a StopIteration as well as a GeneratorExit is that the former
> still indicates that the generator has finalised itself, so the
> objective of calling close() has been achieved and there is no need to
> report an error.
>
> Any code that catches GeneratorExit without reraising it is highly
> suspect, just like code that suppresses SystemExit and KeyboardInterrupt.

Let's make that "without either returning from the generator without
yielding any more values, raising StopIteration, or re-raising
GeneratorExit." At least one example in PEP 342 catches GeneratorExit
and returns.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)


From guido at python.org  Wed Apr  1 18:36:34 2009
From: guido at python.org (Guido van Rossum)
Date: Wed, 1 Apr 2009 09:36:34 -0700
Subject: [Python-ideas] Yet another alternative name for yield-from
In-Reply-To: <49D317D5.6080705@canterbury.ac.nz>
References: <49AB1F90.7070201@canterbury.ac.nz> <49CB8E4A.3050108@improva.dk> 
	<49CC5D85.30409@canterbury.ac.nz> <49CE29BF.3040502@improva.dk> 
	<49CEB8DE.8060200@gmail.com> <49CEBCD5.7020107@canterbury.ac.nz> 
	<49CF6AAF.70109@improva.dk> <49D05C8F.3040800@canterbury.ac.nz> 
	<ca471dc20903312110y287cc5fbhf3abfb8fdac5d3dd@mail.gmail.com> 
	<49D317D5.6080705@canterbury.ac.nz>
Message-ID: <ca471dc20904010936h36fc2315saf9df127463bd343@mail.gmail.com>

On Wed, Apr 1, 2009 at 12:29 AM, Greg Ewing <greg.ewing at canterbury.ac.nz> wrote:
> I've just thought of another possible alternative
> name for yield-from:
>
> ?y = gcall f(x)

Nice April Fool. :-)

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)


From jh at improva.dk  Wed Apr  1 18:53:39 2009
From: jh at improva.dk (Jacob Holm)
Date: Wed, 01 Apr 2009 18:53:39 +0200
Subject: [Python-ideas] Yield-From: Finalization guarantees
In-Reply-To: <49D390B1.7090904@gmail.com>
References: <49AB1F90.7070201@canterbury.ac.nz>
	<49C9D162.5040907@canterbury.ac.nz>	<49CA20F2.7040207@gmail.com>	<49CA4029.6050703@improva.dk>	<49CABFC6.1080207@canterbury.ac.nz>	<49CAC0FE.5010305@improva.dk>	<49CACB39.3020708@canterbury.ac.nz>	<49CAD15D.2090008@improva.dk>	<49CB155E.4040504@canterbury.ac.nz>	<49CB8E4A.3050108@improva.dk>	<49CC5D85.30409@canterbury.ac.nz>	<49CE29BF.3040502@improva.dk>	<49CEB8DE.8060200@gmail.com>	<49CEBCD5.7020107@canterbury.ac.nz>	<49CF6AAF.70109@improva.dk>	<49D05C8F.3040800@canterbury.ac.nz>	<49D0A324.1030701@gmail.com>	<49D143B1.9040009@canterbury.ac.nz>	<49D1542E.7070503@improva.dk>	<49D16294.9030205@canterbury.ac.nz>	<49D17247.20705@improva.dk>	<49D18D31.9000008@canterbury.ac.nz>	<49D1E5E6.5000007@improva.dk>	<49D207BE.8090909@gmail.com>	<49D255BC.6080503@improva.dk>
	<49D2A759.5080204@canterbury.ac.nz>
	<49D2C735.8020803@improva.dk> <49D34FDC.5050106@gmail.com>
	<49D36FD0.3080602@improva.dk> <49D390B1.7090904@gmail.com>
Message-ID: <49D39C13.9090304@improva.dk>

Nick Coghlan wrote:
> Jacob Holm wrote:
>   
>> def averager():
>>    count = 0
>>    sum = 0
>>    while 1:
>>        try:            val = (yield)
>>        except GeneratorExit:
>>            return sum/count
>>        else:
>>            sum += val
>>            count += 1
>>
>> avg = averager()
>> avg.next() # start coroutine
>> avg.send(1.0)
>> avg.send(2.0)
>> print avg.close()  # prints 1.5
>>     
>
> But that's not how it works, unless you're asking Greg to change the PEP
> to allow that. 
I am most definitely asking Greg to change the PEP to allow that.  
Specifically I am asking for a clarification in the PEP of how  
GeneratorReturn/StopIteration is handled in close(), and requesting that 
we define close() to return the value rather than letting the 
GeneratorReturn be raised.

> And while it looks cute a single layer deep like that, it
> goes wrong as soon as you consider the fact that if you get a
> GeneratorReturn exception on close(), you *don't know* if that result
> came from the outer iterator.
>   

Using option #4 from the list I made of possible finalization strategies 
which is what most of you seemed to prefer, and assuming that close 
catches GeneratorReturn/StopIteration you *can* be sure.  There is no 
way it could come from anywhere else in the yield-from stack.  Of course 
you can raise the exception manually or call a function that does, but 
that is crazy code...

> A better way to write that averager would be:
>
> def averager():
>    # Works for Python 2.5+
>    count = 0
>    sum = 0
>    while 1:
>        val = (yield)
>        if val is None:
>            yield sum/count
>            break
>        sum += val
>        count += 1
>
>   
>>>> avg = averager()
>>>> avg.next() # start coroutine
>>>> avg.send(1.0)
>>>> avg.send(2.0)
>>>> print avg.send(None)
>>>>         
> 1.5
>
>   
Yes, I am aware that you can pass special values to send.   I find this 
version less appealing than mine for at least the following reasons:

   1. You need to use a magic "stop" value (in this case None).
   2. You are using the same "send" method for two radically different
      purposes on the same object.
   3. You need a separate "close" step to clean up afterwards (which you
      forgot).
   4. You use "yield" for different purposes at different times (mostly
      input, then a single output).
   5. I find the control flow in mine simpler to understand.  It
      explicitly mentions GeneratorExit, and immediately returns.  Yours
      must check for the magic "stop" value, yield a result, then
      break/return.

#1,2,3 makes the API of the averager object more complex than it needs 
to be.  #4 is generally considered ugly, but is sometimes necessary.  #5 
is just a personal preference.

Best regards

- jacob



From greg.ewing at canterbury.ac.nz  Wed Apr  1 23:30:45 2009
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Thu, 02 Apr 2009 09:30:45 +1200
Subject: [Python-ideas] Yield-From: Finalization guarantees
In-Reply-To: <49D34FDC.5050106@gmail.com>
References: <49AB1F90.7070201@canterbury.ac.nz>
	<49C81A45.1070803@canterbury.ac.nz> <49C94CF6.5070301@gmail.com>
	<49C9D162.5040907@canterbury.ac.nz> <49CA20F2.7040207@gmail.com>
	<49CA4029.6050703@improva.dk> <49CABFC6.1080207@canterbury.ac.nz>
	<49CAC0FE.5010305@improva.dk> <49CACB39.3020708@canterbury.ac.nz>
	<49CAD15D.2090008@improva.dk> <49CB155E.4040504@canterbury.ac.nz>
	<49CB8E4A.3050108@improva.dk> <49CC5D85.30409@canterbury.ac.nz>
	<49CE29BF.3040502@improva.dk> <49CEB8DE.8060200@gmail.com>
	<49CEBCD5.7020107@canterbury.ac.nz> <49CF6AAF.70109@improva.dk>
	<49D05C8F.3040800@canterbury.ac.nz> <49D0A324.1030701@gmail.com>
	<49D143B1.9040009@canterbury.ac.nz> <49D1542E.7070503@improva.dk>
	<49D16294.9030205@canterbury.ac.nz> <49D17247.20705@improva.dk>
	<49D18D31.9000008@canterbury.ac.nz> <49D1E5E6.5000007@improva.dk>
	<49D207BE.8090909@gmail.com> <49D255BC.6080503@improva.dk>
	<49D2A759.5080204@canterbury.ac.nz> <49D2C735.8020803@improva.dk>
	<49D34FDC.5050106@gmail.com>
Message-ID: <49D3DD05.7080506@canterbury.ac.nz>

Nick Coghlan wrote:

> Any code that catches GeneratorExit without reraising it is highly
> suspect, just like code that suppresses SystemExit and KeyboardInterrupt.

As another perspective on this, I think Jacob's
example is another case of bogus refactoring.

If you think about it from the refactoring direction,
you start with something that catches GeneratorExit,
does some cleanup, and returns. That's fine.

But then you try to chop out just the part that
catches the GeneratorExit, without doing anything
to ensure that the main generator still returns
afterwards.

This is analogous to taking a block of code containing
a 'return' out of an ordinary function and putting it
in another function. If that's all you do, you can't
expect it to have the same result, because it only
returns from the inner function, not the outer one.

To correctly refactor Jacob's example, you need to
maintain an 'except GeneratorExit' in the main
generator somehow. Like 'return', it's not something
you can freely move across a refactoring boundary.

-- 
Greg


From jimjjewett at gmail.com  Thu Apr  2 01:21:13 2009
From: jimjjewett at gmail.com (Jim Jewett)
Date: Wed, 1 Apr 2009 19:21:13 -0400
Subject: [Python-ideas] Yet another alternative name for yield-from
In-Reply-To: <49D353C6.3040509@gmail.com>
References: <49AB1F90.7070201@canterbury.ac.nz>
	<49CC5D85.30409@canterbury.ac.nz> <49CE29BF.3040502@improva.dk>
	<49CEB8DE.8060200@gmail.com> <49CEBCD5.7020107@canterbury.ac.nz>
	<49CF6AAF.70109@improva.dk> <49D05C8F.3040800@canterbury.ac.nz>
	<ca471dc20903312110y287cc5fbhf3abfb8fdac5d3dd@mail.gmail.com>
	<49D317D5.6080705@canterbury.ac.nz> <49D353C6.3040509@gmail.com>
Message-ID: <fb6fbf560904011621m78a21f87je322d7e7c3da9317@mail.gmail.com>

On 4/1/09, Nick Coghlan <ncoghlan at gmail.com> wrote:
> Greg Ewing wrote:
>> I've just thought of another possible alternative
>> name for yield-from:

>>   y = gcall f(x)

> However, you would lose the common mnemonic with yield for both turning
> the current function into a generator and indicating to the reader that
> the current frame may get suspended at a particular point.

If the "gencall" exhausts the generator f (as "yield from" normally
does), then the current frame shouldn't be suspended any more than it
would be for an ordinary function call.  If the "return value" of the
generator really is important (and the intermediate values are
discarded), then this probably is the right syntax.   (Whether that
case is worth syntax is a matter of taste, but it does seem to be a
case Greg is trying to support.)

-jJ


From greg.ewing at canterbury.ac.nz  Thu Apr  2 07:36:05 2009
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Thu, 02 Apr 2009 17:36:05 +1200
Subject: [Python-ideas] Yet another alternative name for yield-from
In-Reply-To: <ca471dc20904010936h36fc2315saf9df127463bd343@mail.gmail.com>
References: <49AB1F90.7070201@canterbury.ac.nz> <49CB8E4A.3050108@improva.dk>
	<49CC5D85.30409@canterbury.ac.nz> <49CE29BF.3040502@improva.dk>
	<49CEB8DE.8060200@gmail.com> <49CEBCD5.7020107@canterbury.ac.nz>
	<49CF6AAF.70109@improva.dk> <49D05C8F.3040800@canterbury.ac.nz>
	<ca471dc20903312110y287cc5fbhf3abfb8fdac5d3dd@mail.gmail.com>
	<49D317D5.6080705@canterbury.ac.nz>
	<ca471dc20904010936h36fc2315saf9df127463bd343@mail.gmail.com>
Message-ID: <49D44EC5.30501@canterbury.ac.nz>

Guido van Rossum wrote:

>> y = gcall f(x)
> 
> Nice April Fool. :-)

Actually, it wasn't meant to be -- it was a serious
suggestion (it's not 1 April any more where I am).

I suppose I'll have to post it again tomorrow
before you'll believe that, though...

-- 
Greg


From ncoghlan at gmail.com  Thu Apr  2 11:51:13 2009
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Thu, 02 Apr 2009 19:51:13 +1000
Subject: [Python-ideas] Yet another alternative name for yield-from
In-Reply-To: <fb6fbf560904011621m78a21f87je322d7e7c3da9317@mail.gmail.com>
References: <49AB1F90.7070201@canterbury.ac.nz>	
	<49CC5D85.30409@canterbury.ac.nz>
	<49CE29BF.3040502@improva.dk>	 <49CEB8DE.8060200@gmail.com>
	<49CEBCD5.7020107@canterbury.ac.nz>	 <49CF6AAF.70109@improva.dk>
	<49D05C8F.3040800@canterbury.ac.nz>	
	<ca471dc20903312110y287cc5fbhf3abfb8fdac5d3dd@mail.gmail.com>	
	<49D317D5.6080705@canterbury.ac.nz> <49D353C6.3040509@gmail.com>
	<fb6fbf560904011621m78a21f87je322d7e7c3da9317@mail.gmail.com>
Message-ID: <49D48A91.9050206@gmail.com>

Jim Jewett wrote:
> On 4/1/09, Nick Coghlan <ncoghlan at gmail.com> wrote:
>> Greg Ewing wrote:
>>> I've just thought of another possible alternative
>>> name for yield-from:
> 
>>>   y = gcall f(x)
> 
>> However, you would lose the common mnemonic with yield for both turning
>> the current function into a generator and indicating to the reader that
>> the current frame may get suspended at a particular point.
> 
> If the "gencall" exhausts the generator f (as "yield from" normally
> does), then the current frame shouldn't be suspended any more than it
> would be for an ordinary function call.  If the "return value" of the
> generator really is important (and the intermediate values are
> discarded), then this probably is the right syntax.   (Whether that
> case is worth syntax is a matter of taste, but it does seem to be a
> case Greg is trying to support.)

The intermediate values aren't necessarily discarded by "yield from"
though: they're passed out to whoever is consuming the values yielded by
the outermost generator.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------


From greg.ewing at canterbury.ac.nz  Thu Apr  2 13:28:22 2009
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Thu, 02 Apr 2009 23:28:22 +1200
Subject: [Python-ideas] Yield-From: Handling of GeneratorExit
In-Reply-To: <49D36FD0.3080602@improva.dk>
References: <49AB1F90.7070201@canterbury.ac.nz> <49C94CF6.5070301@gmail.com>
	<49C9D162.5040907@canterbury.ac.nz> <49CA20F2.7040207@gmail.com>
	<49CA4029.6050703@improva.dk> <49CABFC6.1080207@canterbury.ac.nz>
	<49CAC0FE.5010305@improva.dk> <49CACB39.3020708@canterbury.ac.nz>
	<49CAD15D.2090008@improva.dk> <49CB155E.4040504@canterbury.ac.nz>
	<49CB8E4A.3050108@improva.dk> <49CC5D85.30409@canterbury.ac.nz>
	<49CE29BF.3040502@improva.dk> <49CEB8DE.8060200@gmail.com>
	<49CEBCD5.7020107@canterbury.ac.nz> <49CF6AAF.70109@improva.dk>
	<49D05C8F.3040800@canterbury.ac.nz> <49D0A324.1030701@gmail.com>
	<49D143B1.9040009@canterbury.ac.nz> <49D1542E.7070503@improva.dk>
	<49D16294.9030205@canterbury.ac.nz> <49D17247.20705@improva.dk>
	<49D18D31.9000008@canterbury.ac.nz> <49D1E5E6.5000007@improva.dk>
	<49D207BE.8090909@gmail.com> <49D255BC.6080503@improva.dk>
	<49D2A759.5080204@canterbury.ac.nz> <49D2C735.8020803@improva.dk>
	<49D34FDC.5050106@gmail.com> <49D36FD0.3080602@improva.dk>
Message-ID: <49D4A156.9080304@canterbury.ac.nz>

I've had another idea about this. Suppose the close()
method of a generator didn't complain about reaching
a yield after GeneratorExit is raised, but simply
raised it again, and continued doing so until either
a return occured or an exception propagated out.

Seems to me this couldn't do any harm to a well-
behaved generator, since it has to be prepared to
deal with a GeneratorExit arising from any of its
yield points.

Yield-from would then no longer have the potential
to create broken generators, we wouldn't have to treat
GeneratorExit differently from any other exception,
and Jacob could have his subgenerators that return
values when you close them.

-- 
Greg




From ncoghlan at gmail.com  Thu Apr  2 13:50:07 2009
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Thu, 02 Apr 2009 21:50:07 +1000
Subject: [Python-ideas] Yield-From: Handling of GeneratorExit
In-Reply-To: <49D4A156.9080304@canterbury.ac.nz>
References: <49AB1F90.7070201@canterbury.ac.nz>
	<49C9D162.5040907@canterbury.ac.nz>
	<49CA20F2.7040207@gmail.com>	<49CA4029.6050703@improva.dk>
	<49CABFC6.1080207@canterbury.ac.nz>	<49CAC0FE.5010305@improva.dk>
	<49CACB39.3020708@canterbury.ac.nz>	<49CAD15D.2090008@improva.dk>
	<49CB155E.4040504@canterbury.ac.nz>	<49CB8E4A.3050108@improva.dk>
	<49CC5D85.30409@canterbury.ac.nz>	<49CE29BF.3040502@improva.dk>
	<49CEB8DE.8060200@gmail.com>	<49CEBCD5.7020107@canterbury.ac.nz>
	<49CF6AAF.70109@improva.dk>	<49D05C8F.3040800@canterbury.ac.nz>
	<49D0A324.1030701@gmail.com>	<49D143B1.9040009@canterbury.ac.nz>
	<49D1542E.7070503@improva.dk>	<49D16294.9030205@canterbury.ac.nz>
	<49D17247.20705@improva.dk>	<49D18D31.9000008@canterbury.ac.nz>
	<49D1E5E6.5000007@improva.dk>	<49D207BE.8090909@gmail.com>
	<49D255BC.6080503@improva.dk>	<49D2A759.5080204@canterbury.ac.nz>
	<49D2C735.8020803@improva.dk>	<49D34FDC.5050106@gmail.com>
	<49D36FD0.3080602@improva.dk> <49D4A156.9080304@canterbury.ac.nz>
Message-ID: <49D4A66F.9060900@gmail.com>

Greg Ewing wrote:
> I've had another idea about this. Suppose the close()
> method of a generator didn't complain about reaching
> a yield after GeneratorExit is raised, but simply
> raised it again, and continued doing so until either
> a return occured or an exception propagated out.
> 
> Seems to me this couldn't do any harm to a well-
> behaved generator, since it has to be prepared to
> deal with a GeneratorExit arising from any of its
> yield points.
> 
> Yield-from would then no longer have the potential
> to create broken generators, we wouldn't have to treat
> GeneratorExit differently from any other exception,
> and Jacob could have his subgenerators that return
> values when you close them.

I think I'd prefer to see some arbitrary limit (500 seems like a nice
round number) on the number of times that GeneratorExit would be thrown
before giving up and raising RuntimeError, just so truly broken
generators that suppressed GeneratorExit in an infinite loop would
eventually trigger an exception rather than just appearing to hang.

The basic idea seems sound though (Jacob's averager example really was
nicer than mine).

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------


From jh at improva.dk  Thu Apr  2 16:23:03 2009
From: jh at improva.dk (Jacob Holm)
Date: Thu, 02 Apr 2009 16:23:03 +0200
Subject: [Python-ideas] Yield-From: Handling of GeneratorExit
In-Reply-To: <49D4A66F.9060900@gmail.com>
References: <49AB1F90.7070201@canterbury.ac.nz>	<49CA20F2.7040207@gmail.com>	<49CA4029.6050703@improva.dk>	<49CABFC6.1080207@canterbury.ac.nz>	<49CAC0FE.5010305@improva.dk>	<49CACB39.3020708@canterbury.ac.nz>	<49CAD15D.2090008@improva.dk>	<49CB155E.4040504@canterbury.ac.nz>	<49CB8E4A.3050108@improva.dk>	<49CC5D85.30409@canterbury.ac.nz>	<49CE29BF.3040502@improva.dk>	<49CEB8DE.8060200@gmail.com>	<49CEBCD5.7020107@canterbury.ac.nz>	<49CF6AAF.70109@improva.dk>	<49D05C8F.3040800@canterbury.ac.nz>	<49D0A324.1030701@gmail.com>	<49D143B1.9040009@canterbury.ac.nz>	<49D1542E.7070503@improva.dk>	<49D16294.9030205@canterbury.ac.nz>	<49D17247.20705@improva.dk>	<49D18D31.9000008@canterbury.ac.nz>	<49D1E5E6.5000007@improva.dk>	<49D207BE.8090909@gmail.com>	<49D255BC.6080503@improva.dk>	<49D2A759.5080204@canterbury.ac.nz>	<49D2C735.8020803@improva.dk>	<49D34FDC.5050106@gmail.com>	<49D36FD0.3080602@improva.dk>
	<49D4A156.9080304@canterbury.ac.nz> <49D4A66F.9060900@gmail.com>
Message-ID: <49D4CA47.1020003@improva.dk>

Nick Coghlan wrote:
> Greg Ewing wrote:
>   
>> I've had another idea about this. Suppose the close()
>> method of a generator didn't complain about reaching
>> a yield after GeneratorExit is raised, but simply
>> raised it again, and continued doing so until either
>> a return occured or an exception propagated out.
>>
>> Seems to me this couldn't do any harm to a well-
>> behaved generator, since it has to be prepared to
>> deal with a GeneratorExit arising from any of its
>> yield points.
>>     

It solves the returnvalue thing, but introduces a change for existing 
generators. Well-behaved generators would not be affected, but there 
might be generators in real use that relied on the ability to ignore 
close or code using such generators that relied on getting the RuntimeError.

<sidetrack>
If there is a use case for ignoring close, that would be better served 
by another new idea I just had, the "yield raise" expression. The 
purpose of this would be to raise an exception in the caller of "next", 
"send", "throw" or "close" *without* finalizing the generator. Extending 
my "averager" example a bit:

def averager(start=0):
   count = 0
   exc = None
   sum = start
   while 1:
       try: 
           val = (yield) if exc is None else (yield raise exc)
       except GeneratorExit:
           return sum/count
       try:
           sum += val
       except BaseException as e:
           exc = e # will be reraised by "yield raise" above
       else:
           exc = None
           count += 1

avg = averager()
avg.next() # start coroutine
avg.send(1.0)
try:
    avg.send('') # this raises a TypeError at the sum += val line, which is rerouted here by the yield raise
except TypeError:
    pass
avg.send(2.0)
print avg.close() # still prints 1.5


The above code would be the main use for the feature. However, a side 
benefit would be that a generator that wanted to raise an exception 
instead of closing could use a "yield raise OtherException" as response 
to GeneratorExit. I am not saying we should add the "yield raise" 
feature to the PEP, just that I think this would be a better way to 
handle the "don't close me" cases.
(I am not sure how it would fit into the PEP anyway)
</sidetrack>

> Greg Ewing wrote:
>   
>> Yield-from would then no longer have the potential
>> to create broken generators, we wouldn't have to treat
>> GeneratorExit differently from any other exception,
>> and Jacob could have his subgenerators that return
>> values when you close them.
>>     

Only true because you have redefined it so that no generators are 
broken. If I understand you correctly, you are arguing that this change 
lets us throw GeneratorExit to the subiterator without trying to reraise 
it (my #2 from several mails back). That is clearly a plus in my book 
because it adheres to the inlining principle, but I don't think you need 
the loop in close for it to be better.


Nick Coghlan wrote:
> I think I'd prefer to see some arbitrary limit (500 seems like a nice
> round number) on the number of times that GeneratorExit would be thrown
> before giving up and raising RuntimeError, just so truly broken
> generators that suppressed GeneratorExit in an infinite loop would
> eventually trigger an exception rather than just appearing to hang.
>   

Right. The possibility of turning a call that used to raise a 
RuntimeError into an infinite loop bothers me a bit. I also don't really 
see the use for it. GeneratorExit is an unambiguous signal to close, so 
I would expect the generator to handle it by closing (possibly with a 
final return value), or by raising an exception. Not doing so *should* 
be an error. There has been requests for a function that loops over the 
generator and returns the final result, but this version of close 
doesn't fit that use case because it uses throw(GeneratorExit) instead 
of next().

> The basic idea seems sound though (Jacob's averager example really was
> nicer than mine).
>   

Thank you Nick. I am glad you think so.

To summarize, I am only +0.75 on this proposal. I think it would be 
better not to loop, still return the final value from close, and still 
just throw GeneratorExit to subiterators without trying to reraise.


Cheers

- Jacob


From dangyogi at gmail.com  Thu Apr  2 16:57:16 2009
From: dangyogi at gmail.com (Bruce Frederiksen)
Date: Thu, 02 Apr 2009 10:57:16 -0400
Subject: [Python-ideas] Yield-From: Handling of GeneratorExit
In-Reply-To: <49D4CA47.1020003@improva.dk>
References: <49AB1F90.7070201@canterbury.ac.nz>	<49CA4029.6050703@improva.dk>	<49CABFC6.1080207@canterbury.ac.nz>	<49CAC0FE.5010305@improva.dk>	<49CACB39.3020708@canterbury.ac.nz>	<49CAD15D.2090008@improva.dk>	<49CB155E.4040504@canterbury.ac.nz>	<49CB8E4A.3050108@improva.dk>	<49CC5D85.30409@canterbury.ac.nz>	<49CE29BF.3040502@improva.dk>	<49CEB8DE.8060200@gmail.com>	<49CEBCD5.7020107@canterbury.ac.nz>	<49CF6AAF.70109@improva.dk>	<49D05C8F.3040800@canterbury.ac.nz>	<49D0A324.1030701@gmail.com>	<49D143B1.9040009@canterbury.ac.nz>	<49D1542E.7070503@improva.dk>	<49D16294.9030205@canterbury.ac.nz>	<49D17247.20705@improva.dk>	<49D18D31.9000008@canterbury.ac.nz>	<49D1E5E6.5000007@improva.dk>	<49D207BE.8090909@gmail.com>	<49D255BC.6080503@improva.dk>	<49D2A759.5080204@canterbury.ac.nz>	<49D2C735.8020803@improva.dk>	<49D34FDC.5050106@gmail.com>	<49D36FD0.3080602@improva.dk>	<49D4A156.9080304@canterbury.ac.nz>
	<49D4A66F.9060900@gmail.com> <49D4CA47.1020003@improva.dk>
Message-ID: <49D4D24C.6070005@gmail.com>

Jacob Holm wrote:
> I think it would be better not to loop, still return the final value 
> from close, and still just throw GeneratorExit to subiterators without 
> trying to reraise.
This sounds better to me too, except for the last part -- not reraising 
GeneratorExit.

If you re-define close to return the value attached to StopIteration, 
then I think that it makes sense to define it to continue to return this 
value on subsequent calls to close.  This provides a way to still 
retrieve the returned value after the generator has been finalized in 
some other way.

And then, wouldn't this allow you to discard the StopIteration in yield 
from and reraise GeneratorExit to finalize the outer generator; but 
leaving it the option to call close itself on the inner generator to 
retrieve the return value, if it still wants it?

-bruce frederiksen


From jh at improva.dk  Thu Apr  2 18:26:03 2009
From: jh at improva.dk (Jacob Holm)
Date: Thu, 02 Apr 2009 18:26:03 +0200
Subject: [Python-ideas] Yield-From: Handling of GeneratorExit
In-Reply-To: <49D4D24C.6070005@gmail.com>
References: <49AB1F90.7070201@canterbury.ac.nz>	<49CABFC6.1080207@canterbury.ac.nz>	<49CAC0FE.5010305@improva.dk>	<49CACB39.3020708@canterbury.ac.nz>	<49CAD15D.2090008@improva.dk>	<49CB155E.4040504@canterbury.ac.nz>	<49CB8E4A.3050108@improva.dk>	<49CC5D85.30409@canterbury.ac.nz>	<49CE29BF.3040502@improva.dk>	<49CEB8DE.8060200@gmail.com>	<49CEBCD5.7020107@canterbury.ac.nz>	<49CF6AAF.70109@improva.dk>	<49D05C8F.3040800@canterbury.ac.nz>	<49D0A324.1030701@gmail.com>	<49D143B1.9040009@canterbury.ac.nz>	<49D1542E.7070503@improva.dk>	<49D16294.9030205@canterbury.ac.nz>	<49D17247.20705@improva.dk>	<49D18D31.9000008@canterbury.ac.nz>	<49D1E5E6.5000007@improva.dk>	<49D207BE.8090909@gmail.com>	<49D255BC.6080503@improva.dk>	<49D2A759.5080204@canterbury.ac.nz>	<49D2C735.8020803@improva.dk>	<49D34FDC.5050106@gmail.com>	<49D36FD0.3080602@improva.dk>	<49D4A156.9080304@canterbury.ac.nz>
	<49D4A66F.9060900@gmail.com> <49D4CA47.1020003@improva.dk>
	<49D4D24C.6070005@gmail.com>
Message-ID: <49D4E71B.1050807@improva.dk>

Bruce Frederiksen wrote:
> Jacob Holm wrote:
>> I think it would be better not to loop, still return the final value 
>> from close, and still just throw GeneratorExit to subiterators 
>> without trying to reraise.
> This sounds better to me too, except for the last part -- not 
> reraising GeneratorExit.
>
> If you re-define close to return the value attached to StopIteration, 
> then I think that it makes sense to define it to continue to return 
> this value on subsequent calls to close.  This provides a way to still 
> retrieve the returned value after the generator has been finalized in 
> some other way.

If we want close to return the value multiple times we need to store it 
somewhere.  If we are storing it, we might as well do it as a direct 
result of the return statement instead of pulling it out of the 
StopIteration.  That way we could drop the idea of a GeneratorReturn 
exception and just always call close on the subiterator to get the 
value.  If we are calling close anyway, we
don't need to pass the GeneratorExit to the subiterator ourselves, as 
close will do it for us.  But if we don't pass it to the subiterator, we 
need to (re)raise it in the yield from.  That makes this a slight 
variation of my #4, which IIRC was a common preference between you and 
Nick (and probably Greg as well).

>
> And then, wouldn't this allow you to discard the StopIteration in 
> yield from and reraise GeneratorExit to finalize the outer generator; 
> but leaving it the option to call close itself on the inner generator 
> to retrieve the return value, if it still wants it?

Yes it would.  This addresses the issue I had about not being able to 
retrieve the return value after yield-from throws GeneratorExit.  At the 
moment, I can't find any other issues with this version.  The only 
slight drawback is that it has to save the returned value, potentially 
keeping it alive longer than necessary.  This is more than compensated 
for by making things simpler and allowing more uses, such as using the 
generator in a for-loop and accessing the return value afterwards.

So +1 to this.

- Jacob


From guido at python.org  Thu Apr  2 18:37:50 2009
From: guido at python.org (Guido van Rossum)
Date: Thu, 2 Apr 2009 09:37:50 -0700
Subject: [Python-ideas] Yet another alternative name for yield-from
In-Reply-To: <49D44EC5.30501@canterbury.ac.nz>
References: <49AB1F90.7070201@canterbury.ac.nz> <49CE29BF.3040502@improva.dk> 
	<49CEB8DE.8060200@gmail.com> <49CEBCD5.7020107@canterbury.ac.nz> 
	<49CF6AAF.70109@improva.dk> <49D05C8F.3040800@canterbury.ac.nz> 
	<ca471dc20903312110y287cc5fbhf3abfb8fdac5d3dd@mail.gmail.com> 
	<49D317D5.6080705@canterbury.ac.nz>
	<ca471dc20904010936h36fc2315saf9df127463bd343@mail.gmail.com> 
	<49D44EC5.30501@canterbury.ac.nz>
Message-ID: <ca471dc20904020937v3e4ed057t6b2b216331159274@mail.gmail.com>

On Wed, Apr 1, 2009 at 10:36 PM, Greg Ewing <greg.ewing at canterbury.ac.nz> wrote:
> Guido van Rossum wrote:
>
>>> y = gcall f(x)
>>
>> Nice April Fool. :-)
>
> Actually, it wasn't meant to be -- it was a serious
> suggestion (it's not 1 April any more where I am).
>
> I suppose I'll have to post it again tomorrow
> before you'll believe that, though...

Well, no matter what, I think it's a bad name. Let's stick with 'yield
from'. I'm also returning to the view that return from a generator
used as a task (in Dave Beazley's terms) should be spelled differently
than a plain return. Phillip Eby's 'return from yield with' may be a
little long, but how about 'return from'?

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)


From guido at python.org  Thu Apr  2 19:37:27 2009
From: guido at python.org (Guido van Rossum)
Date: Thu, 2 Apr 2009 10:37:27 -0700
Subject: [Python-ideas] [Python-Dev] OSError.errno => exception
	hierarchy?
In-Reply-To: <a467ca4f0904020912g7d4056c5p955198980d9e4218@mail.gmail.com>
References: <a467ca4f0904020525taabdeb8gd75ce8f73b418d66@mail.gmail.com> 
	<ca471dc20904020757o57654981o2170a54f201d5031@mail.gmail.com> 
	<a467ca4f0904020912g7d4056c5p955198980d9e4218@mail.gmail.com>
Message-ID: <ca471dc20904021037r5e1f2fefn7ce83caadf505e34@mail.gmail.com>

[Moving to python-ideas]

On Thu, Apr 2, 2009 at 9:12 AM, Gustavo Carneiro <gjcarneiro at gmail.com> wrote:
>
>
> 2009/4/2 Guido van Rossum <guido at python.org>
>>
>> On 4/2/09, Gustavo Carneiro <gjcarneiro at gmail.com> wrote:
>> > Apologies if this has already been discussed.
>> >
>> > I was expecting that by now, python 3.0,

Note, these words are pretty offensive (or perhaps
passive-aggressive). How would you respond if some Perl hacker said "I
was expecting that by now, Python 3.0, Python would have dropped the
whitespace bug."

>> > the following code:
>> >
>> > ? ? ? ? ? ? # clean the target dir
>> > ? ? ? ? ? ? import errno
>> > ? ? ? ? ? ? try:
>> > ? ? ? ? ? ? ? ? shutil.rmtree(trace_output_path)
>> > ? ? ? ? ? ? except OSError, ex:
>> > ? ? ? ? ? ? ? ? if ex.errno not in [errno.ENOENT]:
>> > ? ? ? ? ? ? ? ? ? ? raise
>> >
>> > Would have become something simpler, like this:
>> >
>> > ? ? ? ? ? ? # clean the target dir
>> > ? ? ? ? ? ? try:
>> > ? ? ? ? ? ? ? ? shutil.rmtree(trace_output_path)
>> > ? ? ? ? ? ? except OSErrorNoEntry: ? ? ? # or maybe os.ErrorNoEntry
>> > ? ? ? ? ? ? ? ? pass
>> >
>> > Apparently no one has bothered yet

Again, offensive words -- makes you sound like you are so much smarter than us.

>> > to turn OSError + errno into a hierarchy
>> > of OSError subclasses, as it should. ?What's the problem, no will to do
>> > it or no manpower?

Again poor choice of words. Note the leading question: you don't even
consider the possibility that it's a bad idea. Compare "When did you
stop beating your wife?"

>> Sounds like a bad idea. There are hundreds of errno values. I don't
>> want to have hundreds of new exceptions -- especially not since not
>> all are defined on each platform.
>
> We already have the hundreds of errno values defined in the errno module.
> It is just a matter of turning the integers that we already have into
> exception subclasses of OSError.? My idea would be to only create exceptions
> for the errors listed in the module 'errno', and use a generic OSError for
> the rest of them.

This would cause more platform problems than we already have. Consider
an errno (EWIN, mapped to OSWinError in your proposal) that exists
only on Windows, and another (ELIN, OSLinError) that exists only on
Linux. Now suppose you have to catch both of them. If you write your
code like this:

 try:
    ...
  except OSWinError:
    ...
  except OSLinError:
    ...

then on Linux, if OSLinError is raised, the "except OSWinError:"
clause will raise a NameError because that exception isn't defined,
and you'll never reach the "except OSLinError:" clause. If you reverse
the clauses you have the same problem on Windows.

While you would have the same problem if you tried to do something
like "if e.errno == errno.EWIN:" on Linux, it's easier to circumvent
-- one of the many ways to do so is to write "if
errno.errorcode[e.errno] == 'EWIN':" instead..

> Compatibility could be preserved if the exceptions were made subclasses of
> OSError, so it would be trivial to catch any kind of OSError generically.
>
> PS: private email: accidental or intentional?

Accidental (my phone's experimental gmail client is missing a
reply-all). I'm adding python-ideas since that's where it belongs.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)


From dangyogi at gmail.com  Thu Apr  2 19:39:45 2009
From: dangyogi at gmail.com (Bruce Frederiksen)
Date: Thu, 02 Apr 2009 13:39:45 -0400
Subject: [Python-ideas] Yet another alternative name for yield-from
In-Reply-To: <ca471dc20904020937v3e4ed057t6b2b216331159274@mail.gmail.com>
References: <49AB1F90.7070201@canterbury.ac.nz> <49CE29BF.3040502@improva.dk>
	<49CEB8DE.8060200@gmail.com>
	<49CEBCD5.7020107@canterbury.ac.nz> <49CF6AAF.70109@improva.dk>
	<49D05C8F.3040800@canterbury.ac.nz>
	<ca471dc20903312110y287cc5fbhf3abfb8fdac5d3dd@mail.gmail.com>
	<49D317D5.6080705@canterbury.ac.nz>	<ca471dc20904010936h36fc2315saf9df127463bd343@mail.gmail.com>
	<49D44EC5.30501@canterbury.ac.nz>
	<ca471dc20904020937v3e4ed057t6b2b216331159274@mail.gmail.com>
Message-ID: <49D4F861.9040407@gmail.com>

Guido van Rossum wrote:
>  but how about 'return from'?
>   
or 'return finally'?(??) ...

-bruce frederiksen


From guido at python.org  Thu Apr  2 19:44:36 2009
From: guido at python.org (Guido van Rossum)
Date: Thu, 2 Apr 2009 10:44:36 -0700
Subject: [Python-ideas] Yield-From: Handling of GeneratorExit
In-Reply-To: <49D4A156.9080304@canterbury.ac.nz>
References: <49AB1F90.7070201@canterbury.ac.nz>
	<49D18D31.9000008@canterbury.ac.nz> 
	<49D1E5E6.5000007@improva.dk> <49D207BE.8090909@gmail.com> 
	<49D255BC.6080503@improva.dk> <49D2A759.5080204@canterbury.ac.nz> 
	<49D2C735.8020803@improva.dk> <49D34FDC.5050106@gmail.com> 
	<49D36FD0.3080602@improva.dk> <49D4A156.9080304@canterbury.ac.nz>
Message-ID: <ca471dc20904021044q3765adc8g827e4b9015c7ae9c@mail.gmail.com>

On Thu, Apr 2, 2009 at 4:28 AM, Greg Ewing <greg.ewing at canterbury.ac.nz> wrote:
> I've had another idea about this. Suppose the close()
> method of a generator didn't complain about reaching
> a yield after GeneratorExit is raised, but simply
> raised it again, and continued doing so until either
> a return occured or an exception propagated out.
>
> Seems to me this couldn't do any harm to a well-
> behaved generator, since it has to be prepared to
> deal with a GeneratorExit arising from any of its
> yield points.

The feature doesn't exist for the benefit of well-behaved generators.
It exists to help people who don't understand generators well enough
yet to only write well-behaved ones. This is an important goal to me
-- generators are a complex enough topic that I prefer to be on the
strict side rather than giving weird code a random meaning.

> Yield-from would then no longer have the potential
> to create broken generators, we wouldn't have to treat
> GeneratorExit differently from any other exception,
> and Jacob could have his subgenerators that return
> values when you close them.

I need a longer description of the problems that you are trying to
solve here -- I haven't been able to catch up with all the threads.
How would yield-from create a broken generator? (As opposed to all the
ways that allowing GeneratorExit to be ignored allows creating broken
generators.) Is there an example shorter than a page that shows the
usefulness of subgenerators returning values when closed?

Please, please, please, we need to stop the bikeshedding and scope
expansion, and start converging to a *simpler* proposal.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)


From jh at improva.dk  Thu Apr  2 20:37:32 2009
From: jh at improva.dk (Jacob Holm)
Date: Thu, 02 Apr 2009 20:37:32 +0200
Subject: [Python-ideas] Yet another alternative name for yield-from
In-Reply-To: <49D4F861.9040407@gmail.com>
References: <49AB1F90.7070201@canterbury.ac.nz>
	<49CE29BF.3040502@improva.dk>	<49CEB8DE.8060200@gmail.com>	<49CEBCD5.7020107@canterbury.ac.nz>
	<49CF6AAF.70109@improva.dk>	<49D05C8F.3040800@canterbury.ac.nz>	<ca471dc20903312110y287cc5fbhf3abfb8fdac5d3dd@mail.gmail.com>	<49D317D5.6080705@canterbury.ac.nz>	<ca471dc20904010936h36fc2315saf9df127463bd343@mail.gmail.com>	<49D44EC5.30501@canterbury.ac.nz>	<ca471dc20904020937v3e4ed057t6b2b216331159274@mail.gmail.com>
	<49D4F861.9040407@gmail.com>
Message-ID: <49D505EC.8010305@improva.dk>

Bruce Frederiksen wrote:
> Guido van Rossum wrote:
>> but how about 'return from'?
> or 'return finally'?(??) ...
>
Or what about "yield return"? That clearly marks the construct as 
belonging in a generator. It also mixes well with the idea of a "yield 
raise" that I mentioned in another mail (not a suggestion for this PEP).

- Jacob


From guido at python.org  Thu Apr  2 20:44:57 2009
From: guido at python.org (Guido van Rossum)
Date: Thu, 2 Apr 2009 11:44:57 -0700
Subject: [Python-ideas] Yet another alternative name for yield-from
In-Reply-To: <49D505EC.8010305@improva.dk>
References: <49AB1F90.7070201@canterbury.ac.nz> <49CF6AAF.70109@improva.dk> 
	<49D05C8F.3040800@canterbury.ac.nz>
	<ca471dc20903312110y287cc5fbhf3abfb8fdac5d3dd@mail.gmail.com> 
	<49D317D5.6080705@canterbury.ac.nz>
	<ca471dc20904010936h36fc2315saf9df127463bd343@mail.gmail.com> 
	<49D44EC5.30501@canterbury.ac.nz>
	<ca471dc20904020937v3e4ed057t6b2b216331159274@mail.gmail.com> 
	<49D4F861.9040407@gmail.com> <49D505EC.8010305@improva.dk>
Message-ID: <ca471dc20904021144gc85aafbk2abb25dc03afb106@mail.gmail.com>

On Thu, Apr 2, 2009 at 11:37 AM, Jacob Holm <jh at improva.dk> wrote:
> Bruce Frederiksen wrote:
>>
>> Guido van Rossum wrote:
>>>
>>> but how about 'return from'?
>>
>> or 'return finally'?(??) ...
>>
> Or what about "yield return"? That clearly marks the construct as belonging
> in a generator. It also mixes well with the idea of a "yield raise" that I
> mentioned in another mail (not a suggestion for this PEP).

Not totally weird. After all Dave Beazley's trampoline uses "yield
g()" to call a sub-generator and "yield x" to return a value x from a
sub-generator to the calling generator via the trampoline's stack...

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)


From dangyogi at gmail.com  Thu Apr  2 21:35:43 2009
From: dangyogi at gmail.com (Bruce Frederiksen)
Date: Thu, 02 Apr 2009 15:35:43 -0400
Subject: [Python-ideas] Yet another alternative name for yield-from
In-Reply-To: <49D505EC.8010305@improva.dk>
References: <49AB1F90.7070201@canterbury.ac.nz>
	<49CE29BF.3040502@improva.dk>	<49CEB8DE.8060200@gmail.com>	<49CEBCD5.7020107@canterbury.ac.nz>
	<49CF6AAF.70109@improva.dk>	<49D05C8F.3040800@canterbury.ac.nz>	<ca471dc20903312110y287cc5fbhf3abfb8fdac5d3dd@mail.gmail.com>	<49D317D5.6080705@canterbury.ac.nz>	<ca471dc20904010936h36fc2315saf9df127463bd343@mail.gmail.com>	<49D44EC5.30501@canterbury.ac.nz>	<ca471dc20904020937v3e4ed057t6b2b216331159274@mail.gmail.com>
	<49D4F861.9040407@gmail.com> <49D505EC.8010305@improva.dk>
Message-ID: <49D5138F.4070703@gmail.com>

Jacob Holm wrote:
> Bruce Frederiksen wrote:
>> Guido van Rossum wrote:
>>> but how about 'return from'?
>> or 'return finally'?(??) ...
>>
> Or what about "yield return"? That clearly marks the construct as 
> belonging in a generator. It also mixes well with the idea of a "yield 
> raise" that I mentioned in another mail (not a suggestion for this PEP).
Another strange one: 'close with X'.

This hinges on the 'close' method returning X and also that this could 
be done syntactically *without* making 'close' a reserved word by 
relying on 'with' already being a reserved word with very limited usage 
in the grammar:

    generator_return_stmt: NAME 'with' testlist

And then verify later that NAME is 'close' (or raise SyntaxError).

I'm not that familiar with Python's parser to know if it could handle 
this or not (LL vs. LR)...

-bruce frederiksen


From gjcarneiro at gmail.com  Thu Apr  2 22:10:55 2009
From: gjcarneiro at gmail.com (Gustavo Carneiro)
Date: Thu, 2 Apr 2009 21:10:55 +0100
Subject: [Python-ideas] [Python-Dev] OSError.errno => exception
	hierarchy?
In-Reply-To: <ca471dc20904021037r5e1f2fefn7ce83caadf505e34@mail.gmail.com>
References: <a467ca4f0904020525taabdeb8gd75ce8f73b418d66@mail.gmail.com>
	<ca471dc20904020757o57654981o2170a54f201d5031@mail.gmail.com>
	<a467ca4f0904020912g7d4056c5p955198980d9e4218@mail.gmail.com>
	<ca471dc20904021037r5e1f2fefn7ce83caadf505e34@mail.gmail.com>
Message-ID: <a467ca4f0904021310u32bc492bx3964fe299ac44a10@mail.gmail.com>

2009/4/2 Guido van Rossum <guido at python.org>

> [Moving to python-ideas]
>
> On Thu, Apr 2, 2009 at 9:12 AM, Gustavo Carneiro <gjcarneiro at gmail.com>
> wrote:
> >
> >
> > 2009/4/2 Guido van Rossum <guido at python.org>
> >>
> >> On 4/2/09, Gustavo Carneiro <gjcarneiro at gmail.com> wrote:
> >> > Apologies if this has already been discussed.
> >> >
> >> > I was expecting that by now, python 3.0,
>
> Note, these words are pretty offensive (or perhaps
> passive-aggressive). How would you respond if some Perl hacker said "I
> was expecting that by now, Python 3.0, Python would have dropped the
> whitespace bug."


Sorry.  I suck at this.  I think I'm too impatient (read: time constrained)
to be polite :P


>
> >> > the following code:
> >> >
> >> >             # clean the target dir
> >> >             import errno
> >> >             try:
> >> >                 shutil.rmtree(trace_output_path)
> >> >             except OSError, ex:
> >> >                 if ex.errno not in [errno.ENOENT]:
> >> >                     raise
> >> >
> >> > Would have become something simpler, like this:
> >> >
> >> >             # clean the target dir
> >> >             try:
> >> >                 shutil.rmtree(trace_output_path)
> >> >             except OSErrorNoEntry:       # or maybe os.ErrorNoEntry
> >> >                 pass
> >> >
> >> > Apparently no one has bothered yet
>
> Again, offensive words -- makes you sound like you are so much smarter than
> us.
>
> >> > to turn OSError + errno into a hierarchy
> >> > of OSError subclasses, as it should.  What's the problem, no will to
> do
> >> > it or no manpower?
>
> Again poor choice of words. Note the leading question: you don't even
> consider the possibility that it's a bad idea. Compare "When did you
> stop beating your wife?"


Maybe I didn't express myself very well.  By "no will to do it" I meant "no
interest to do it".  If you thought it was a bad idea you would have no
interest in doing it.

To be frank, what I thought most likely was that, with all the refactoring
going on in Python 3, this issue, which was an obvious cleanup to me (but
now I realize it is not obvious at all to everyone else), was overlooked
because there were bigger problems to solve.

So there are usually only three reasons why something is not done:

  1- It's a bad idea;
  2- No one thought of it;
  3- It's a good and known idea, but there was no manpower to do it.

I was betting more on 2 and 3, but not entirely ruling out 1.


>
> >> Sounds like a bad idea. There are hundreds of errno values. I don't
> >> want to have hundreds of new exceptions -- especially not since not
> >> all are defined on each platform.
> >
> > We already have the hundreds of errno values defined in the errno module.
> > It is just a matter of turning the integers that we already have into
> > exception subclasses of OSError.  My idea would be to only create
> exceptions
> > for the errors listed in the module 'errno', and use a generic OSError
> for
> > the rest of them.
>
> This would cause more platform problems than we already have. Consider
> an errno (EWIN, mapped to OSWinError in your proposal) that exists
> only on Windows, and another (ELIN, OSLinError) that exists only on
> Linux. Now suppose you have to catch both of them. If you write your
> code like this:
>
>  try:
>    ...
>  except OSWinError:
>    ...
>  except OSLinError:
>    ...
>
> then on Linux, if OSLinError is raised, the "except OSWinError:"
> clause will raise a NameError because that exception isn't defined,
> and you'll never reach the "except OSLinError:" clause. If you reverse
> the clauses you have the same problem on Windows.
>
> While you would have the same problem if you tried to do something
> like "if e.errno == errno.EWIN:" on Linux, it's easier to circumvent
> -- one of the many ways to do so is to write "if
> errno.errorcode[e.errno] == 'EWIN':" instead..
>

I just committed some code like "if e.errno == errno.EWIN".  I  had not
realized not all constants are defined on all systems, although it is
documented so I have no excuse.

The problem you report:

>  try:
>    ...
>  except OSWinError:
>    ...
>  except OSLinError:
>    ...
>
>
Would be solved if both OSWinError and OSLinError were always defined in
both Linux and Windows Python.  Programs could be written to catch both
OSWinError and OSLinError, except that on Linux OSWinError would never
actually be raised, and on Windows OSLinError would never occur.  Problem
solved.

The downsides of this?  I can only see memory, at the moment, but I might be
missing something.

Now just one final word why I think this matters.  The currently correct way
to remove a directory tree and only ignore the error "it does not exist" is:

try:
    shutil.rmtree("dirname")
except OSError, e:
    if errno.errorcode[e.errno] != 'ENOENT':
       raise

However, only very experienced programmers will know to write that correct
code (apparently I am not experienced enought!).

What I am proposing is that the simpler correct code would be something
like:

try:
    shutil.rmtree("dirname")
except OSNoEntryError:
    pass

Much simpler, no?

Right now, developers are tempted to write code like:

    shutil.rmtree("dirname", ignore_errors=True)

Or:

try:
    shutil.rmtree("dirname")
except OSError:
    pass

Both of which follow the error hiding anti-pattern [1].

[1] http://en.wikipedia.org/wiki/Error_hiding

Thanks for reading this far.

-- 
Gustavo J. A. M. Carneiro
INESC Porto, Telecommunications and Multimedia Unit
"The universe is always one step beyond logic." -- Frank Herbert
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-ideas/attachments/20090402/3f7389a4/attachment.html>

From guido at python.org  Thu Apr  2 22:37:47 2009
From: guido at python.org (Guido van Rossum)
Date: Thu, 2 Apr 2009 13:37:47 -0700
Subject: [Python-ideas] [Python-Dev] OSError.errno => exception
	hierarchy?
In-Reply-To: <a467ca4f0904021310u32bc492bx3964fe299ac44a10@mail.gmail.com>
References: <a467ca4f0904020525taabdeb8gd75ce8f73b418d66@mail.gmail.com> 
	<ca471dc20904020757o57654981o2170a54f201d5031@mail.gmail.com> 
	<a467ca4f0904020912g7d4056c5p955198980d9e4218@mail.gmail.com> 
	<ca471dc20904021037r5e1f2fefn7ce83caadf505e34@mail.gmail.com> 
	<a467ca4f0904021310u32bc492bx3964fe299ac44a10@mail.gmail.com>
Message-ID: <ca471dc20904021337i43a1e995sc0d62ab9a64ef99d@mail.gmail.com>

On Thu, Apr 2, 2009 at 1:10 PM, Gustavo Carneiro
<gjcarneiro at gmail.com> > Sorry.? I suck at this.? I think I'm too
impatient (read: time constrained)
> to be polite :P

OK, ditto. :)

>> This would cause more platform problems than we already have. Consider
>> an errno (EWIN, mapped to OSWinError in your proposal) that exists
>> only on Windows, and another (ELIN, OSLinError) that exists only on
>> Linux. Now suppose you have to catch both of them. If you write your
>> code like this:
>>
>> ?try:
>> ? ?...
>> ?except OSWinError:
>> ? ?...
>> ?except OSLinError:
>> ? ?...
>>
>> then on Linux, if OSLinError is raised, the "except OSWinError:"
>> clause will raise a NameError because that exception isn't defined,
>> and you'll never reach the "except OSLinError:" clause. If you reverse
>> the clauses you have the same problem on Windows.
>>
>> While you would have the same problem if you tried to do something
>> like "if e.errno == errno.EWIN:" on Linux, it's easier to circumvent
>> -- one of the many ways to do so is to write "if
>> errno.errorcode[e.errno] == 'EWIN':" instead..
>
> I just committed some code like "if e.errno == errno.EWIN".? I? had not
> realized not all constants are defined on all systems, although it is
> documented so I have no excuse.
>
> The problem you report:
>>
>> ?try:
>> ? ?...
>> ?except OSWinError:
>> ? ?...
>> ?except OSLinError:
>> ? ?...
>>
>
> Would be solved if both OSWinError and OSLinError were always defined in
> both Linux and Windows Python.? Programs could be written to catch both
> OSWinError and OSLinError, except that on Linux OSWinError would never
> actually be raised, and on Windows OSLinError would never occur.? Problem
> solved.

Yeah, but now you'd have to generate the list of exceptions (which
would be enormously long) based on the union of all errno codes in the
universe.

Unless you only want to do it for some errno codes and not for others,
which sounds like asking for trouble.

Also you need a naming scheme that works for all errnos and doesn't
require manual work. Frankly, the only scheme that I can think of that
could be automated would be something like OSError_ENAME.

And, while OSError is built-in, I think these exceptions (because
there are so many) should not be built-in, and probably not even live
in the 'os' namespace -- the best place for them would be the errno
module, so errno.OSError_ENAME.

> The downsides of this?? I can only see memory, at the moment, but I might be
> missing something.

It's an enormous amount of work to make it happen across all
platforms. And it doesn't really solve an important problem.

> Now just one final word why I think this matters.? The currently correct way
> to remove a directory tree and only ignore the error "it does not exist" is:
>
> try:
> ??? shutil.rmtree("dirname")
> except OSError, e:
> ??? if errno.errorcode[e.errno] != 'ENOENT':
> ?????? raise
>
> However, only very experienced programmers will know to write that correct
> code (apparently I am not experienced enought!).

That doesn't strike me as correct at all, since it doesn't distinguish
between ENOENT being raised for some file deep down in the tree vs.
the root not existing. (This could happen if after you did
os.listdir() some other process deleted some file.)

A better way might be

try:
  shutil.rmtree(<dir>)
except OSError:
  if os.path.exists(<dir>):
   raise

Though I don't know what you wish to happen of <dir> were a dangling symlink.

> What I am proposing is that the simpler correct code would be something
> like:
>
> try:
> ??? shutil.rmtree("dirname")
> except OSNoEntryError:
> ??? pass
>
> Much simpler, no?

And wrong.

> Right now, developers are tempted to write code like:
>
> ??? shutil.rmtree("dirname", ignore_errors=True)
>
> Or:
>
> try:
> ??? shutil.rmtree("dirname")
> except OSError:
> ??? pass
>
> Both of which follow the error hiding anti-pattern [1].
>
> [1] http://en.wikipedia.org/wiki/Error_hiding
>
> Thanks for reading this far.

Thanks for not wasting any more of my time.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)


From guido at python.org  Thu Apr  2 22:43:39 2009
From: guido at python.org (Guido van Rossum)
Date: Thu, 2 Apr 2009 13:43:39 -0700
Subject: [Python-ideas] Yet another alternative name for yield-from
In-Reply-To: <49D5138F.4070703@gmail.com>
References: <49AB1F90.7070201@canterbury.ac.nz>
	<49D05C8F.3040800@canterbury.ac.nz> 
	<ca471dc20903312110y287cc5fbhf3abfb8fdac5d3dd@mail.gmail.com> 
	<49D317D5.6080705@canterbury.ac.nz>
	<ca471dc20904010936h36fc2315saf9df127463bd343@mail.gmail.com> 
	<49D44EC5.30501@canterbury.ac.nz>
	<ca471dc20904020937v3e4ed057t6b2b216331159274@mail.gmail.com> 
	<49D4F861.9040407@gmail.com> <49D505EC.8010305@improva.dk> 
	<49D5138F.4070703@gmail.com>
Message-ID: <ca471dc20904021343x7d22ccbcif9241109c1dc8df3@mail.gmail.com>

On Thu, Apr 2, 2009 at 12:35 PM, Bruce Frederiksen <dangyogi at gmail.com> wrote:
> Another strange one: 'close with X'.
>
> This hinges on the 'close' method returning X and also that this could be
> done syntactically *without* making 'close' a reserved word by relying on
> 'with' already being a reserved word with very limited usage in the grammar:
>
> ? generator_return_stmt: NAME 'with' testlist
>
> And then verify later that NAME is 'close' (or raise SyntaxError).
>
> I'm not that familiar with Python's parser to know if it could handle this
> or not (LL vs. LR)...

The current parser generator cannot deal with this -- when it sees a
NAME at the start of the line it has to decide which non-terminal to
pick to parse the rest of the line. Besides (before you put effort
into trying to fix this or prove me wrong) this syntax looks too weird
-- we don't normally refer to leaving a stack frame as "closing"
anything. Close is a verb we apply to other things, e.g. files -- or
generators. But not the current frame or generator.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)


From zac256 at gmail.com  Thu Apr  2 23:14:42 2009
From: zac256 at gmail.com (Zac Burns)
Date: Thu, 2 Apr 2009 14:14:42 -0700
Subject: [Python-ideas] Modules could behave like new-style objects
Message-ID: <333edbe80904021414h765f1a0bmfe39ac6efcdae8d7@mail.gmail.com>

I would like to see modules behave more like new-style objects. One
should be able to add properties, descriptors, __getattr__, and such.

One common use case for me is implementing wrapper modules that
populate dynamically (like the ctypes.windll object - except as a
module).



--
Zachary Burns
(407)590-4814
Aim - Zac256FL
Production Engineer
Zindagi Games


From aahz at pythoncraft.com  Thu Apr  2 23:23:41 2009
From: aahz at pythoncraft.com (Aahz)
Date: Thu, 2 Apr 2009 14:23:41 -0700
Subject: [Python-ideas] Modules could behave like new-style objects
In-Reply-To: <333edbe80904021414h765f1a0bmfe39ac6efcdae8d7@mail.gmail.com>
References: <333edbe80904021414h765f1a0bmfe39ac6efcdae8d7@mail.gmail.com>
Message-ID: <20090402212340.GA13035@panix.com>

On Thu, Apr 02, 2009, Zac Burns wrote:
>
> I would like to see modules behave more like new-style objects. One
> should be able to add properties, descriptors, __getattr__, and such.
> 
> One common use case for me is implementing wrapper modules that
> populate dynamically (like the ctypes.windll object - except as a
> module).

Guido can speak up for himself, but in the past he's been pretty negative
about this idea; you may want to hunt up previous threads.
-- 
Aahz (aahz at pythoncraft.com)           <*>         http://www.pythoncraft.com/

"Debugging is twice as hard as writing the code in the first place.
Therefore, if you write the code as cleverly as possible, you are, by
definition, not smart enough to debug it."  --Brian W. Kernighan


From ncoghlan at gmail.com  Thu Apr  2 23:53:27 2009
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Fri, 03 Apr 2009 07:53:27 +1000
Subject: [Python-ideas] Yet another alternative name for yield-from
In-Reply-To: <ca471dc20904021144gc85aafbk2abb25dc03afb106@mail.gmail.com>
References: <49AB1F90.7070201@canterbury.ac.nz> <49CF6AAF.70109@improva.dk>
	<49D05C8F.3040800@canterbury.ac.nz>	<ca471dc20903312110y287cc5fbhf3abfb8fdac5d3dd@mail.gmail.com>
	<49D317D5.6080705@canterbury.ac.nz>	<ca471dc20904010936h36fc2315saf9df127463bd343@mail.gmail.com>
	<49D44EC5.30501@canterbury.ac.nz>	<ca471dc20904020937v3e4ed057t6b2b216331159274@mail.gmail.com>
	<49D4F861.9040407@gmail.com> <49D505EC.8010305@improva.dk>
	<ca471dc20904021144gc85aafbk2abb25dc03afb106@mail.gmail.com>
Message-ID: <49D533D7.4080300@gmail.com>

Guido van Rossum wrote:
> On Thu, Apr 2, 2009 at 11:37 AM, Jacob Holm <jh at improva.dk> wrote:
>> Bruce Frederiksen wrote:
>>> Guido van Rossum wrote:
>>>> but how about 'return from'?
>>> or 'return finally'?(??) ...
>>>
>> Or what about "yield return"? That clearly marks the construct as belonging
>> in a generator. It also mixes well with the idea of a "yield raise" that I
>> mentioned in another mail (not a suggestion for this PEP).
> 
> Not totally weird. After all Dave Beazley's trampoline uses "yield
> g()" to call a sub-generator and "yield x" to return a value x from a
> sub-generator to the calling generator via the trampoline's stack...

Using 'yield return' rather than a bare return wouldn't get any
objections from me. As has been said before, the current SyntaxError
definitely makes it easier to learn some of the ins and outs of generators.

That would leave us with:

'yield': pass a value back to and receive a value from this generator's
client
'yield from': pass control to a subgenerator and receive a value back
from it
'yield return': finish this generator with GeneratorReturn
'return': finish this generator with StopIteration

I think that leaves us with one remaining question: should we save the
return value on the generator iterator and make it available as the
return value of the close() method?

My inclination is that finalising a generator in a way that allows the
return value to be retrieved should be left out of the PEP for now, as
it is something that can be:
a) easily done externally to the generator*
b) added to close() later if we decide it would be a good idea

In order to leave that avenue open in the future however, close() must
be defined in the PEP to trap GeneratorReturn as well as StopIteration.

So +1 to having close() accept GeneratorReturn as a legitimate reaction
to being thrown GeneratorExit, but -0 on saving the return value on the
generator iterator object (at least in the initial version of the PEP)

Cheers,
Nick.

* For example:

  def get_result_send(self, g, sentinel=None):
    # Using a sentinel to tell the generator to finish
    try:
      while 1:
        g.send(sentinel)
      return None
    except GeneratorReturn as gr:
      return gr.value

  def get_result_throw(self, g, sentinel=GeneratorExit):
    # Using GeneratorExit to tell the generator to finish
    try:
      while 1:
        try:
          g.throw(sentinel)
        except sentinel:
          break
      return None
    except GeneratorReturn as gr:
      return gr.value

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------


From zac256 at gmail.com  Thu Apr  2 23:57:44 2009
From: zac256 at gmail.com (Zac Burns)
Date: Thu, 2 Apr 2009 14:57:44 -0700
Subject: [Python-ideas] Modules could behave like new-style objects
In-Reply-To: <20090402212340.GA13035@panix.com>
References: <333edbe80904021414h765f1a0bmfe39ac6efcdae8d7@mail.gmail.com>
	<20090402212340.GA13035@panix.com>
Message-ID: <333edbe80904021457k5b8846b2odec01c04488e02cd@mail.gmail.com>

Thanks Aahz,

I did a search for previous threads but all the keywords and
combinations of keywords I could think of showed an overwhelming
amount of unrelated things. Does anyone know where I can find this
information?

--
Zachary Burns
(407)590-4814
Aim - Zac256FL
Production Engineer
Zindagi Games



On Thu, Apr 2, 2009 at 2:23 PM, Aahz <aahz at pythoncraft.com> wrote:
> On Thu, Apr 02, 2009, Zac Burns wrote:
>>
>> I would like to see modules behave more like new-style objects. One
>> should be able to add properties, descriptors, __getattr__, and such.
>>
>> One common use case for me is implementing wrapper modules that
>> populate dynamically (like the ctypes.windll object - except as a
>> module).
>
> Guido can speak up for himself, but in the past he's been pretty negative
> about this idea; you may want to hunt up previous threads.
> --
> Aahz (aahz at pythoncraft.com) ? ? ? ? ? <*> ? ? ? ? http://www.pythoncraft.com/
>
> "Debugging is twice as hard as writing the code in the first place.
> Therefore, if you write the code as cleverly as possible, you are, by
> definition, not smart enough to debug it." ?--Brian W. Kernighan
> _______________________________________________
> Python-ideas mailing list
> Python-ideas at python.org
> http://mail.python.org/mailman/listinfo/python-ideas
>


From ncoghlan at gmail.com  Fri Apr  3 00:19:52 2009
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Fri, 03 Apr 2009 08:19:52 +1000
Subject: [Python-ideas] Modules could behave like new-style objects
In-Reply-To: <333edbe80904021457k5b8846b2odec01c04488e02cd@mail.gmail.com>
References: <333edbe80904021414h765f1a0bmfe39ac6efcdae8d7@mail.gmail.com>	<20090402212340.GA13035@panix.com>
	<333edbe80904021457k5b8846b2odec01c04488e02cd@mail.gmail.com>
Message-ID: <49D53A08.7040406@gmail.com>

Zac Burns wrote:
> Thanks Aahz,
> 
> I did a search for previous threads but all the keywords and
> combinations of keywords I could think of showed an overwhelming
> amount of unrelated things. Does anyone know where I can find this
> information?

Here's an example thread:
http://mail.python.org/pipermail/python-dev/2003-January/032106.html

It would also pose some difficult technical questions - currently
"attr", "globals()['attr']" and "sys.modules[__name__].attr" all access
the same value in different ways. If descriptors are allowed on normal
modules, should the first two expressions be changed to invoke them? Or
will the 3 expressions no longer be alternate ways of saying the same thing?

Given the behaviour of code in the body of a class definition, it would
most likely be the latter, but it would take quite a bit of thought to
figure out the full ramifications of that.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------


From gjcarneiro at gmail.com  Fri Apr  3 01:13:05 2009
From: gjcarneiro at gmail.com (Gustavo Carneiro)
Date: Fri, 3 Apr 2009 00:13:05 +0100
Subject: [Python-ideas] [Python-Dev] OSError.errno => exception
	hierarchy?
In-Reply-To: <ca471dc20904021337i43a1e995sc0d62ab9a64ef99d@mail.gmail.com>
References: <a467ca4f0904020525taabdeb8gd75ce8f73b418d66@mail.gmail.com>
	<ca471dc20904020757o57654981o2170a54f201d5031@mail.gmail.com>
	<a467ca4f0904020912g7d4056c5p955198980d9e4218@mail.gmail.com>
	<ca471dc20904021037r5e1f2fefn7ce83caadf505e34@mail.gmail.com>
	<a467ca4f0904021310u32bc492bx3964fe299ac44a10@mail.gmail.com>
	<ca471dc20904021337i43a1e995sc0d62ab9a64ef99d@mail.gmail.com>
Message-ID: <a467ca4f0904021613g5ef4b33ag105da705e1b4b9b3@mail.gmail.com>

(cross-posting back to python-dev to finalize discussions)

2009/4/2 Guido van Rossum <guido at python.org>
[...]

> > The problem you report:
> >>
> >>  try:
> >>    ...
> >>  except OSWinError:
> >>    ...
> >>  except OSLinError:
> >>    ...
> >>
> >
> > Would be solved if both OSWinError and OSLinError were always defined in
> > both Linux and Windows Python.  Programs could be written to catch both
> > OSWinError and OSLinError, except that on Linux OSWinError would never
> > actually be raised, and on Windows OSLinError would never occur.  Problem
> > solved.
>
> Yeah, but now you'd have to generate the list of exceptions (which
> would be enormously long) based on the union of all errno codes in the
> universe.
>
> Unless you only want to do it for some errno codes and not for others,
> which sounds like asking for trouble.
>
> Also you need a naming scheme that works for all errnos and doesn't
> require manual work. Frankly, the only scheme that I can think of that
> could be automated would be something like OSError_ENAME.
>
> And, while OSError is built-in, I think these exceptions (because
> there are so many) should not be built-in, and probably not even live
> in the 'os' namespace -- the best place for them would be the errno
> module, so errno.OSError_ENAME.
>
> > The downsides of this?  I can only see memory, at the moment, but I might
> be
> > missing something.
>
> It's an enormous amount of work to make it happen across all
> platforms. And it doesn't really solve an important problem.


I partially agree.  It will be a lot of work.  I think the problem is valid,
although not very important, I agree.


>
>
> > Now just one final word why I think this matters.  The currently correct
> way
> > to remove a directory tree and only ignore the error "it does not exist"
> is:
> >
> > try:
> >     shutil.rmtree("dirname")
> > except OSError, e:
> >     if errno.errorcode[e.errno] != 'ENOENT':
> >        raise
> >
> > However, only very experienced programmers will know to write that
> correct
> > code (apparently I am not experienced enought!).
>
> That doesn't strike me as correct at all, since it doesn't distinguish
> between ENOENT being raised for some file deep down in the tree vs.
> the root not existing. (This could happen if after you did
> os.listdir() some other process deleted some file.)


OK.  Maybe in a generic case this could happen, although I'm sure this won't
happen in my particular scenario.  This is about a build system, and I am
assuming there are no two concurrent builds (or else a lot of other things
would fail anyway).


> A better way might be
>
> try:
>  shutil.rmtree(<dir>)
> except OSError:
>  if os.path.exists(<dir>):
>   raise


Sure, this works, but at the cost of an extra system call.  I think it's
more elegant to check the errno (assuming the corner case you pointed out
above is not an issue).


> Though I don't know what you wish to happen of <dir> were a dangling
> symlink.
>
> > What I am proposing is that the simpler correct code would be something
> > like:
> >
> > try:
> >     shutil.rmtree("dirname")
> > except OSNoEntryError:
> >     pass
> >
> > Much simpler, no?
>
> And wrong.
>
> > Right now, developers are tempted to write code like:
> >
> >     shutil.rmtree("dirname", ignore_errors=True)
> >
> > Or:
> >
> > try:
> >     shutil.rmtree("dirname")
> > except OSError:
> >     pass
> >
> > Both of which follow the error hiding anti-pattern [1].
> >
> > [1] http://en.wikipedia.org/wiki/Error_hiding
> >
> > Thanks for reading this far.
>
> Thanks for not wasting any more of my time.


OK, I won't waste more time.  If this were an obvious improvement beyond
doubt to most people, I would pursue it, but since it's not, I can live with
it.

Thanks anyway,

-- 
Gustavo J. A. M. Carneiro
INESC Porto, Telecommunications and Multimedia Unit
"The universe is always one step beyond logic." -- Frank Herbert
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-ideas/attachments/20090403/5db48395/attachment.html>

From jh at improva.dk  Fri Apr  3 01:14:01 2009
From: jh at improva.dk (Jacob Holm)
Date: Fri, 03 Apr 2009 01:14:01 +0200
Subject: [Python-ideas] Yet another alternative name for yield-from
In-Reply-To: <49D533D7.4080300@gmail.com>
References: <49AB1F90.7070201@canterbury.ac.nz> <49CF6AAF.70109@improva.dk>
	<49D05C8F.3040800@canterbury.ac.nz>	<ca471dc20903312110y287cc5fbhf3abfb8fdac5d3dd@mail.gmail.com>
	<49D317D5.6080705@canterbury.ac.nz>	<ca471dc20904010936h36fc2315saf9df127463bd343@mail.gmail.com>
	<49D44EC5.30501@canterbury.ac.nz>	<ca471dc20904020937v3e4ed057t6b2b216331159274@mail.gmail.com>
	<49D4F861.9040407@gmail.com> <49D505EC.8010305@improva.dk>
	<ca471dc20904021144gc85aafbk2abb25dc03afb106@mail.gmail.com>
	<49D533D7.4080300@gmail.com>
Message-ID: <49D546B9.5090601@improva.dk>

Nick Coghlan wrote:
> Using 'yield return' rather than a bare return wouldn't get any
> objections from me. As has been said before, the current SyntaxError
> definitely makes it easier to learn some of the ins and outs of generators.
>
> That would leave us with:
>
> 'yield': pass a value back to and receive a value from this generator's
> client
> 'yield from': pass control to a subgenerator and receive a value back
> from it
> 'yield return': finish this generator with GeneratorReturn
> 'return': finish this generator with StopIteration
>   

FWIW, I still don't see the need for a GeneratorReturn exception.  I 
don't understand why it should be an error to ignore the return value, 
or to loop over a generator that returns a value.  I assume it makes 
sense to someone since it is being discussed, so perhaps one of you 
someones would care to explain it?

> I think that leaves us with one remaining question: should we save the
> return value on the generator iterator and make it available as the
> return value of the close() method?
>   

I think so, yes.  It makes a big difference to some of the examples I 
have shown.

> My inclination is that finalising a generator in a way that allows the
> return value to be retrieved should be left out of the PEP for now, as
> it is something that can be:
> a) easily done externally to the generator*
>   

Yes, you can hack your way around most limitations.  In this case if you 
need the feature it makes quite a big difference to both the calling and 
the called code.

> b) added to close() later if we decide it would be a good idea
>   

That is true, but I think the semantics of "yield-from" becomes more 
coherent if we do it now.  Alternatively, we could drop the "yield 
return" idea from the proposal and make "yield from" a statement.  I 
would hate to see it go because coupled with returning the value from 
close it has some really nice uses, but that would be the other way I 
see to make the proposal coherent.   Having "yield return" without 
returning the value from close just feels wrong.

> In order to leave that avenue open in the future however, close() must
> be defined in the PEP to trap GeneratorReturn as well as StopIteration.
>   

But if we do that without storing the value and returning it on the next 
close, you cannot use "yield return" as a response to GeneratorExit in a 
subiterator without losing the returned value.  (This of course depends 
on how we end up handling GeneratorExit and close in the yield-from 
expression).  Instead you will need to manually raise a different 
exception in the subiterator.  And if you do that, the resulting 
generator can no longer be closed *without* some wrapper to catch the 
exception.


> So +1 to having close() accept GeneratorReturn as a legitimate reaction
> to being thrown GeneratorExit, but -0 on saving the return value on the
> generator iterator object (at least in the initial version of the PEP)
>
>   

And I am +1/+1 on this, although I would rather see the "yield return" 
statement just storing the value directly on the generator, raising a 
normal StopIteration, and not using a GeneratorReturn exception at all.

- Jacob



From guido at python.org  Fri Apr  3 01:28:18 2009
From: guido at python.org (Guido van Rossum)
Date: Thu, 2 Apr 2009 16:28:18 -0700
Subject: [Python-ideas] Yet another alternative name for yield-from
In-Reply-To: <49D546B9.5090601@improva.dk>
References: <49AB1F90.7070201@canterbury.ac.nz>
	<49D317D5.6080705@canterbury.ac.nz> 
	<ca471dc20904010936h36fc2315saf9df127463bd343@mail.gmail.com> 
	<49D44EC5.30501@canterbury.ac.nz>
	<ca471dc20904020937v3e4ed057t6b2b216331159274@mail.gmail.com> 
	<49D4F861.9040407@gmail.com> <49D505EC.8010305@improva.dk> 
	<ca471dc20904021144gc85aafbk2abb25dc03afb106@mail.gmail.com> 
	<49D533D7.4080300@gmail.com> <49D546B9.5090601@improva.dk>
Message-ID: <ca471dc20904021628k2b35981avb8f56b04227f7198@mail.gmail.com>

On Thu, Apr 2, 2009 at 4:14 PM, Jacob Holm <jh at improva.dk> wrote:
> Nick Coghlan wrote:
>>
>> Using 'yield return' rather than a bare return wouldn't get any
>> objections from me. As has been said before, the current SyntaxError
>> definitely makes it easier to learn some of the ins and outs of
>> generators.
>>
>> That would leave us with:
>>
>> 'yield': pass a value back to and receive a value from this generator's
>> client
>> 'yield from': pass control to a subgenerator and receive a value back
>> from it
>> 'yield return': finish this generator with GeneratorReturn
>> 'return': finish this generator with StopIteration
>>
>
> FWIW, I still don't see the need for a GeneratorReturn exception. ?I don't
> understand why it should be an error to ignore the return value, or to loop
> over a generator that returns a value. ?I assume it makes sense to someone
> since it is being discussed, so perhaps one of you someones would care to
> explain it?

I've explained this more than once in some of the many yield-from
threads, but since I am myself asking for a summary of previous
threads I'll explain it again.

Generators are a mind-bendingly complex issue and it's easy for
someone who is just starting to write a generator for the first time
to get a detail or two wrong. We intentionally decided to make "return
<value>" invalid syntax in a generator to help those people. Surely it
would have been easier to code if we just ignored the value. But we
went the extra mile to help people negotiate the steep learning curve.
I don't want to lose this.

>> I think that leaves us with one remaining question: should we save the
>> return value on the generator iterator and make it available as the
>> return value of the close() method?
>>
>
> I think so, yes. ?It makes a big difference to some of the examples I have
> shown.

I must have missed that.

>> My inclination is that finalising a generator in a way that allows the
>> return value to be retrieved should be left out of the PEP for now, as
>> it is something that can be:
>> a) easily done externally to the generator*
>>
>
> Yes, you can hack your way around most limitations. ?In this case if you
> need the feature it makes quite a big difference to both the calling and the
> called code.
>
>> b) added to close() later if we decide it would be a good idea
>>
>
> That is true, but I think the semantics of "yield-from" becomes more
> coherent if we do it now. ?Alternatively, we could drop the "yield return"
> idea from the proposal and make "yield from" a statement. ?I would hate to
> see it go because coupled with returning the value from close it has some
> really nice uses, but that would be the other way I see to make the proposal
> coherent. ? Having "yield return" without returning the value from close
> just feels wrong.
>
>> In order to leave that avenue open in the future however, close() must
>> be defined in the PEP to trap GeneratorReturn as well as StopIteration.
>>
>
> But if we do that without storing the value and returning it on the next
> close, you cannot use "yield return" as a response to GeneratorExit in a
> subiterator without losing the returned value. ?(This of course depends on
> how we end up handling GeneratorExit and close in the yield-from
> expression). ?Instead you will need to manually raise a different exception
> in the subiterator. ?And if you do that, the resulting generator can no
> longer be closed *without* some wrapper to catch the exception.
>
>
>> So +1 to having close() accept GeneratorReturn as a legitimate reaction
>> to being thrown GeneratorExit, but -0 on saving the return value on the
>> generator iterator object (at least in the initial version of the PEP)
>>
>>
>
> And I am +1/+1 on this, although I would rather see the "yield return"
> statement just storing the value directly on the generator, raising a normal
> StopIteration, and not using a GeneratorReturn exception at all.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)


From greg.ewing at canterbury.ac.nz  Fri Apr  3 02:18:25 2009
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Fri, 03 Apr 2009 13:18:25 +1300
Subject: [Python-ideas] Yield-From: Handling of GeneratorExit
In-Reply-To: <49D4A66F.9060900@gmail.com>
References: <49AB1F90.7070201@canterbury.ac.nz> <49CA20F2.7040207@gmail.com>
	<49CA4029.6050703@improva.dk> <49CABFC6.1080207@canterbury.ac.nz>
	<49CAC0FE.5010305@improva.dk> <49CACB39.3020708@canterbury.ac.nz>
	<49CAD15D.2090008@improva.dk> <49CB155E.4040504@canterbury.ac.nz>
	<49CB8E4A.3050108@improva.dk> <49CC5D85.30409@canterbury.ac.nz>
	<49CE29BF.3040502@improva.dk> <49CEB8DE.8060200@gmail.com>
	<49CEBCD5.7020107@canterbury.ac.nz> <49CF6AAF.70109@improva.dk>
	<49D05C8F.3040800@canterbury.ac.nz> <49D0A324.1030701@gmail.com>
	<49D143B1.9040009@canterbury.ac.nz> <49D1542E.7070503@improva.dk>
	<49D16294.9030205@canterbury.ac.nz> <49D17247.20705@improva.dk>
	<49D18D31.9000008@canterbury.ac.nz> <49D1E5E6.5000007@improva.dk>
	<49D207BE.8090909@gmail.com> <49D255BC.6080503@improva.dk>
	<49D2A759.5080204@canterbury.ac.nz> <49D2C735.8020803@improva.dk>
	<49D34FDC.5050106@gmail.com> <49D36FD0.3080602@improva.dk>
	<49D4A156.9080304@canterbury.ac.nz> <49D4A66F.9060900@gmail.com>
Message-ID: <49D555D1.9050701@canterbury.ac.nz>

Nick Coghlan wrote:

> I think I'd prefer to see some arbitrary limit (500 seems like a nice
> round number) on the number of times that GeneratorExit would be thrown
> before giving up

Is it really worth singling out this particular way
of writing an infinite loop?

If you're catching GeneratorExit then you presumably
have the need to clean up and exit on your mind, so
I don't think this is a likely mistake to make.

-- 
Greg



From rrr at ronadam.com  Fri Apr  3 02:35:43 2009
From: rrr at ronadam.com (Ron Adam)
Date: Thu, 02 Apr 2009 19:35:43 -0500
Subject: [Python-ideas] Yield-From: Finalization guarantees
In-Reply-To: <49D36FD0.3080602@improva.dk>
References: <49AB1F90.7070201@canterbury.ac.nz>	<49C94CF6.5070301@gmail.com>	<49C9D162.5040907@canterbury.ac.nz>	<49CA20F2.7040207@gmail.com>	<49CA4029.6050703@improva.dk>	<49CABFC6.1080207@canterbury.ac.nz>	<49CAC0FE.5010305@improva.dk>	<49CACB39.3020708@canterbury.ac.nz>	<49CAD15D.2090008@improva.dk>	<49CB155E.4040504@canterbury.ac.nz>	<49CB8E4A.3050108@improva.dk>	<49CC5D85.30409@canterbury.ac.nz>	<49CE29BF.3040502@improva.dk>	<49CEB8DE.8060200@gmail.com>	<49CEBCD5.7020107@canterbury.ac.nz>	<49CF6AAF.70109@improva.dk>	<49D05C8F.3040800@canterbury.ac.nz>	<49D0A324.1030701@gmail.com>	<49D143B1.9040009@canterbury.ac.nz>	<49D1542E.7070503@improva.dk>	<49D16294.9030205@canterbury.ac.nz>	<49D17247.20705@improva.dk>	<49D18D31.9000008@canterbury.ac.nz>	<49D1E5E6.5000007@improva.dk>	<49D207BE.8090909@gmail
	.com>	<49D255BC.6080503@improva.dk>	<49D2A759.5080204@canterbury.ac.nz>	<49D2C735.8020803@improva.dk>
	<49D34FDC.5050106@gmail.com> <49D36FD0.3080602@improva.dk>
Message-ID: <49D559DF.6000303@ronadam.com>



Jacob Holm wrote:

> That might be the prevailing wisdom concerning GeneratorExit, at least 
> partly based on the fact that the only way to communicate anything 
> useful out of a closing generator is to raise another exception.   
> Thinking a bit about coroutines, it would be nice to use "send" for the 
> normal communication and "close" to shut it down and getting a final 
> result.  Example:
> 
> def averager():
>    count = 0
>    sum = 0
>    while 1:
>        try:            val = (yield)
>        except GeneratorExit:
>            return sum/count
>        else:
>            sum += val
>            count += 1
> 
> avg = averager()
> avg.next() # start coroutine
> avg.send(1.0)
> avg.send(2.0)
> print avg.close()  # prints 1.5
> 
> 
> To do something similar today requires either a custom exception, or the 
> use of special values to tell the generator to yield the result.  I find 
> this version a lot cleaner.

This doesn't seem less cleaner than the above to me.

def averager():
     sum = 0
     count = 0
     try:
         while 1:
             sum += yield
             count += 1
     finally:
         yield sum / count

avg = averager()
avg.next()
avg.send(1.0)
avg.send(2.0)
print avg.next()   # prints 1.5









From greg.ewing at canterbury.ac.nz  Fri Apr  3 03:40:44 2009
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Fri, 03 Apr 2009 14:40:44 +1300
Subject: [Python-ideas] Yield-From: Handling of GeneratorExit
In-Reply-To: <49D4CA47.1020003@improva.dk>
References: <49AB1F90.7070201@canterbury.ac.nz> <49CA4029.6050703@improva.dk>
	<49CABFC6.1080207@canterbury.ac.nz> <49CAC0FE.5010305@improva.dk>
	<49CACB39.3020708@canterbury.ac.nz> <49CAD15D.2090008@improva.dk>
	<49CB155E.4040504@canterbury.ac.nz> <49CB8E4A.3050108@improva.dk>
	<49CC5D85.30409@canterbury.ac.nz> <49CE29BF.3040502@improva.dk>
	<49CEB8DE.8060200@gmail.com> <49CEBCD5.7020107@canterbury.ac.nz>
	<49CF6AAF.70109@improva.dk> <49D05C8F.3040800@canterbury.ac.nz>
	<49D0A324.1030701@gmail.com> <49D143B1.9040009@canterbury.ac.nz>
	<49D1542E.7070503@improva.dk> <49D16294.9030205@canterbury.ac.nz>
	<49D17247.20705@improva.dk> <49D18D31.9000008@canterbury.ac.nz>
	<49D1E5E6.5000007@improva.dk> <49D207BE.8090909@gmail.com>
	<49D255BC.6080503@improva.dk> <49D2A759.5080204@canterbury.ac.nz>
	<49D2C735.8020803@improva.dk> <49D34FDC.5050106@gmail.com>
	<49D36FD0.3080602@improva.dk> <49D4A156.9080304@canterbury.ac.nz>
	<49D4A66F.9060900@gmail.com> <49D4CA47.1020003@improva.dk>
Message-ID: <49D5691C.6030901@canterbury.ac.nz>

Jacob Holm wrote:
> Well-behaved generators would not be affected, but there 
> might be generators in real use that relied on the ability to ignore 
> close or code using such generators that relied on getting the 
> RuntimeError.

I think that's stretching things a bit. To my mind,
any code that *relies* on getting a RuntimeError is
just perverse -- sort of like saying that the code
is only correct if it has a bug in it.

> the "yield raise" expression. The 
> purpose of this would be to raise an exception in the caller of "next", 
> "send", "throw" or "close" *without* finalizing the generator. Extending 
> my "averager" example a bit:

Sorry, but your example is now so convoluted that I
can't follow it. I would never recommend that anyone
write code like that.

> Only true because you have redefined it so that no generators are 
> broken.

Not quite -- a generator that gets into an infinite
loop catching GeneratorExits would still be broken,
you just wouldn't be told about it with an exception.
But it's true that the class of non-broken generators
would be considerably expanded.

I would argue that the generators being included were
unfairly classified as broken before, because they
simply hadn't been given enough opportunity to
finalize themselves.

> The possibility of turning a call that used to raise a 
> RuntimeError into an infinite loop bothers me a bit.

I still have trouble believing that this will be a
serious problem in practice. I suspect it will occur
quite rarely, and if it does occur, you debug it using
the usual techniques for diagnosing an infinite loop.
Hit Ctrl-C, examine the traceback, and do some poking
around.

> To summarize, I am only +0.75 on this proposal. I think it would be 
> better not to loop, still return the final value from close, and still 
> just throw GeneratorExit to subiterators without trying to reraise.

But we've established that this combination makes it
very easy to create broken generators through no
fault of your own. That's not acceptable to my mind.

-- 
Greg



From jimjjewett at gmail.com  Fri Apr  3 03:43:08 2009
From: jimjjewett at gmail.com (Jim Jewett)
Date: Thu, 2 Apr 2009 21:43:08 -0400
Subject: [Python-ideas] x=(yield from) confusion [was:Yet another
	alternative name for yield-from]
Message-ID: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com>

Summary:

I can understand a generator that returns something useful with each yield.

I can understand a generator that is used only for cooperative
multi-tasking, and returns nothing useful until the end.

yield from *as an expression* only really makes sense if the generator
is sending useful information *both* ways.  I can understand that sort
of generator only while reading the PEP; the code smell is strong
enough that I forget it by the next day.

Details:

On 4/2/09, Nick Coghlan <ncoghlan at gmail.com> wrote:
> Jim Jewett wrote:

>> If the "gencall" exhausts the generator f ...
>> If the "return value" of the generator really is important ...

> The intermediate values aren't necessarily discarded by "yield from"
> though: they're passed out to whoever is consuming the values yielded by
> the outermost generator.

I think this may be where I start to have trouble.  When I see the
statement form:

    def outer_g():
        yield from inner_g()

I expect inner_g to suspend execution, and it isn't that hard to
remember that it might do so several times.  I also expect the results
of inner_g.next() to be passed on out to outer_g's own caller, thereby
suspending outer_g.  So far, so good.

But when yield from is used as an expression, and I see:

    x = yield from g()

I somehow expect only a single call to g.next(), whose value gets
assigned to x, and not passed out.  I did read the PEP, in several
versions, and understood it at the time ... and still managed (several
times) to forget and misinterpret by a day or two later.

And yes, I realize it doesn't make too much sense to call g.next()
only once -- so I kept looking for a loop around that yield from
statement.  When there wasn't one, I shrugged it off like a with
clause -- and still didn't remember the actual PEP-intended meaning.

The times I did remember that (even) the expression form looped, I was
still boggled that it would return something other than None after it
was exhausted.  Greg's answer was that it was for threading, and the
final return was the real value.  This seems like a different category
of generator, but I could get my head around it -- so long as I forgot
that the yield itself was returning anything useful.

-jJ


From greg.ewing at canterbury.ac.nz  Fri Apr  3 03:45:01 2009
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Fri, 03 Apr 2009 14:45:01 +1300
Subject: [Python-ideas] Yield-From: Handling of GeneratorExit
In-Reply-To: <49D4CA47.1020003@improva.dk>
References: <49AB1F90.7070201@canterbury.ac.nz> <49CA4029.6050703@improva.dk>
	<49CABFC6.1080207@canterbury.ac.nz> <49CAC0FE.5010305@improva.dk>
	<49CACB39.3020708@canterbury.ac.nz> <49CAD15D.2090008@improva.dk>
	<49CB155E.4040504@canterbury.ac.nz> <49CB8E4A.3050108@improva.dk>
	<49CC5D85.30409@canterbury.ac.nz> <49CE29BF.3040502@improva.dk>
	<49CEB8DE.8060200@gmail.com> <49CEBCD5.7020107@canterbury.ac.nz>
	<49CF6AAF.70109@improva.dk> <49D05C8F.3040800@canterbury.ac.nz>
	<49D0A324.1030701@gmail.com> <49D143B1.9040009@canterbury.ac.nz>
	<49D1542E.7070503@improva.dk> <49D16294.9030205@canterbury.ac.nz>
	<49D17247.20705@improva.dk> <49D18D31.9000008@canterbury.ac.nz>
	<49D1E5E6.5000007@improva.dk> <49D207BE.8090909@gmail.com>
	<49D255BC.6080503@improva.dk> <49D2A759.5080204@canterbury.ac.nz>
	<49D2C735.8020803@improva.dk> <49D34FDC.5050106@gmail.com>
	<49D36FD0.3080602@improva.dk> <49D4A156.9080304@canterbury.ac.nz>
	<49D4A66F.9060900@gmail.com> <49D4CA47.1020003@improva.dk>
Message-ID: <49D56A1D.1080400@canterbury.ac.nz>

Jacob Holm wrote:
> There has been requests for a function that loops over the 
> generator and returns the final result, but this version of close 
> doesn't fit that use case

That has nothing to do with closing behaviour. Such a
function wouldn't close() the subgenerator, it would
just keep calling next() until it finishes naturally
and catch the GeneratorReturn.

-- 
Greg


From greg.ewing at canterbury.ac.nz  Fri Apr  3 03:51:18 2009
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Fri, 03 Apr 2009 14:51:18 +1300
Subject: [Python-ideas] Yield-From: Handling of GeneratorExit
In-Reply-To: <49D4D24C.6070005@gmail.com>
References: <49AB1F90.7070201@canterbury.ac.nz>
	<49CABFC6.1080207@canterbury.ac.nz> <49CAC0FE.5010305@improva.dk>
	<49CACB39.3020708@canterbury.ac.nz> <49CAD15D.2090008@improva.dk>
	<49CB155E.4040504@canterbury.ac.nz> <49CB8E4A.3050108@improva.dk>
	<49CC5D85.30409@canterbury.ac.nz> <49CE29BF.3040502@improva.dk>
	<49CEB8DE.8060200@gmail.com> <49CEBCD5.7020107@canterbury.ac.nz>
	<49CF6AAF.70109@improva.dk> <49D05C8F.3040800@canterbury.ac.nz>
	<49D0A324.1030701@gmail.com> <49D143B1.9040009@canterbury.ac.nz>
	<49D1542E.7070503@improva.dk> <49D16294.9030205@canterbury.ac.nz>
	<49D17247.20705@improva.dk> <49D18D31.9000008@canterbury.ac.nz>
	<49D1E5E6.5000007@improva.dk> <49D207BE.8090909@gmail.com>
	<49D255BC.6080503@improva.dk> <49D2A759.5080204@canterbury.ac.nz>
	<49D2C735.8020803@improva.dk> <49D34FDC.5050106@gmail.com>
	<49D36FD0.3080602@improva.dk> <49D4A156.9080304@canterbury.ac.nz>
	<49D4A66F.9060900@gmail.com> <49D4CA47.1020003@improva.dk>
	<49D4D24C.6070005@gmail.com>
Message-ID: <49D56B96.1020700@canterbury.ac.nz>

Bruce Frederiksen wrote:

> If you re-define close to return the value attached to StopIteration, 

There may be a misconception here. I haven't been
intending for close() to return the return value.
That's not necessary to support Jacob's desire for
the subgenerator to be able to return a value
while the outer generator is being closed.

That's because the subgenerator would *not* have
its close() method called -- rather, GeneratorExit
would be thrown into it. If it returned, this would
manifest as a GeneratorReturn which could be caught
and treated accordingly.

I wouldn't necessarily be against having close()
return the value from GeneratorReturn, but that's
a separate issue to be decided independently.

-- 
Greg


From steve at pearwood.info  Fri Apr  3 03:57:37 2009
From: steve at pearwood.info (Steven D'Aprano)
Date: Fri, 3 Apr 2009 12:57:37 +1100
Subject: [Python-ideas] Yield-From: Handling of GeneratorExit
In-Reply-To: <49D555D1.9050701@canterbury.ac.nz>
References: <49AB1F90.7070201@canterbury.ac.nz> <49D4A66F.9060900@gmail.com>
	<49D555D1.9050701@canterbury.ac.nz>
Message-ID: <200904031257.37927.steve@pearwood.info>

On Fri, 3 Apr 2009 11:18:25 am Greg Ewing wrote:
> Nick Coghlan wrote:
> > I think I'd prefer to see some arbitrary limit (500 seems like a
> > nice round number) on the number of times that GeneratorExit would
> > be thrown before giving up
>
> Is it really worth singling out this particular way
> of writing an infinite loop?

Perhaps I've missed something, but it seems to me that the right limit 
to use would be the recursion limit, and the right exception to raise 
would be RecursionError rather than RuntimeError.




-- 
Steven D'Aprano


From greg.ewing at canterbury.ac.nz  Fri Apr  3 05:17:37 2009
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Fri, 03 Apr 2009 16:17:37 +1300
Subject: [Python-ideas] [Python-Dev] OSError.errno =>
	exception	hierarchy?
In-Reply-To: <ca471dc20904021037r5e1f2fefn7ce83caadf505e34@mail.gmail.com>
References: <a467ca4f0904020525taabdeb8gd75ce8f73b418d66@mail.gmail.com>
	<ca471dc20904020757o57654981o2170a54f201d5031@mail.gmail.com>
	<a467ca4f0904020912g7d4056c5p955198980d9e4218@mail.gmail.com>
	<ca471dc20904021037r5e1f2fefn7ce83caadf505e34@mail.gmail.com>
Message-ID: <49D57FD1.6090203@canterbury.ac.nz>

Guido van Rossum wrote:
> Consider
> an errno (EWIN, mapped to OSWinError in your proposal) that exists
> only on Windows, and another (ELIN, OSLinError) that exists only on
> Linux.

Also, some platforms may have a rather large number
of possible error codes. Not sure what MacOSX is like,
but Classic MacOS had literally *hundreds* of OSError
values. Having a class for each one would be rather
unwieldy.

What might make more sense is to have a way of
attaching a guard expression to an except clause,
maybe

   except OSError as e if e.errno = ESPAM:
     ...

-- 
Greg


From guido at python.org  Fri Apr  3 05:22:31 2009
From: guido at python.org (Guido van Rossum)
Date: Thu, 2 Apr 2009 20:22:31 -0700
Subject: [Python-ideas] Yield-From: Finalization guarantees
In-Reply-To: <49D559DF.6000303@ronadam.com>
References: <49AB1F90.7070201@canterbury.ac.nz> <49D17247.20705@improva.dk> 
	<49D18D31.9000008@canterbury.ac.nz> <49D1E5E6.5000007@improva.dk> 
	<49D255BC.6080503@improva.dk> <49D2A759.5080204@canterbury.ac.nz> 
	<49D2C735.8020803@improva.dk> <49D34FDC.5050106@gmail.com> 
	<49D36FD0.3080602@improva.dk> <49D559DF.6000303@ronadam.com>
Message-ID: <ca471dc20904022022q5a0bd13bg646ab1a3f7b004fe@mail.gmail.com>

On Thu, Apr 2, 2009 at 5:35 PM, Ron Adam <rrr at ronadam.com> wrote:
>
>
> Jacob Holm wrote:
>
>> That might be the prevailing wisdom concerning GeneratorExit, at least
>> partly based on the fact that the only way to communicate anything useful
>> out of a closing generator is to raise another exception. ? Thinking a bit
>> about coroutines, it would be nice to use "send" for the normal
>> communication and "close" to shut it down and getting a final result.
>> ?Example:
>>
>> def averager():
>> ? count = 0
>> ? sum = 0
>> ? while 1:
>> ? ? ? try: ? ? ? ? ? ?val = (yield)
>> ? ? ? except GeneratorExit:
>> ? ? ? ? ? return sum/count
>> ? ? ? else:
>> ? ? ? ? ? sum += val
>> ? ? ? ? ? count += 1
>>
>> avg = averager()
>> avg.next() # start coroutine
>> avg.send(1.0)
>> avg.send(2.0)
>> print avg.close() ?# prints 1.5
>>
>>
>> To do something similar today requires either a custom exception, or the
>> use of special values to tell the generator to yield the result. ?I find
>> this version a lot cleaner.
>
> This doesn't seem less cleaner than the above to me.
>
> def averager():
> ? ?sum = 0
> ? ?count = 0
> ? ?try:
> ? ? ? ?while 1:
> ? ? ? ? ? ?sum += yield
> ? ? ? ? ? ?count += 1
> ? ?finally:
> ? ? ? ?yield sum / count
>
> avg = averager()
> avg.next()
> avg.send(1.0)
> avg.send(2.0)
> print avg.next() ? # prints 1.5

But your version isn't clean -- it relies on "sum += yield" raising a
TypeError when yield returns None (due to .next() being the same as
.send(None)).

That's not to say I like Jacob's version that much, but I now
understand his use case. I note that Dave Beazley works around this
carefully in his tutorial (dabeaz.com/coroutines/) by using examples
that produce output on stdout -- and much later, in his multitasking
schedule example, his trampoline actually interprets yielding a value
that is neither a SystemCall instance nor a generator as a return from
a generator. (This is similar to the abuse that your example is giving
yield, actually.) I'll have to ponder this more.

__________
PS. Somehow the headers in your email made my reply add this:

  Python-Ideas <public-python-ideas-+ZN9ApsXKcEdnm+yROfE0A at ciao.gmane.org>,
Nick Coghlan <public-ncoghlan-Re5JQEeQqe8AvxtiuMwx3w at ciao.gmane.org>

Whoever did that, and whatever they did to cause it, please don't do it again.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)


From guido at python.org  Fri Apr  3 05:25:10 2009
From: guido at python.org (Guido van Rossum)
Date: Thu, 2 Apr 2009 20:25:10 -0700
Subject: [Python-ideas] Yield-From: Finalization guarantees
In-Reply-To: <49D3DD05.7080506@canterbury.ac.nz>
References: <49AB1F90.7070201@canterbury.ac.nz> <49D17247.20705@improva.dk> 
	<49D18D31.9000008@canterbury.ac.nz> <49D1E5E6.5000007@improva.dk> 
	<49D207BE.8090909@gmail.com> <49D255BC.6080503@improva.dk> 
	<49D2A759.5080204@canterbury.ac.nz> <49D2C735.8020803@improva.dk> 
	<49D34FDC.5050106@gmail.com> <49D3DD05.7080506@canterbury.ac.nz>
Message-ID: <ca471dc20904022025i79ede6cfmd49706331152429@mail.gmail.com>

On Wed, Apr 1, 2009 at 2:30 PM, Greg Ewing <greg.ewing at canterbury.ac.nz> wrote:
> As another perspective on this, I think Jacob's
> example is another case of bogus refactoring.

Be that as it may, I wonder if we shouldn't back off from the
refactoring use case a bit and instead just ponder the different types
of code you can write using generators. There's the traditional "pull"
style (iterators), "push" style (like the averaging example), and then
there are "tasks". (Have you read Dave Beazley's couroutines tutorial
yet? Or am I the only one who likes it? :-)

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)


From guido at python.org  Fri Apr  3 05:37:05 2009
From: guido at python.org (Guido van Rossum)
Date: Thu, 2 Apr 2009 20:37:05 -0700
Subject: [Python-ideas] x=(yield from) confusion [was:Yet another
	alternative name for yield-from]
In-Reply-To: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com>
Message-ID: <ca471dc20904022037v7ed3eaacmf2e10b07809cecef@mail.gmail.com>

On Thu, Apr 2, 2009 at 6:43 PM, Jim Jewett <jimjjewett at gmail.com> wrote:
> yield from *as an expression* only really makes sense if the generator
> is sending useful information *both* ways. ?I can understand that sort
> of generator only while reading the PEP; the code smell is strong
> enough that I forget it by the next day.

Read Dave Beazley's coroutines tutorial (dabeaz.com/couroutines) and
check out the contortions in the scheduler to support subgenerators
(Part 8).

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)


From greg.ewing at canterbury.ac.nz  Fri Apr  3 05:41:32 2009
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Fri, 03 Apr 2009 16:41:32 +1300
Subject: [Python-ideas] Yet another alternative name for yield-from
In-Reply-To: <ca471dc20904020937v3e4ed057t6b2b216331159274@mail.gmail.com>
References: <49AB1F90.7070201@canterbury.ac.nz> <49CE29BF.3040502@improva.dk>
	<49CEB8DE.8060200@gmail.com> <49CEBCD5.7020107@canterbury.ac.nz>
	<49CF6AAF.70109@improva.dk> <49D05C8F.3040800@canterbury.ac.nz>
	<ca471dc20903312110y287cc5fbhf3abfb8fdac5d3dd@mail.gmail.com>
	<49D317D5.6080705@canterbury.ac.nz>
	<ca471dc20904010936h36fc2315saf9df127463bd343@mail.gmail.com>
	<49D44EC5.30501@canterbury.ac.nz>
	<ca471dc20904020937v3e4ed057t6b2b216331159274@mail.gmail.com>
Message-ID: <49D5856C.8080706@canterbury.ac.nz>

Guido van Rossum wrote:

> I'm also returning to the view that return from a generator
> used as a task (in Dave Beazley's terms) should be spelled differently
> than a plain return.

Oh, no... I thought you wanted to avoid a bikeshed
discussion on that...

> Phillip Eby's 'return from yield with' may be a
> little long, but how about 'return from'?

That makes no sense. The thing after the 'from'
isn't what you're returning from!

-- 
Greg




From greg.ewing at canterbury.ac.nz  Fri Apr  3 07:16:51 2009
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Fri, 03 Apr 2009 18:16:51 +1300
Subject: [Python-ideas] Yield-From: Handling of GeneratorExit
In-Reply-To: <ca471dc20904021044q3765adc8g827e4b9015c7ae9c@mail.gmail.com>
References: <49AB1F90.7070201@canterbury.ac.nz>
	<49D18D31.9000008@canterbury.ac.nz> <49D1E5E6.5000007@improva.dk>
	<49D207BE.8090909@gmail.com> <49D255BC.6080503@improva.dk>
	<49D2A759.5080204@canterbury.ac.nz> <49D2C735.8020803@improva.dk>
	<49D34FDC.5050106@gmail.com> <49D36FD0.3080602@improva.dk>
	<49D4A156.9080304@canterbury.ac.nz>
	<ca471dc20904021044q3765adc8g827e4b9015c7ae9c@mail.gmail.com>
Message-ID: <49D59BC3.2080606@canterbury.ac.nz>

Guido van Rossum wrote:

> The feature doesn't exist for the benefit of well-behaved generators.

I know. I just mean that it won't break any existing
correct generators, and should allow reasonably-written
future generators to behave reasonably.

> It exists to help people who don't understand generators well enough
> yet to only write well-behaved ones.

If you haven't delved into the details of generators,
you won't know about GeneratorExit, so you won't be
trying to catch it.

> I need a longer description of the problems that you are trying to
> solve here

It's a bit subtle, but I'll try to recap. Suppose
Fred writes the following generator:

   def fred():
     try:
       yield 1
       x = 17
     except GeneratorExit:
       x = 42
     print "x =", x

By current standards, this is a legitimate generator.
Now, the refactoring principle suggests that it
should be possible to rewrite it like this:

   def fred_outer():
     x = yield from fred_inner()
     print "x =", x

   def fred_inner():
     try:
       yield 1
       x = 17
     except GeneratorExit:
       x = 42
     return x

If we treat GeneratorExit just like any other exception
and throw it into the subgenerator, this does in fact
work.

Now for the problem: Suppose Mary comes along and wants
to re-use Fred's inner generator. She writes this:

   def mary():
     y = yield from fred_inner()
     print "y =", y
     yield 2

If close() is called on Mary's generator while it's
suspended inside the call to fred_inner(), a RuntimeError
occurs, because the GeneratorExit got swallowed and
Mary tried to do another yield.

This is not reasonable behaviour, because Mary didn't
do anything wrong. Neither did Fred do anything wrong
when he wrote fred_inner() -- it's a perfectly well-
behaved generator by current standards. But put the
two together and a broken generator results.

One way to fix this is to place a small restriction
on the refactoring principle: we state that you can't
factor out a block of code that catches GeneratorExit
and doesn't reraise it before exiting the block.

This allows us to treat GeneratorExit as a special case,
and always reraise it regardless of what the subiterator
does. Mary's generator is then no longer broken. Fred's
doesn't work any more, but he can't complain, because
he performed an invalid refactoring.

My proposal for changing the way close() works is just
an alternative way of tackling this problem that would
remove the need for special-casing GeneratorExit either
in the expansion or the statement of the refactoring
principle, and allow generators such as Fred's above
to work.

> Please, please, please, we need to stop the bikeshedding and scope
> expansion, and start converging to a *simpler* proposal.

I'm all in favour of simplicity, but it's not clear
what is simpler here. There's a tradeoff between
complexity in the yield-from expansion and complexity
in the behaviour of close().

BTW, if you're after simplicity, I still think that
using a different exception to return values from
generators, and using a different syntax to do so,
are both unnecessary complications.

-- 
Greg

> 



From rrr at ronadam.com  Fri Apr  3 07:58:40 2009
From: rrr at ronadam.com (Ron Adam)
Date: Fri, 03 Apr 2009 00:58:40 -0500
Subject: [Python-ideas] Yield-From: Finalization guarantees
In-Reply-To: <ca471dc20904022022q5a0bd13bg646ab1a3f7b004fe@mail.gmail.com>
References: <49AB1F90.7070201@canterbury.ac.nz> <49D17247.20705@improva.dk>
	<49D18D31.9000008@canterbury.ac.nz>
	<49D1E5E6.5000007@improva.dk> <49D255BC.6080503@improva.dk>
	<49D2A759.5080204@canterbury.ac.nz>
	<49D2C735.8020803@improva.dk> <49D34FDC.5050106@gmail.com>
	<49D36FD0.3080602@improva.dk> <49D559DF.6000303@ronadam.com>
	<ca471dc20904022022q5a0bd13bg646ab1a3f7b004fe@mail.gmail.com>
Message-ID: <gr48ih$dv$1@ger.gmane.org>



Guido van Rossum wrote:
> On Thu, Apr 2, 2009 at 5:35 PM, Ron Adam <rrr at ronadam.com> wrote:
>>
>> Jacob Holm wrote:
>>
>>> That might be the prevailing wisdom concerning GeneratorExit, at least
>>> partly based on the fact that the only way to communicate anything useful
>>> out of a closing generator is to raise another exception.   Thinking a bit
>>> about coroutines, it would be nice to use "send" for the normal
>>> communication and "close" to shut it down and getting a final result.
>>>  Example:
>>>
>>> def averager():
>>>   count = 0
>>>   sum = 0
>>>   while 1:
>>>       try:            val = (yield)
>>>       except GeneratorExit:
>>>           return sum/count
>>>       else:
>>>           sum += val
>>>           count += 1
>>>
>>> avg = averager()
>>> avg.next() # start coroutine
>>> avg.send(1.0)
>>> avg.send(2.0)
>>> print avg.close()  # prints 1.5
>>>
>>>
>>> To do something similar today requires either a custom exception, or the
>>> use of special values to tell the generator to yield the result.  I find
>>> this version a lot cleaner.
>> This doesn't seem less cleaner than the above to me.
>>
>> def averager():
>>    sum = 0
>>    count = 0
>>    try:
>>        while 1:
>>            sum += yield
>>            count += 1
>>    finally:
>>        yield sum / count
>>
>> avg = averager()
>> avg.next()
>> avg.send(1.0)
>> avg.send(2.0)
>> print avg.next()   # prints 1.5
> 
> But your version isn't clean -- it relies on "sum += yield" raising a
> TypeError when yield returns None (due to .next() being the same as
> .send(None)).

Something I noticed is that function and method calls use TypeError in 
cases where the argument count is miss matched.  If there was a different 
exception for miss matched arguments and .next() sent no arguments, it 
could be rewritten in a somewhat cleaner way.

def averager():
     sum = 0
     count = 0
     try:
         while 1:
             sum += yield
             count += 1
     except ArgumentCountError:
         yield sum / count

avg = averager()
avg.next()
avg.send(1.0)
avg.send(2.0)
print avg.next()   # prints 1.5

This seems to me that a different exception for miss-matched arguments 
might be useful in a more general way.


> That's not to say I like Jacob's version that much, but I now
> understand his use case. I note that Dave Beazley works around this
> carefully in his tutorial (dabeaz.com/coroutines/) by using examples
> that produce output on stdout -- and much later, in his multitasking
> schedule example, his trampoline actually interprets yielding a value
> that is neither a SystemCall instance nor a generator as a return from
> a generator. (This is similar to the abuse that your example is giving
> yield, actually.) I'll have to ponder this more.

Yes, generators seem very limiting to me.  Generators with more complex 
input and output requirements are bound to have added complexities to 
offset the limits of a single sequential i/o channel.


> __________
> PS. Somehow the headers in your email made my reply add this:
> 
>   Python-Ideas <public-python-ideas-+ZN9ApsXKcEdnm+yROfE0A at ciao.gmane.org>,
> Nick Coghlan <public-ncoghlan-Re5JQEeQqe8AvxtiuMwx3w at ciao.gmane.org>
> 
> Whoever did that, and whatever they did to cause it, please don't do it again.

I'll manually remove the extra email address's until I (or someone else) 
can explain why they do that.  I hope that's enough for now.

I read several python groups though gmane using Thunderbird on Ubuntu, 
Python-ideas is the only one the email address's are changed like that. (?)

Ron








From greg.ewing at canterbury.ac.nz  Fri Apr  3 08:41:49 2009
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Fri, 03 Apr 2009 18:41:49 +1200
Subject: [Python-ideas] Modules could behave like new-style objects
In-Reply-To: <333edbe80904021457k5b8846b2odec01c04488e02cd@mail.gmail.com>
References: <333edbe80904021414h765f1a0bmfe39ac6efcdae8d7@mail.gmail.com>
	<20090402212340.GA13035@panix.com>
	<333edbe80904021457k5b8846b2odec01c04488e02cd@mail.gmail.com>
Message-ID: <49D5AFAD.1020504@canterbury.ac.nz>

Zac Burns wrote:
> On Thu, Apr 2, 2009 at 2:23 PM, Aahz <aahz at pythoncraft.com> wrote:
> 
>>Guido can speak up for himself, but in the past he's been pretty negative
>>about this idea; you may want to hunt up previous threads.

I think the basic objection is that it would be
inconsistent, because __getattr__ et al only have
effect when defined in the object's class, not the
object itself, and modules are not classes.

What might be workable is a way to specify the class
you want a module to have, something like

__moduleclass__ = MyFancyModule

But that doesn't quite work, because it needs to get
evaluated before creating the module object, but
you need to execute the module code first so you can
import MyFancyModule from somewhere... it all gets
rather messy.

-- 
Greg


From greg.ewing at canterbury.ac.nz  Fri Apr  3 09:23:54 2009
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Fri, 03 Apr 2009 19:23:54 +1200
Subject: [Python-ideas] x=(yield from) confusion [was:Yet another
 alternative name for yield-from]
In-Reply-To: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com>
Message-ID: <49D5B98A.20700@canterbury.ac.nz>

Jim Jewett wrote:

> yield from *as an expression* only really makes sense if the generator
> is sending useful information *both* ways.

No, that's not the only way it makes sense. In my
multitasking example, none of the yields send or
receive any values. But they're still needed,
because they define the points at which the task
can be suspended.

> The times I did remember that (even) the expression form looped,

The yield-from expression itself doesn't loop. What
it does do is yield multiple times, if the generator
being called yields multiple times. But it has to
be driven by whatever is calling the whole thing
making a sufficient number of next() or send() calls,
in a loop or otherwise.

In hindsight, the wording in the PEP about the
subiterator being "run to exhaustion" might be a bit
misleading. I'l see if I can find a better way to
phrase it.

-- 
Greg


From greg.ewing at canterbury.ac.nz  Fri Apr  3 09:32:15 2009
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Fri, 03 Apr 2009 19:32:15 +1200
Subject: [Python-ideas] Yield-From: Handling of GeneratorExit
In-Reply-To: <200904031257.37927.steve@pearwood.info>
References: <49AB1F90.7070201@canterbury.ac.nz> <49D4A66F.9060900@gmail.com>
	<49D555D1.9050701@canterbury.ac.nz>
	<200904031257.37927.steve@pearwood.info>
Message-ID: <49D5BB7F.2010407@canterbury.ac.nz>

Steven D'Aprano wrote:

> Perhaps I've missed something, but it seems to me that the right limit 
> to use would be the recursion limit, and the right exception to raise 
> would be RecursionError rather than RuntimeError.

I'm not sure about that. The kind of code needed
to cause a problem would be something like

   def i_refuse_to-die():
     while 1:
       try:
       	yield 42
       except GeneratorExit:
         pass

which looks more like a plain infinite loop than
anything involving recursion, so I think getting
a RecursionError would be more confusing than
helpful.

-- 
Greg


From greg.ewing at canterbury.ac.nz  Fri Apr  3 10:02:04 2009
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Fri, 03 Apr 2009 20:02:04 +1200
Subject: [Python-ideas] Yield-From: Finalization guarantees
In-Reply-To: <ca471dc20904022025i79ede6cfmd49706331152429@mail.gmail.com>
References: <49AB1F90.7070201@canterbury.ac.nz> <49D17247.20705@improva.dk>
	<49D18D31.9000008@canterbury.ac.nz> <49D1E5E6.5000007@improva.dk>
	<49D207BE.8090909@gmail.com> <49D255BC.6080503@improva.dk>
	<49D2A759.5080204@canterbury.ac.nz> <49D2C735.8020803@improva.dk>
	<49D34FDC.5050106@gmail.com> <49D3DD05.7080506@canterbury.ac.nz>
	<ca471dc20904022025i79ede6cfmd49706331152429@mail.gmail.com>
Message-ID: <49D5C27C.9050804@canterbury.ac.nz>

Guido van Rossum wrote:
> I wonder if we shouldn't back off from the
> refactoring use case a bit and instead just ponder the different types
> of code you can write using generators. There's the traditional "pull"
> style (iterators), "push" style (like the averaging example), and then
> there are "tasks".

I'm not sure how to respond to that, because the whole
issue at stake is whether a certain kind of refactoring
should be considered legal. It's orthogonal to whether
you're using push/pull/task style generators.

> (Have you read Dave Beazley's couroutines tutorial
> yet? Or am I the only one who likes it? :-)

Yes, I've read it, and I quite like it too.

As for where yield-from fits into it, mainly it would
be in section 8, where it would eliminate the need for
trampolining to handle calls/returns.

It doesn't directly help with pipelines of coroutines,
because you're processing the values at each step
rather than just passing them through. But it would
enable a single step of the pipeline to be spread over
more than one function more easily (something he refrains
from doing at that stage in the tutorial, because it
would require the trampolining technique that he doesn't
develop until later).

-- 
Greg



From denis.spir at free.fr  Fri Apr  3 10:19:15 2009
From: denis.spir at free.fr (spir)
Date: Fri, 3 Apr 2009 10:19:15 +0200
Subject: [Python-ideas] name export
Message-ID: <20090403101915.479e4e0b@o>

Hello,

When I write tool modules that export useful names to client code, I usually use __all__ to select proper names. Sure, it's a potential source of identifier conflict. I have another custom __*names__ module attribute that allows the client at least to control which names are defined in the imported module:

# module M
__Mnames__ = [...]
__all__ = ["__Mnames__"] + __M_names__

Then
	from M import * ; print __Mnames__
outputs needed naming information:

	from M import * ; print __Mnames__ ; print dir()
==>
	['a', 'b', 'c']
	['__Mnames__', '__builtins__', '__doc__', '__file__', '__name__', 'a', 'b', 'c']

[Indeed, you'd have the same info with M.__all__, but I find it strange to have both "from M import *" and "import M" only to access its __all__ attribute. Also, it happens that a module name and it's main defined name are identical, like time.time.]

A complication arises when the toolset is structured as a hierarchy of modules. Then I have a top module that (oft) only serves as name interface. Each module exports a "name summary" attribute. For instance, in the case below (where M22 is an internal tool module, not to be exported):

M0
	M1
	M2
		M21
		M22

the import/export schedule would be:

# M21
__M21names__ = [...]
__all__ = ["__M21names__"] + __M21names__
# M22
__all__ = [...]
# M2
from M21 import *
from M22 import *
__M2names__ = [...] + __M21names__
__all__ = ["__M2names__"] +__M2names__
# M1
__M1names__ = [...]
__all__ = ["__M1names__"] + __M1names__
# M0
from M1 import *
from M2 import *
__M0names__ = [...] + __M1names__ + __M2names__
__all__ = ["__M0names__"] + __M0names__

Now, when I modify a module in a way that leads to change/delete/add exported names, I only need to care with this update _locally_. The update will automatically climb up the export chain. [Only a module name change remains problematic -- I'm thinking at an automatic module name lookup/update tool.]

Well, all of this looks a bit forced and rather unpythonic to me. It works fine, but I'm not satisfied. I wonder whether I'm overlooking something obvious. And if yes, what?
Or conversely, do you think there could/should be a nicer (and builtin) way of doing such things? 

Denis
------
la vita e estrany


From denis.spir at free.fr  Fri Apr  3 10:28:23 2009
From: denis.spir at free.fr (spir)
Date: Fri, 3 Apr 2009 10:28:23 +0200
Subject: [Python-ideas] name export -- woops!
In-Reply-To: <20090403101915.479e4e0b@o>
References: <20090403101915.479e4e0b@o>
Message-ID: <20090403102823.674f8943@o>

Sorry, I didn't intend to send this post now -- was rather a rough version waiting for further research -- but I hit the wrong button. Anyway, now it's published... 

Le Fri, 3 Apr 2009 10:19:15 +0200,
spir <denis.spir at free.fr> s'exprima ainsi:

> Hello,
> 
> When I write tool modules that export useful names to client code, I
> usually use __all__ to select proper names. Sure, it's a potential source
> of identifier conflict. I have another custom __*names__ module attribute
> that allows the client at least to control which names are defined in the
> imported module:
> 
> # module M
> __Mnames__ = [...]
> __all__ = ["__Mnames__"] + __M_names__
> 
> Then
> 	from M import * ; print __Mnames__
> outputs needed naming information:
> 
> 	from M import * ; print __Mnames__ ; print dir()
> ==>
> 	['a', 'b', 'c']
> 	['__Mnames__', '__builtins__', '__doc__', '__file__', '__name__',
> 'a', 'b', 'c']
> 
> [Indeed, you'd have the same info with M.__all__, but I find it strange to
> have both "from M import *" and "import M" only to access its __all__
> attribute. Also, it happens that a module name and it's main defined name
> are identical, like time.time.]
> 
> A complication arises when the toolset is structured as a hierarchy of
> modules. Then I have a top module that (oft) only serves as name interface.
> Each module exports a "name summary" attribute. For instance, in the case
> below (where M22 is an internal tool module, not to be exported):
> 
> M0
> 	M1
> 	M2
> 		M21
> 		M22
> 
> the import/export schedule would be:
> 
> # M21
> __M21names__ = [...]
> __all__ = ["__M21names__"] + __M21names__
> # M22
> __all__ = [...]
> # M2
> from M21 import *
> from M22 import *
> __M2names__ = [...] + __M21names__
> __all__ = ["__M2names__"] +__M2names__
> # M1
> __M1names__ = [...]
> __all__ = ["__M1names__"] + __M1names__
> # M0
> from M1 import *
> from M2 import *
> __M0names__ = [...] + __M1names__ + __M2names__
> __all__ = ["__M0names__"] + __M0names__
> 
> Now, when I modify a module in a way that leads to change/delete/add
> exported names, I only need to care with this update _locally_. The update
> will automatically climb up the export chain. [Only a module name change
> remains problematic -- I'm thinking at an automatic module name
> lookup/update tool.]
> 
> Well, all of this looks a bit forced and rather unpythonic to me. It works
> fine, but I'm not satisfied. I wonder whether I'm overlooking something
> obvious. And if yes, what? Or conversely, do you think there could/should
> be a nicer (and builtin) way of doing such things? 
> 
> Denis
> ------
> la vita e estrany
> _______________________________________________
> Python-ideas mailing list
> Python-ideas at python.org
> http://mail.python.org/mailman/listinfo/python-ideas
> 


------
la vita e estrany


From greg.ewing at canterbury.ac.nz  Fri Apr  3 10:40:17 2009
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Fri, 03 Apr 2009 20:40:17 +1200
Subject: [Python-ideas] Yet another alternative name for yield-from
In-Reply-To: <49D505EC.8010305@improva.dk>
References: <49AB1F90.7070201@canterbury.ac.nz> <49CE29BF.3040502@improva.dk>
	<49CEB8DE.8060200@gmail.com> <49CEBCD5.7020107@canterbury.ac.nz>
	<49CF6AAF.70109@improva.dk> <49D05C8F.3040800@canterbury.ac.nz>
	<ca471dc20903312110y287cc5fbhf3abfb8fdac5d3dd@mail.gmail.com>
	<49D317D5.6080705@canterbury.ac.nz>
	<ca471dc20904010936h36fc2315saf9df127463bd343@mail.gmail.com>
	<49D44EC5.30501@canterbury.ac.nz>
	<ca471dc20904020937v3e4ed057t6b2b216331159274@mail.gmail.com>
	<49D4F861.9040407@gmail.com> <49D505EC.8010305@improva.dk>
Message-ID: <49D5CB71.4080104@canterbury.ac.nz>

Jacob Holm wrote:

> Or what about "yield return"?

That makes no sense either.

The 'yield from' syntax works be cause it says, at least
in part, what it's doing. All these suggestions are just
gibberish.

-- 
Greg


From lists at cheimes.de  Fri Apr  3 10:56:11 2009
From: lists at cheimes.de (Christian Heimes)
Date: Fri, 03 Apr 2009 10:56:11 +0200
Subject: [Python-ideas] Modules could behave like new-style objects
In-Reply-To: <333edbe80904021414h765f1a0bmfe39ac6efcdae8d7@mail.gmail.com>
References: <333edbe80904021414h765f1a0bmfe39ac6efcdae8d7@mail.gmail.com>
Message-ID: <gr4ivb$riu$1@ger.gmane.org>

Zac Burns schrieb:
> I would like to see modules behave more like new-style objects. One
> should be able to add properties, descriptors, __getattr__, and such.
> 
> One common use case for me is implementing wrapper modules that
> populate dynamically (like the ctypes.windll object - except as a
> module).

Modules are new style classes and they behave exactly like new style
classes. Perhaps you have missed the point that modules are *instances*
of the module type? As you probably know one can't overwrite magic
methods like __getattr__ on the instance. :)

Christian



From andre.roberge at gmail.com  Fri Apr  3 12:53:29 2009
From: andre.roberge at gmail.com (Andre Roberge)
Date: Fri, 3 Apr 2009 07:53:29 -0300
Subject: [Python-ideas] clear() method for lists
Message-ID: <7528bcdd0904030353v303635fem1e757a5d6d625f8f@mail.gmail.com>

Hi everyone,

On the general Python list, a suggestion was made to add a clear() method to
list, as the "obvious" way to do
del some_list[:]
or
some_list[:] = []

since the clear() method is currently the obvious way to remove all elements
from dict and set objects.

I believe that this would be a lot more intuitive to beginners learning the
language, making Python more uniform.

Andr?
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-ideas/attachments/20090403/605c9239/attachment.html>

From list at qtrac.plus.com  Fri Apr  3 13:15:20 2009
From: list at qtrac.plus.com (Mark Summerfield)
Date: Fri, 3 Apr 2009 12:15:20 +0100
Subject: [Python-ideas] clear() method for lists
In-Reply-To: <7528bcdd0904030353v303635fem1e757a5d6d625f8f@mail.gmail.com>
References: <7528bcdd0904030353v303635fem1e757a5d6d625f8f@mail.gmail.com>
Message-ID: <200904031215.21277.list@qtrac.plus.com>

On 2009-04-03, Andre Roberge wrote:
> Hi everyone,
>
> On the general Python list, a suggestion was made to add a clear() method
> to list, as the "obvious" way to do
> del some_list[:]
> or
> some_list[:] = []
>
> since the clear() method is currently the obvious way to remove all
> elements from dict and set objects.
>
> I believe that this would be a lot more intuitive to beginners learning the
> language, making Python more uniform.
>
> Andr?

Hi,

I have a use case for list.clear() (might be a bit obscure though).

If you have a class that includes a list as an attribute (e.g., a list
"subclass" that uses aggregation rather than inheritance), you might
want to delegate many list methods to the list attribute and only
implement those you want to treat specially. I show an example of this
in "Programming in Python 3" (pages 367/8) where I have a @delegate
decorator that accepts an attribute name and a tuple of methods to
delegate to, e.g.:

    @delegate("__list", ("pop", "__delitem__", "__getitem_", ...))
    class MyList:
	...
	def clear(self):
	    self.__list = []

But because there is no list.clear(), the clear() method must be
implemented rather than delegated even though it doesn't do anything
special.


+1

-- 
Mark Summerfield, Qtrac Ltd, www.qtrac.eu
    C++, Python, Qt, PyQt - training and consultancy
	"Programming in Python 3" - ISBN 0137129297



From gnewsg at gmail.com  Fri Apr  3 14:22:20 2009
From: gnewsg at gmail.com (Giampaolo Rodola')
Date: Fri, 3 Apr 2009 05:22:20 -0700 (PDT)
Subject: [Python-ideas] clear() method for lists
In-Reply-To: <7528bcdd0904030353v303635fem1e757a5d6d625f8f@mail.gmail.com>
References: <7528bcdd0904030353v303635fem1e757a5d6d625f8f@mail.gmail.com>
Message-ID: <30f8400d-b378-46d5-8a12-1874b0fc37ed@u8g2000yqn.googlegroups.com>

On 3 Apr, 12:53, Andre Roberge <andre.robe... at gmail.com> wrote:
> Hi everyone,
>
> On the general Python list, a suggestion was made to add a clear() method to
> list, as the "obvious" way to do
> del some_list[:]
> or
> some_list[:] = []
>
> since the clear() method is currently the obvious way to remove all elements
> from dict and set objects.
>
> I believe that this would be a lot more intuitive to beginners learning the
> language, making Python more uniform.
>
> Andr?


I always wondered why there wasn't such a thing from the beginning.
+ 1 from me.


--- Giampaolo
http://code.google.com/p/pyftpdlib


From theller at ctypes.org  Fri Apr  3 14:57:26 2009
From: theller at ctypes.org (Thomas Heller)
Date: Fri, 03 Apr 2009 14:57:26 +0200
Subject: [Python-ideas] Modules could behave like new-style objects
In-Reply-To: <gr4ivb$riu$1@ger.gmane.org>
References: <333edbe80904021414h765f1a0bmfe39ac6efcdae8d7@mail.gmail.com>
	<gr4ivb$riu$1@ger.gmane.org>
Message-ID: <gr5134$6au$1@ger.gmane.org>

Christian Heimes schrieb:
> Zac Burns schrieb:
>> I would like to see modules behave more like new-style objects. One
>> should be able to add properties, descriptors, __getattr__, and such.
>> 
>> One common use case for me is implementing wrapper modules that
>> populate dynamically (like the ctypes.windll object - except as a
>> module).
> 
> Modules are new style classes and they behave exactly like new style
> classes. Perhaps you have missed the point that modules are *instances*
> of the module type? As you probably know one can't overwrite magic
> methods like __getattr__ on the instance. :)

But one could probably implement the module type's __getattr__ method
to delegate to the instance, somehow.  Yes, I know that Guido doesn't like it.

What one could also do is to replace the module object in sys.modules
with a magic object that specializes the behaviour.
The 'include' function in this module does it:

http://svn.python.org/view/ctypes/trunk/ctypeslib/ctypeslib/dynamic_module.py?revision=HEAD&view=markup

-- 
Thanks,
Thomas



From ncoghlan at gmail.com  Fri Apr  3 15:13:37 2009
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Fri, 03 Apr 2009 23:13:37 +1000
Subject: [Python-ideas] x=(yield from) confusion [was:Yet another
 alternative name for yield-from]
In-Reply-To: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com>
Message-ID: <49D60B81.6060209@gmail.com>

Jim Jewett wrote:
> The times I did remember that (even) the expression form looped, I was
> still boggled that it would return something other than None after it
> was exhausted.  Greg's answer was that it was for threading, and the
> final return was the real value.  This seems like a different category
> of generator, but I could get my head around it -- so long as I forgot
> that the yield itself was returning anything useful.

Greg tried to clarify this a bit already, but I think Jacob's averager
example is an interesting case where it makes sense to both yield
multiple times and also "return a value".

While it is just a toy example, I believe it does a great job of
illustrating the control flow expectations. (Writing this email has
certainly clarified a lot of things about the PEP in my *own* mind).

The following reworking of Jacob's example assumes a couple of things
that differ from the current PEP:

- the particular colour my bikeshed is painted when it comes to
returning values from a generator is "return finally" (the idea being to
emphasise that this represents a special "final" value for the generator
that happens only after all of the normal yields are done).

- rather than trying to change the meaning of GeneratorExit and close(),
3 new generator methods would be added: next_return(), send_return() and
throw_return(). The new methods have the same signatures as their
existing counterparts, but if the generator raises GeneratorReturn, they
trap it and return the associated value instead. Like close(), they
complain with a RuntimeError if the generator doesn't finish. For example:

  def throw_return(self, *exc_info):
    try:
      self.throw(*exc_info)
      raise RuntimeError("Generator did not terminate")
    except GeneratorReturn as gr:
      return gr.value

(Note that I've also removed the 'yield raise' idea from the example -
if next() or send() triggers termination of the generator with an
exception other than StopIteration, then that exception is already
propagated into the calling scope by the existing generator machinery. I
realise Jacob was trying to make it possible to "yield an exception"
without terminating the coroutine, but that idea is well beyond the
scope of the PEP)

You then get:

  class CalcAverage(Exception): pass

  def averager(start=0):
    # averager that maintains a running average
    # and returns the final average when done
    count = 0
    exc = None
    sum = start
    while 1:
      avg = sum / count
      try:
        val = yield avg
      except CalcAverage:
        return finally avg
      sum += val
      count += 1

  avg = averager()
  avg.next() # start coroutine
  avg.send(1.0) # yields 1.0
  avg.send(2.0) # yields 1.5
  print avg.throw_return(CalcAverage) # prints 1.5

Now, suppose I want to write another toy coroutine that calculates the
averages of two sequences and then returns the difference:

  def average_diff(start=0):
    avg1 = yield from averager(start)
    avg2 = yield from averager(start)
    return finally avg2 - avg1

  diff = average_diff()
  diff.next() # start coroutine
              # yields 0.0
  avg.send(1.0) # yields 1.0
  avg.send(2.0) # yields 1.5
  diff.throw(CalcAverage) # Starts calculation of second average
                          # yields 0.0
  diff.send(2.0) # yields 2.0
  diff.send(3.0) # yields 2.5
  print diff.throw_return(CalcAverage) # Prints 1.0 (from "2.5 - 1.5")

The same example could be rewritten to use "None" as a sentinel value
instead of throwing in an exception (average_diff doesn't change, so I
haven't rewritten that part):

  def averager(start=0):
    count = 0
    exc = None
    sum = start
    while 1:
      avg = sum / count
      val = yield avg
      if val is None:
        return finally avg
      sum += val
      count += 1

  # yielded values are same as the throw_return() approach
  diff = average_diff()
  diff.next() # start coroutine
  diff.send(1.0)
  diff.send(2.0)
  diff.send(None) # Starts calculation of second average
  diff.send(2.0)
  diff.send(3.0)
  print diff.send_return(None) # Prints 1.0 (from "2.5 - 1.5")

Notice how the coroutines in this example can be thought of as simple
state machines that the calling code needs to know how to drive. That
state sequence is as much a part of the coroutine's signature as are the
arguments to the constructor and the final return value.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------


From ncoghlan at gmail.com  Fri Apr  3 15:18:15 2009
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Fri, 03 Apr 2009 23:18:15 +1000
Subject: [Python-ideas] Yield-From: Handling of GeneratorExit
In-Reply-To: <49D555D1.9050701@canterbury.ac.nz>
References: <49AB1F90.7070201@canterbury.ac.nz> <49CA4029.6050703@improva.dk>
	<49CABFC6.1080207@canterbury.ac.nz>	<49CAC0FE.5010305@improva.dk>
	<49CACB39.3020708@canterbury.ac.nz>	<49CAD15D.2090008@improva.dk>
	<49CB155E.4040504@canterbury.ac.nz>	<49CB8E4A.3050108@improva.dk>
	<49CC5D85.30409@canterbury.ac.nz>	<49CE29BF.3040502@improva.dk>
	<49CEB8DE.8060200@gmail.com>	<49CEBCD5.7020107@canterbury.ac.nz>
	<49CF6AAF.70109@improva.dk>	<49D05C8F.3040800@canterbury.ac.nz>
	<49D0A324.1030701@gmail.com>	<49D143B1.9040009@canterbury.ac.nz>
	<49D1542E.7070503@improva.dk>	<49D16294.9030205@canterbury.ac.nz>
	<49D17247.20705@improva.dk>	<49D18D31.9000008@canterbury.ac.nz>
	<49D1E5E6.5000007@improva.dk>	<49D207BE.8090909@gmail.com>
	<49D255BC.6080503@improva.dk>	<49D2A759.5080204@canterbury.ac.nz>
	<49D2C735.8020803@improva.dk>	<49D34FDC.5050106@gmail.com>
	<49D36FD0.3080602@improva.dk>	<49D4A156.9080304@canterbury.ac.nz>
	<49D4A66F.9060900@gmail.com> <49D555D1.9050701@canterbury.ac.nz>
Message-ID: <49D60C97.3040203@gmail.com>

Greg Ewing wrote:
> Nick Coghlan wrote:
> 
>> I think I'd prefer to see some arbitrary limit (500 seems like a nice
>> round number) on the number of times that GeneratorExit would be thrown
>> before giving up
> 
> Is it really worth singling out this particular way
> of writing an infinite loop?
> 
> If you're catching GeneratorExit then you presumably
> have the need to clean up and exit on your mind, so
> I don't think this is a likely mistake to make.

I came up with a different answer that I like better - don't mess with
GeneratorExit and close() at all, and instead provide next_return(),
send_return() and throw_return() methods that *expect* to get a
GeneratorReturn exception in response (and complain if it doesn't happen).

(I expand on this idea in a lot more detail in my reply to Jim)

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------


From lists at cheimes.de  Fri Apr  3 16:12:07 2009
From: lists at cheimes.de (Christian Heimes)
Date: Fri, 03 Apr 2009 16:12:07 +0200
Subject: [Python-ideas] Modules could behave like new-style objects
In-Reply-To: <gr5134$6au$1@ger.gmane.org>
References: <333edbe80904021414h765f1a0bmfe39ac6efcdae8d7@mail.gmail.com>	<gr4ivb$riu$1@ger.gmane.org>
	<gr5134$6au$1@ger.gmane.org>
Message-ID: <gr55fn$mrv$1@ger.gmane.org>

Thomas Heller wrote:
> But one could probably implement the module type's __getattr__ method
> to delegate to the instance, somehow.  Yes, I know that Guido doesn't like it.

As a matter of fact Guido has mentioned the very same idea a couple of
months ago.

Christian



From jh at improva.dk  Fri Apr  3 17:15:16 2009
From: jh at improva.dk (Jacob Holm)
Date: Fri, 03 Apr 2009 17:15:16 +0200
Subject: [Python-ideas] yield-from and @coroutine decorator
 [was:x=(yield from) confusion]
In-Reply-To: <49D60B81.6060209@gmail.com>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com>
	<49D60B81.6060209@gmail.com>
Message-ID: <49D62804.8080102@improva.dk>

Hi Nick,

Your reworking of my "averager" example has highlighted another issue 
for me, which I will get to below. First a few comments on your message.

Nick Coghlan wrote:
> [snip]
>
> The following reworking of Jacob's example assumes a couple of things
> that differ from the current PEP:
>
> - the particular colour my bikeshed is painted when it comes to
> returning values from a generator is "return finally" (the idea being to
> emphasise that this represents a special "final" value for the generator
> that happens only after all of the normal yields are done).
>   

We should probably drop that particular bikeshed discussion until we 
actually know the details of what the construct should do, esp in the 
context of close(). I am starting to lose track of all the different 
possible versions.

> - rather than trying to change the meaning of GeneratorExit and close(),
> 3 new generator methods would be added: next_return(), send_return() and
> throw_return(). The new methods have the same signatures as their
> existing counterparts, but if the generator raises GeneratorReturn, they
> trap it and return the associated value instead. Like close(), they
> complain with a RuntimeError if the generator doesn't finish. For example:
>
>   def throw_return(self, *exc_info):
>     try:
>       self.throw(*exc_info)
>       raise RuntimeError("Generator did not terminate")
>     except GeneratorReturn as gr:
>       return gr.value
>   

I don't much like the idea of adding these methods, but that is not the 
point of this mail.

> (Note that I've also removed the 'yield raise' idea from the example -
> if next() or send() triggers termination of the generator with an
> exception other than StopIteration, then that exception is already
> propagated into the calling scope by the existing generator machinery. I
> realise Jacob was trying to make it possible to "yield an exception"
> without terminating the coroutine, but that idea is well beyond the
> scope of the PEP)
>   

I think it was pretty clearly marked as out of scope for this PEP, but I 
still like the idea.

> You then get:
>
>   class CalcAverage(Exception): pass
>
>   def averager(start=0):
>     # averager that maintains a running average
>     # and returns the final average when done
>     count = 0
>     exc = None
>     sum = start
>     while 1:
>       avg = sum / count
>       try:
>         val = yield avg
>       except CalcAverage:
>         return finally avg
>       sum += val
>       count += 1
>
>   avg = averager()
>   avg.next() # start coroutine
>   avg.send(1.0) # yields 1.0
>   avg.send(2.0) # yields 1.5
>   print avg.throw_return(CalcAverage) # prints 1.5
>   

This version has a bug. It will raise ZeroDivisionError on the initial 
next() call used to start the generator. A better version if you insist 
on yielding the running average, would be:

def averager(start=0):
    # averager that maintains a running average
    # and returns the final average when done
    count = 0
    sum = start
    avg = None
    while 1:
        try:
            val = yield avg
        except CalcAverage:
            return finally avg
        sum += val
        count += 1
        avg = sum/count


> Now, suppose I want to write another toy coroutine that calculates the
> averages of two sequences and then returns the difference:
>
>   def average_diff(start=0):
>     avg1 = yield from averager(start)
>     avg2 = yield from averager(start)
>     return finally avg2 - avg1
>
>   diff = average_diff()
>   diff.next() # start coroutine
>               # yields 0.0
>   avg.send(1.0) # yields 1.0
>   avg.send(2.0) # yields 1.5
>   diff.throw(CalcAverage) # Starts calculation of second average
>                           # yields 0.0
>   diff.send(2.0) # yields 2.0
>   diff.send(3.0) # yields 2.5
>   print diff.throw_return(CalcAverage) # Prints 1.0 (from "2.5 - 1.5")
>
>   
(There is another minor bug here: the two avg.send() calls should have 
been diff.send()).


Now for my problem. The original averager example was inspired by the 
tutorial http://dabeaz.com/coroutines/ that Guido pointed to. (Great 
stuff, btw). One pattern that is recommended by the tutorial and used 
throughout is to decorate all coroutines with a decorator like:

def coroutine(func):
    def start(*args,**kwargs):
        cr = func(*args,**kwargs)
        cr.next()
        return cr
    return start


The idea is that it saves you from the initial next() call used to start 
the coroutine. The problem is that you cannot use such a decorated 
coroutine in any flavor of the yield-from expression we have considered 
so far, because the yield-from will start out by doing an *additional* 
next call and yield that value.

I have a few vague ideas of how we might change "yield from" to support 
this, but nothing concrete enough to put here. Is this a problem we 
should try to fix, and if so, how?


not-trying-to-be-difficult-ly yours
- Jacob



From ncoghlan at gmail.com  Fri Apr  3 18:08:13 2009
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Sat, 04 Apr 2009 02:08:13 +1000
Subject: [Python-ideas] yield-from and @coroutine decorator
 [was:x=(yield from) confusion]
In-Reply-To: <49D62804.8080102@improva.dk>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com>
	<49D60B81.6060209@gmail.com> <49D62804.8080102@improva.dk>
Message-ID: <49D6346D.3000002@gmail.com>

Jacob Holm wrote:
>> - the particular colour my bikeshed is painted when it comes to
>> returning values from a generator is "return finally" (the idea being to
>> emphasise that this represents a special "final" value for the generator
>> that happens only after all of the normal yields are done).
>>   
> 
> We should probably drop that particular bikeshed discussion until we
> actually know the details of what the construct should do, esp in the
> context of close(). I am starting to lose track of all the different
> possible versions.

Note that the syntax for returning values from generators is largely
independent of the semantics. Guido has pointed out that disallowing the
naive "return EXPR" in generators is an important learning tool for
inexperienced generator users, and I think he's right.

"return finally" reads pretty well and doesn't add a new keyword, while
still allowing generator return values to be written easily. I haven't
seen other suggestions I particularly like, so I figured I'd run with
that one for the revised example :)

>> - rather than trying to change the meaning of GeneratorExit and close(),
>> 3 new generator methods would be added: next_return(), send_return() and
>> throw_return(). The new methods have the same signatures as their
>> existing counterparts, but if the generator raises GeneratorReturn, they
>> trap it and return the associated value instead. Like close(), they
>> complain with a RuntimeError if the generator doesn't finish. For
>> example:
>>
>>   def throw_return(self, *exc_info):
>>     try:
>>       self.throw(*exc_info)
>>       raise RuntimeError("Generator did not terminate")
>>     except GeneratorReturn as gr:
>>       return gr.value
>>   
> 
> I don't much like the idea of adding these methods, but that is not the
> point of this mail.

They don't have to be generator methods - they could easily be functions
in a coroutine module. However, I definitely prefer the idea of new
methods or functions that support a variety of interaction styles over
trying to redefine generator finalisation tools (i.e. GeneratorExit and
close()) to cover this completely different use case. Why create a
potential backwards compatibility problem for ourselves when there are
equally clean alternative solutions?

I also don't like the idea of imposing a specific coroutine return idiom
in the PEP - better to have a system that supports both sentinel values
(via next_return() and send_return()) and sentinel exceptions (via
send_throw()).

> Now for my problem. The original averager example was inspired by the
> tutorial http://dabeaz.com/coroutines/ that Guido pointed to. (Great
> stuff, btw). One pattern that is recommended by the tutorial and used
> throughout is to decorate all coroutines with a decorator like:
> 
> def coroutine(func):
>    def start(*args,**kwargs):
>        cr = func(*args,**kwargs)
>        cr.next()
>        return cr
>    return start
> 
> 
> The idea is that it saves you from the initial next() call used to start
> the coroutine. The problem is that you cannot use such a decorated
> coroutine in any flavor of the yield-from expression we have considered
> so far, because the yield-from will start out by doing an *additional*
> next call and yield that value.
> 
> I have a few vague ideas of how we might change "yield from" to support
> this, but nothing concrete enough to put here. Is this a problem we
> should try to fix, and if so, how?

Hmm, that's a tricky one. It sounds like it is definitely an issue the
PEP needs to discuss, but I don't currently have an opinion as to what
it should say.

> not-trying-to-be-difficult-ly yours

We have a long way to go before we even come close to consuming as many
pixels as PEP 308 or PEP 343 - a fact for which Greg is probably grateful ;)

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------


From jimjjewett at gmail.com  Fri Apr  3 18:42:20 2009
From: jimjjewett at gmail.com (Jim Jewett)
Date: Fri, 3 Apr 2009 12:42:20 -0400
Subject: [Python-ideas] x=(yield from) confusion [was:Yet another
	alternative name for yield-from]
In-Reply-To: <49D5B98A.20700@canterbury.ac.nz>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com>
	<49D5B98A.20700@canterbury.ac.nz>
Message-ID: <fb6fbf560904030942m731f4bfbr709f59fc309b26b2@mail.gmail.com>

On 4/3/09, Greg Ewing <greg.ewing at canterbury.ac.nz> wrote:
> Jim Jewett wrote:

>> yield from *as an expression* only really makes sense if the generator
>> is sending useful information *both* ways.

> No, that's not the only way it makes sense. In my
> multitasking example, none of the yields send or
> receive any values.

err... I didn't mean both directions, I meant "from the callee to the
caller as a yielded value" and "from the callee to the caller as a
final return value that can't be yielded normally."

> But they're still needed,
> because they define the points at which the task
> can be suspended.

If they don't send or receive values, then why do they need to be
expressions instead of statements?


>> The times I did remember that (even) the expression form looped,

> The yield-from expression itself doesn't loop. What
> it does do is yield multiple times,

That sounds to me like an implicit loop.

    yield from iter <==>  for val in iter: yield val

So the outside generator won't progress to its own next line (and
subsequent yield) until it has finished looping over the inner
generator.

-jJ


From jimjjewett at gmail.com  Fri Apr  3 18:48:38 2009
From: jimjjewett at gmail.com (Jim Jewett)
Date: Fri, 3 Apr 2009 12:48:38 -0400
Subject: [Python-ideas] x=(yield from) confusion [was:Yet another
	alternative name for yield-from]
In-Reply-To: <49D60B81.6060209@gmail.com>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com>
	<49D60B81.6060209@gmail.com>
Message-ID: <fb6fbf560904030948t23ad6b08t77e71017e7c62853@mail.gmail.com>

On 4/3/09, Nick Coghlan <ncoghlan at gmail.com> wrote:
> Greg tried to clarify this a bit already, but I think Jacob's averager
> example is an interesting case where it makes sense to both yield
> multiple times and also "return a value".

>   def averager(start=0):
>     # averager that maintains a running average
>     # and returns the final average when done
>     count = 0
>     exc = None
>     sum = start
>     while 1:
>       avg = sum / count
>       try:
>         val = yield avg
>       except CalcAverage:
>         return finally avg
>       sum += val
>       count += 1

It looks to me like it returns (or yields) the running average either way.

I see a reason to send in a sentinel value, saying "Don't update the
average, just tell me the current value."

I don't see why that sentinel has to terminate the generator, nor do I
see why that final average has to be returned rather than yielded.

-jJ


From jimjjewett at gmail.com  Fri Apr  3 19:06:52 2009
From: jimjjewett at gmail.com (Jim Jewett)
Date: Fri, 3 Apr 2009 13:06:52 -0400
Subject: [Python-ideas] x=(yield from) confusion [was:Yet another
	alternative name for yield-from]
In-Reply-To: <ca471dc20904022037v7ed3eaacmf2e10b07809cecef@mail.gmail.com>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com>
	<ca471dc20904022037v7ed3eaacmf2e10b07809cecef@mail.gmail.com>
Message-ID: <fb6fbf560904031006y2a934e3bg1c34c828fac07263@mail.gmail.com>

On 4/2/09, Guido van Rossum <guido at python.org> wrote:
> On Thu, Apr 2, 2009 at 6:43 PM, Jim Jewett <jimjjewett at gmail.com> wrote:
>> yield from *as an expression* only really makes sense if the generator
>> is sending useful information *both* ways.  I can understand that sort
>> of generator only while reading the PEP; the code smell is strong
>> enough that I forget it by the next day.

> Read Dave Beazley's coroutines tutorial (dabeaz.com/couroutines) and
> check out the contortions in the scheduler to support subgenerators
> (Part 8).

I have.  I still don't see it really helping with anything except
maybe (in http://dabeaz.com/coroutines/pyos8.py) the Task.run method.

Even there, I don't see an expression form helping very much.  Passing
through the intermediate yields could be nice, but a statement can do
that.  Grabbing a separate final value could change the "try ...
except StopIteration" into an "x=yield from", but ... I'm not sure
that could really work, because I don't see how all the code inside
the try block goes away.  (It might go away if you moved that logic
from the task to the scheduler, but then the protocol seems to get
even more complicated, as the scheduler has to distinguish between
"I've used up this time slot", as well as "I have intermediate
results, but might be called again", and "I'm done".)

-jJ


From guido at python.org  Fri Apr  3 19:19:53 2009
From: guido at python.org (Guido van Rossum)
Date: Fri, 3 Apr 2009 10:19:53 -0700
Subject: [Python-ideas] x=(yield from) confusion [was:Yet another
	alternative name for yield-from]
In-Reply-To: <49D60B81.6060209@gmail.com>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com> 
	<49D60B81.6060209@gmail.com>
Message-ID: <ca471dc20904031019v74dcf36dj7df9bd7566fb291a@mail.gmail.com>

-1 on adding more methods to generators.
+1 on adding this as a recipe to the docs.

On Fri, Apr 3, 2009 at 6:13 AM, Nick Coghlan <ncoghlan at gmail.com> wrote:
> Jim Jewett wrote:
>> The times I did remember that (even) the expression form looped, I was
>> still boggled that it would return something other than None after it
>> was exhausted. ?Greg's answer was that it was for threading, and the
>> final return was the real value. ?This seems like a different category
>> of generator, but I could get my head around it -- so long as I forgot
>> that the yield itself was returning anything useful.
>
> Greg tried to clarify this a bit already, but I think Jacob's averager
> example is an interesting case where it makes sense to both yield
> multiple times and also "return a value".
>
> While it is just a toy example, I believe it does a great job of
> illustrating the control flow expectations. (Writing this email has
> certainly clarified a lot of things about the PEP in my *own* mind).
>
> The following reworking of Jacob's example assumes a couple of things
> that differ from the current PEP:
>
> - the particular colour my bikeshed is painted when it comes to
> returning values from a generator is "return finally" (the idea being to
> emphasise that this represents a special "final" value for the generator
> that happens only after all of the normal yields are done).
>
> - rather than trying to change the meaning of GeneratorExit and close(),
> 3 new generator methods would be added: next_return(), send_return() and
> throw_return(). The new methods have the same signatures as their
> existing counterparts, but if the generator raises GeneratorReturn, they
> trap it and return the associated value instead. Like close(), they
> complain with a RuntimeError if the generator doesn't finish. For example:
>
> ?def throw_return(self, *exc_info):
> ? ?try:
> ? ? ?self.throw(*exc_info)
> ? ? ?raise RuntimeError("Generator did not terminate")
> ? ?except GeneratorReturn as gr:
> ? ? ?return gr.value
>
> (Note that I've also removed the 'yield raise' idea from the example -
> if next() or send() triggers termination of the generator with an
> exception other than StopIteration, then that exception is already
> propagated into the calling scope by the existing generator machinery. I
> realise Jacob was trying to make it possible to "yield an exception"
> without terminating the coroutine, but that idea is well beyond the
> scope of the PEP)
>
> You then get:
>
> ?class CalcAverage(Exception): pass
>
> ?def averager(start=0):
> ? ?# averager that maintains a running average
> ? ?# and returns the final average when done
> ? ?count = 0
> ? ?exc = None
> ? ?sum = start
> ? ?while 1:
> ? ? ?avg = sum / count
> ? ? ?try:
> ? ? ? ?val = yield avg
> ? ? ?except CalcAverage:
> ? ? ? ?return finally avg
> ? ? ?sum += val
> ? ? ?count += 1
>
> ?avg = averager()
> ?avg.next() # start coroutine
> ?avg.send(1.0) # yields 1.0
> ?avg.send(2.0) # yields 1.5
> ?print avg.throw_return(CalcAverage) # prints 1.5
>
> Now, suppose I want to write another toy coroutine that calculates the
> averages of two sequences and then returns the difference:
>
> ?def average_diff(start=0):
> ? ?avg1 = yield from averager(start)
> ? ?avg2 = yield from averager(start)
> ? ?return finally avg2 - avg1
>
> ?diff = average_diff()
> ?diff.next() # start coroutine
> ? ? ? ? ? ? ?# yields 0.0
> ?avg.send(1.0) # yields 1.0
> ?avg.send(2.0) # yields 1.5
> ?diff.throw(CalcAverage) # Starts calculation of second average
> ? ? ? ? ? ? ? ? ? ? ? ? ?# yields 0.0
> ?diff.send(2.0) # yields 2.0
> ?diff.send(3.0) # yields 2.5
> ?print diff.throw_return(CalcAverage) # Prints 1.0 (from "2.5 - 1.5")
>
> The same example could be rewritten to use "None" as a sentinel value
> instead of throwing in an exception (average_diff doesn't change, so I
> haven't rewritten that part):
>
> ?def averager(start=0):
> ? ?count = 0
> ? ?exc = None
> ? ?sum = start
> ? ?while 1:
> ? ? ?avg = sum / count
> ? ? ?val = yield avg
> ? ? ?if val is None:
> ? ? ? ?return finally avg
> ? ? ?sum += val
> ? ? ?count += 1
>
> ?# yielded values are same as the throw_return() approach
> ?diff = average_diff()
> ?diff.next() # start coroutine
> ?diff.send(1.0)
> ?diff.send(2.0)
> ?diff.send(None) # Starts calculation of second average
> ?diff.send(2.0)
> ?diff.send(3.0)
> ?print diff.send_return(None) # Prints 1.0 (from "2.5 - 1.5")
>
> Notice how the coroutines in this example can be thought of as simple
> state machines that the calling code needs to know how to drive. That
> state sequence is as much a part of the coroutine's signature as are the
> arguments to the constructor and the final return value.
>
> Cheers,
> Nick.
>
> --
> Nick Coghlan ? | ? ncoghlan at gmail.com ? | ? Brisbane, Australia
> ---------------------------------------------------------------
> _______________________________________________
> Python-ideas mailing list
> Python-ideas at python.org
> http://mail.python.org/mailman/listinfo/python-ideas
>



-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)


From guido at python.org  Fri Apr  3 19:22:17 2009
From: guido at python.org (Guido van Rossum)
Date: Fri, 3 Apr 2009 10:22:17 -0700
Subject: [Python-ideas] Yield-From: Handling of GeneratorExit
In-Reply-To: <49D60C97.3040203@gmail.com>
References: <49AB1F90.7070201@canterbury.ac.nz> <49D255BC.6080503@improva.dk> 
	<49D2A759.5080204@canterbury.ac.nz> <49D2C735.8020803@improva.dk> 
	<49D34FDC.5050106@gmail.com> <49D36FD0.3080602@improva.dk> 
	<49D4A156.9080304@canterbury.ac.nz> <49D4A66F.9060900@gmail.com> 
	<49D555D1.9050701@canterbury.ac.nz> <49D60C97.3040203@gmail.com>
Message-ID: <ca471dc20904031022v746f959enf59f5a1a2f322e0d@mail.gmail.com>

On Fri, Apr 3, 2009 at 6:18 AM, Nick Coghlan <ncoghlan at gmail.com> wrote:
> Greg Ewing wrote:
>> Nick Coghlan wrote:
>>
>>> I think I'd prefer to see some arbitrary limit (500 seems like a nice
>>> round number) on the number of times that GeneratorExit would be thrown
>>> before giving up
>>
>> Is it really worth singling out this particular way
>> of writing an infinite loop?
>>
>> If you're catching GeneratorExit then you presumably
>> have the need to clean up and exit on your mind, so
>> I don't think this is a likely mistake to make.
>
> I came up with a different answer that I like better - don't mess with
> GeneratorExit and close() at all, and instead provide next_return(),
> send_return() and throw_return() methods that *expect* to get a
> GeneratorReturn exception in response (and complain if it doesn't happen).
>
> (I expand on this idea in a lot more detail in my reply to Jim)

Since there are so many threads, let me repeat that I'm -1 on adding
more methods, but +1 on adding them to the docs as recipes.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)


From gerald.britton at gmail.com  Fri Apr  3 19:33:14 2009
From: gerald.britton at gmail.com (Gerald Britton)
Date: Fri, 3 Apr 2009 13:33:14 -0400
Subject: [Python-ideas] clear() method for lists
In-Reply-To: <200904031215.21277.list@qtrac.plus.com>
References: <7528bcdd0904030353v303635fem1e757a5d6d625f8f@mail.gmail.com> 
	<200904031215.21277.list@qtrac.plus.com>
Message-ID: <5d1a32000904031033x36eb2fafqb395ac793c36161c@mail.gmail.com>

About your example, what is the advantage over inheriting from list?
I did this myself to build a kind of treed list class that supports
nested lists:

Class TreeList(list):
 # uses most list methods
   def __iter__:
   # does a recursive descent through nested lists.

I only had to implement methods that I wanted to add some extra sauce to.


On Fri, Apr 3, 2009 at 7:15 AM, Mark Summerfield <list at qtrac.plus.com> wrote:
> On 2009-04-03, Andre Roberge wrote:
>> Hi everyone,
>>
>> On the general Python list, a suggestion was made to add a clear() method
>> to list, as the "obvious" way to do
>> del some_list[:]
>> or
>> some_list[:] = []
>>
>> since the clear() method is currently the obvious way to remove all
>> elements from dict and set objects.
>>
>> I believe that this would be a lot more intuitive to beginners learning the
>> language, making Python more uniform.
>>
>> Andr?
>
> Hi,
>
> I have a use case for list.clear() (might be a bit obscure though).
>
> If you have a class that includes a list as an attribute (e.g., a list
> "subclass" that uses aggregation rather than inheritance), you might
> want to delegate many list methods to the list attribute and only
> implement those you want to treat specially. I show an example of this
> in "Programming in Python 3" (pages 367/8) where I have a @delegate
> decorator that accepts an attribute name and a tuple of methods to
> delegate to, e.g.:
>
> ? ?@delegate("__list", ("pop", "__delitem__", "__getitem_", ...))
> ? ?class MyList:
> ? ? ? ?...
> ? ? ? ?def clear(self):
> ? ? ? ? ? ?self.__list = []
>
> But because there is no list.clear(), the clear() method must be
> implemented rather than delegated even though it doesn't do anything
> special.
>
>
> +1
>
> --
> Mark Summerfield, Qtrac Ltd, www.qtrac.eu
> ? ?C++, Python, Qt, PyQt - training and consultancy
> ? ? ? ?"Programming in Python 3" - ISBN 0137129297
>
> _______________________________________________
> Python-ideas mailing list
> Python-ideas at python.org
> http://mail.python.org/mailman/listinfo/python-ideas
>



-- 
Gerald Britton


From jh at improva.dk  Fri Apr  3 19:47:19 2009
From: jh at improva.dk (Jacob Holm)
Date: Fri, 03 Apr 2009 19:47:19 +0200
Subject: [Python-ideas] x=(yield from) confusion [was:Yet
 another	alternative name for yield-from]
In-Reply-To: <fb6fbf560904030948t23ad6b08t77e71017e7c62853@mail.gmail.com>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com>	<49D60B81.6060209@gmail.com>
	<fb6fbf560904030948t23ad6b08t77e71017e7c62853@mail.gmail.com>
Message-ID: <49D64BA7.2000006@improva.dk>

Jim Jewett wrote:
> It looks to me like it returns (or yields) the running average either way.
>   
That is because Nick has mangled my beautiful example - sorry Nick :)

You can see my original example at:

http://mail.python.org/pipermail/python-ideas/2009-April/003841.html

and a few arguments why I think it is better at:

http://mail.python.org/pipermail/python-ideas/2009-April/003847.html

> I see a reason to send in a sentinel value, saying "Don't update the
> average, just tell me the current value."
>
> I don't see why that sentinel has to terminate the generator, nor do I
> see why that final average has to be returned rather than yielded.
>   

Yielding the current value on each send was not part of the original 
example because I was thinking in terms of well-behaved coroutines as 
described in http://dabeaz.com/coroutines/. I agree that it makes sense 
for running averages, but it is not that hard to come up with similar 
examples where the intermediate state is not really useful and/or may be 
expensive to compute.

The reason for closing would be that once you have computed the final 
result, you want whatever resources the coroutine is using to be freed. 
Since only the final result is assumed to be useful, it makes perfect 
sense to close the coroutine at the same time as you are requesting the 
final result.

- Jacob


From george.sakkis at gmail.com  Fri Apr  3 20:00:07 2009
From: george.sakkis at gmail.com (George Sakkis)
Date: Fri, 3 Apr 2009 14:00:07 -0400
Subject: [Python-ideas] clear() method for lists
In-Reply-To: <5d1a32000904031033x36eb2fafqb395ac793c36161c@mail.gmail.com>
References: <7528bcdd0904030353v303635fem1e757a5d6d625f8f@mail.gmail.com>
	<200904031215.21277.list@qtrac.plus.com>
	<5d1a32000904031033x36eb2fafqb395ac793c36161c@mail.gmail.com>
Message-ID: <91ad5bf80904031100w5adc5d3bq5d78f8760ee21656@mail.gmail.com>

On Fri, Apr 3, 2009 at 1:33 PM, Gerald Britton <gerald.britton at gmail.com> wrote:

> About your example, what is the advantage over inheriting from list?

The fact that you can pick and choose which methods to provide, while
with inheritance you get all of them whether you want them or not. For
example one may want to create a FixedSizeList that provides only the
read-only methods plus __setitem__, but not any method that changes
the size of the list. Although you could emulate that with inheritance
(by making all forbidden methods raise AttributeError), it's less
clear than delegation and it breaks introspection.

George


> I did this myself to build a kind of treed list class that supports
> nested lists:
>
> Class TreeList(list):
> ?# uses most list methods
> ? def __iter__:
> ? # does a recursive descent through nested lists.
>
> I only had to implement methods that I wanted to add some extra sauce to.
>
>
> On Fri, Apr 3, 2009 at 7:15 AM, Mark Summerfield <list at qtrac.plus.com> wrote:
>> On 2009-04-03, Andre Roberge wrote:
>>> Hi everyone,
>>>
>>> On the general Python list, a suggestion was made to add a clear() method
>>> to list, as the "obvious" way to do
>>> del some_list[:]
>>> or
>>> some_list[:] = []
>>>
>>> since the clear() method is currently the obvious way to remove all
>>> elements from dict and set objects.
>>>
>>> I believe that this would be a lot more intuitive to beginners learning the
>>> language, making Python more uniform.
>>>
>>> Andr?
>>
>> Hi,
>>
>> I have a use case for list.clear() (might be a bit obscure though).
>>
>> If you have a class that includes a list as an attribute (e.g., a list
>> "subclass" that uses aggregation rather than inheritance), you might
>> want to delegate many list methods to the list attribute and only
>> implement those you want to treat specially. I show an example of this
>> in "Programming in Python 3" (pages 367/8) where I have a @delegate
>> decorator that accepts an attribute name and a tuple of methods to
>> delegate to, e.g.:
>>
>> ? ?@delegate("__list", ("pop", "__delitem__", "__getitem_", ...))
>> ? ?class MyList:
>> ? ? ? ?...
>> ? ? ? ?def clear(self):
>> ? ? ? ? ? ?self.__list = []
>>
>> But because there is no list.clear(), the clear() method must be
>> implemented rather than delegated even though it doesn't do anything
>> special.
>>
>>
>> +1
>>
>> --
>> Mark Summerfield, Qtrac Ltd, www.qtrac.eu
>> ? ?C++, Python, Qt, PyQt - training and consultancy
>> ? ? ? ?"Programming in Python 3" - ISBN 0137129297
>>
>> _______________________________________________
>> Python-ideas mailing list
>> Python-ideas at python.org
>> http://mail.python.org/mailman/listinfo/python-ideas
>>
>
>
>
> --
> Gerald Britton
> _______________________________________________
> Python-ideas mailing list
> Python-ideas at python.org
> http://mail.python.org/mailman/listinfo/python-ideas
>


From fuzzyman at gmail.com  Fri Apr  3 20:33:37 2009
From: fuzzyman at gmail.com (Michael Foord)
Date: Fri, 3 Apr 2009 19:33:37 +0100
Subject: [Python-ideas] clear() method for lists
In-Reply-To: <30f8400d-b378-46d5-8a12-1874b0fc37ed@u8g2000yqn.googlegroups.com>
References: <7528bcdd0904030353v303635fem1e757a5d6d625f8f@mail.gmail.com>
	<30f8400d-b378-46d5-8a12-1874b0fc37ed@u8g2000yqn.googlegroups.com>
Message-ID: <6f4025010904031133g1a0b61dbjabbc59ff5ad64c19@mail.gmail.com>

+1 from me

The current methods are all slightly ugly.

Michael

2009/4/3 Giampaolo Rodola' <gnewsg at gmail.com>

> On 3 Apr, 12:53, Andre Roberge <andre.robe... at gmail.com> wrote:
> > Hi everyone,
> >
> > On the general Python list, a suggestion was made to add a clear() method
> to
> > list, as the "obvious" way to do
> > del some_list[:]
> > or
> > some_list[:] = []
> >
> > since the clear() method is currently the obvious way to remove all
> elements
> > from dict and set objects.
> >
> > I believe that this would be a lot more intuitive to beginners learning
> the
> > language, making Python more uniform.
> >
> > Andr?
>
>
> I always wondered why there wasn't such a thing from the beginning.
> + 1 from me.
>
>
> --- Giampaolo
> http://code.google.com/p/pyftpdlib
> _______________________________________________
> Python-ideas mailing list
> Python-ideas at python.org
> http://mail.python.org/mailman/listinfo/python-ideas
>



-- 
http://www.ironpythoninaction.com/
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-ideas/attachments/20090403/c260cc5c/attachment.html>

From jimjjewett at gmail.com  Fri Apr  3 20:48:22 2009
From: jimjjewett at gmail.com (Jim Jewett)
Date: Fri, 3 Apr 2009 14:48:22 -0400
Subject: [Python-ideas] x=(yield from) confusion [was:Yet another
	alternative name for yield-from]
In-Reply-To: <49D64BA7.2000006@improva.dk>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com>
	<49D60B81.6060209@gmail.com>
	<fb6fbf560904030948t23ad6b08t77e71017e7c62853@mail.gmail.com>
	<49D64BA7.2000006@improva.dk>
Message-ID: <fb6fbf560904031148h69eeeed8n80ccb7165d71a11f@mail.gmail.com>

On 4/3/09, Jacob Holm <jh at improva.dk> wrote:

> Yielding the current value on each send was not part of the original
> example ...

So in the original you cared about the final value, but not the
intermediate yields.  My question is whether there is a sane case
where you care about *both*, *and* you care about distinguishing them.
 And is this case common really enough that we don't want it marked up
with something more explicit, like a sentinel or a raise?

> The reason for closing would be that once you have computed the final
> result, you want whatever resources the coroutine is using to be freed.
> Since only the final result is assumed to be useful, it makes perfect
> sense to close the coroutine at the same time as you are requesting the
> final result.

    def outer(unfinished=object()):
        g=inner(unfinished)
        for result in g:
            yield unfinished    # cooperative multi-tasking, so co-operate
            if result is not unfinished: break
        ...

I see some value in simplifying that, or adding more power.

But I'm not convinced the current proposal actually is much simpler,
or that the extra power wouldn't be better written in a more explicit
manner.  I think the above still translates into

    def outer(unfinished=object()):
        #    # now also need to set an initial value of result
        #    # *OR* distinguish intermediate from final results.
        #    result=unfinished
        g=inner(unfinished)
        #    # loop is now implicit
        #    while result is unfinished:
        #        result = yield from g
        result = yield from g
        ...
-jJ


From george.sakkis at gmail.com  Fri Apr  3 20:54:57 2009
From: george.sakkis at gmail.com (George Sakkis)
Date: Fri, 3 Apr 2009 14:54:57 -0400
Subject: [Python-ideas] clear() method for lists
In-Reply-To: <7528bcdd0904030353v303635fem1e757a5d6d625f8f@mail.gmail.com>
References: <7528bcdd0904030353v303635fem1e757a5d6d625f8f@mail.gmail.com>
Message-ID: <91ad5bf80904031154h1d009f64g60da09003487d6b8@mail.gmail.com>

On Fri, Apr 3, 2009 at 6:53 AM, Andre Roberge <andre.roberge at gmail.com> wrote:

> Hi everyone,
>
> On the general Python list, a suggestion was made to add a clear() method to
> list, as the "obvious" way to do
> del some_list[:]
> or
> some_list[:] = []
>
> since the clear() method is currently the obvious way to remove all elements
> from dict and set objects.
>
> I believe that this would be a lot more intuitive to beginners learning the
> language, making Python more uniform.

It's obviously more explicit, but at the same it's a rather infrequent
operation and the current ways are not particularly ugly. +0, no
strong feelings either way.

George


From erratic at devel.ws  Fri Apr  3 21:09:09 2009
From: erratic at devel.ws (Paige Thompson)
Date: Fri, 3 Apr 2009 12:09:09 -0700
Subject: [Python-ideas] clear() method for lists
In-Reply-To: <91ad5bf80904031154h1d009f64g60da09003487d6b8@mail.gmail.com>
References: <7528bcdd0904030353v303635fem1e757a5d6d625f8f@mail.gmail.com>
	<91ad5bf80904031154h1d009f64g60da09003487d6b8@mail.gmail.com>
Message-ID: <5061b39c0904031209mbb9851ete35a7c6ff452d553@mail.gmail.com>

i instinctively want to do add instead of append. some of the other
functions i would want to use are Contains, maybe IndexOf.

is there any kind of plugin for a query language like LINQ for python? thatd
be dope

-Adele
(sent from my gphone!)

On Apr 3, 2009 11:55 AM, "George Sakkis" <george.sakkis at gmail.com> wrote:

On Fri, Apr 3, 2009 at 6:53 AM, Andre Roberge <andre.roberge at gmail.com>
wrote: > Hi everyone, > > O...
It's obviously more explicit, but at the same it's a rather infrequent
operation and the current ways are not particularly ugly. +0, no
strong feelings either way.

George

_______________________________________________ Python-ideas mailing list
Python-ideas at python.org ht...
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-ideas/attachments/20090403/c52b9205/attachment.html>

From erratic at devel.ws  Fri Apr  3 21:21:58 2009
From: erratic at devel.ws (Paige Thompson)
Date: Fri, 3 Apr 2009 12:21:58 -0700
Subject: [Python-ideas] clear() method for lists
In-Reply-To: <91ad5bf80904031154h1d009f64g60da09003487d6b8@mail.gmail.com>
References: <7528bcdd0904030353v303635fem1e757a5d6d625f8f@mail.gmail.com>
	<91ad5bf80904031154h1d009f64g60da09003487d6b8@mail.gmail.com>
Message-ID: <5061b39c0904031221r5040ab71u3ea68025a60c3b7b@mail.gmail.com>

if any of you are not C# coders, generics and IEnumerables/IQueryables are
whats god to me. We have lists which is a generic, though the genrics part
is kind of type-safe specific. Example:

List<string> mylist = new List<string>(); // cant remember if thats the
exact syntax but thats essentially *it*

you could also do
List<object> hodgepodge

.... could be strings ints guids etc.

anyway i think c# lists and dictionaries (also generic) are god.

Dictionary<guid, string>

at any rate, typesafe or not the list and dictionary objects have a lot of
neat stuff inside of them, some of which are LINQ (essentially an inline
query language) specific. maybe some of the devs would like to steal some of
the brilliance out of there-- their claim to fame is patterns and practices.
Having worked with these and not to put python down in anyway at all i think
their lists and dictionaries (generics) are wonderful to work with. I hear
java is similar.

-Adele
(sent from my gphone!)

On Apr 3, 2009 11:55 AM, "George Sakkis" <george.sakkis at gmail.com> wrote:

On Fri, Apr 3, 2009 at 6:53 AM, Andre Roberge <andre.roberge at gmail.com>
wrote: > Hi everyone, > > O...
It's obviously more explicit, but at the same it's a rather infrequent
operation and the current ways are not particularly ugly. +0, no
strong feelings either way.

George

_______________________________________________ Python-ideas mailing list
Python-ideas at python.org ht...
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-ideas/attachments/20090403/3ed39be6/attachment.html>

From fuzzyman at gmail.com  Fri Apr  3 21:32:51 2009
From: fuzzyman at gmail.com (Michael Foord)
Date: Fri, 3 Apr 2009 20:32:51 +0100
Subject: [Python-ideas] clear() method for lists
In-Reply-To: <5061b39c0904031209mbb9851ete35a7c6ff452d553@mail.gmail.com>
References: <7528bcdd0904030353v303635fem1e757a5d6d625f8f@mail.gmail.com>
	<91ad5bf80904031154h1d009f64g60da09003487d6b8@mail.gmail.com>
	<5061b39c0904031209mbb9851ete35a7c6ff452d553@mail.gmail.com>
Message-ID: <6f4025010904031232s2bdfccbav3dc76e3ab6b220d@mail.gmail.com>

2009/4/3 Paige Thompson <erratic at devel.ws>

> i instinctively want to do add instead of append. some of the other
> functions i would want to use are Contains, maybe IndexOf.
>

Sound like you want to be using a different language!

Lists support all those operations they are just spelled differently...



> is there any kind of plugin for a query language like LINQ for python?
> thatd be dope
>


We have list comprehensions which are LINQ over objects on steroids, however
Python has nothing like LINQ to other data providers built into the
language.

There is a third party extension / framework / tool called DejaVu though,
which I have heard good things about.

Michael


> -Adele
> (sent from my gphone!)
>
> On Apr 3, 2009 11:55 AM, "George Sakkis" <george.sakkis at gmail.com> wrote:
>
> On Fri, Apr 3, 2009 at 6:53 AM, Andre Roberge <andre.roberge at gmail.com>
> wrote: > Hi everyone, > > O...
> It's obviously more explicit, but at the same it's a rather infrequent
> operation and the current ways are not particularly ugly. +0, no
> strong feelings either way.
>
> George
>
> _______________________________________________ Python-ideas mailing list
> Python-ideas at python.org ht...
>
>
> _______________________________________________
> Python-ideas mailing list
> Python-ideas at python.org
> http://mail.python.org/mailman/listinfo/python-ideas
>
>


-- 
http://www.ironpythoninaction.com/
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-ideas/attachments/20090403/95e07f39/attachment.html>

From george.sakkis at gmail.com  Fri Apr  3 22:24:20 2009
From: george.sakkis at gmail.com (George Sakkis)
Date: Fri, 3 Apr 2009 16:24:20 -0400
Subject: [Python-ideas] clear() method for lists
In-Reply-To: <5061b39c0904031209mbb9851ete35a7c6ff452d553@mail.gmail.com>
References: <7528bcdd0904030353v303635fem1e757a5d6d625f8f@mail.gmail.com>
	<91ad5bf80904031154h1d009f64g60da09003487d6b8@mail.gmail.com>
	<5061b39c0904031209mbb9851ete35a7c6ff452d553@mail.gmail.com>
Message-ID: <91ad5bf80904031324p36e26684i5b376e477c790012@mail.gmail.com>

On Fri, Apr 3, 2009 at 3:09 PM, Paige Thompson <erratic at devel.ws> wrote:

> i instinctively want to do add instead of append. some of the other
> functions i would want to use are Contains, maybe IndexOf.
>
> is there any kind of plugin for a query language like LINQ for python? thatd
> be dope

If you're so happy with C#, why bother with Python in the first place ?

> -Adele
> (sent from my gphone!)

George
(sent from my gmail!)


From guido at python.org  Fri Apr  3 23:21:16 2009
From: guido at python.org (Guido van Rossum)
Date: Fri, 3 Apr 2009 14:21:16 -0700
Subject: [Python-ideas] x=(yield from) confusion [was:Yet another
	alternative name for yield-from]
In-Reply-To: <49D64BA7.2000006@improva.dk>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com> 
	<49D60B81.6060209@gmail.com>
	<fb6fbf560904030948t23ad6b08t77e71017e7c62853@mail.gmail.com> 
	<49D64BA7.2000006@improva.dk>
Message-ID: <ca471dc20904031421j2c6c8f49kece03fe24ca539ee@mail.gmail.com>

On Fri, Apr 3, 2009 at 10:47 AM, Jacob Holm <jh at improva.dk> wrote:
> You can see my original example at:
>
> http://mail.python.org/pipermail/python-ideas/2009-April/003841.html
>
> and a few arguments why I think it is better at:
>
> http://mail.python.org/pipermail/python-ideas/2009-April/003847.html
[...]
> The reason for closing would be that once you have computed the final
> result, you want whatever resources the coroutine is using to be freed.
> Since only the final result is assumed to be useful, it makes perfect sense
> to close the coroutine at the same time as you are requesting the final
> result.

Hm. I am beginning to see what you are asking for. Your averager
example is somewhat convincing.

Interestingly, in a sense it seems unrelated to yield-from: the
averager doesn't seem to need yield-from, it receives values sent to
it using send(). An alternative version (that works today), which I
find a bit clearer, uses exceptions instead of a sentinel value: when
you are done with sending it the sequence of values, you throw() a
special exception into it, and in response it raises another exception
back with a value attribute. I'm showing the usage example first, then
the support code.

Usage example:

@coroutine
def summer():
  sum = 0
  while True:
    try:
      value = yield
    except Terminate:
      raise Done(sum)
    else:
      sum += value

def main():
  a = summer()
  a.send(1)
  a.send(2)
  print finalize(a)

Support code:

class Terminate(Exception):
  """Exception thrown into the generator to ask it to stop."""

class Done(Exception):
  """Exception raised by the generator when it catches Terminate."""
  def __init__(self, value=None):
    self.value = value

def coroutine(func):
  """Decorator around a coroutine, to make the initial next() call."""
  def wrapper(*args, **kwds):
    g = func(*args, **kwds)
    g.next()
    return g
  return wrapper

def finalize(g):
  """Throw Terminate into a couroutine and extract the value from Done."""
  try:
    g.throw(Terminate)
  except Done as e:
    return e.value
  else:
    g.close()
    raise RuntimeError("Expected Done(<value>)")

I use a different exception to throw into the exception as what it
raises in response, so that mistakes (e.g. the generator not catching
Terminate) are caught, and no confusion can exist with the built-in
exceptions StopIteration and GeneratorExit.

Now I'll compare this manual version with your (Jacob Holm's) proposal:

- instead of Done you use GeneratorExit
- hence, instead of g.throw(Done) you can use g.close()
- instead of Terminate you use StopException
- you want g.close() to extract and return the value from StopException
- you use "return value" instead of "raise Done(value)"

The usage example then becomes, with original version indicated in comments:

@coroutine
def summer():
  sum = 0
  while True:
    try:
      value = yield
    except GeneratorExit:    # except Terminate:
      return sum    # raise Done(sum)
    else:
      sum += value

def main():
  a = summer()
  a.send(1)
  a.send(2)
  print a.close()    # print finalize(a)

At this point, I admin that I am not yet convinced. On the one hand,
my support code melts away, except for the @coroutine decorator. On
the other hand:

- the overloading of GeneratorExit and StopIteration reduces
diagnostics for common beginner's mistakes when writing regular
(iterator-style) generator code
- the usage example isn't much simpler
- the support code isn't rocket science
- the coroutine use case is specialized enough that a little support
seems okay (you still need @coroutine anyway)

The last three points don't sway me either way: they pit minor
conveniences against minor inconveniences.

However, the first point worries me a lot. The concern over
StopIteration can be dealt with by introducing a new exception that is
raised only by "return value" inside a generator.

But I'm also worried that the mere need to catch GeneratorExit for a
purpose other than resource cleanup will cause examples using it to
pop up on the web, which will then be copied and modified by clueless
beginners, and *increase* the probability of bad code being written.
(That's why I introduce *new* exceptions in my support code -- they
don't have predefined meanings in other contexts.)

Finally, I am not sure of the connection with "yield from". I don't
see a way to exploit it for this example. As an exercise, I
constructed an "averager" generator out of the above "summer" and a
similar "counter", and I didn't see a way to exploit "yield from". The
only connection seems to be PEP 380's proposal to turn "return value"
inside a generator into "raise StopIteration(value)", and that's the
one part of the PEP with which I have a problem anyway (the beginner's
issues above). Oh, and "yield from" competes with @couroutine over
when the initial next() call is made, which again suggests the two
styles (yield-from and coroutines) are incompatible.

All in all, I think I would be okay with turning "return value" inside
a generator into raising *some* exception, as long as that exception
is not StopIteration (nor derives from it, nor from GeneratorExit).
PEP 380 and its implementation would become just a tad more complex,
but I think that's worth it. Generators used as iterators would raise
a (normally) uncaught exception if they returned a value, and that's
my main requirement. I'm still not convince that more is needed, in
particular I'm still -0 on catching this value in gen_close() and
returning the value attribute from there.

As I've said before, I don't care whether "return None" would be
treated more like "return" or more like "return value" -- for
beginners' code I don't think it matters, and for advanced code they
should be equivalent.

I'll stop arguing for new syntax to return a value from a generator
(like Phillip Eby's proposed "return from yield with <value>"): I
don't think it adds enough to overcome the pain for the parser and
other tools.

Finally, as far as a name for the new exception, I think something
long like ReturnFromGenerator would be fine, since most of the time it
is handled implicitly by coroutine support code (whether this is user
code or gen_close()) or the yield-from implementation.

I'm sorry for being so long winded and yet somewhat inconclusive. I
wouldn't have bothered if I didn't think there was something worth
pursuing. But it sure seems elusive.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)


From jimjjewett at gmail.com  Sat Apr  4 00:10:55 2009
From: jimjjewett at gmail.com (Jim Jewett)
Date: Fri, 3 Apr 2009 18:10:55 -0400
Subject: [Python-ideas] Yield-From: Handling of GeneratorExit
In-Reply-To: <200904031257.37927.steve@pearwood.info>
References: <49AB1F90.7070201@canterbury.ac.nz> <49D4A66F.9060900@gmail.com>
	<49D555D1.9050701@canterbury.ac.nz>
	<200904031257.37927.steve@pearwood.info>
Message-ID: <fb6fbf560904031510w4f510cfemd42326a0d0a0d88e@mail.gmail.com>

On 4/2/09, Steven D'Aprano <steve at pearwood.info> wrote:
> On Fri, 3 Apr 2009 11:18:25 am Greg Ewing wrote:
>> Nick Coghlan wrote:
>> > I think I'd prefer to see some arbitrary limit (500 seems like a
>> > nice round number) on the number of times that GeneratorExit would
>> > be thrown before giving up

>> Is it really worth singling out this particular way
>> of writing an infinite loop?

generators are trickier, so I would say yes, except that ...

Someone who is already intentionally catching and ignoring an
Exception may not be in the mood to respond to subtle hints.

> Perhaps I've missed something, but it seems to me that the right limit
> to use would be the recursion limit, and the right exception to raise
> would be RecursionError rather than RuntimeError.

The recursion limit is normally a way to prevent memory exhaustion.
In this case, the stack doesn't grow; it is still a generic "while
True: pass" that just happens to bounce between two frames instead of
sticking to one.

-jJ


From tjreedy at udel.edu  Sat Apr  4 01:21:10 2009
From: tjreedy at udel.edu (Terry Reedy)
Date: Fri, 03 Apr 2009 19:21:10 -0400
Subject: [Python-ideas] Yield-From: Finalization guarantees
In-Reply-To: <ca471dc20904022025i79ede6cfmd49706331152429@mail.gmail.com>
References: <49AB1F90.7070201@canterbury.ac.nz> <49D17247.20705@improva.dk>
	<49D18D31.9000008@canterbury.ac.nz>
	<49D1E5E6.5000007@improva.dk> <49D207BE.8090909@gmail.com>
	<49D255BC.6080503@improva.dk> <49D2A759.5080204@canterbury.ac.nz>
	<49D2C735.8020803@improva.dk> <49D34FDC.5050106@gmail.com>
	<49D3DD05.7080506@canterbury.ac.nz>
	<ca471dc20904022025i79ede6cfmd49706331152429@mail.gmail.com>
Message-ID: <gr65l5$s8a$2@ger.gmane.org>

Guido van Rossum wrote:
> On Wed, Apr 1, 2009 at 2:30 PM, Greg Ewing <greg.ewing at canterbury.ac.nz> wrote:
>> As another perspective on this, I think Jacob's
>> example is another case of bogus refactoring.
> 
> Be that as it may, I wonder if we shouldn't back off from the
> refactoring use case a bit and instead just ponder the different types
> of code you can write using generators. There's the traditional "pull"
> style (iterators), "push" style (like the averaging example), and then
> there are "tasks". (Have you read Dave Beazley's couroutines tutorial
> yet? Or am I the only one who likes it? :-)

I liked it so much that I posted the url to the Python list, where 
others gave it a positive response also.  Unlike his previous intro 
talk, it was not something to breeze through in an evening, so I 
bookmarked it.  Thanks for mentioning it.

Terry




From jh at improva.dk  Sat Apr  4 01:25:46 2009
From: jh at improva.dk (Jacob Holm)
Date: Sat, 04 Apr 2009 01:25:46 +0200
Subject: [Python-ideas] x=(yield from) confusion [was:Yet another
 alternative name for yield-from]
In-Reply-To: <ca471dc20904031421j2c6c8f49kece03fe24ca539ee@mail.gmail.com>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com>
	<49D60B81.6060209@gmail.com>
	<fb6fbf560904030948t23ad6b08t77e71017e7c62853@mail.gmail.com>
	<49D64BA7.2000006@improva.dk>
	<ca471dc20904031421j2c6c8f49kece03fe24ca539ee@mail.gmail.com>
Message-ID: <49D69AFA.5070600@improva.dk>

Hi Guido

Thank you for taking the time.  I'll try to be brief so I don't spend 
more of it than necesary...  (I'll probably fail though)

Guido van Rossum wrote:
> Hm. I am beginning to see what you are asking for. Your averager
> example is somewhat convincing.
>
> Interestingly, in a sense it seems unrelated to yield-from: the
> averager doesn't seem to need yield-from, it receives values sent to
> it using send(). 

It is related to yield-from only because the "return value from 
generator" concept is introduced there.  I was trying to work out the 
details of how I would like that to work in the context of GeneratorExit 
and needed a near-trivial example.


[snip "summer" example using existing features and comparison with 
version using suggested new features]
> At this point, I admin that I am not yet convinced. On the one hand,
> my support code melts away, except for the @coroutine decorator. On
> the other hand:
>
> - the overloading of GeneratorExit and StopIteration reduces
> diagnostics for common beginner's mistakes when writing regular
> (iterator-style) generator code
> - the usage example isn't much simpler
> - the support code isn't rocket science
> - the coroutine use case is specialized enough that a little support
> seems okay (you still need @coroutine anyway)
>
> The last three points don't sway me either way: they pit minor
> conveniences against minor inconveniences.
>
> However, the first point worries me a lot. The concern over
> StopIteration can be dealt with by introducing a new exception that is
> raised only by "return value" inside a generator.
>   

I am still trying to get a clear picture of what kind of mistakes you 
are trying to protect against. 

If it is people accidently writing return in a generator when they 
really mean yield, that is what I thought the proposal for an alternate 
syntax was for.  That sounds like a good idea to me, especially if we 
could also ban or discourage the use of normal return.  But the 
alternate syntax doesn't have to mean a different exception.

Since you are no longer pushing an alternative syntax for return but 
still want a different exception, I'll assume there is some other 
beginner mistake you are worried about.  My guess is it is some mistake 
at the places where the generator is used, but I am having a hard time 
figuring out where the mistake could be in ignoring the returned value.  
Perhaps you (or someone who has more time) can provide an example where 
this is a bad thing?

> But I'm also worried that the mere need to catch GeneratorExit for a
> purpose other than resource cleanup will cause examples using it to
> pop up on the web, which will then be copied and modified by clueless
> beginners, and *increase* the probability of bad code being written.
> (That's why I introduce *new* exceptions in my support code -- they
> don't have predefined meanings in other contexts.)
>   

That worry I can understand.

> Finally, I am not sure of the connection with "yield from". I don't
> see a way to exploit it for this example. As an exercise, I
> constructed an "averager" generator out of the above "summer" and a
> similar "counter", and I didn't see a way to exploit "yield from". The
> only connection seems to be PEP 380's proposal to turn "return value"
> inside a generator into "raise StopIteration(value)", and that's the
> one part of the PEP with which I have a problem anyway (the beginner's
> issues above). 

Yes, the only connection is that this is where "return value" is 
introduced.   I could easily see "return value" as a separate PEP, 
except PEP 380 provides one of the better reasons for its inclusion.  It 
might be good to figure out how this feature should work by itself 
before complicating things by integrating it in the yield-from semantics.

> Oh, and "yield from" competes with @couroutine over
> when the initial next() call is made, which again suggests the two
> styles (yield-from and coroutines) are incompatible.
>   

It is a serious problem, because one of the major points of the PEP is 
that it should be useful for refactoring coroutines.  As a matter of 
fact, I started another thread on this specific issue earlier today 
which only Nick has so far responded to.  I think it is solvable, but 
requires some more work.

> All in all, I think I would be okay with turning "return value" inside
> a generator into raising *some* exception, as long as that exception
> is not StopIteration (nor derives from it, nor from GeneratorExit).
> PEP 380 and its implementation would become just a tad more complex,
> but I think that's worth it. Generators used as iterators would raise
> a (normally) uncaught exception if they returned a value, and that's
> my main requirement. I'm still not convince that more is needed, in
> particular I'm still -0 on catching this value in gen_close() and
> returning the value attribute from there.
>   

As long as close is not catching it without also returning the value.  
That would be *really* annoying.

> As I've said before, I don't care whether "return None" would be
> treated more like "return" or more like "return value" -- for
> beginners' code I don't think it matters, and for advanced code they
> should be equivalent.
>
> I'll stop arguing for new syntax to return a value from a generator
> (like Phillip Eby's proposed "return from yield with <value>"): I
> don't think it adds enough to overcome the pain for the parser and
> other tools.
>   

And here I was starting to think that a new syntax for return could 
solve the problem of beginner mistakes without needing a new exception.

> Finally, as far as a name for the new exception, I think something
> long like ReturnFromGenerator would be fine, since most of the time it
> is handled implicitly by coroutine support code (whether this is user
> code or gen_close()) or the yield-from implementation.
>   

GeneratorReturn is the name we have been using for this beast so far, 
but I really don't care what it is called.

> I'm sorry for being so long winded and yet somewhat inconclusive. I
> wouldn't have bothered if I didn't think there was something worth
> pursuing. But it sure seems elusive.
>   

And I thank you again for your time.

Best regards
- Jacob


From ben+python at benfinney.id.au  Sat Apr  4 01:33:11 2009
From: ben+python at benfinney.id.au (Ben Finney)
Date: Sat, 04 Apr 2009 10:33:11 +1100
Subject: [Python-ideas] clear() method for lists
References: <7528bcdd0904030353v303635fem1e757a5d6d625f8f@mail.gmail.com>
	<91ad5bf80904031154h1d009f64g60da09003487d6b8@mail.gmail.com>
Message-ID: <873acpe2iw.fsf@benfinney.id.au>

George Sakkis <george.sakkis at gmail.com> writes:

> On Fri, Apr 3, 2009 at 6:53 AM, Andre Roberge <andre.roberge at gmail.com> wrote:
> 
> > On the general Python list, a suggestion was made to add a clear() method to
> > list, as the "obvious" way to do
> > del some_list[:]
> > or
> > some_list[:] = []
> 
> It's obviously more explicit, but at the same it's a rather
> infrequent operation and the current ways are not particularly ugly.

More than explicit, it would make ?clear? the One Obvious Way To Do It
for base collection types. +1 from me.

-- 
 \          ?It's a good thing we have gravity or else when birds died |
  `\             they'd just stay right up there. Hunters would be all |
_o__)                                        confused.? ?Steven Wright |
Ben Finney



From jh at improva.dk  Sat Apr  4 01:34:16 2009
From: jh at improva.dk (Jacob Holm)
Date: Sat, 04 Apr 2009 01:34:16 +0200
Subject: [Python-ideas] yield-from and @coroutine decorator
 [was:x=(yield from) confusion]
In-Reply-To: <49D6346D.3000002@gmail.com>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com>
	<49D60B81.6060209@gmail.com> <49D62804.8080102@improva.dk>
	<49D6346D.3000002@gmail.com>
Message-ID: <49D69CF8.1050202@improva.dk>

Nick Coghlan wrote:
> Jacob Holm wrote:
>   
>> We should probably drop that particular bikeshed discussion until we
>> actually know the details of what the construct should do, esp in the
>> context of close(). I am starting to lose track of all the different
>> possible versions.
>>     
>
> Note that the syntax for returning values from generators is largely
> independent of the semantics. Guido has pointed out that disallowing the
> naive "return EXPR" in generators is an important learning tool for
> inexperienced generator users, and I think he's right.
>   

I agree that a separate syntax for returning a value from a 
generator/coroutine is propably a good idea.  (I am still not convinced 
we need a separate exception for it, but that is a separate 
discussion).  I even think it would be a good idea to deprecate the use 
of the normal "return" in generators, but that is probably not going to 
happen.

> "return finally" reads pretty well and doesn't add a new keyword, while
> still allowing generator return values to be written easily. I haven't
> seen other suggestions I particularly like, so I figured I'd run with
> that one for the revised example :)
>   

You can call it whatever you want, as long as it works predictably for 
the use cases we are finding.


[snip]
>> Now for my problem. The original averager example was inspired by the
>> tutorial http://dabeaz.com/coroutines/ that Guido pointed to. (Great
>> stuff, btw). One pattern that is recommended by the tutorial and used
>> throughout is to decorate all coroutines with a decorator like:
>>
>> def coroutine(func):
>>    def start(*args,**kwargs):
>>        cr = func(*args,**kwargs)
>>        cr.next()
>>        return cr
>>    return start
>>
>>
>> The idea is that it saves you from the initial next() call used to start
>> the coroutine. The problem is that you cannot use such a decorated
>> coroutine in any flavor of the yield-from expression we have considered
>> so far, because the yield-from will start out by doing an *additional*
>> next call and yield that value.
>>
>> I have a few vague ideas of how we might change "yield from" to support
>> this, but nothing concrete enough to put here. Is this a problem we
>> should try to fix, and if so, how?
>>     
>
> Hmm, that's a tricky one. It sounds like it is definitely an issue the
> PEP needs to discuss, but I don't currently have an opinion as to what
> it should say.
>   

Here is one possible fix, never mind the syntax.  We could change the 
yield from expression from the current:

  RESULT = yield from EXPR

by adding an extra form, possibly one of:

  RESULT = yield STARTEXPR from EXPR
  RESULT = yield from EXPR with STARTEXPR
  RESULT = yield from EXPR as NAME starting with STARTEXPR(NAME)

And letting STARTEXPR if given take the place of the initial _i.next() 
in the expansion(s).  The point is we need to yield *something* first, 
before rerouting all send(), next() and throw() calls to the subiterator.

>> not-trying-to-be-difficult-ly yours
>>     
>
> We have a long way to go before we even come close to consuming as many
> pixels as PEP 308 or PEP 343 - a fact for which Greg is probably grateful ;)
>   
Him and everybody else I would think.  But AFAICT we are not even close 
to finished, so we may get there yet.

- Jacob



From andre.roberge at gmail.com  Sat Apr  4 01:36:46 2009
From: andre.roberge at gmail.com (Andre Roberge)
Date: Fri, 3 Apr 2009 20:36:46 -0300
Subject: [Python-ideas] clear() method for lists
In-Reply-To: <873acpe2iw.fsf@benfinney.id.au>
References: <7528bcdd0904030353v303635fem1e757a5d6d625f8f@mail.gmail.com>
	<91ad5bf80904031154h1d009f64g60da09003487d6b8@mail.gmail.com>
	<873acpe2iw.fsf@benfinney.id.au>
Message-ID: <7528bcdd0904031636g2fb62cecm38e25f74c0aa2d34@mail.gmail.com>

On Fri, Apr 3, 2009 at 8:33 PM, Ben Finney
<ben+python at benfinney.id.au<ben%2Bpython at benfinney.id.au>
> wrote:

> George Sakkis <george.sakkis at gmail.com> writes:
>
> > On Fri, Apr 3, 2009 at 6:53 AM, Andre Roberge <andre.roberge at gmail.com>
> wrote:
> >
> > > On the general Python list, a suggestion was made to add a clear()
> method to
> > > list, as the "obvious" way to do
> > > del some_list[:]
> > > or
> > > some_list[:] = []
> >
> > It's obviously more explicit, but at the same it's a rather
> > infrequent operation and the current ways are not particularly ugly.
>
> More than explicit, it would make ?clear? the One Obvious Way To Do It
> for base collection types. +1 from me.
>

Seeing mostly positive responses so far ... Is it worth writing a pre-PEP to
formalize this suggestion?

Andr?



>
> --
>  \          ?It's a good thing we have gravity or else when birds died |
>  `\             they'd just stay right up there. Hunters would be all |
> _o__)                                        confused.? ?Steven Wright |
> Ben Finney
>
> _______________________________________________
> Python-ideas mailing list
> Python-ideas at python.org
> http://mail.python.org/mailman/listinfo/python-ideas
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-ideas/attachments/20090403/755508c1/attachment.html>

From ncoghlan at gmail.com  Sat Apr  4 02:32:07 2009
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Sat, 04 Apr 2009 10:32:07 +1000
Subject: [Python-ideas] x=(yield from) confusion [was:Yet another
 alternative name for yield-from]
In-Reply-To: <fb6fbf560904030948t23ad6b08t77e71017e7c62853@mail.gmail.com>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com>	
	<49D60B81.6060209@gmail.com>
	<fb6fbf560904030948t23ad6b08t77e71017e7c62853@mail.gmail.com>
Message-ID: <49D6AA87.5000608@gmail.com>

Jim Jewett wrote:
> On 4/3/09, Nick Coghlan <ncoghlan at gmail.com> wrote:
>> Greg tried to clarify this a bit already, but I think Jacob's averager
>> example is an interesting case where it makes sense to both yield
>> multiple times and also "return a value".
> 
>>   def averager(start=0):
>>     # averager that maintains a running average
>>     # and returns the final average when done
>>     count = 0
>>     exc = None
>>     sum = start
>>     while 1:
>>       avg = sum / count
>>       try:
>>         val = yield avg
>>       except CalcAverage:
>>         return finally avg
>>       sum += val
>>       count += 1
> 
> It looks to me like it returns (or yields) the running average either way.
> 
> I see a reason to send in a sentinel value, saying "Don't update the
> average, just tell me the current value."
> 
> I don't see why that sentinel has to terminate the generator, nor do I
> see why that final average has to be returned rather than yielded.

It doesn't *have* to do anything - you could set it up that way if you
wanted to. However, when the running average is only yielded, then the
location providing the numbers (i.e. the top level code in my example)
is also the only location which can receive the running average. That's
why anyone using coroutines now *has* to have a top-level scheduler to
handle the "coroutine stack" by knowing which coroutines are calling
each other and detecting when one has "returned" (i.e. yielded a special
sentinel value that the scheduler recognises) so the return value can be
passed back to the calling coroutine.

I hoped to show the advantage of the separate return value with the
difference calculating coroutine: in that example, having the separate
return value makes it easy to identify the values which need to be
returned to a location *other* than the source of the numbers being
averaged. The example goes through some distinct stages:

- top level code sending numbers to first averager via send() and
receiving running averages back via yield
- top level code telling first averager to finish up (via either throw()
or send() depending on implementation), first averager returning final
average value to the difference calculator
- top level code sending numbers to second averager and receiving
running averages back
- top level code telling second averager to finish up, second averager
returning final average value to the difference calculator, difference
calculator returning difference between the two averages to the top
level code

That is, yield, send() and throw() involve communication between the
currently active subcoroutine and the client of the whole coroutine.
They bypass the current stack in the coroutine itself. The return
values, on the other hand, *do* involve unwinding the coroutine stack,
just like they do with normal function calls.

I'll have another go, this time comparing the "normal" calls to the
coroutine version:

  def average(seq, start=0):
    if seq:
      return sum(seq, start) / len(seq)
    return start

  def average_diff(seq1, seq2):
    avg1 = average(seq1)
    avg2 = average(seq2)
    return avg2 - avg1

OK, simple and straightforward. The idea of 'yield from' is to allow
"average_diff" to be turned into a coroutine that receives the values to
be averaged one by one *without* having to inline the actual average
calculation.

I'll also go back to Jacob's original averaging example that *doesn't*
yield a running average - it is more inline with examples where the
final calculation is expensive, so you won't do it until you're told you
have all the data.

What might that look like as 'yield from' style coroutines?

  class CalcAverage(Exception): pass

  def average_cr(start=0):
    count = 0
    exc = None
    sum = start
    while 1:
      try:
        # The yield surrenders control to the code
        # that is supplying the numbers to be
        # averaged
        val = yield
      except CalcAverage:
        # The return finally passes the final
        # average back to the code that was
        # asking for the average (which is NOT
        # necessarily the same code that was
        # supplying the numbers
        if count:
          return finally sum / count
        return finally start
      sum += val
      count += 1

  # Note how similar this is to the normal version above
  def average_diff_cr(start):
    avg1 = yield from average_cr(start, seq1)
    avg2 = yield from average_cr(start, seq2)
    return avg2 - avg1

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------


From ncoghlan at gmail.com  Sat Apr  4 02:39:46 2009
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Sat, 04 Apr 2009 10:39:46 +1000
Subject: [Python-ideas] x=(yield from) confusion [was:Yet another
 alternative name for yield-from]
In-Reply-To: <49D69AFA.5070600@improva.dk>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com>	<49D60B81.6060209@gmail.com>	<fb6fbf560904030948t23ad6b08t77e71017e7c62853@mail.gmail.com>	<49D64BA7.2000006@improva.dk>	<ca471dc20904031421j2c6c8f49kece03fe24ca539ee@mail.gmail.com>
	<49D69AFA.5070600@improva.dk>
Message-ID: <49D6AC52.4010608@gmail.com>

Jacob Holm wrote:
> Since you are no longer pushing an alternative syntax for return but
> still want a different exception, I'll assume there is some other
> beginner mistake you are worried about.  My guess is it is some mistake
> at the places where the generator is used, but I am having a hard time
> figuring out where the mistake could be in ignoring the returned value. 
> Perhaps you (or someone who has more time) can provide an example where
> this is a bad thing?

I can't speak for Guido, but the two easy beginner mistakes I think are
worth preventing:

- using 'return' where you meant 'yield' (however, if even 'return
finally' doesn't appeal to Guido as alternative syntax for "no, this is
a coroutine, I really mean it" then I'm fine with that)

- trying to iterate normally over a coroutine instead of calling it
appropriately (raising GeneratorReturn instead of StopIteration means
that existing iterative code will let the new exception escape rather
than silently suppressing the return exception)

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------


From ncoghlan at gmail.com  Sat Apr  4 02:45:09 2009
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Sat, 04 Apr 2009 10:45:09 +1000
Subject: [Python-ideas] Yield-From: Handling of GeneratorExit
In-Reply-To: <ca471dc20904031022v746f959enf59f5a1a2f322e0d@mail.gmail.com>
References: <49AB1F90.7070201@canterbury.ac.nz> <49D255BC.6080503@improva.dk>
	<49D2A759.5080204@canterbury.ac.nz>
	<49D2C735.8020803@improva.dk> <49D34FDC.5050106@gmail.com>
	<49D36FD0.3080602@improva.dk> <49D4A156.9080304@canterbury.ac.nz>
	<49D4A66F.9060900@gmail.com> <49D555D1.9050701@canterbury.ac.nz>
	<49D60C97.3040203@gmail.com>
	<ca471dc20904031022v746f959enf59f5a1a2f322e0d@mail.gmail.com>
Message-ID: <49D6AD95.2010705@gmail.com>

Guido van Rossum wrote:
> On Fri, Apr 3, 2009 at 6:18 AM, Nick Coghlan <ncoghlan at gmail.com> wrote:
>> Greg Ewing wrote:
>>> Nick Coghlan wrote:
>>>
>>>> I think I'd prefer to see some arbitrary limit (500 seems like a nice
>>>> round number) on the number of times that GeneratorExit would be thrown
>>>> before giving up
>>> Is it really worth singling out this particular way
>>> of writing an infinite loop?
>>>
>>> If you're catching GeneratorExit then you presumably
>>> have the need to clean up and exit on your mind, so
>>> I don't think this is a likely mistake to make.
>> I came up with a different answer that I like better - don't mess with
>> GeneratorExit and close() at all, and instead provide next_return(),
>> send_return() and throw_return() methods that *expect* to get a
>> GeneratorReturn exception in response (and complain if it doesn't happen).
>>
>> (I expand on this idea in a lot more detail in my reply to Jim)
> 
> Since there are so many threads, let me repeat that I'm -1 on adding
> more methods, but +1 on adding them to the docs as recipes.

If we end up adding a support library for coroutines (depending on how
the discussion of the @coroutine problem goes), then that may be another
place for them.

Fattening the generator API even further bothered me a bit as well, so
I'm actually happy to be overruled on that particular idea :)

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------


From george.sakkis at gmail.com  Sat Apr  4 02:50:54 2009
From: george.sakkis at gmail.com (George Sakkis)
Date: Fri, 3 Apr 2009 20:50:54 -0400
Subject: [Python-ideas] clear() method for lists
In-Reply-To: <7528bcdd0904031636g2fb62cecm38e25f74c0aa2d34@mail.gmail.com>
References: <7528bcdd0904030353v303635fem1e757a5d6d625f8f@mail.gmail.com>
	<91ad5bf80904031154h1d009f64g60da09003487d6b8@mail.gmail.com>
	<873acpe2iw.fsf@benfinney.id.au>
	<7528bcdd0904031636g2fb62cecm38e25f74c0aa2d34@mail.gmail.com>
Message-ID: <91ad5bf80904031750s3208efbcsa0487f28edbaf14d@mail.gmail.com>

On Fri, Apr 3, 2009 at 7:36 PM, Andre Roberge <andre.roberge at gmail.com> wrote:
>
>
> On Fri, Apr 3, 2009 at 8:33 PM, Ben Finney <ben+python at benfinney.id.au>
> wrote:
>>
>> George Sakkis <george.sakkis at gmail.com> writes:
>>
>> > On Fri, Apr 3, 2009 at 6:53 AM, Andre Roberge <andre.roberge at gmail.com>
>> > wrote:
>> >
>> > > On the general Python list, a suggestion was made to add a clear()
>> > > method to
>> > > list, as the "obvious" way to do
>> > > del some_list[:]
>> > > or
>> > > some_list[:] = []
>> >
>> > It's obviously more explicit, but at the same it's a rather
>> > infrequent operation and the current ways are not particularly ugly.
>>
>> More than explicit, it would make ?clear? the One Obvious Way To Do It
>> for base collection types. +1 from me.
>
> Seeing mostly positive responses so far ... Is it worth writing a pre-PEP to
> formalize this suggestion?

I might be wrong on this, but I don't think there has to be a PEP for
something as small as a single simple well-defined method. A feature
request on the tracker should be enough.

George


From greg.ewing at canterbury.ac.nz  Sat Apr  4 02:58:21 2009
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Sat, 04 Apr 2009 12:58:21 +1200
Subject: [Python-ideas] yield-from and @coroutine decorator
 [was:x=(yield from) confusion]
In-Reply-To: <49D6346D.3000002@gmail.com>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com>
	<49D60B81.6060209@gmail.com> <49D62804.8080102@improva.dk>
	<49D6346D.3000002@gmail.com>
Message-ID: <49D6B0AD.8020205@canterbury.ac.nz>

Nick Coghlan wrote:

> "return finally" reads pretty well and doesn't add a new keyword

Still doesn't mean anything, though. Ordinary returns
happen "finally" too (most of the time, anyway), so
what's the difference?

-- 
Greg


From lists at cheimes.de  Sat Apr  4 02:56:47 2009
From: lists at cheimes.de (Christian Heimes)
Date: Sat, 04 Apr 2009 02:56:47 +0200
Subject: [Python-ideas] clear() method for lists
In-Reply-To: <7528bcdd0904031636g2fb62cecm38e25f74c0aa2d34@mail.gmail.com>
References: <7528bcdd0904030353v303635fem1e757a5d6d625f8f@mail.gmail.com>	<91ad5bf80904031154h1d009f64g60da09003487d6b8@mail.gmail.com>	<873acpe2iw.fsf@benfinney.id.au>
	<7528bcdd0904031636g2fb62cecm38e25f74c0aa2d34@mail.gmail.com>
Message-ID: <gr6b8g$bob$1@ger.gmane.org>

Andre Roberge wrote:
> Seeing mostly positive responses so far ... Is it worth writing a pre-PEP to
> formalize this suggestion?

The feature doesn't require a formal PEP. The implementation requires 10
lines of simple C code. For a full patch you have to provide a simple
_abscoll.MutableSequence.clear() method (just pop() until an IndexError
is raised), some documentation updates and a bunch of unit tests.

Christian

Index: Objects/listobject.c
===================================================================
--- Objects/listobject.c        (revision 71106)
+++ Objects/listobject.c        (working copy)
@@ -785,6 +785,14 @@
 }

 static PyObject *
+listclear(PyListObject *self, PyObject *unused)
+{
+       if (list_resize(self, 0) == 0)
+               Py_RETURN_NONE;
+       return NULL;
+}
+
+static PyObject *
 listextend(PyListObject *self, PyObject *b)
 {
        PyObject *it;      /* iter(v) */
@@ -2458,6 +2466,8 @@
 "L.__sizeof__() -- size of L in memory, in bytes");
 PyDoc_STRVAR(append_doc,
 "L.append(object) -- append object to end");
+PyDoc_STRVAR(clear_doc,
+"L.clear() -- clear the list");
 PyDoc_STRVAR(extend_doc,
 "L.extend(iterable) -- extend list by appending elements from the
iterable");
 PyDoc_STRVAR(insert_doc,
@@ -2486,6 +2496,7 @@
        {"__reversed__",(PyCFunction)list_reversed, METH_NOARGS,
reversed_doc},
        {"__sizeof__",  (PyCFunction)list_sizeof, METH_NOARGS, sizeof_doc},
        {"append",      (PyCFunction)listappend,  METH_O, append_doc},
+       {"clear",       (PyCFunction)listclear,   METH_NOARGS, clear_doc},
        {"insert",      (PyCFunction)listinsert,  METH_VARARGS, insert_doc},
        {"extend",      (PyCFunction)listextend,  METH_O, extend_doc},
        {"pop",         (PyCFunction)listpop,     METH_VARARGS, pop_doc},


>>> l = [1, 2, 3]
[35104 refs]
>>> l
[1, 2, 3]
[35111 refs]
>>> l.clear()
[35111 refs]
>>> l
[]



From greg.ewing at canterbury.ac.nz  Sat Apr  4 03:15:59 2009
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Sat, 04 Apr 2009 13:15:59 +1200
Subject: [Python-ideas] x=(yield from) confusion [was:Yet another
 alternative name for yield-from]
In-Reply-To: <fb6fbf560904030942m731f4bfbr709f59fc309b26b2@mail.gmail.com>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com>
	<49D5B98A.20700@canterbury.ac.nz>
	<fb6fbf560904030942m731f4bfbr709f59fc309b26b2@mail.gmail.com>
Message-ID: <49D6B4CF.8080608@canterbury.ac.nz>

Jim Jewett wrote:

> err... I didn't mean both directions, I meant "from the callee to the
> caller as a yielded value" and "from the callee to the caller as a
> final return value that can't be yielded normally."

I think perhaps we're misunderstanding each other. You
seemed to be saying that the only time you would want
a generator to return a value is when you were also
using it to either send or receive values using yield,
and I was just pointing out that's not true.

If that's not what you meant, you'll have to explain
more clearly what you do mean.

-- 
Greg



From ben+python at benfinney.id.au  Sat Apr  4 03:14:31 2009
From: ben+python at benfinney.id.au (Ben Finney)
Date: Sat, 04 Apr 2009 12:14:31 +1100
Subject: [Python-ideas] clear() method for lists
References: <7528bcdd0904030353v303635fem1e757a5d6d625f8f@mail.gmail.com>
	<91ad5bf80904031154h1d009f64g60da09003487d6b8@mail.gmail.com>
	<873acpe2iw.fsf@benfinney.id.au>
	<7528bcdd0904031636g2fb62cecm38e25f74c0aa2d34@mail.gmail.com>
Message-ID: <87y6uhcj9k.fsf@benfinney.id.au>

Andre Roberge <andre.roberge at gmail.com> writes:

> On Fri, Apr 3, 2009 at 8:33 PM, Ben Finney
> > wrote:
> > More than explicit, it would make ?clear? the One Obvious Way To
> > Do It for base collection types. +1 from me.
> 
> Seeing mostly positive responses so far ... Is it worth writing a
> pre-PEP to formalize this suggestion?

I wouldn't think it worth a PEP. Simpler to produce a patch
implementing and documenting the change, and see what reception it
gets on python-dev.

-- 
 \          ?I got an answering machine for my phone. Now when someone |
  `\      calls me up and I'm not home, they get a recording of a busy |
_o__)                                          signal.? ?Steven Wright |
Ben Finney



From ncoghlan at gmail.com  Sat Apr  4 03:47:08 2009
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Sat, 04 Apr 2009 11:47:08 +1000
Subject: [Python-ideas] yield-from and @coroutine decorator
 [was:x=(yield from) confusion]
In-Reply-To: <49D6B0AD.8020205@canterbury.ac.nz>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com>	<49D60B81.6060209@gmail.com>
	<49D62804.8080102@improva.dk>	<49D6346D.3000002@gmail.com>
	<49D6B0AD.8020205@canterbury.ac.nz>
Message-ID: <49D6BC1C.90007@gmail.com>

Greg Ewing wrote:
> Nick Coghlan wrote:
> 
>> "return finally" reads pretty well and doesn't add a new keyword
> 
> Still doesn't mean anything, though. Ordinary returns
> happen "finally" too (most of the time, anyway), so
> what's the difference?

It needs to be mnemonic, not literal. The difference between a generator
return and a normal function return is that with a generator you will
typically have at least one yield before the actual return, whereas with
a normal function, return or an exception are the only way to leave the
function's frame. So "return finally" is meant to help you remember that
it happens only after all the yields are done.

Guido has said he is OK with losing the novice assistance on this one
though, so it's probably a moot point.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------


From jh at improva.dk  Sat Apr  4 03:47:50 2009
From: jh at improva.dk (Jacob Holm)
Date: Sat, 04 Apr 2009 03:47:50 +0200
Subject: [Python-ideas] x=(yield from) confusion [was:Yet another
 alternative name for yield-from]
In-Reply-To: <49D6AC52.4010608@gmail.com>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com>	<49D60B81.6060209@gmail.com>	<fb6fbf560904030948t23ad6b08t77e71017e7c62853@mail.gmail.com>	<49D64BA7.2000006@improva.dk>	<ca471dc20904031421j2c6c8f49kece03fe24ca539ee@mail.gmail.com>
	<49D69AFA.5070600@improva.dk> <49D6AC52.4010608@gmail.com>
Message-ID: <49D6BC46.9000808@improva.dk>

Nick Coghlan wrote:
> Jacob Holm wrote:
>   
>> Since you are no longer pushing an alternative syntax for return but
>> still want a different exception, I'll assume there is some other
>> beginner mistake you are worried about.  My guess is it is some mistake
>> at the places where the generator is used, but I am having a hard time
>> figuring out where the mistake could be in ignoring the returned value. 
>> Perhaps you (or someone who has more time) can provide an example where
>> this is a bad thing?
>>     
>
> I can't speak for Guido, but the two easy beginner mistakes I think are
> worth preventing:
>
> - using 'return' where you meant 'yield' (however, if even 'return
> finally' doesn't appeal to Guido as alternative syntax for "no, this is
> a coroutine, I really mean it" then I'm fine with that)
>   
Good, this one I understand.

> - trying to iterate normally over a coroutine instead of calling it
> appropriately (raising GeneratorReturn instead of StopIteration means
> that existing iterative code will let the new exception escape rather
> than silently suppressing the return exception)
>   

But this one I still don't get.  Let me try a couple of cases:

1)  We have a coroutine that expects you to call send and/or throw with 
specific values, and ends up returning a value.  A beginner may try to 
iterate over it, but will most likely get an exception on the first 
next() call because the input is not valid. Or he would get an infinite 
loop because None is not changing the state of the coroutine.  In any 
case, it is unlikely that he will get to see either StopIteration or the 
new exception, because the input is not what the coroutine expects. The 
new exception doesn't help here.

2) We have a generator that e.g. pulls values from a file, yielding the 
processed values as it goes along, and returning some form of summary at 
the end.   If I iterate over it with a for-loop, I get all the values 
asn usual ... followed by an exception.  Why do I have to get an 
exception there just because the generator has some information that its 
implementer thought I might want?   Ignoring the value in this case 
seems perfectly reasonable, so having to catch an exception is just 
noise here.

3) We have a coroutine that computes something expensive, occationally 
yielding to let other code run. It neither sends or receives values, 
just uses yield for cooperative multitasking.  When it is done it 
returns a value.  If you loop over this coroutine, you will get a bunch 
of Nones, followed by the new exception.  You could argue that the new 
exception helps you here.  One way of accessing the returned value would 
be to catch it and look at an attribute.  However, for this case I would 
prefer to just call close on the generator to get the value afterwards.  
A beginner might be helped by the unexpected exception, but I think even 
a beginner would find that something strange was going on when the only 
value he gets for the loop variable is None.  He might even look up the 
documentation for the coroutine he was calling and see how it was 
supposed to be used.

4) ... ?

Do you have other concrete use cases I haven't thought of where a the 
new exception would help?

- Jacob



From python at rcn.com  Sat Apr  4 03:52:21 2009
From: python at rcn.com (Raymond Hettinger)
Date: Fri, 3 Apr 2009 18:52:21 -0700
Subject: [Python-ideas] clear() method for lists
References: <7528bcdd0904030353v303635fem1e757a5d6d625f8f@mail.gmail.com><91ad5bf80904031154h1d009f64g60da09003487d6b8@mail.gmail.com><873acpe2iw.fsf@benfinney.id.au>
	<7528bcdd0904031636g2fb62cecm38e25f74c0aa2d34@mail.gmail.com>
Message-ID: <AD08093125F34C038B2A5BC519CD20BB@RaymondLaptop1>

Just ask Guido for his blessing.  The implementation is trivial.


Raymond

  >Seeing mostly positive responses so far ... Is it worth writing a pre-PEP to formalize this suggestion?
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-ideas/attachments/20090403/a69b1e8f/attachment.html>

From jh at improva.dk  Sat Apr  4 14:01:13 2009
From: jh at improva.dk (Jacob Holm)
Date: Sat, 04 Apr 2009 14:01:13 +0200
Subject: [Python-ideas] yield-from and @coroutine decorator
 [was:x=(yield from) confusion]
In-Reply-To: <49D69CF8.1050202@improva.dk>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com>	<49D60B81.6060209@gmail.com>
	<49D62804.8080102@improva.dk>	<49D6346D.3000002@gmail.com>
	<49D69CF8.1050202@improva.dk>
Message-ID: <49D74C09.2020305@improva.dk>

Jacob Holm wrote:
> Here is one possible fix, never mind the syntax. We could change the 
> yield from expression from the current:
>
> RESULT = yield from EXPR
>
> by adding an extra form, possibly one of:
>
> RESULT = yield STARTEXPR from EXPR
> RESULT = yield from EXPR with STARTEXPR
> RESULT = yield from EXPR as NAME starting with STARTEXPR(NAME)
>
> And letting STARTEXPR if given take the place of the initial _i.next() 
> in the expansion(s). The point is we need to yield *something* first, 
> before rerouting all send(), next() and throw() calls to the subiterator.
>

Another possible fix would be to have new syntax for specifying that the 
initial call to the coroutine should be using send or throw instead. 
This could be seen as a restriction on what could be used as 
STARTEXPR(NAME) in the earlier syntax idea.


Yet another fix that requires no extra syntax would be to store the 
latest value yielded by the generator in a property on the generator 
(raising AttributeError before the first yield). Then the initial:

_y = _i.next()


could be replaced with:

try:
    _y = _i.gi_latest_yield  # or whatever its name would be.
except AttributeError:
    _y = _i.next()


The benefit of this version is that it requires no new syntax, it avoids 
the extra next() call for coroutines, and it opens some new ways of 
using generators. It also supports almost everything that would be 
possible with the syntax-based fix. (Everything if the property is 
actually writable, but I don't really see a use for that except perhaps 
for deleting it).

I can even remember that I have wanted such a property before, although 
I don't recall the exact use case.

One bad thing about it is that the initial yield made by the yield-from 
is then the value that the coroutine decorator was meant to discard 
(usually None). That might be a reason for allowing the property to be 
writable, or for a change in syntax after all. On the other hand, if 
this is a problem you can manually call the coroutine the way you want 
before using it in yield-from, which would then initialize the value 
exactly like with the second syntax idea.

Of the three ideas so far, I much prefer the one without extra syntax.

- Jacob



From ncoghlan at gmail.com  Sat Apr  4 14:29:00 2009
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Sat, 04 Apr 2009 22:29:00 +1000
Subject: [Python-ideas] yield-from and @coroutine decorator
 [was:x=(yield from) confusion]
In-Reply-To: <49D74C09.2020305@improva.dk>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com>	<49D60B81.6060209@gmail.com>
	<49D62804.8080102@improva.dk>	<49D6346D.3000002@gmail.com>
	<49D69CF8.1050202@improva.dk> <49D74C09.2020305@improva.dk>
Message-ID: <49D7528C.9030605@gmail.com>

Jacob Holm wrote:
> Yet another fix that requires no extra syntax would be to store the
> latest value yielded by the generator in a property on the generator
> (raising AttributeError before the first yield). Then the initial:
> 
> _y = _i.next()
> 
> 
> could be replaced with:
> 
> try:
>    _y = _i.gi_latest_yield  # or whatever its name would be.
> except AttributeError:
>    _y = _i.next()
> 
> 
> The benefit of this version is that it requires no new syntax, it avoids
> the extra next() call for coroutines, and it opens some new ways of
> using generators. It also supports almost everything that would be
> possible with the syntax-based fix. (Everything if the property is
> actually writable, but I don't really see a use for that except perhaps
> for deleting it).
> 
> I can even remember that I have wanted such a property before, although
> I don't recall the exact use case.
> 
> One bad thing about it is that the initial yield made by the yield-from
> is then the value that the coroutine decorator was meant to discard
> (usually None). That might be a reason for allowing the property to be
> writable, or for a change in syntax after all. On the other hand, if
> this is a problem you can manually call the coroutine the way you want
> before using it in yield-from, which would then initialize the value
> exactly like with the second syntax idea.
> 
> Of the three ideas so far, I much prefer the one without extra syntax.

This issue is still bouncing around in my brain, so I don't have a lot
say about it yet, but a special attribute on the generator-iterator
object that the yield from expression could check was the first possible
approach that occurred to me.

Although, rather than it being the "latest yield" from the generator, I
was thinking more of just an ordinary attribute that a @coroutine
decorator could set to indicate what to yield when firing it up with
'yield from'.

On your syntax ideas, note that the parser can't do anything tricky with
expressions of the form "yield EXPR" - the parser will treat that as a
normal yield and get confused if you try to add anything after it.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------


From jh at improva.dk  Sat Apr  4 14:50:16 2009
From: jh at improva.dk (Jacob Holm)
Date: Sat, 04 Apr 2009 14:50:16 +0200
Subject: [Python-ideas] yield-from and @coroutine decorator
 [was:x=(yield from) confusion]
In-Reply-To: <49D7528C.9030605@gmail.com>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com>	<49D60B81.6060209@gmail.com>
	<49D62804.8080102@improva.dk>	<49D6346D.3000002@gmail.com>
	<49D69CF8.1050202@improva.dk> <49D74C09.2020305@improva.dk>
	<49D7528C.9030605@gmail.com>
Message-ID: <49D75788.6020806@improva.dk>

Nick Coghlan wrote:
> This issue is still bouncing around in my brain, so I don't have a lot
> say about it yet, but a special attribute on the generator-iterator
> object that the yield from expression could check was the first possible
> approach that occurred to me.
>   

Ok, keep it bouncing.

> Although, rather than it being the "latest yield" from the generator, I
> was thinking more of just an ordinary attribute that a @coroutine
> decorator could set to indicate what to yield when firing it up with
> 'yield from'.
>   

I made it the latest yield because I have had a use case for that in the 
past, and it seemed like a natural thing to do.

> On your syntax ideas, note that the parser can't do anything tricky with
> expressions of the form "yield EXPR" - the parser will treat that as a
> normal yield and get confused if you try to add anything after it.
>   

I don't really like the idea of new syntax anyway, now that it seems 
there is a way to avoid it. But thanks for the reminder.

- Jacob


From aahz at pythoncraft.com  Sat Apr  4 18:23:11 2009
From: aahz at pythoncraft.com (Aahz)
Date: Sat, 4 Apr 2009 09:23:11 -0700
Subject: [Python-ideas] name export
In-Reply-To: <20090403101915.479e4e0b@o>
References: <20090403101915.479e4e0b@o>
Message-ID: <20090404162311.GA8988@panix.com>

On Fri, Apr 03, 2009, spir wrote:
> 
> When I write tool modules that export useful names to client code, I
> usually use __all__ to select proper names. Sure, it's a potential
> source of identifier conflict. I have another custom __*names__ module
> attribute that allows the client at least to control which names are
> defined in the imported module:
>
> # module M
> __Mnames__ = [...]
> __all__ = ["__Mnames__"] + __M_names__
> 
> Then
> 	from M import * ; print __Mnames__
> outputs needed naming information:
> 
> 	from M import * ; print __Mnames__ ; print dir()
> ==>
> 	['a', 'b', 'c']
> 	['__Mnames__', '__builtins__', '__doc__', '__file__', '__name__', 'a', 'b', 'c']
> 
> [Indeed, you'd have the same info with M.__all__, but I find it
> strange to have both "from M import *" and "import M" only to access
> its __all__ attribute. Also, it happens that a module name and it's
> main defined name are identical, like time.time.]

Your problem is that you're using import * -- stop doing that and you
won't have an issue.  The only good use cases for import * IMO are
interactive Python and packages, and in the latter case I don't see why
anyone would need the information you propose except for debugging
purposes.
-- 
Aahz (aahz at pythoncraft.com)           <*>         http://www.pythoncraft.com/

"Debugging is twice as hard as writing the code in the first place.
Therefore, if you write the code as cleverly as possible, you are, by
definition, not smart enough to debug it."  --Brian W. Kernighan


From guido at python.org  Sat Apr  4 18:34:12 2009
From: guido at python.org (Guido van Rossum)
Date: Sat, 4 Apr 2009 09:34:12 -0700
Subject: [Python-ideas] yield-from and @coroutine decorator
	[was:x=(yield from) confusion]
In-Reply-To: <49D74C09.2020305@improva.dk>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com> 
	<49D60B81.6060209@gmail.com> <49D62804.8080102@improva.dk> 
	<49D6346D.3000002@gmail.com> <49D69CF8.1050202@improva.dk> 
	<49D74C09.2020305@improva.dk>
Message-ID: <ca471dc20904040934v5f35c4a5y4eb1207df5145eae@mail.gmail.com>

On Sat, Apr 4, 2009 at 5:01 AM, Jacob Holm <jh at improva.dk> wrote:
> Another possible fix would be to have new syntax for specifying that the
> initial call to the coroutine should be using send or throw instead. This
> could be seen as a restriction on what could be used as STARTEXPR(NAME) in
> the earlier syntax idea.

All, please stop making more proposals. I've got it all in my head but
no time to write it up right now. Hopefully before the weekend is over
I'll find the time.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)


From guido at python.org  Sat Apr  4 22:29:00 2009
From: guido at python.org (Guido van Rossum)
Date: Sat, 4 Apr 2009 13:29:00 -0700
Subject: [Python-ideas] x=(yield from) confusion [was:Yet another
	alternative name for yield-from]
In-Reply-To: <49D69AFA.5070600@improva.dk>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com> 
	<49D60B81.6060209@gmail.com>
	<fb6fbf560904030948t23ad6b08t77e71017e7c62853@mail.gmail.com> 
	<49D64BA7.2000006@improva.dk>
	<ca471dc20904031421j2c6c8f49kece03fe24ca539ee@mail.gmail.com> 
	<49D69AFA.5070600@improva.dk>
Message-ID: <ca471dc20904041329p31571e51o44d0ea8bcd74f96f@mail.gmail.com>

[Answering somewhat out of order; new proposal developed at the end.]

On Fri, Apr 3, 2009 at 4:25 PM, Jacob Holm <jh at improva.dk> wrote:
> I am still trying to get a clear picture of what kind of mistakes you are
> trying to protect against.
> If it is people accidently writing return in a generator when they really
> mean yield, that is what I thought the proposal for an alternate syntax
was
> for.  That sounds like a good idea to me, especially if we could also ban
or
> discourage the use of normal return.  But the alternate syntax doesn't
have
> to mean a different exception.

I am leaning the other way now. New syntax for returning in a value is a
high-cost proposition. Instead, I think we can guard against most of the
same mistakes (mixing yield and return in a generator used as an iterator)
by using a different exception to pass the value. This would delay the
failure to runtime, but it would still fail loudly, which is good enough for
me.

I want to name the new exception ReturnFromGenerator to minimize the
similarity with GeneratorExit: if we had both GeneratorExit and
GeneratorReturn there would be endless questions on the newbie forums about
the differences between the two, and people might use the wrong one. Since
ReturnFromGenerator goes *out* of the generator and GeneratorExit goes *in*,
there really are no useful parallels, and similar names would cause
confusion.

> I could easily see "return value" as a separate PEP, except PEP 380
> provides one of the better reasons for its inclusion.  It might be good to
> figure out how this feature should work by itself before complicating
things
> by integrating it in the yield-from semantics.

Here are my curent thoughts on this. When a generator returns, the return
statement is treated normally (whether or not it has a value) until the
frame is about to be left (i.e. after any finally-clauses have run). Then,
it is converted to StopIteration if there was no value or
ReturnFromGenerator if there was a value. I don't care which one is picked
for an explicit "return None" -- that should be decided by implementation
expediency. (E.g. if one requires adding new opcodes and one doesn't, I'd
pick the one that doesn't.)

Normal loops (for-loops, list comprehensions, other implied loops) only
catch StopIteration, so that returning a value is still wrong here. But some
other contexts treat ReturnFromGenerator similar as StopIteration except the
latter conveys None and the former conveys an explicit value. This applies
to yield-from as well as to explicit or implied closing of the generator
(close() or deallocation).

So g.close() returns the value (I think I earlier said I didn't like that --
I turned around on this one). It's pseudo-code is roughly:

def close(it):
  try:
   it.throw(GeneratorExit)
  except (GeneratorExit, StopIteration):
   return None
  except ReturnFromGenerator as e: # This block is really the only new thing
   return e.value
  # Other exceptions are passed out unchanged
 else:
    # throw() yielded a value -- unchanged
   raise RuntimeError(.....)

Deleting a generator is like closing and printing (!) a traceback (to
stderr) if close() raises an exception. A returned value it is just ignored.
Explicit pseudo-code without falling back to close():

def __del__(it):
  try:
   it.throw(GeneratorExit)
  except (GeneratorExit, StopIteration, ReturnFromGenerator):
   pass
  except:
   # Some other exception happened
    <print traceback>
 else:
    # throw() yielded another value
   <print traceback>

I have also worked out what I want yield-from to do, see end of this
message.

[Guido]
>> Oh, and "yield from" competes with @couroutine over
>> when the initial next() call is made, which again suggests the two
>> styles (yield-from and coroutines) are incompatible.
>
> It is a serious problem, because one of the major points of the PEP is
that
> it should be useful for refactoring coroutines.  As a matter of fact, I
> started another thread on this specific issue earlier today which only
Nick
> has so far responded to.  I think it is solvable, but requires some more
> work.

I think that's the thread where I asked you and Nick to stop making more
proposals.I a worried that a solution would become too complex, and I want
to keep the "naive" interpretation of "yield from EXPR" to be as close as
possible to "for x in EXPR: yield x". I think the @coroutine generator
(whether built-in or not) or explicit "priming" by a next() call is fine.

-----

So now let me develop my full thoughts on yield-from. This is unfortunately
long, because I want to show some intermediate stages. I am using a green
font for new code. I am using stages, where each stage provides a better
approximation of the desired semantics. Note that each stage *adds* some
semantics for corner cases that weren't handled the same way in the previous
stage. Each stage proposes an expansion for "RETVAL = yield from EXPR". I am
using Py3k syntax.

1. Stage one uses the for-loop equivalence:

for x in EXPR:
 yield x
RETVAL = None

2. Stage two expands the for-loop into an explicit while-loop that has the
same meaning. It also sets RETVAL when breaking out of the loop. This
prepares for the subsequent stages. Note that we have an explicit iter(EXPR)
call here, since that is what a for-loop does:

it = iter(EXPR)
while True:
  try:
   x = next(it)
  except StopIteration:
   RETVAL = None; break
  yield x

3. Stage three further rearranges stage 2 without making semantic changes,
Again this prepares for later stages:

it = iter(EXPR)
try:
  x = next(it)
except StopIteration:
  RETVAL = e.value
else:
  while True:
    yield x
    try:
      x = next(x)
    except StopIteration:
      RETVAL = None; break

4. Stage four adds handling for ReturnFromGenerator, in both places where
next() is called:

it = iter(EXPR)
try:
  x = next(it)
except StopIteration:
  RETVAL = e.value
except ReturnFromGenerator as e:
   RETVAL = e.value; break
else:
  while True:
    yield x
    try:
      x = next(it)
    except StopIteration:
      RETVAL = None; break
    except ReturnFromGenerator as e:
       RETVAL = e.value; break
 yield x

5. Stage five shows what should happen if "yield x" above returns a value:
it is passed into the subgenerator using send(). I am ignoring for now what
happens if it is not a generator; this will be cleared up later. Note that
the initial next() call does not change into a send() call, because there is
no value to send before before we have yielded:

it = iter(EXPR)
try:
  x = next(it)
except StopIteration:
  RETVAL = None
except ReturnFromGenerator as e:
  RETVAL = e.value
else:
  while True:
    v = yield x
    try:
      x = it.send(v)
    except StopIteration:
      RETVAL = None; break
    except ReturnFromGenerator as e:
       RETVAL = e.value; break

6. Stage six adds more refined semantics for when "yield x" raises an
exception: it is thrown into the generator, except if it is GeneratorExit,
in which case we close() the generator and re-raise it (in this case the
loop cannot continue so we do not set RETVAL):

it = iter(EXPR)
try:
  x = next(it)
except StopIteration:
  RETVAL = None
except ReturnFromGenerator as e:
  RETVAL = e.value
else:
  while True:
    try:
      v = yield x
    except GeneratorExit:
      it.close()
      raise
    except:
      try:
        x = it.throw(*sys.exc_info())
      except StopIteration:
        RETVAL = None; break
      except ReturnFromGenerator as e:
        RETVAL = e.value; break
    else:
      try:
        x = it.send(v)
      except StopIteration:
        RETVAL = None; break
      except ReturnFromGenerator as e:
         RETVAL = e.value; break

7. In stage 7 we finally ask ourselves what should happen if it is not a
generator (but some other iterator). The best answer seems subtle: send()
should degenerator to next(), and all exceptions should simply be re-raised.
We can conceptually specify this by simply re-using the for-loop expansion:

it = iter(EXPR)
if <it is not a generator>:
  for x in it:
    yield next(x)
  RETVAL = None
else:
  try:
    x = next(it)
  except StopIteration:
    RETVAL = None
  except ReturnFromGenerator as e:
    RETVAL = e.value
  else:
    while True:
      try:
        v = yield x
      except GeneratorExit:
        it.close()
        raise
      except:
        try:
          x = it.throw(*sys.exc_info())
        except StopIteration:
          RETVAL = None; break
        except ReturnFromGenerator as e:
          RETVAL = e.value; break
      else:
        try:
          x = it.send(v)
        except StopIteration:
          RETVAL = None; break
        except ReturnFromGenerator as e:
          RETVAL = e.value; break

Note: I don't mean that we literally should have a separate code path for
non-generators. But writing it this way adds the generator test to one place
in the spec, which helps understanding why I am choosing these semantics.
The entire code of stage 6 degenerates to stage 1 if we make the following
substitutions:

it.send(v)               -> next(v)
it.throw(sys.exc_info()) -> raise
it.close()               -> pass

(Except for some edge cases if the incoming exception is StopIteration or
ReturnFromgenerator, so we'd have to do the test before entering the
try/except block around the throw() or send() call.)

We could do this based on the presence or absence of the send/throw/close
attributes: this would be duck typing. Or we could use isinstance(it,
types.GeneratorType). I'm not sure there are strong arguments for either
interpretation. The type check might be a little faster. We could even check
for an exact type, since GeneratorType is final. Perhaps the most important
consideration is that if EXPR produces a file stream object (which has a
close() method), it would not consistently be closed: it would be closed if
the outer generator was closed before reaching the end, but not if the loop
was allowed to run until the end of the file. So I'm leaning towards only
making the generator-specific method calls if it is really a generator.

-- 
--Guido van Rossum (home page:
http://www.python.org/~guido/<http://www.python.org/%7Eguido/>
)
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-ideas/attachments/20090404/0dad23ab/attachment.html>

From erratic at devel.ws  Sat Apr  4 22:44:20 2009
From: erratic at devel.ws (Paige Thompson)
Date: Sat, 4 Apr 2009 13:44:20 -0700
Subject: [Python-ideas] clear() method for lists
In-Reply-To: <91ad5bf80904031324p36e26684i5b376e477c790012@mail.gmail.com>
References: <7528bcdd0904030353v303635fem1e757a5d6d625f8f@mail.gmail.com>
	<91ad5bf80904031154h1d009f64g60da09003487d6b8@mail.gmail.com>
	<5061b39c0904031209mbb9851ete35a7c6ff452d553@mail.gmail.com>
	<91ad5bf80904031324p36e26684i5b376e477c790012@mail.gmail.com>
Message-ID: <5061b39c0904041344g639ca8f3q820301ddf830f745@mail.gmail.com>

because im just a troll

-Adele
(sent from my gphone!)

On Apr 3, 2009 1:24 PM, "George Sakkis" <george.sakkis at gmail.com> wrote:

On Fri, Apr 3, 2009 at 3:09 PM, Paige Thompson <erratic at devel.ws> wrote: > i
instinctively want to ...
If you're so happy with C#, why bother with Python in the first place ?

> -Adele > (sent from my gphone!)
George
(sent from my gmail!)
_______________________________________________

Python-ideas mailing list Python-ideas at python.org

http://mail.python.org/mailman/listinfo/python-ideas
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-ideas/attachments/20090404/3100d466/attachment.html>

From guido at python.org  Sat Apr  4 22:48:19 2009
From: guido at python.org (Guido van Rossum)
Date: Sat, 4 Apr 2009 13:48:19 -0700
Subject: [Python-ideas] x=(yield from) confusion [was:Yet another
	alternative name for yield-from]
In-Reply-To: <49D6BC46.9000808@improva.dk>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com> 
	<49D60B81.6060209@gmail.com>
	<fb6fbf560904030948t23ad6b08t77e71017e7c62853@mail.gmail.com> 
	<49D64BA7.2000006@improva.dk>
	<ca471dc20904031421j2c6c8f49kece03fe24ca539ee@mail.gmail.com> 
	<49D69AFA.5070600@improva.dk> <49D6AC52.4010608@gmail.com> 
	<49D6BC46.9000808@improva.dk>
Message-ID: <ca471dc20904041348u4ea98d60u3a023e8262563df2@mail.gmail.com>

On Fri, Apr 3, 2009 at 6:47 PM, Jacob Holm <jh at improva.dk> wrote:
> 1) ?We have a coroutine that expects you to call send and/or throw with
> specific values, and ends up returning a value. ?A beginner may try to
> iterate over it, but will most likely get an exception on the first next()
> call because the input is not valid. Or he would get an infinite loop
> because None is not changing the state of the coroutine. ?In any case, it is
> unlikely that he will get to see either StopIteration or the new exception,
> because the input is not what the coroutine expects. The new exception
> doesn't help here.
>
> 2) We have a generator that e.g. pulls values from a file, yielding the
> processed values as it goes along, and returning some form of summary at the
> end. ? If I iterate over it with a for-loop, I get all the values asn usual
> ... followed by an exception. ?Why do I have to get an exception there just
> because the generator has some information that its implementer thought I
> might want? ? Ignoring the value in this case seems perfectly reasonable, so
> having to catch an exception is just noise here.

Sorry, I read this message after writing a long response to an earlier
message of you where I rejected the idea of new syntax for returning a
value from a generator. I find this example somewhat convincing, and
more so because the extra processing of ReturnFromGenerator makes my
proposal a bit messy: there are three "except StopIteration" clauses,
all with parallel "except ReturnFromGenerator" clauses.

Though the real implementation would probably merge all that code into
a single C-level function.

And new syntax *is* a much bigger burden than a new exception. I think
I need to ponder this for a while and think more about how important
it really is to hold the hand of newbies trying to write a vanilla
generator, vs. how important this use case really is (it's easily
solved with a class, for example).

> 3) We have a coroutine that computes something expensive, occationally
> yielding to let other code run. It neither sends or receives values, just
> uses yield for cooperative multitasking. ?When it is done it returns a
> value. ?If you loop over this coroutine, you will get a bunch of Nones,
> followed by the new exception. ?You could argue that the new exception helps
> you here. ?One way of accessing the returned value would be to catch it and
> look at an attribute. ?However, for this case I would prefer to just call
> close on the generator to get the value afterwards. ?A beginner might be
> helped by the unexpected exception, but I think even a beginner would find
> that something strange was going on when the only value he gets for the loop
> variable is None. ?He might even look up the documentation for the coroutine
> he was calling and see how it was supposed to be used.

Well if you have nothing else to do you could just use "yield from"
over this coroutine and get the return value through that syntax.

And if you're going to call next() on it with other activities in
between, you have to catch StopIteration from that next() call anyway
-- you would have to also catch ReturnFromGenerator to extract the
value.

I don't believe that once the generator has raised StopIteration or
ReturnFromGenerator, the return value should be saved somewhere to be
retrieved with an explicit close() call -- I want to be able to free
all resources once the generator frame is dead.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)


From benjamin at python.org  Sat Apr  4 23:38:50 2009
From: benjamin at python.org (Benjamin Peterson)
Date: Sat, 4 Apr 2009 21:38:50 +0000 (UTC)
Subject: [Python-ideas] list.index() extension
Message-ID: <loom.20090404T210511-367@post.gmane.org>

I would like to propose a extra parameter `predicate` to list.index() like this:

def index(self, obj, predicate=operator.eq):
    for idx, item in enumerate(self):
        if predicate(item, obj):
            return idx
    raise IndexError

My main use-case is 2to3 where a node has to locate its self in the parents node
list by identity. Instead of writing out the search manually as is done now, it
would be nice to just write `self.parent.nodes.index(self, operator.is_)`.

I can also imagine this might be useful:

print "The first number less than 5 in this list is %s" % (my_list.index(5,
operator.lt),)



From greg.ewing at canterbury.ac.nz  Sat Apr  4 23:51:21 2009
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Sun, 05 Apr 2009 09:51:21 +1200
Subject: [Python-ideas] yield-from and @coroutine decorator
 [was:x=(yield from) confusion]
In-Reply-To: <49D69CF8.1050202@improva.dk>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com>
	<49D60B81.6060209@gmail.com> <49D62804.8080102@improva.dk>
	<49D6346D.3000002@gmail.com> <49D69CF8.1050202@improva.dk>
Message-ID: <49D7D659.8060903@canterbury.ac.nz>

Jacob Holm wrote:

>  RESULT = yield STARTEXPR from EXPR
>  RESULT = yield from EXPR with STARTEXPR
>  RESULT = yield from EXPR as NAME starting with STARTEXPR(NAME)
> 
> And letting STARTEXPR if given take the place of the initial _i.next() 

No, that's not satisfactory at all, because it introduces
a spurious value into the stream of yielded values seen
by the user of the outer generator.

For refactoring to work correctly, the first value yielded
by the yield-from expression *must* be the first value
yielded by the subgenerator. There's no way of achieving
that when using the Beazley decorator, because it thinks
the first yielded value is of no interest and discards it.

>> We have a long way to go before we even come close to consuming as many
>> pixels as PEP 308 or PEP 343 - a fact for which Greg is probably 
>> grateful ;)

At least Guido will know if someone manages to unsubscribe
him from the list -- he won't be getting 500 messages about
yield-from every day. :-)

-- 
Greg


From leif.walsh at gmail.com  Sat Apr  4 23:50:44 2009
From: leif.walsh at gmail.com (Leif Walsh)
Date: Sat, 4 Apr 2009 17:50:44 -0400
Subject: [Python-ideas] list.index() extension
In-Reply-To: <loom.20090404T210511-367@post.gmane.org>
References: <loom.20090404T210511-367@post.gmane.org>
Message-ID: <cc7430500904041450y4a85026jea9add038367f6d@mail.gmail.com>

On Sat, Apr 4, 2009 at 5:38 PM, Benjamin Peterson <benjamin at python.org> wrote:
> I would like to propose a extra parameter `predicate` to list.index() like this:
>
> def index(self, obj, predicate=operator.eq):
> ? ?for idx, item in enumerate(self):
> ? ? ? ?if predicate(item, obj):
> ? ? ? ? ? ?return idx
> ? ?raise IndexError
>
> My main use-case is 2to3 where a node has to locate its self in the parents node
> list by identity. Instead of writing out the search manually as is done now, it
> would be nice to just write `self.parent.nodes.index(self, operator.is_)`.
>
> I can also imagine this might be useful:
>
> print "The first number less than 5 in this list is %s" % (my_list.index(5,
> operator.lt),)

print "The first number less than 5 in this list is my_list[%d]=%s" %
((idx, elt) for idx, elt in enumerate(my_list) if elt < 5).next()

Okay, it's sort of ugly and rubyish, but I think it solves your case
sufficiently that we don't need to change index().  If you can come up
with a more pressing reason though, I'm all ears (and fingers,
evidently).

-- 
Cheers,
Leif


From benjamin at python.org  Sun Apr  5 00:00:13 2009
From: benjamin at python.org (Benjamin Peterson)
Date: Sat, 4 Apr 2009 22:00:13 +0000 (UTC)
Subject: [Python-ideas] list.index() extension
References: <loom.20090404T210511-367@post.gmane.org>
	<cc7430500904041450y4a85026jea9add038367f6d@mail.gmail.com>
Message-ID: <loom.20090404T215956-523@post.gmane.org>

2009/4/4 Leif Walsh <leif.walsh at gmail.com>:
> On Sat, Apr 4, 2009 at 5:38 PM, Benjamin Peterson <benjamin at python.org> wrote:
>> I can also imagine this might be useful:
>>
>> print "The first number less than 5 in this list is %s" % (my_list.index(5,
>> operator.lt),)
>
> print "The first number less than 5 in this list is my_list[%d]=%s" %
> ((idx, elt) for idx, elt in enumerate(my_list) if elt < 5).next()

That does something different. My would tell you the first index of a
number less than 5 and your what tell you what that was.

>
> Okay, it's sort of ugly and rubyish, but I think it solves your case
> sufficiently that we don't need to change index().  If you can come up
> with a more pressing reason though, I'm all ears (and fingers,
> evidently).

Did you see the 2to3 one?



From leif.walsh at gmail.com  Sun Apr  5 00:07:37 2009
From: leif.walsh at gmail.com (Leif Walsh)
Date: Sat, 4 Apr 2009 18:07:37 -0400
Subject: [Python-ideas] list.index() extension
In-Reply-To: <1afaf6160904041458x6d1ea0c5n5df35fad7464989c@mail.gmail.com>
References: <loom.20090404T210511-367@post.gmane.org>
	<cc7430500904041450y4a85026jea9add038367f6d@mail.gmail.com> 
	<1afaf6160904041458x6d1ea0c5n5df35fad7464989c@mail.gmail.com>
Message-ID: <cc7430500904041507p6acb300cradb437caad2ad0b4@mail.gmail.com>

On Sat, Apr 4, 2009 at 5:58 PM, Benjamin Peterson <benjamin at python.org> wrote:
> 2009/4/4 Leif Walsh <leif.walsh at gmail.com>:
>> On Sat, Apr 4, 2009 at 5:38 PM, Benjamin Peterson <benjamin at python.org> wrote:
>>> print "The first number less than 5 in this list is %s" % (my_list.index(5,
>>> operator.lt),)
>>
>> print "The first number less than 5 in this list is my_list[%d]=%s" %
>> ((idx, elt) for idx, elt in enumerate(my_list) if elt < 5).next()
>
> That does something different. My would tell you the first index of a
> number less than 5 and your what tell you what that was.

Mine does both, actually, and you can get whichever part you need out of it.

> Did you see the 2to3 one?

Yes.  I don't know the details of the implementations of those
classes, but I think you could easily cook up something quite similar
to what I did above.

I'm not really -1 or +1 on this, I just think it's probably easier for
you to use a generator expression than to try to convince python-ideas
that this needs to happen.  My finger could be way off the list's
pulse though.

-- 
Cheers,
Leif


From lists at cheimes.de  Sun Apr  5 00:45:57 2009
From: lists at cheimes.de (Christian Heimes)
Date: Sun, 05 Apr 2009 00:45:57 +0200
Subject: [Python-ideas] list.index() extension
In-Reply-To: <loom.20090404T210511-367@post.gmane.org>
References: <loom.20090404T210511-367@post.gmane.org>
Message-ID: <gr8nv5$lk3$1@ger.gmane.org>

Benjamin Peterson schrieb:
> I would like to propose a extra parameter `predicate` to list.index() like this:
> 
> def index(self, obj, predicate=operator.eq):
>     for idx, item in enumerate(self):
>         if predicate(item, obj):
>             return idx
>     raise IndexError

+0 from me

Are you planing in adding the same feature for count, too?

Christian



From ncoghlan at gmail.com  Sun Apr  5 00:54:08 2009
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Sun, 05 Apr 2009 08:54:08 +1000
Subject: [Python-ideas] name export
In-Reply-To: <20090404162311.GA8988@panix.com>
References: <20090403101915.479e4e0b@o> <20090404162311.GA8988@panix.com>
Message-ID: <49D7E510.5040400@gmail.com>

Aahz wrote:
> Your problem is that you're using import * -- stop doing that and you
> won't have an issue.  The only good use cases for import * IMO are
> interactive Python and packages, and in the latter case I don't see why
> anyone would need the information you propose except for debugging
> purposes.

We've recently discovered another use for it: overriding a pure Python
implementation with optional native language accelerated components.
Using "from _accelerated_name import *" allows other Python
implementations to easily choose a different subset of functions and
classes to accelerate.

That one is only relevant to people writing modules that they would like
to work unchanged with more than one Python implementation though.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------


From ncoghlan at gmail.com  Sun Apr  5 01:11:32 2009
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Sun, 05 Apr 2009 09:11:32 +1000
Subject: [Python-ideas] x=(yield from) confusion [was:Yet
 another	alternative name for yield-from]
In-Reply-To: <ca471dc20904041329p31571e51o44d0ea8bcd74f96f@mail.gmail.com>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com>
	<49D60B81.6060209@gmail.com>	<fb6fbf560904030948t23ad6b08t77e71017e7c62853@mail.gmail.com>
	<49D64BA7.2000006@improva.dk>	<ca471dc20904031421j2c6c8f49kece03fe24ca539ee@mail.gmail.com>
	<49D69AFA.5070600@improva.dk>
	<ca471dc20904041329p31571e51o44d0ea8bcd74f96f@mail.gmail.com>
Message-ID: <49D7E924.6040101@gmail.com>

Guido van Rossum wrote:
> [Guido]
>>> Oh, and "yield from" competes with @couroutine over
>>> when the initial next() call is made, which again suggests the two
>>> styles (yield-from and coroutines) are incompatible.
>>
>> It is a serious problem, because one of the major points of the PEP is
> that
>> it should be useful for refactoring coroutines.  As a matter of fact, I
>> started another thread on this specific issue earlier today which only
> Nick
>> has so far responded to.  I think it is solvable, but requires some more
>> work.
> 
> I think that's the thread where I asked you and Nick to stop making more
> proposals.I a worried that a solution would become too complex, and I
> want to keep the "naive" interpretation of "yield from EXPR" to be as
> close as possible to "for x in EXPR: yield x". I think the @coroutine
> generator (whether built-in or not) or explicit "priming" by a next()
> call is fine.

The trick is that if the definition of "yield from" *includes* the
priming step, then we are saying that coroutines *shouldn't* be primed
in a decorator. I don't actually have a problem with that, so long as we
realise that existing coroutines that are automatically primed when
created won't work unmodified with "yield from" (since they would get
primed twice - once by the wrapper function and once by the "yield from"
expression).

To be honest, I see that "auto-priming" behaviour as similar to merging
creation of threading.Thread instances with calling t.start() on them -
while it is sometimes convenient to do that, making it impossible to
separate the creation from the activation the way a @coroutine decorator
does actually seems like an undesirable thing to do.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------


From greg.ewing at canterbury.ac.nz  Sun Apr  5 01:14:25 2009
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Sun, 05 Apr 2009 11:14:25 +1200
Subject: [Python-ideas] x=(yield from) confusion [was:Yet another
 alternative name for yield-from]
In-Reply-To: <ca471dc20904041329p31571e51o44d0ea8bcd74f96f@mail.gmail.com>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com>
	<49D60B81.6060209@gmail.com>
	<fb6fbf560904030948t23ad6b08t77e71017e7c62853@mail.gmail.com>
	<49D64BA7.2000006@improva.dk>
	<ca471dc20904031421j2c6c8f49kece03fe24ca539ee@mail.gmail.com>
	<49D69AFA.5070600@improva.dk>
	<ca471dc20904041329p31571e51o44d0ea8bcd74f96f@mail.gmail.com>
Message-ID: <49D7E9D1.3050700@canterbury.ac.nz>

Guido van Rossum wrote:

> We could do this based on the presence or absence of the 
> send/throw/close attributes: this would be duck typing. Or we could use 
> isinstance(it, types.GeneratorType).

I don't like the idea of switching the entire behaviour
based on a blanket generator/non-generator distinction.

Failure to duck-type is unpythonic unless there's a very
good reason for it, and I don't see any strong reason here.
Why shouldn't an iterator be able to emulate a generator
by providing all the necessary methods?

On the other hand, if we're looking at the presence of
methods, what happens if e.g. it has a throw() method
but not a send() method? Do we treat it as though the
throw() method didn't exist just because it doesn't have
the full complement of generator methods? That doesn't
seem very pythonic either.

> if EXPR produces 
> a file stream object (which has a close() method), it would not 
> consistently be closed: it would be closed if the outer generator was 
> closed before reaching the end, but not if the loop was allowed to run 
> until the end of the file.

I don't think this is a serious problem, for the following
reasons:

1. We've already more or less decided that yield-from is not
going to address the case of shared subiterators, so if anything
else would care about the file being closed unexpectedly, you
shouldn't be using yield-from on it.

2. It's well known that you can't rely on automatic closing
of files in non-refcounting implementations, so code wanting
to ensure the file is closed will need to do so explicitly
using a finally clause or something equivalent, which will
get triggered by closing the outer generator.

-- 
Greg


From leif.walsh at gmail.com  Sun Apr  5 01:22:53 2009
From: leif.walsh at gmail.com (Leif Walsh)
Date: Sat, 4 Apr 2009 19:22:53 -0400
Subject: [Python-ideas] x=(yield from) confusion [was:Yet another
	alternative name for yield-from]
In-Reply-To: <49D7E9D1.3050700@canterbury.ac.nz>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com> 
	<49D60B81.6060209@gmail.com>
	<fb6fbf560904030948t23ad6b08t77e71017e7c62853@mail.gmail.com> 
	<49D64BA7.2000006@improva.dk>
	<ca471dc20904031421j2c6c8f49kece03fe24ca539ee@mail.gmail.com> 
	<49D69AFA.5070600@improva.dk>
	<ca471dc20904041329p31571e51o44d0ea8bcd74f96f@mail.gmail.com> 
	<49D7E9D1.3050700@canterbury.ac.nz>
Message-ID: <cc7430500904041622q5a08d29bma4836640b445361e@mail.gmail.com>

I haven't been following this discussion too much (as I would have no
time for anything else if I did, it seems), but I think I understand
the problem with priming a coroutine, and then trying to use it in
yield from, and I may have a solution.  I don't understand what it
means to 'yield from' a coroutine, but I'll here's my proposed fix:

Give all generators/coroutines a 'prime' (or better named) function.
This prime function can set some 'is_primed' internal variable so that
it never primes more than once.  Now, yield from and @coroutine can
(and this is the hazy part because I don't know what yield from is
really doing under the hood) both use prime(), so yielding from a
non-decorated coroutine will have the same effect as yielding from a
decorated coroutine.

-- 
Cheers,
Leif


From benjamin at python.org  Sun Apr  5 01:23:23 2009
From: benjamin at python.org (Benjamin Peterson)
Date: Sat, 4 Apr 2009 23:23:23 +0000 (UTC)
Subject: [Python-ideas] list.index() extension
References: <loom.20090404T210511-367@post.gmane.org>
	<gr8nv5$lk3$1@ger.gmane.org>
Message-ID: <loom.20090404T231931-372@post.gmane.org>

Christian Heimes <lists at ...> writes:
> 
> Are you planing in adding the same feature for count, too?

I don't care, but I suppose for consistency it should get the extra argument,
too, as well as tuple's index() and count().






From leif.walsh at gmail.com  Sun Apr  5 01:23:32 2009
From: leif.walsh at gmail.com (Leif Walsh)
Date: Sat, 4 Apr 2009 19:23:32 -0400
Subject: [Python-ideas] x=(yield from) confusion [was:Yet another
	alternative name for yield-from]
In-Reply-To: <cc7430500904041622q5a08d29bma4836640b445361e@mail.gmail.com>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com> 
	<49D60B81.6060209@gmail.com>
	<fb6fbf560904030948t23ad6b08t77e71017e7c62853@mail.gmail.com> 
	<49D64BA7.2000006@improva.dk>
	<ca471dc20904031421j2c6c8f49kece03fe24ca539ee@mail.gmail.com> 
	<49D69AFA.5070600@improva.dk>
	<ca471dc20904041329p31571e51o44d0ea8bcd74f96f@mail.gmail.com> 
	<49D7E9D1.3050700@canterbury.ac.nz>
	<cc7430500904041622q5a08d29bma4836640b445361e@mail.gmail.com>
Message-ID: <cc7430500904041623h78aa2173pc377311431ea03ef@mail.gmail.com>

On Sat, Apr 4, 2009 at 7:22 PM, Leif Walsh <leif.walsh at gmail.com> wrote:
> but I'll here's my proposed fix:

hoo-ray copyediting!

-- 
Cheers,
Leif


From greg.ewing at canterbury.ac.nz  Sun Apr  5 01:27:00 2009
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Sun, 05 Apr 2009 11:27:00 +1200
Subject: [Python-ideas] x=(yield from) confusion [was:Yet another
 alternative name for yield-from]
In-Reply-To: <ca471dc20904041348u4ea98d60u3a023e8262563df2@mail.gmail.com>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com>
	<49D60B81.6060209@gmail.com>
	<fb6fbf560904030948t23ad6b08t77e71017e7c62853@mail.gmail.com>
	<49D64BA7.2000006@improva.dk>
	<ca471dc20904031421j2c6c8f49kece03fe24ca539ee@mail.gmail.com>
	<49D69AFA.5070600@improva.dk> <49D6AC52.4010608@gmail.com>
	<49D6BC46.9000808@improva.dk>
	<ca471dc20904041348u4ea98d60u3a023e8262563df2@mail.gmail.com>
Message-ID: <49D7ECC4.1030008@canterbury.ac.nz>

Guido van Rossum wrote:

> how important this use case really is (it's easily
> solved with a class, for example).

Yes, it's my feeling that a class would be better for
this kind of thing too.

Let's not lose sight of the fundamental motivation
for all this, the way I see it at least: yield-from is
primarily to permit factoring of generator code. Any
proposals for enhancements or extensions ought to be
justified in relation to that.

> I don't believe that once the generator has raised StopIteration or
> ReturnFromGenerator, the return value should be saved somewhere to be
> retrieved with an explicit close() call -- I want to be able to free
> all resources once the generator frame is dead.

I agree with that.

As a corollary, I *don't* think that close() should
return the value of a ReturnFromGenerator even if it
gets one, because unless the value is stored, you'll
only get it the first time close() is called, and
only if the generator has not already completed
normally. That would make it too unreliable for any
practical use as far as I can see.

-- 
Greg


From greg.ewing at canterbury.ac.nz  Sun Apr  5 01:36:51 2009
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Sun, 05 Apr 2009 11:36:51 +1200
Subject: [Python-ideas] list.index() extension
In-Reply-To: <gr8nv5$lk3$1@ger.gmane.org>
References: <loom.20090404T210511-367@post.gmane.org>
	<gr8nv5$lk3$1@ger.gmane.org>
Message-ID: <49D7EF13.6030001@canterbury.ac.nz>

Christian Heimes wrote:

>>def index(self, obj, predicate=operator.eq):
>>    for idx, item in enumerate(self):
>>        if predicate(item, obj):
>>            return idx
>>    raise IndexError

This looks more like it belongs in the itertools module,
if there isn't something there already that does it.

-- 
Greg


From benjamin at python.org  Sun Apr  5 01:51:17 2009
From: benjamin at python.org (Benjamin Peterson)
Date: Sat, 4 Apr 2009 23:51:17 +0000 (UTC)
Subject: [Python-ideas] list.index() extension
References: <loom.20090404T210511-367@post.gmane.org>
	<gr8nv5$lk3$1@ger.gmane.org> <49D7EF13.6030001@canterbury.ac.nz>
Message-ID: <loom.20090404T235013-362@post.gmane.org>

Greg Ewing <greg.ewing at ...> writes:

> 
> Christian Heimes wrote:
> 
> >>def index(self, obj, predicate=operator.eq):
> >>    for idx, item in enumerate(self):
> >>        if predicate(item, obj):
> >>            return idx
> >>    raise IndexError
> 
> This looks more like it belongs in the itertools module,
> if there isn't something there already that does it.

There's `next(itertools.ifilter(some_list, lambda x: x is my_obj))`, but that
returns the object and not the index as I want.






From cs at zip.com.au  Sun Apr  5 01:44:10 2009
From: cs at zip.com.au (Cameron Simpson)
Date: Sun, 5 Apr 2009 09:44:10 +1000
Subject: [Python-ideas] list.index() extension
In-Reply-To: <49D7EF13.6030001@canterbury.ac.nz>
Message-ID: <20090404234410.GA20535@cskk.homeip.net>

On 05Apr2009 11:36, Greg Ewing <greg.ewing at canterbury.ac.nz> wrote:
> Christian Heimes wrote:
>>> def index(self, obj, predicate=operator.eq):
>>>    for idx, item in enumerate(self):
>>>        if predicate(item, obj):
>>>            return idx
>>>    raise IndexError
>
> This looks more like it belongs in the itertools module,
> if there isn't something there already that does it.

Isn't it trivially built on top of itertools.takewhile?
-- 
Cameron Simpson <cs at zip.com.au> DoD#743
http://www.cskk.ezoshosting.com/cs/


From leif.walsh at gmail.com  Sun Apr  5 01:57:52 2009
From: leif.walsh at gmail.com (Leif Walsh)
Date: Sat, 4 Apr 2009 19:57:52 -0400
Subject: [Python-ideas] list.index() extension
In-Reply-To: <loom.20090404T235013-362@post.gmane.org>
References: <loom.20090404T210511-367@post.gmane.org>
	<gr8nv5$lk3$1@ger.gmane.org> <49D7EF13.6030001@canterbury.ac.nz>
	<loom.20090404T235013-362@post.gmane.org>
Message-ID: <cc7430500904041657u785898cdm31352bb996989c8c@mail.gmail.com>

On Sat, Apr 4, 2009 at 7:51 PM, Benjamin Peterson <benjamin at python.org> wrote:
> There's `next(itertools.ifilter(some_list, lambda x: x is my_obj))`, but that
> returns the object and not the index as I want.

Maybe you didn't understand what I meant.  Try the following code,
with your favorite list and object.  I promise it works.

try:
  print "The index of the first element that 'is obj' is %d." % (idx
for idx, elt in enumerate(lst) if elt is obj).next()
except StopIteration:
  print "obj is not in lst"

-- 
Cheers,
Leif


From benjamin at python.org  Sun Apr  5 02:39:24 2009
From: benjamin at python.org (Benjamin Peterson)
Date: Sun, 5 Apr 2009 00:39:24 +0000 (UTC)
Subject: [Python-ideas] list.index() extension
References: <loom.20090404T210511-367@post.gmane.org>
	<gr8nv5$lk3$1@ger.gmane.org> <49D7EF13.6030001@canterbury.ac.nz>
	<loom.20090404T235013-362@post.gmane.org>
	<cc7430500904041657u785898cdm31352bb996989c8c@mail.gmail.com>
Message-ID: <loom.20090405T003847-380@post.gmane.org>



2009/4/4 Leif Walsh <leif.walsh at gmail.com>:
> Maybe you didn't understand what I meant.  Try the following code,
> with your favorite list and object.  I promise it works.
>
> try:
>  print "The index of the first element that 'is obj' is %d." % (idx
> for idx, elt in enumerate(lst) if elt is obj).next()
> except StopIteration:
>  print "obj is not in lst"

Yes, I saw that but it still strikes me as much uglier than necessary.



-- 
Regards,
Benjamin





From leif.walsh at gmail.com  Sun Apr  5 03:21:40 2009
From: leif.walsh at gmail.com (Leif Walsh)
Date: Sat, 4 Apr 2009 21:21:40 -0400 (EDT)
Subject: [Python-ideas] list.index() extension
In-Reply-To: <loom.20090405T003847-380@post.gmane.org>
Message-ID: <ft51sq8jnpfwr7mg06UYAxe124vaj_firegpg@mail.gmail.com>

On Sat, Apr 4, 2009 at 8:39 PM, Benjamin Peterson <benjamin at python.org> wrote:
> Yes, I saw that but it still strikes me as much uglier than necessary.

Fair enough. ;-)

I'll leave you alone then.  Have fun.

-- 
Cheers,
Leif

-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 197 bytes
Desc: OpenPGP digital signature
URL: <http://mail.python.org/pipermail/python-ideas/attachments/20090404/fb3026e8/attachment.pgp>

From aahz at pythoncraft.com  Sun Apr  5 03:55:34 2009
From: aahz at pythoncraft.com (Aahz)
Date: Sat, 4 Apr 2009 18:55:34 -0700
Subject: [Python-ideas] list.index() extension
In-Reply-To: <loom.20090404T210511-367@post.gmane.org>
References: <loom.20090404T210511-367@post.gmane.org>
Message-ID: <20090405015534.GA19165@panix.com>

On Sat, Apr 04, 2009, Benjamin Peterson wrote:
>
> I would like to propose a extra parameter `predicate` to list.index()
> like this:
>
> def index(self, obj, predicate=operator.eq):
>     for idx, item in enumerate(self):
>         if predicate(item, obj):
>             return idx
>     raise IndexError
> 
> My main use-case is 2to3 where a node has to locate its self in
> the parents node list by identity. Instead of writing out the
> search manually as is done now, it would be nice to just write
> `self.parent.nodes.index(self, operator.is_)`.

-1 -- it complicates the documentation too much for a small feature.
Your function looks just fine, and I see no reason to add a new method.
-- 
Aahz (aahz at pythoncraft.com)           <*>         http://www.pythoncraft.com/

"Debugging is twice as hard as writing the code in the first place.
Therefore, if you write the code as cleverly as possible, you are, by
definition, not smart enough to debug it."  --Brian W. Kernighan


From tjreedy at udel.edu  Sun Apr  5 04:14:30 2009
From: tjreedy at udel.edu (Terry Reedy)
Date: Sat, 04 Apr 2009 22:14:30 -0400
Subject: [Python-ideas] name export
In-Reply-To: <49D7E510.5040400@gmail.com>
References: <20090403101915.479e4e0b@o> <20090404162311.GA8988@panix.com>
	<49D7E510.5040400@gmail.com>
Message-ID: <gr9464$ar0$1@ger.gmane.org>

Nick Coghlan wrote:
> Aahz wrote:
>> Your problem is that you're using import * -- stop doing that and you
>> won't have an issue.  The only good use cases for import * IMO are
>> interactive Python and packages, and in the latter case I don't see why
>> anyone would need the information you propose except for debugging
>> purposes.
> 
> We've recently discovered another use for it: overriding a pure Python
> implementation with optional native language accelerated components.
> Using "from _accelerated_name import *" allows other Python
> implementations to easily choose a different subset of functions and
> classes to accelerate.

It also allows the set of accerated components to change (typically, 
grow) within an implementation.

> That one is only relevant to people writing modules that they would like
> to work unchanged with more than one Python implementation though.

tjr



From tjreedy at udel.edu  Sun Apr  5 04:20:47 2009
From: tjreedy at udel.edu (Terry Reedy)
Date: Sat, 04 Apr 2009 22:20:47 -0400
Subject: [Python-ideas] x=(yield from) confusion [was:Yet another
 alternative name for yield-from]
In-Reply-To: <ca471dc20904041329p31571e51o44d0ea8bcd74f96f@mail.gmail.com>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com>
	<49D60B81.6060209@gmail.com>	<fb6fbf560904030948t23ad6b08t77e71017e7c62853@mail.gmail.com>
	<49D64BA7.2000006@improva.dk>	<ca471dc20904031421j2c6c8f49kece03fe24ca539ee@mail.gmail.com>
	<49D69AFA.5070600@improva.dk>
	<ca471dc20904041329p31571e51o44d0ea8bcd74f96f@mail.gmail.com>
Message-ID: <gr94hu$cj7$1@ger.gmane.org>

Guido van Rossum wrote:

> So now let me develop my full thoughts on yield-from. This is 
> unfortunately long, because I want to show some intermediate stages.

I found this extremely helpful.  Whatever expansion you finally decide 
on (pun intended ;-), an explanation like this in the PEP would be nice.

tjr



From steve at pearwood.info  Sun Apr  5 04:52:23 2009
From: steve at pearwood.info (Steven D'Aprano)
Date: Sun, 5 Apr 2009 13:52:23 +1100
Subject: [Python-ideas] list.index() extension
In-Reply-To: <49D7EF13.6030001@canterbury.ac.nz>
References: <loom.20090404T210511-367@post.gmane.org>
	<gr8nv5$lk3$1@ger.gmane.org> <49D7EF13.6030001@canterbury.ac.nz>
Message-ID: <200904051252.24476.steve@pearwood.info>

On Sun, 5 Apr 2009 09:36:51 am Greg Ewing wrote:
> Christian Heimes wrote:
> >>def index(self, obj, predicate=operator.eq):
> >>    for idx, item in enumerate(self):
> >>        if predicate(item, obj):
> >>            return idx
> >>    raise IndexError
>
> This looks more like it belongs in the itertools module,
> if there isn't something there already that does it.

Not to me. Surely itertools is for functions which return iterators, not 
arbitrary functions that take an iterable argument?

It certainly does look like a useful function to have, but I don't know 
where in the standard library it should go, given that Guido dislikes 
grab-bag modules of miscellaneous functions. To my mind, that makes it 
a candidate to become a list/tuple/str method.

I like Christian's suggestion. I've often wished for a built-in way to 
do element testing by identity instead of equality, and by "often" I 
mean occasionally. So I'm +1 on the proposal.


-- 
Steven D'Aprano


From steve at pearwood.info  Sun Apr  5 05:01:56 2009
From: steve at pearwood.info (Steven D'Aprano)
Date: Sun, 5 Apr 2009 14:01:56 +1100
Subject: [Python-ideas] list.index() extension
In-Reply-To: <loom.20090404T210511-367@post.gmane.org>
References: <loom.20090404T210511-367@post.gmane.org>
Message-ID: <200904051301.56929.steve@pearwood.info>

On Sun, 5 Apr 2009 07:38:50 am Benjamin Peterson wrote:

> def index(self, obj, predicate=operator.eq):
>     for idx, item in enumerate(self):
>         if predicate(item, obj):
>             return idx
>     raise IndexError


Isn't that a misuse of IndexError?

>>> [1, 2, 3].index(5)
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
ValueError: list.index(x): x not in list

>>> [1, 2, 3][4]
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
IndexError: list index out of range



-- 
Steven D'Aprano


From kay at fiber-space.de  Sun Apr  5 08:20:17 2009
From: kay at fiber-space.de (Kay Schluehr)
Date: Sun, 05 Apr 2009 08:20:17 +0200
Subject: [Python-ideas] copyable generators
In-Reply-To: <20090404162311.GA8988@panix.com>
References: <20090403101915.479e4e0b@o> <20090404162311.GA8988@panix.com>
Message-ID: <49D84DA1.7090509@fiber-space.de>

Dear Python idea-lists,

I've noticed that subgenerators, coroutines and such stuff are heavily 
discussed and are on topic here right now. What I'd like to ask is about 
the mood of re-considering the deferred PEP 323 about copyable iterators 
and related ideas.

I've done some work a while ago about copyable and pickable generators 
in pure Python that works in some limitations but basically relies on a 
fierce and almost impenetrable ( although documented ) bytecode hack. It 
is bundled in a package called generator_tools. The design is mostly 
"test-driven" which means that I wouldn't dare in my life to try a more 
formal specification of the used heuristics and special case treatments.

http://pypi.python.org/pypi/generator_tools/0.3.6

I know that this packages has users and there were also quite a few 
contributors who provided bug reports and fixes. So it isn't an academic 
exercise. I would love to see this package being abandoned in the future 
and replaced by a proper, more complete and faster CPython 
implementation. I know I'm not really in a position of calling for 
volunteers in particular because I'm not a core developer and don't want 
to hack the CPython C code base. But maybe this would be a cool idea for 
a summer-of-code project or something alike. I know that Stackless 
Python has a working implemenation so it shall be at least feasible.

Regards


-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-ideas/attachments/20090405/3d41dbef/attachment.html>

From list at qtrac.plus.com  Sun Apr  5 09:29:24 2009
From: list at qtrac.plus.com (Mark Summerfield)
Date: Sun, 5 Apr 2009 08:29:24 +0100
Subject: [Python-ideas] clear() method for lists
In-Reply-To: <5d1a32000904031033x36eb2fafqb395ac793c36161c@mail.gmail.com>
References: <7528bcdd0904030353v303635fem1e757a5d6d625f8f@mail.gmail.com>
	<200904031215.21277.list@qtrac.plus.com>
	<5d1a32000904031033x36eb2fafqb395ac793c36161c@mail.gmail.com>
Message-ID: <200904050829.25050.list@qtrac.plus.com>

On 2009-04-03, Gerald Britton wrote:
> About your example, what is the advantage over inheriting from list?
> I did this myself to build a kind of treed list class that supports
> nested lists:
>
> Class TreeList(list):
>  # uses most list methods
>    def __iter__:
>    # does a recursive descent through nested lists.
>
> I only had to implement methods that I wanted to add some extra sauce to.

Yes, but sometimes you need something that offers _less_ functionality
than a list (e.g., you could create a stack or queue by aggregating a
list), in which case it is easier to aggregate and just add or delegate
the methods you want without having to worry about "unimplementing"
those that you don't want---although unimplementing is possible of
course.

>
> On Fri, Apr 3, 2009 at 7:15 AM, Mark Summerfield <list at qtrac.plus.com> 
wrote:
> > On 2009-04-03, Andre Roberge wrote:
> >> Hi everyone,
> >>
> >> On the general Python list, a suggestion was made to add a clear()
> >> method to list, as the "obvious" way to do
> >> del some_list[:]
> >> or
> >> some_list[:] = []
> >>
> >> since the clear() method is currently the obvious way to remove all
> >> elements from dict and set objects.
> >>
> >> I believe that this would be a lot more intuitive to beginners learning
> >> the language, making Python more uniform.
> >>
> >> Andr?
> >
> > Hi,
> >
> > I have a use case for list.clear() (might be a bit obscure though).
> >
> > If you have a class that includes a list as an attribute (e.g., a list
> > "subclass" that uses aggregation rather than inheritance), you might
> > want to delegate many list methods to the list attribute and only
> > implement those you want to treat specially. I show an example of this
> > in "Programming in Python 3" (pages 367/8) where I have a @delegate
> > decorator that accepts an attribute name and a tuple of methods to
> > delegate to, e.g.:
> >
> >    @delegate("__list", ("pop", "__delitem__", "__getitem_", ...))
> >    class MyList:
> >        ...
> >        def clear(self):
> >            self.__list = []
> >
> > But because there is no list.clear(), the clear() method must be
> > implemented rather than delegated even though it doesn't do anything
> > special.
> >
> >
> > +1
> >
> > --
> > Mark Summerfield, Qtrac Ltd, www.qtrac.eu
> >    C++, Python, Qt, PyQt - training and consultancy
> >        "Programming in Python 3" - ISBN 0137129297
> >
> > _______________________________________________
> > Python-ideas mailing list
> > Python-ideas at python.org
> > http://mail.python.org/mailman/listinfo/python-ideas


-- 
Mark Summerfield, Qtrac Ltd, www.qtrac.eu
    C++, Python, Qt, PyQt - training and consultancy
        "C++ GUI Programming with Qt 4" - ISBN 0132354160



From alexandre at peadrop.com  Sun Apr  5 09:38:50 2009
From: alexandre at peadrop.com (Alexandre Vassalotti)
Date: Sun, 5 Apr 2009 03:38:50 -0400
Subject: [Python-ideas] Adding a functional list type to the standard
	library.
Message-ID: <acd65fa20904050038q4297c346ge06c3f1b16cd3b96@mail.gmail.com>

Hello,

I would like to have your opinion whether a functional list type would
be welcomed in the standard library. A functional list type would
provide an efficient data-structure for keeping many lists with
partial modifications. In particular, a such data-structure would be
obsolete many, if not all, use-cases of copy.deepcopy on lists. Here
is a simple example how a such list would be used:

  # The constructor does a deep conversion of the list to its internal
representation.
  state = FunctionalList([[1, 2, 3, 4], [5, 6, 7, 8], [9, 10, 11, 12],
[13, 14, 15, None]])
  print(goal[0])   # This would print a FunctionalList object, i.e.,
FunctionalList([1, 2, 3, 4])

  # Do a lazy copy of the list.
  new_state = state.copy()

  # Modify the list?many (possibly all) unchanged items are shared between
  # the original list and its copy. The original list remains unchanged.
  new_state[3][3] = new_state[3][2]
  new_state[2][3] = None

  # Show the unchanged list. This prints:
  #   FunctionalList([[1, 2, 3, 4], [5, 6, 7, 8], [9, 10, 11, 12],
[13, 14, 15, None]])
  print(state)

  # Show the modified list. This prints:
  #   FunctionalList([[1, 2, 3, 4], [5, 6, 7, 8], [9, 10, 11, None],
[13, 14, 15, 12]])

Attentive readers will remark the above example comes from a attempt
of an 15-puzzle solver. There are many use-cases for a such
data-structure. In particular, any problem that requires backtracking
search would be benefit from having a functional list. Peter Norvig
[1], for example, had to resort to use strings to implement his soduku
solver efficiently. Fortunately for him, the state of a soduko puzzle
fits well in a string as each soduko's cell can be stored in a single
single character. For more complex problems, using strings is not
always a suitable option.

Also, it would also be interesting to consider the addition of
functional dictionaries and sets as well. These twos would probably
more useful to have since the most common usage of copy.deepcopy seems
to be on dictionaries. [2] Finally, I would like to say that I am
willing to write a Python and an optimized C implementation of my
proposal.

Regards,
-- Alexandre

[1]: http://norvig.com/sudoku.html
[2]: http://google.com/codesearch?q=copy\.deepcopy+lang:python


From denis.spir at free.fr  Sun Apr  5 10:01:49 2009
From: denis.spir at free.fr (spir)
Date: Sun, 5 Apr 2009 10:01:49 +0200
Subject: [Python-ideas] list.index() extension
In-Reply-To: <loom.20090405T003847-380@post.gmane.org>
References: <loom.20090404T210511-367@post.gmane.org>
	<gr8nv5$lk3$1@ger.gmane.org> <49D7EF13.6030001@canterbury.ac.nz>
	<loom.20090404T235013-362@post.gmane.org>
	<cc7430500904041657u785898cdm31352bb996989c8c@mail.gmail.com>
	<loom.20090405T003847-380@post.gmane.org>
Message-ID: <20090405100149.2228edf5@o>

Le Sun, 5 Apr 2009 00:39:24 +0000 (UTC),
Benjamin Peterson <benjamin at python.org> s'exprima ainsi:

> 
> 
> 2009/4/4 Leif Walsh <leif.walsh at gmail.com>:
> > Maybe you didn't understand what I meant.  Try the following code,
> > with your favorite list and object.  I promise it works.
> >
> > try:
> >  print "The index of the first element that 'is obj' is %d." % (idx
> > for idx, elt in enumerate(lst) if elt is obj).next()
> > except StopIteration:
> >  print "obj is not in lst"
> 
> Yes, I saw that but it still strikes me as much uglier than necessary.

An issue is imo that (idx for idx, elt in enumerate(lst) if elt is obj) builds a generator able to yield every indexes of elements that satisfy the condition; while all you want is the first one (as expressed by ".next()").

On one hand, I have very few use cases for such a "conditional find". One the other hand, it's simple, practicle and consistent. Adding an optional parameter, with '==' as default, as originally proposed by Benjamin, does not harm in the common case and does not break any exiting code.
In case it would be accepted, then rather in builtin find and count methods for I see this as a semantic extension, not a fully different feature.

+0.5

Denis
------
la vita e estrany


From leif.walsh at gmail.com  Sun Apr  5 11:11:06 2009
From: leif.walsh at gmail.com (Leif Walsh)
Date: Sun, 5 Apr 2009 05:11:06 -0400 (EDT)
Subject: [Python-ideas] list.index() extension
In-Reply-To: <20090405100149.2228edf5@o>
Message-ID: <ft5ikfto8zthx74tlfUYAxe124vaj_firegpg@mail.gmail.com>

On Sun, Apr 5, 2009 at 4:01 AM, spir <denis.spir at free.fr> wrote:
> An issue is imo that (idx for idx, elt in enumerate(lst) if elt is obj) builds a generator able to yield every indexes of elements that satisfy the condition; while all you want is the first one (as expressed by ".next()").

It appears you're right, but the pain is only felt when the item is very close to the front of the list (and note that the ~2s penalty is over a million runs).

I've attached my benchmark, and the results are below.  For a quick idea of what I did, I made a simple global find(lst, obj) that has the 'is' condition hard-coded (for my own sanity).  I also used the generator example I posted above.  Each version was asked to find the last element of a list.  Times were averaged over 5 runs for each size, and the number of calls varied inversely to the size of the list (just so my computer doesn't burn out).

"""
Running tests, each output in secs, average of 5 calls.
10 item list, 1000000 calls:
  gen:  5.534338
  fun:  3.620335
100 item list, 100000 calls:
  gen:  2.331217
  fun:  2.258001
1000 item list, 10000 calls:
  gen:  1.842553
  fun:  2.014453
10000 item list, 1000 calls:
  gen:  1.760938
  fun:  1.925302
100000 item list, 100 calls:
  gen:  1.783202
  fun:  1.921334
1000000 item list, 10 calls:
  gen:  1.798051
  fun:  1.955751
"""

-- 
Cheers,
Leif

-------------- next part --------------
A non-text attachment was scrubbed...
Name: test.py
Type: text/x-python
Size: 916 bytes
Desc: not available
URL: <http://mail.python.org/pipermail/python-ideas/attachments/20090405/31586e86/attachment.py>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 270 bytes
Desc: OpenPGP digital signature
URL: <http://mail.python.org/pipermail/python-ideas/attachments/20090405/31586e86/attachment.pgp>

From g.brandl at gmx.net  Sun Apr  5 12:00:29 2009
From: g.brandl at gmx.net (Georg Brandl)
Date: Sun, 05 Apr 2009 12:00:29 +0200
Subject: [Python-ideas] list.index() extension
In-Reply-To: <loom.20090404T210511-367@post.gmane.org>
References: <loom.20090404T210511-367@post.gmane.org>
Message-ID: <gr9vfv$qus$1@ger.gmane.org>

Benjamin Peterson schrieb:
> I would like to propose a extra parameter `predicate` to list.index() like this:
> 
> def index(self, obj, predicate=operator.eq):
>     for idx, item in enumerate(self):
>         if predicate(item, obj):
>             return idx
>     raise IndexError
> 
> My main use-case is 2to3 where a node has to locate its self in the parents node
> list by identity. Instead of writing out the search manually as is done now, it
> would be nice to just write `self.parent.nodes.index(self, operator.is_)`.
> 
> I can also imagine this might be useful:
> 
> print "The first number less than 5 in this list is %s" % (my_list.index(5,
> operator.lt),)

-1.  The list API is the prototype of all mutable sequence APIs (as now fixed
in the corresponding abc class).  It is meant to be simple and easy to
understand.  Adding features to it should be done very carefully, and this seems
like a random one of ten similar features that could be added.

There's nothing wrong with a three-line helper function, or a list subclass
that has the additional method you're looking for.

cheers,
Georg

-- 
Thus spake the Lord: Thou shalt indent with four spaces. No more, no less.
Four shall be the number of spaces thou shalt indent, and the number of thy
indenting shall be four. Eight shalt thou not indent, nor either indent thou
two, excepting that thou then proceed to four. Tabs are right out.



From fredrik.johansson at gmail.com  Sun Apr  5 12:31:30 2009
From: fredrik.johansson at gmail.com (Fredrik Johansson)
Date: Sun, 5 Apr 2009 12:31:30 +0200
Subject: [Python-ideas] list.index() extension
In-Reply-To: <loom.20090404T210511-367@post.gmane.org>
References: <loom.20090404T210511-367@post.gmane.org>
Message-ID: <3d0cebfb0904050331y68d64d00u8ae04a75f2010bc8@mail.gmail.com>

On Sat, Apr 4, 2009 at 11:38 PM, Benjamin Peterson <benjamin at python.org> wrote:
> I would like to propose a extra parameter `predicate` to list.index() like this:
>
> def index(self, obj, predicate=operator.eq):
> ? ?for idx, item in enumerate(self):
> ? ? ? ?if predicate(item, obj):
> ? ? ? ? ? ?return idx
> ? ?raise IndexError
>
> My main use-case is 2to3 where a node has to locate its self in the parents node
> list by identity. Instead of writing out the search manually as is done now, it
> would be nice to just write `self.parent.nodes.index(self, operator.is_)`.
>
> I can also imagine this might be useful:
>
> print "The first number less than 5 in this list is %s" % (my_list.index(5,
> operator.lt),)

It would be more natural to pass a single predicate function than an argument
and a binary operator, as this is more general. I wouldn't mind predicate-based
index and count somewhere in the standard library.

Meanwhile, here is yet another solution (although not so efficient).

class match:
    def __init__(self, predicate):
        self.__eq__ = predicate

range(10,-10,-1).index(match(lambda x: x < 5))
range(10,-10,-1).count(match(lambda x: x < 5))

Fredrik


From g.brandl at gmx.net  Sun Apr  5 14:55:28 2009
From: g.brandl at gmx.net (Georg Brandl)
Date: Sun, 05 Apr 2009 14:55:28 +0200
Subject: [Python-ideas] list.index() extension
In-Reply-To: <3d0cebfb0904050331y68d64d00u8ae04a75f2010bc8@mail.gmail.com>
References: <loom.20090404T210511-367@post.gmane.org>
	<3d0cebfb0904050331y68d64d00u8ae04a75f2010bc8@mail.gmail.com>
Message-ID: <gra9o2$jbv$1@ger.gmane.org>

Fredrik Johansson schrieb:
> On Sat, Apr 4, 2009 at 11:38 PM, Benjamin Peterson <benjamin at python.org> wrote:
>> I would like to propose a extra parameter `predicate` to list.index() like this:
>>
>> def index(self, obj, predicate=operator.eq):
>>    for idx, item in enumerate(self):
>>        if predicate(item, obj):
>>            return idx
>>    raise IndexError
>>
>> My main use-case is 2to3 where a node has to locate its self in the parents node
>> list by identity. Instead of writing out the search manually as is done now, it
>> would be nice to just write `self.parent.nodes.index(self, operator.is_)`.
>>
>> I can also imagine this might be useful:
>>
>> print "The first number less than 5 in this list is %s" % (my_list.index(5,
>> operator.lt),)
> 
> It would be more natural to pass a single predicate function than an argument
> and a binary operator, as this is more general. I wouldn't mind predicate-based
> index and count somewhere in the standard library.

Or we could finally give in and realize the need for fully generalized list
operations, which were already proposed over 10 years ago.  Sadly, the idea was
never followed through and implemented: <http://tinyurl.com/d8xwee>.

cheers,
Georg

-- 
Thus spake the Lord: Thou shalt indent with four spaces. No more, no less.
Four shall be the number of spaces thou shalt indent, and the number of thy
indenting shall be four. Eight shalt thou not indent, nor either indent thou
two, excepting that thou then proceed to four. Tabs are right out.



From aahz at pythoncraft.com  Sun Apr  5 15:57:31 2009
From: aahz at pythoncraft.com (Aahz)
Date: Sun, 5 Apr 2009 06:57:31 -0700
Subject: [Python-ideas] Adding a functional list type to the
	standard	library.
In-Reply-To: <acd65fa20904050038q4297c346ge06c3f1b16cd3b96@mail.gmail.com>
References: <acd65fa20904050038q4297c346ge06c3f1b16cd3b96@mail.gmail.com>
Message-ID: <20090405135731.GA28212@panix.com>

On Sun, Apr 05, 2009, Alexandre Vassalotti wrote:
>
> I would like to have your opinion whether a functional list type would
> be welcomed in the standard library. 

My head hurts.  Are you sure you didn't intend to send this four days
ago?  ;-)

>   # The constructor does a deep conversion of the list to its internal
> representation.
>   state = FunctionalList([[1, 2, 3, 4], [5, 6, 7, 8], [9, 10, 11, 12],
> [13, 14, 15, None]])
>   print(goal[0])   # This would print a FunctionalList object, i.e.,
> FunctionalList([1, 2, 3, 4])

Where is ``goal`` defined?  At this point, I'm unable to make any
headway on understanding what you're trying here.
-- 
Aahz (aahz at pythoncraft.com)           <*>         http://www.pythoncraft.com/

"Debugging is twice as hard as writing the code in the first place.
Therefore, if you write the code as cleverly as possible, you are, by
definition, not smart enough to debug it."  --Brian W. Kernighan


From daniel at stutzbachenterprises.com  Sun Apr  5 16:20:51 2009
From: daniel at stutzbachenterprises.com (Daniel Stutzbach)
Date: Sun, 5 Apr 2009 09:20:51 -0500
Subject: [Python-ideas] Adding a functional list type to the standard
	library.
In-Reply-To: <acd65fa20904050038q4297c346ge06c3f1b16cd3b96@mail.gmail.com>
References: <acd65fa20904050038q4297c346ge06c3f1b16cd3b96@mail.gmail.com>
Message-ID: <eae285400904050720q37bc5f2atd27bed9a0b01dc2c@mail.gmail.com>

On Sun, Apr 5, 2009 at 2:38 AM, Alexandre Vassalotti
<alexandre at peadrop.com>wrote:

> I would like to have your opinion whether a functional list type would
> be welcomed in the standard library. A functional list type would
> provide an efficient data-structure for keeping many lists with
> partial modifications. In particular, a such data-structure would be
> obsolete many, if not all, use-cases of copy.deepcopy on lists. Here
> is a simple example how a such list would be used:
>

If you would find it useful, write it and distribute it as an extension
module on PyPi.

--
Daniel Stutzbach, Ph.D.
President, Stutzbach Enterprises, LLC <http://stutzbachenterprises.com>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-ideas/attachments/20090405/c043f3e1/attachment.html>

From jh at improva.dk  Sun Apr  5 16:54:58 2009
From: jh at improva.dk (Jacob Holm)
Date: Sun, 05 Apr 2009 16:54:58 +0200
Subject: [Python-ideas] x=(yield from) confusion [was:Yet another
 alternative name for yield-from]
In-Reply-To: <ca471dc20904041329p31571e51o44d0ea8bcd74f96f@mail.gmail.com>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com>
	<49D60B81.6060209@gmail.com>
	<fb6fbf560904030948t23ad6b08t77e71017e7c62853@mail.gmail.com>
	<49D64BA7.2000006@improva.dk>
	<ca471dc20904031421j2c6c8f49kece03fe24ca539ee@mail.gmail.com>
	<49D69AFA.5070600@improva.dk>
	<ca471dc20904041329p31571e51o44d0ea8bcd74f96f@mail.gmail.com>
Message-ID: <49D8C642.3080308@improva.dk>

Hi Guido

I like the way you are building the description up from the simple case, 
but I think you are missing a few details along the way.
Those details are what has been driving the discussion, so I think it is 
important to get them handled.  I'll comment on each point as I get to it.


Guido van Rossum wrote:
> I want to name the new exception ReturnFromGenerator to minimize the 
> similarity with GeneratorExit [...]

Fine with me, assuming we can't get rid of it altogether. 

[Snipped description of close() and __del__(), which I intend to comment 
on in the other thread]

> [Guido]
> >> Oh, and "yield from" competes with @couroutine over
> >> when the initial next() call is made, which again suggests the two
> >> styles (yield-from and coroutines) are incompatible.
> >
> > It is a serious problem, because one of the major points of the PEP 
> is that
> > it should be useful for refactoring coroutines.  As a matter of fact, I
> > started another thread on this specific issue earlier today which 
> only Nick
> > has so far responded to.  I think it is solvable, but requires some more
> > work.
>
> I think that's the thread where I asked you and Nick to stop making 
> more proposals.I a worried that a solution would become too complex, 
> and I want to keep the "naive" interpretation of "yield from EXPR" to 
> be as close as possible to "for x in EXPR: yield x". I think the 
> @coroutine generator (whether built-in or not) or explicit "priming" 
> by a next() call is fine.

I think it is important to be able to use yield-from with a @coroutine, 
but I'll wait a bit before I do more on that front (except for a few 
more comments in this mail).  There are plenty of other issues to tackle.


> So now let me develop my full thoughts on yield-from. This is 
> unfortunately long, because I want to show some intermediate stages. I 
> am using a green font for new code. I am using stages, where each 
> stage provides a better approximation of the desired semantics. Note 
> that each stage *adds* some semantics for corner cases that weren't 
> handled the same way in the previous stage. Each stage proposes an 
> expansion for "RETVAL = yield from EXPR". I am using Py3k syntax.
[snip stage 1-3]
> 4. Stage four adds handling for ReturnFromGenerator, in both places 
> where next() is called:
>
> it = iter(EXPR)
> try:
>   x = next(it)
> except StopIteration:
>   RETVAL = e.value
> except ReturnFromGenerator as e:
>   RETVAL = e.value; break
> else:
>   while True:
>     yield x
>     try:
>       x = next(it)
>     except StopIteration:
>       RETVAL = None; break
>     except ReturnFromGenerator as e:
>       RETVAL = e.value; break
>  yield x

(There are two cut'n'paste errors here.  The first "break" and the 
second "yield x" shouldn't be there.  Just wanted to point it out in 
case this derivation makes it to the PEP)

>
> 5. Stage five shows what should happen if "yield x" above returns a 
> value: it is passed into the subgenerator using send(). I am ignoring 
> for now what happens if it is not a generator; this will be cleared up 
> later. Note that the initial next() call does not change into a send() 
> call, because there is no value to send before before we have yielded:
>
[snipped code for stage 5]

The argument that we have no value to send before we have yielded is 
wrong.  The generator containing the "yield-from" could easily have a 
value to send (or throw), and if iter(EXPR) returns a coroutine or a 
non-generator it could easily be ready to accept it.  That is the idea 
behind my attempted fixes to the @coroutine issue.

> 6. Stage six adds more refined semantics for when "yield x" raises an 
> exception: it is thrown into the generator, except if it is 
> GeneratorExit, in which case we close() the generator and re-raise it 
> (in this case the loop cannot continue so we do not set RETVAL):
>
[snipped code for stage 6]

This is where the fun begins.  In an earlier thread we concluded that if 
the thrown exception is a StopIteration and the *same* StopIteration 
instance escapes the throw() call, it should be reraised rather than 
caught and turned into a RETVAL.  The reasoning was the following example:

def inner():
    for i in xrange(10):
        yield i

def outer():
    yield from inner()
    print "if StopIteration is thrown in we shouldn't get here" 


Which we wanted to be equivalent to:

def outer():
    for i in xrange(10):
        yield i
    print "if StopIteration is thrown in we shouldn't get here" 


The same argument goes for ReturnFromGenerator, so the expansion at this 
stage should be more like:

it = iter(EXPR)
try:
  x = next(it)
except StopIteration:
  RETVAL = None
except ReturnFromGenerator as e:
  RETVAL = e.value
else:
  while True:
    try:
      v = yield x
    except GeneratorExit:
      it.close()
      raise
    except BaseException as e:
      try:
        x = it.throw(e)  # IIRC this includes the correct traceback in 3.x so we don't need to use sys.exc_info
      except StopIteration as r:
        if r is e:
          raise
        RETVAL = None; break
      except ReturnFromGenerator as r:
        if r is e:
          raise
        RETVAL = r.value; break
    else:
      try:
        x = it.send(v)
      except StopIteration:
        RETVAL = None; break
      except ReturnFromGenerator as e:
        RETVAL = e.value; break


Next issue is that the value returned by it.close() is thrown away by 
yield-from.  Here is a silly example:

def inner():
    i = 0
    while True
        try:
            yield
        except GeneratorExit:
            return i
        i += 1

def outer():
    try:
        yield from inner()
    except GeneratorExit:
        # nothing I can write here will get me the value returned from inner()


Also the trivial:

def outer():
    return yield from inner()


Would swallow the return value as well.

I have previously suggested attaching the return value to the (re)raised 
GeneratorExit, and/or saving the return value on the generator and 
making close return the value each time it is called.  We could also 
choose to define this as broken behavior and raise a RuntimeError, 
although it seems a bit strange to have yield-from treat it as an error 
when close doesn't.  Silently having the yield-from construct swallow 
the returned value is my least favored option.

>
> 7. In stage 7 we finally ask ourselves what should happen if it is not 
> a generator (but some other iterator). The best answer seems subtle: 
> send() should degenerator to next(), and all exceptions should simply 
> be re-raised. We can conceptually specify this by simply re-using the 
> for-loop expansion:
>
> it = iter(EXPR)
> if <it is not a generator>:
>   for x in it:
>     yield next(x)
>   RETVAL = None
> else:
>   try:
>     x = next(it)
>   except StopIteration:
>     RETVAL = None
>   except ReturnFromGenerator as e:
>     RETVAL = e.value
>   else:
>     while True:
>       try:
>         v = yield x
>       except GeneratorExit:
>         it.close()
>         raise
>       except:
>         try:
>           x = it.throw(*sys.exc_info())
>         except StopIteration:
>           RETVAL = None; break
>         except ReturnFromGenerator as e:
>           RETVAL = e.value; break
>       else:
>         try:
>           x = it.send(v)
>         except StopIteration:
>           RETVAL = None; break
>         except ReturnFromGenerator as e:
>          RETVAL = e.value; break
>
> Note: I don't mean that we literally should have a separate code path 
> for non-generators. But writing it this way adds the generator test to 
> one place in the spec, which helps understanding why I am choosing 
> these semantics. The entire code of stage 6 degenerates to stage 1 if 
> we make the following substitutions:
>
> it.send(v)               -> next(v)
> it.throw(sys.exc_info()) -> raise
> it.close()               -> pass
>
> (Except for some edge cases if the incoming exception is StopIteration 
> or ReturnFromgenerator, so we'd have to do the test before entering 
> the try/except block around the throw() or send() call.)
>
> We could do this based on the presence or absence of the 
> send/throw/close attributes: this would be duck typing. Or we could 
> use isinstance(it, types.GeneratorType). I'm not sure there are strong 
> arguments for either interpretation. The type check might be a little 
> faster. We could even check for an exact type, since GeneratorType is 
> final. Perhaps the most important consideration is that if EXPR 
> produces a file stream object (which has a close() method), it would 
> not consistently be closed: it would be closed if the outer generator 
> was closed before reaching the end, but not if the loop was allowed to 
> run until the end of the file. So I'm leaning towards only making the 
> generator-specific method calls if it is really a generator.

Like Greg, I am in favor of duck-typing this as closely as possible.  My 
preferred treatment for converting stage 6 to stage 7 goes like this:

x = it.close() -->

  m = getattr(it, 'close', None)
  if m is not None:
      x = it.close()
  else:
      x = None

x = it.send(v) -->

  if v is None:
      x = next(it)
  else:
      try:
          m = it.send
      except AttributeError:
          m = getattr(it, 'close', None)
          if m is not None:
              it.close()  # in this case I think it is ok to ignore the return value
          raise
      else:
          x = m(v)

x = throw(e) -->

  m = getattr(it, 'throw', None)
  if m is not None:
      x = m()
  else:
      m = getattr(it, 'close', None)
      if m is not None:
          it.close()  # in this case I think it is ok to ignore the return value
      raise e


In this version it is easy enough to wrap the final iterator if you want 
different behavior.  With your version it becomes difficult to replace a 
generator that is used in a yield-from with an iterator.  (You would 
have to wrap the iterator in a generator that mostly consisted of the 
expansion from this PEP with the above substitution).

I don't think we need to worry about performance at this stage.  AFAICT 
from the patch I was working on, the cost of a few extra checks is 
negligible compared to the savings you get from using yield-from in the 
first place.

Best regards
- Jacob

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-ideas/attachments/20090405/50cf5475/attachment.html>

From dangyogi at gmail.com  Sun Apr  5 18:35:15 2009
From: dangyogi at gmail.com (Bruce Frederiksen)
Date: Sun, 05 Apr 2009 12:35:15 -0400
Subject: [Python-ideas] Adding a functional list type to the standard
 library.
In-Reply-To: <acd65fa20904050038q4297c346ge06c3f1b16cd3b96@mail.gmail.com>
References: <acd65fa20904050038q4297c346ge06c3f1b16cd3b96@mail.gmail.com>
Message-ID: <49D8DDC3.6000407@gmail.com>

Alexandre Vassalotti wrote:
> I would like to have your opinion whether a functional list type would
> be welcomed in the standard library. 
I'm wondering how FunctionalLists compare to extending tuples to provide 
all of the list capabilities but by returning new (immutable) objects, 
rather than modifying (or seeming to modify?) the object in place?

For example, if two new implementations of tuple were added: 
slice_tuple(parent_tuple, slice) and subst_tuple(parent_tuple, slice, 
inserted_tuple) to obviate copying the needed parts of the parent_tuple, 
then:

tuple[slice] => return slice_tuple(tuple, slice)
tuple.delete(slice) => return subst_tuple(tuple, slice, ())
tuple.with_append(x) => return subst_tuple(tuple, slice(len(tuple), 
len(tuple)), (x,))
tuple.extended_by(tuple_b) => tuple + tuple_b => return 
subst_tuple(tuple, slice(len(tuple), len(tuple)), x)
tuple.with_subst(slice, tuple_b) => return subst_tuple(tuple, slice, 
tuple_b)
tuple.reversed() => return slice_tuple(tuple, slice(None, None, -1))

If it's not clear, I'm imagining that slice_tuple (for example) be 
implemented something like:

class slice_tuple(base_tuple):
    def __init__(self, parent_tuple, slice):
        self.parent = parent_tuple
        self.step = slice.step or 1
        # Convert negative and out of bound slice.starts to make 0 <= 
self.start < len(self.parent).
        if slice.start is not None:
            self.start = slice.start if slice.start >= 0 else 
slice.start + len(self.parent)
            if self.start < 0: self.start = 0
            elif self.start > len(self.parent): self.start = 
len(self.parent)
        else:
            self.start = 0 if self.step > 0 else len(self.parent) - 1
        # Convert negative and out of bound slice.stops to make -1 <= 
self.stop <= len(self.parent)
        # and (self.stop >= self.start if self.step > 0 else self.stop 
<= self.start).
        if slice.stop is not None:
            self.stop = slice.stop if slice.stop >= 0 else slice.stop + 
len(self.parent)
            if self.step > 0:
                if self.stop < self.start: self.stop = self.start
                elif self.stop > len(self.parent): self.stop = 
len(self.parent)
            else:
                if self.stop > self.start: self.stop = self.start
                elif self.stop < -1: self.stop = -1
        else:
            self.stop = len(self.parent) if self.step > 0 else -1
    def __len__(self):
        if self.step > 0:
            return (self.stop - self.start + self.step - 1) // self.step
        return (self.stop - self.start + self.step + 1) // self.step
    def __getitem__(self, i):
        if i < 0: i += len(self)
        if i < 0 or i >= len(self): raise IndexError("tuple index out of 
range")
        return self.parent[self.start + i * self.step]
    ... etc ...

-bruce frederiksen


From guido at python.org  Sun Apr  5 18:38:29 2009
From: guido at python.org (Guido van Rossum)
Date: Sun, 5 Apr 2009 09:38:29 -0700
Subject: [Python-ideas] x=(yield from) confusion [was:Yet another
	alternative name for yield-from]
In-Reply-To: <49D8C43E.5010704@improva.dk>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com> 
	<49D60B81.6060209@gmail.com>
	<fb6fbf560904030948t23ad6b08t77e71017e7c62853@mail.gmail.com> 
	<49D64BA7.2000006@improva.dk>
	<ca471dc20904031421j2c6c8f49kece03fe24ca539ee@mail.gmail.com> 
	<49D69AFA.5070600@improva.dk>
	<ca471dc20904041329p31571e51o44d0ea8bcd74f96f@mail.gmail.com> 
	<49D8C43E.5010704@improva.dk>
Message-ID: <ca471dc20904050938j4bd866b6ga4ed671195549428@mail.gmail.com>

On Sun, Apr 5, 2009 at 7:46 AM, Jacob Holm <jh at improva.dk> wrote:
> The argument that we have no value to send before we have yielded is wrong.
> The generator containing the "yield-from" could easily have a value to send
> (or throw), and if iter(EXPR) returns a coroutine or a non-generator it
> could easily be ready to accept it.? That is the idea behind my attempted
> fixes to the @coroutine issue.

I think it's simpler to refrain from yield-from in that case and spell
it out. If the value to send doesn't come from outside the outer
generator, yield-from is not the solution.

> This is where the fun begins.? In an earlier thread we concluded that if the
> thrown exception is a StopIteration and the *same* StopIteration instance
> escapes the throw() call, it should be reraised rather than caught and
> turned into a RETVAL.? The reasoning was the following example:
>
> def inner():
>     for i in xrange(10):
>         yield i
>
> def outer():
>     yield from inner()
>     print "if StopIteration is thrown in we shouldn't get here"
>
> Which we wanted to be equivalent to:
>
> def outer():
>     for i in xrange(10):
>         yield i
>     print "if StopIteration is thrown in we shouldn't get here"
>
> The same argument goes for ReturnFromGenerator, so the expansion at this
> stage should be more like:
[snip]

This example and reasoning are invalid. You shouldn't be throwing
StopIteration (or ReturnFromGenerator) *into* a generator. That's
something that should only come *out*.

> Next issue is that the value returned by it.close() is thrown away by
> yield-from.? Here is a silly example:
>
> def inner():
>     i = 0
>     while True
>         try:
>             yield
>         except GeneratorExit:
>             return i
>         i += 1
>
> def outer():
>     try:
>         yield from inner()
>     except GeneratorExit:
>         # nothing I can write here will get me the value returned from inner()
>
> Also the trivial:
>
> def outer():
>     return yield from inner()
>
> Would swallow the return value as well.
>
> I have previously suggested attaching the return value to the (re)raised
> GeneratorExit, and/or saving the return value on the generator and making
> close return the value each time it is called.? We could also choose to
> define this as broken behavior and raise a RuntimeError, although it seems a
> bit strange to have yield-from treat it as an error when close doesn't.
> Silently having the yield-from construct swallow the returned value is my
> least favored option.

Attaching it to the GeneratorExit is just plain wrong -- this is an
exception you throw *in*, not something that is thrown out (except
when it bounces back).

One solution is not to use yield-from but write it out using yield and
send (just like the full expansion, but you can probably drop most of
the complexity for any particular example).

Another solution is not to use close() and GeneratorExit but some
application-specific exception to signal the end.

But perhaps it would be okay to change the GeneratorExit handler in
the expansion so that it passes through the return value with a
StopIteration exception:

rv = it.close()
if rv is None:
  raise StopIteration(rv)    # Or ReturnFromGenerator(rv)
else:
  raise

Alternatively, simpler:

it.throw(GeneratorExit)
# We only get here if it yielded a value
raise RuntimeError(...)

(Though this isn't exactly if we were to use duck typing.)

We could then write the first version of outer() like this:

def outer():
  try:
    yield from inner()
  except StopIteration as e:
    ...access return value as e.value...

and I think the second (trivial) outer() will return inner()'s return
value just fine, since it just passes through as a StopIteration
value.

> Like Greg, I am in favor of duck-typing this as closely as possible.

OK, noted. I think it's probably fine.

FWIW, I'm beginning to think that ReturnFromGenerator is a bit of a
nuisance, and that it's actually fine to allow "return value" inside a
generator to mean "raise StopIteration(value)" (well not quite at that
point in the code but once we are about to clean up the frame). Maybe
I've overstated the case for preventing beginners' mistakes. After all
they'll notice that their generator returns prematurely when they
include any kind of return value. Also if the StopIteration ends being
printed as a traceback the value will be printed, which is the kind of
hint newbies love.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)


From jh at improva.dk  Sun Apr  5 18:40:50 2009
From: jh at improva.dk (Jacob Holm)
Date: Sun, 05 Apr 2009 18:40:50 +0200
Subject: [Python-ideas] x=(yield from) confusion [was:Yet another
 alternative name for yield-from]
In-Reply-To: <49D7ECC4.1030008@canterbury.ac.nz>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com>	<49D60B81.6060209@gmail.com>	<fb6fbf560904030948t23ad6b08t77e71017e7c62853@mail.gmail.com>	<49D64BA7.2000006@improva.dk>	<ca471dc20904031421j2c6c8f49kece03fe24ca539ee@mail.gmail.com>	<49D69AFA.5070600@improva.dk>
	<49D6AC52.4010608@gmail.com>	<49D6BC46.9000808@improva.dk>	<ca471dc20904041348u4ea98d60u3a023e8262563df2@mail.gmail.com>
	<49D7ECC4.1030008@canterbury.ac.nz>
Message-ID: <49D8DF12.50208@improva.dk>

Greg Ewing wrote:
> Guido van Rossum wrote:
>> I don't believe that once the generator has raised StopIteration or
>> ReturnFromGenerator, the return value should be saved somewhere to be
>> retrieved with an explicit close() call -- I want to be able to free
>> all resources once the generator frame is dead.
>
> I agree with that.

I don't think it is common to keep the generator object alive long after 
the generator is closed, so I don't see the problem in keeping the value 
so it can be returned by the next close() call.

>
> As a corollary, I *don't* think that close() should
> return the value of a ReturnFromGenerator even if it
> gets one, because unless the value is stored, you'll
> only get it the first time close() is called, and
> only if the generator has not already completed
> normally. That would make it too unreliable for any
> practical use as far as I can see.
>

And I think it is a mistake to have close() swallow the return value. If 
it catches ReturnFromGenerator, it should also return the value or raise 
a RuntimeError.

In my order of preference:

   1. "return value" in a generator raises StopIteration(value). Any
      exception raised by next, send or throw sets a return value on the
      generator on the way out. If the exception is a StopIteration or
      GeneratorExit (see below) that was not the argument to throw, the
      value is taken from there, else it is None. Any next(), send(), or
      throw() operation on a closed generator raises a new StopIteration
      using the saved value. When close catches a StopIteration or
      GeneratorExit it returns the value. After yield-from calls close
      as part of its GeneratorExit handling, it raises a new
      GeneratorExit with the returned value.

      The GeneratorExit part lets "def outer(): return yield from
      inner()" behave as expected.

   2. Same as #1 but using ReturnFromGenerator(value) instead of
      StopIteration.

   3. Same as #1 but without attaching return value to GeneratorExit in
      yield-from.

   4. Same as #3 but using ReturnFromGenerator(value) instead of
      StopIteration.

   5. Same as #1 but without storing the value on the generator.

   6. Same as #5 but using ReturnFromGenerator(value) instead of
      StopIteration.

   7. "return value" in a generator raises ReturnFromGenerator(value).
      close() doesn't catch it.

   8. "return value" in a generator raises ReturnFromGenerator(value).
      close() catches ReturnFromGenerator and raises a RuntimeError.

   9. Anything else that has been suggested. In particular anything
      where close() catches ReturnFromGenerator without either returning
      the value or raising another exception.

Too many options? This is just a small subset of what has been discussed.

- Jacob



From guido at python.org  Sun Apr  5 18:43:26 2009
From: guido at python.org (Guido van Rossum)
Date: Sun, 5 Apr 2009 09:43:26 -0700
Subject: [Python-ideas] x=(yield from) confusion [was:Yet another
	alternative name for yield-from]
In-Reply-To: <49D7ECC4.1030008@canterbury.ac.nz>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com> 
	<49D60B81.6060209@gmail.com>
	<fb6fbf560904030948t23ad6b08t77e71017e7c62853@mail.gmail.com> 
	<49D64BA7.2000006@improva.dk>
	<ca471dc20904031421j2c6c8f49kece03fe24ca539ee@mail.gmail.com> 
	<49D69AFA.5070600@improva.dk> <49D6AC52.4010608@gmail.com> 
	<49D6BC46.9000808@improva.dk>
	<ca471dc20904041348u4ea98d60u3a023e8262563df2@mail.gmail.com> 
	<49D7ECC4.1030008@canterbury.ac.nz>
Message-ID: <ca471dc20904050943r5eac8c3endb3bae3461d013b8@mail.gmail.com>

On Sat, Apr 4, 2009 at 4:27 PM, Greg Ewing <greg.ewing at canterbury.ac.nz> wrote:
> Let's not lose sight of the fundamental motivation
> for all this, the way I see it at least: yield-from is
> primarily to permit factoring of generator code. Any
> proposals for enhancements or extensions ought to be
> justified in relation to that.

I still don't think that refactoring should drive the design
exclusively. Refactoring is *one* thing that becomes easier with
yield-from. But I want the design to look pretty from as many angles
as possible.

>> I don't believe that once the generator has raised StopIteration or
>> ReturnFromGenerator, the return value should be saved somewhere to be
>> retrieved with an explicit close() call -- I want to be able to free
>> all resources once the generator frame is dead.
>
> I agree with that.
>
> As a corollary, I *don't* think that close() should
> return the value of a ReturnFromGenerator even if it
> gets one, because unless the value is stored, you'll
> only get it the first time close() is called, and
> only if the generator has not already completed
> normally. That would make it too unreliable for any
> practical use as far as I can see.

Throwing in GeneratorExit and catching the ReturnFromGenerator
exception would have the same problem though, so I'm not sure I buy
this argument.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)


From denis.spir at free.fr  Sun Apr  5 19:43:46 2009
From: denis.spir at free.fr (spir)
Date: Sun, 5 Apr 2009 19:43:46 +0200
Subject: [Python-ideas] about repr
Message-ID: <20090405194346.22c6ff55@o>

Hello,

-1- repr writing function

Wonder if you would like a statement (py2) or function (py3) similar to print, except that it would output the repr form instead of str. I would enjoy having such a nicety to avoid "print repr(x)" or "print(repr(x))".

As a side note, there is imo something missing in the // below:
   __str__    str()    %s   print
   __repr__   repr()   %r   (show?)
If only for consistency... it wouldn't hurt.

This seems easier for python 3: repr output would then be implemented as a function, like print, so that we wouldn't not need a dedicated keyword.


-2- use for test purpose 

I just read the following by Ka-Ping Yee on python-dev (http://mail.python.org/pipermail/python-dev/2000-April/003238.html):
"""
repr() is for the human, not for the machine.  [Serialization]
is for the machine.  repr() is: "Please show me as much information
as you reasonably can about this object in an accurate and unambiguous
way, but if you can't readably show me everything, make it obvious
that you're not."
"""

I see str() as intended to produce a view of an object that has a kind of "natural" (read: cultural) textual form, like a date, and rather for the user. repr() instead seems to me more "rough", informative, and programmer-oriented. 
For this reason, I heavily use repr for test, usually have a __repr__ method on custom classes only for that purpose.

I would like to know how much debug work is done without help of a debugger. I suspect it's a rather high percentage for several reasons, especially with a clear and friendly language like python. At test/debug/setup time it's often enough to have some variable output at the proper time & place. But we often need several values at several points. 
The one thing missing is then the names of the variables showed. So that we have to insert eg
	print "x:%r" % x

You will probably find it weird, but I would really like a way to have a variable name automatically written together with its value. I can see 2 ways for that:

~ Either repr is really considered as programmer test output. Then when its argument is a simple name instead of another kind of expression, this name would be printed to stdout together with the value. (*)

~ Or a brand new pair of functions/methods: one to produce the string, and one to print it out.

(*) This special casing of identifier alone, as opposed to general expression, may seem strange; but it already exists in python assignments in order to alias instead of yielding a new value.

Denis

PS:
I cannot simply write a tool func, for ordinary objects do not know their own name: only functions, methods and classes have a __name__. I can nevertheless simulate it using eval():
   def show(name):
      sys.stdout.write( "%s:%r" % (name, eval(name)) )
but then show() must be called with "show('x')", using additional ''. Also, this cannot work as a method, so cannot be specialized.
What I wish is instead:
   def show(obj):
      sys.stdout.write( "%s:%r" % (obj.__name__, obj) )
that can be called with "show(x)". Why do only funcs & types know their names?

------
la vita e estrany


From g.brandl at gmx.net  Sun Apr  5 19:57:37 2009
From: g.brandl at gmx.net (Georg Brandl)
Date: Sun, 05 Apr 2009 19:57:37 +0200
Subject: [Python-ideas] about repr
In-Reply-To: <20090405194346.22c6ff55@o>
References: <20090405194346.22c6ff55@o>
Message-ID: <grarek$26u$1@ger.gmane.org>

spir schrieb:
> Hello,
> 
> -1- repr writing function
> 
> Wonder if you would like a statement (py2) or function (py3) similar to print, except that it would output the repr form instead of str. I would enjoy having such a nicety to avoid "print repr(x)" or "print(repr(x))".
> 
> As a side note, there is imo something missing in the // below:
>    __str__    str()    %s   print
>    __repr__   repr()   %r   (show?)
> If only for consistency... it wouldn't hurt.

Use ``from sys import displayhook as show`` ;)

> -2- use for test purpose 
> 
> I just read the following by Ka-Ping Yee on python-dev (http://mail.python.org/pipermail/python-dev/2000-April/003238.html):
> """
> repr() is for the human, not for the machine.  [Serialization]
> is for the machine.  repr() is: "Please show me as much information
> as you reasonably can about this object in an accurate and unambiguous
> way, but if you can't readably show me everything, make it obvious
> that you're not."
> """
> 
> I see str() as intended to produce a view of an object that has a kind of "natural" (read: cultural) textual form, like a date, and rather for the user. repr() instead seems to me more "rough", informative, and programmer-oriented. 
> For this reason, I heavily use repr for test, usually have a __repr__ method on custom classes only for that purpose.
> 
> I would like to know how much debug work is done without help of a debugger. I suspect it's a rather high percentage for several reasons, especially with a clear and friendly language like python. At test/debug/setup time it's often enough to have some variable output at the proper time & place. But we often need several values at several points. 
> The one thing missing is then the names of the variables showed. So that we have to insert eg
> 	print "x:%r" % x

Use ``from pprint import pprint; pprint(vars())``.

Georg

-- 
Thus spake the Lord: Thou shalt indent with four spaces. No more, no less.
Four shall be the number of spaces thou shalt indent, and the number of thy
indenting shall be four. Eight shalt thou not indent, nor either indent thou
two, excepting that thou then proceed to four. Tabs are right out.



From jh at improva.dk  Sun Apr  5 20:22:16 2009
From: jh at improva.dk (Jacob Holm)
Date: Sun, 05 Apr 2009 20:22:16 +0200
Subject: [Python-ideas] x=(yield from) confusion [was:Yet another
 alternative name for yield-from]
In-Reply-To: <ca471dc20904050938j4bd866b6ga4ed671195549428@mail.gmail.com>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com>
	<49D60B81.6060209@gmail.com>
	<fb6fbf560904030948t23ad6b08t77e71017e7c62853@mail.gmail.com>
	<49D64BA7.2000006@improva.dk>
	<ca471dc20904031421j2c6c8f49kece03fe24ca539ee@mail.gmail.com>
	<49D69AFA.5070600@improva.dk>
	<ca471dc20904041329p31571e51o44d0ea8bcd74f96f@mail.gmail.com>
	<49D8C43E.5010704@improva.dk>
	<ca471dc20904050938j4bd866b6ga4ed671195549428@mail.gmail.com>
Message-ID: <49D8F6D8.50806@improva.dk>

Guido van Rossum wrote:
> On Sun, Apr 5, 2009 at 7:46 AM, Jacob Holm <jh at improva.dk> wrote:
>   
>> The argument that we have no value to send before we have yielded is wrong.
>> The generator containing the "yield-from" could easily have a value to send
>> (or throw), and if iter(EXPR) returns a coroutine or a non-generator it
>> could easily be ready to accept it.  That is the idea behind my attempted
>> fixes to the @coroutine issue.
>>     
>
> I think it's simpler to refrain from yield-from in that case and spell
> it out. If the value to send doesn't come from outside the outer
> generator, yield-from is not the solution.
>   

But it *could* come from outside.  If it is a coroutine calling another 
coroutine, it could have done any number of yields first, the last of 
which would return the value to be sent to the inner one.

It bothers me a lot if you cannot use yield-from with coroutines, 
because most other uses I can see are just as easily written as 
for-loops.  I'll think a bit more about this.

>> This is where the fun begins.  In an earlier thread we concluded that if the
>> thrown exception is a StopIteration and the *same* StopIteration instance
>> escapes the throw() call, it should be reraised rather than caught and
>> turned into a RETVAL.  The reasoning was the following example:
>>
>> def inner():
>>     for i in xrange(10):
>>         yield i
>>
>> def outer():
>>     yield from inner()
>>     print "if StopIteration is thrown in we shouldn't get here"
>>
>> Which we wanted to be equivalent to:
>>
>> def outer():
>>     for i in xrange(10):
>>         yield i
>>     print "if StopIteration is thrown in we shouldn't get here"
>>
>> The same argument goes for ReturnFromGenerator, so the expansion at this
>> stage should be more like:
>>     
> [snip]
>
> This example and reasoning are invalid. You shouldn't be throwing
> StopIteration (or ReturnFromGenerator) *into* a generator. That's
> something that should only come *out*.
>   

I am not claiming that you *should* be throwing StopIteration to a 
generator, just that there is nothing that prevents you from doing it, 
so we need to consider what should happen if you do.  The above 
reasoning based on the refactoring principle lead to one choice, which I 
happen to like.  If you only focus on getting the expansion in the PEP 
as simple as possible you will probably make another choice.

Note that if you don't handle StopIteration this way but just treat it 
as a normal StopIteration you open up for interesting ways to abuse 
yield-from, exactly by throwing StopIteration.   In particular, if you 
use an iterator without throw or close methods you can break out of the 
innermost yield-from and even set the value to be returned.

I don't mind either way.  I just thought I would mention this in case 
you missed it.


[snip my examples where the return value was swallowed]
>> I have previously suggested attaching the return value to the (re)raised
>> GeneratorExit, and/or saving the return value on the generator and making
>> close return the value each time it is called.  We could also choose to
>> define this as broken behavior and raise a RuntimeError, although it seems a
>> bit strange to have yield-from treat it as an error when close doesn't.
>> Silently having the yield-from construct swallow the returned value is my
>> least favored option.
>>     
>
> Attaching it to the GeneratorExit is just plain wrong -- this is an
> exception you throw *in*, not something that is thrown out (except
> when it bounces back).
>   

I expanded a little bit on the idea in my reply to Greg that must have 
crossed your mail.  Listed a number of possible solutions that had been 
discussed in my order of preference.  I don't see a problem in having 
the language construct "yield-from" raise GeneratorExit with a value as 
a result of GeneratorExit.

> One solution is not to use yield-from but write it out using yield and
> send (just like the full expansion, but you can probably drop most of
> the complexity for any particular example).
>
> Another solution is not to use close() and GeneratorExit but some
> application-specific exception to signal the end.
>   

That doesn't really solve the issue of what should happen if you write 
such code.  What bothers me most is that the return value is silently 
swallowed.

> But perhaps it would be okay to change the GeneratorExit handler in
> the expansion so that it passes through the return value with a
> StopIteration exception:
>
> rv = it.close()
> if rv is None:
>   raise StopIteration(rv)    # Or ReturnFromGenerator(rv)
> else:
>   raise
>
> Alternatively, simpler:
>
> it.throw(GeneratorExit)
> # We only get here if it yielded a value
> raise RuntimeError(...)
>
> (Though this isn't exactly if we were to use duck typing.)
>
> We could then write the first version of outer() like this:
>
> def outer():
>   try:
>     yield from inner()
>   except StopIteration as e:
>     ...access return value as e.value...
>
> and I think the second (trivial) outer() will return inner()'s return
> value just fine, since it just passes through as a StopIteration
> value.
>   

I don't mind catching an exception to get the value in this case.  I 
just think it should be GeneratorExit i should catch.

This is related to the question of what should happen if you throw 
StopIteration.   If you don't special-case StopIteration in throw, using 
StopIteration for this is fine.

> FWIW, I'm beginning to think that ReturnFromGenerator is a bit of a
> nuisance, and that it's actually fine to allow "return value" inside a
> generator to mean "raise StopIteration(value)" (well not quite at that
> point in the code but once we are about to clean up the frame). Maybe
> I've overstated the case for preventing beginners' mistakes. After all
> they'll notice that their generator returns prematurely when they
> include any kind of return value. Also if the StopIteration ends being
> printed as a traceback the value will be printed, which is the kind of
> hint newbies love.
>   

Ok, then I will stop worrying about ReturnFromGenerator until it is 
brought up again.

Best regards
- Jacob


From guido at python.org  Sun Apr  5 20:31:49 2009
From: guido at python.org (Guido van Rossum)
Date: Sun, 5 Apr 2009 11:31:49 -0700
Subject: [Python-ideas] list.index() extension
In-Reply-To: <20090405015534.GA19165@panix.com>
References: <loom.20090404T210511-367@post.gmane.org>
	<20090405015534.GA19165@panix.com>
Message-ID: <ca471dc20904051131l2c330927of88e067c16b47912@mail.gmail.com>

On Sat, Apr 4, 2009 at 6:55 PM, Aahz <aahz at pythoncraft.com> wrote:
> On Sat, Apr 04, 2009, Benjamin Peterson wrote:
>>
>> I would like to propose a extra parameter `predicate` to list.index()
>> like this:
>>
>> def index(self, obj, predicate=operator.eq):
>> ? ? for idx, item in enumerate(self):
>> ? ? ? ? if predicate(item, obj):
>> ? ? ? ? ? ? return idx
>> ? ? raise IndexError
>>
>> My main use-case is 2to3 where a node has to locate its self in
>> the parents node list by identity. Instead of writing out the
>> search manually as is done now, it would be nice to just write
>> `self.parent.nodes.index(self, operator.is_)`.
>
> -1 -- it complicates the documentation too much for a small feature.
> Your function looks just fine, and I see no reason to add a new method.

Ditto.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)


From guido at python.org  Sun Apr  5 22:55:40 2009
From: guido at python.org (Guido van Rossum)
Date: Sun, 5 Apr 2009 13:55:40 -0700
Subject: [Python-ideas] x=(yield from) confusion [was:Yet another
	alternative name for yield-from]
In-Reply-To: <49D8F6D8.50806@improva.dk>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com> 
	<49D60B81.6060209@gmail.com>
	<fb6fbf560904030948t23ad6b08t77e71017e7c62853@mail.gmail.com> 
	<49D64BA7.2000006@improva.dk>
	<ca471dc20904031421j2c6c8f49kece03fe24ca539ee@mail.gmail.com> 
	<49D69AFA.5070600@improva.dk>
	<ca471dc20904041329p31571e51o44d0ea8bcd74f96f@mail.gmail.com> 
	<49D8C43E.5010704@improva.dk>
	<ca471dc20904050938j4bd866b6ga4ed671195549428@mail.gmail.com> 
	<49D8F6D8.50806@improva.dk>
Message-ID: <ca471dc20904051355o531ef62es2bb6a7c3f415988b@mail.gmail.com>

I'm all of round tuits for a while, so I recommend that you all (and
whoever else wants to join) find agreement on a next version of the
PEP.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)


From ncoghlan at gmail.com  Mon Apr  6 00:46:25 2009
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Mon, 06 Apr 2009 08:46:25 +1000
Subject: [Python-ideas] x=(yield from) confusion [was:Yet another
 alternative name for yield-from]
In-Reply-To: <49D8DF12.50208@improva.dk>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com>	<49D60B81.6060209@gmail.com>	<fb6fbf560904030948t23ad6b08t77e71017e7c62853@mail.gmail.com>	<49D64BA7.2000006@improva.dk>	<ca471dc20904031421j2c6c8f49kece03fe24ca539ee@mail.gmail.com>	<49D69AFA.5070600@improva.dk>	<49D6AC52.4010608@gmail.com>	<49D6BC46.9000808@improva.dk>	<ca471dc20904041348u4ea98d60u3a023e8262563df2@mail.gmail.com>	<49D7ECC4.1030008@canterbury.ac.nz>
	<49D8DF12.50208@improva.dk>
Message-ID: <49D934C1.2010504@gmail.com>

Jacob Holm wrote:
> Greg Ewing wrote:
>> Guido van Rossum wrote:
>>> I don't believe that once the generator has raised StopIteration or
>>> ReturnFromGenerator, the return value should be saved somewhere to be
>>> retrieved with an explicit close() call -- I want to be able to free
>>> all resources once the generator frame is dead.
>>
>> I agree with that.
> 
> I don't think it is common to keep the generator object alive long after
> the generator is closed, so I don't see the problem in keeping the value
> so it can be returned by the next close() call.

I don't think close() means to me what it means to you... close() to me
means "I'm done with this, it should have been exhausted already, but
just to make sure all the resources held by the internal frame are
released properly, I'm shutting it down explicitly"

In other words, the *normal* flow for close() should be the "frame has
already terminated, so just return immediately" path, not the "frame
hasn't terminated yet, so throw GeneratorExit in and complain if the
frame doesn't terminate" path.

You're trying to move throwing GeneratorExit into the internal frame
from the exceptional path to the normal path and I don't think that is a
good idea. Far better to attach the return value to StopIteration as in
Greg's original proposal (since Guido no longer appears to be advocating
a separate exception for returning a value, we should be able to just go
back to Greg's original approach) and use a normal next(), send() or
throw() call along with a helper function to catch the StopIteration.

Heck, with that approach, you can even write a context manager to catch
the result for you:

  @contextmanager
  class cr_result(object):
    def __init__(self, cr):
      self.coroutine = cr
      self.result = None
    def __enter__(self):
      return self
    def __exit__(self, et, ev, tb):
      if et is StopIteration:
        self.result = ev.value
        return True # Trap StopIteration
      # Anything else is propagated

  with cr_result(a) as x:
    # Call a.next()/a.send()/a.throw() as you like
  # Use x.result to retrieve the coroutine's value

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------


From jimjjewett at gmail.com  Mon Apr  6 00:48:51 2009
From: jimjjewett at gmail.com (Jim Jewett)
Date: Sun, 5 Apr 2009 18:48:51 -0400
Subject: [Python-ideas] x=(yield from) confusion [was:Yet another
	alternative name for yield-from]
In-Reply-To: <49D6B4CF.8080608@canterbury.ac.nz>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com>
	<49D5B98A.20700@canterbury.ac.nz>
	<fb6fbf560904030942m731f4bfbr709f59fc309b26b2@mail.gmail.com>
	<49D6B4CF.8080608@canterbury.ac.nz>
Message-ID: <fb6fbf560904051548g3b5a4ecej24ea10011dd8e01b@mail.gmail.com>

On Fri, Apr 3, 2009 at 9:15 PM, Greg Ewing <greg.ewing at canterbury.ac.nz> wrote:
> Jim Jewett wrote:

>> err... I didn't mean both directions, I meant "from the callee to the
>> caller as a yielded value" and "from the callee to the caller as a
>> final return value that can't be yielded normally."

> I think perhaps we're misunderstanding each other. You
> seemed to be saying that the only time you would want
> a generator to return a value is when you were also
> using it to either send or receive values using yield,

Almost exactly the opposite.  I can see cases where you want
the interim yields, and I can see cases where you want the final
result -- but I'm asking how common it is to need *both*, and
whether we should really be going to such effort for it.

If you only need one or the other, I think the PEP can be greatly simplified.

-jJ

> If that's not what you meant, you'll have to explain
> more clearly what you do mean.
>
> --
> Greg
>
>


From ncoghlan at gmail.com  Mon Apr  6 01:04:33 2009
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Mon, 06 Apr 2009 09:04:33 +1000
Subject: [Python-ideas] x=(yield from) confusion [was:Yet
 another	alternative name for yield-from]
In-Reply-To: <fb6fbf560904051548g3b5a4ecej24ea10011dd8e01b@mail.gmail.com>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com>	<49D5B98A.20700@canterbury.ac.nz>	<fb6fbf560904030942m731f4bfbr709f59fc309b26b2@mail.gmail.com>	<49D6B4CF.8080608@canterbury.ac.nz>
	<fb6fbf560904051548g3b5a4ecej24ea10011dd8e01b@mail.gmail.com>
Message-ID: <49D93901.9080604@gmail.com>

Jim Jewett wrote:
> Almost exactly the opposite.  I can see cases where you want
> the interim yields, and I can see cases where you want the final
> result -- but I'm asking how common it is to need *both*, and
> whether we should really be going to such effort for it.
> 
> If you only need one or the other, I think the PEP can be greatly simplified.

One of the things you may want to use coroutines with is a trampoline
scheduler for handling asynchronous IO. In that case, the inner
coroutine may want to yield an IO wait object so the scheduler can add
the relevant descriptor to the main select() loop.

In such a case, the interim yielded values would go back to the
scheduler, while the final result would go back to the calling coroutine.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------


From cmjohnson.mailinglist at gmail.com  Mon Apr  6 01:06:11 2009
From: cmjohnson.mailinglist at gmail.com (Carl Johnson)
Date: Sun, 5 Apr 2009 13:06:11 -1000
Subject: [Python-ideas] list.index() extension
In-Reply-To: <3d0cebfb0904050331y68d64d00u8ae04a75f2010bc8@mail.gmail.com>
References: <loom.20090404T210511-367@post.gmane.org>
	<3d0cebfb0904050331y68d64d00u8ae04a75f2010bc8@mail.gmail.com>
Message-ID: <3bdda690904051606x14312ebqb17f47683be8d195@mail.gmail.com>

Fredrik Johansson wrote:

> Meanwhile, here is yet another solution (although not so efficient).
>
> class match:
> ? ?def __init__(self, predicate):
> ? ? ? ?self.__eq__ = predicate
>
> range(10,-10,-1).index(match(lambda x: x < 5))
> range(10,-10,-1).count(match(lambda x: x < 5))

I find that sort of elegant, but it won't work with new style classes
as written. It needs to have the __eq__ on the class instead of on the
instance:

>>> class match(object):
...     def __init__(self, predicate):
...         self.predicate = predicate
...     def __eq__(self, item):
...         return self.predicate(item)
...
>>> range(10,-10,-1).index(match(lambda x: x < 5))
6
>>> range(10,-10,-1).count(match(lambda x: x < 5))
14

Also, the examples given before using .next() look slightly less bad
when written with the 2.6/3.0 next function:

index = next(i for i, v in enumerate(range(-10, 10)) if v is 5)

-- Carl Johnson


From jimjjewett at gmail.com  Mon Apr  6 01:16:03 2009
From: jimjjewett at gmail.com (Jim Jewett)
Date: Sun, 5 Apr 2009 19:16:03 -0400
Subject: [Python-ideas] Are we hammering thumbtacks? [was: Re: x=(yield
	from) confusion]
Message-ID: <fb6fbf560904051616q32519c9r67133db22121397@mail.gmail.com>

On Fri, Apr 3, 2009 at 8:32 PM, Nick Coghlan <ncoghlan at gmail.com> wrote:
> That is, yield, send() and throw() involve communication between the
> currently active subcoroutine and the client of the whole coroutine.
> They bypass the current stack in the coroutine itself. The return
> values, on the other hand, *do* involve unwinding the coroutine stack,
> just like they do with normal function calls.
...
> ?# Note how similar this is to the normal version above
> ?def average_diff_cr(start):
> ? ?avg1 = yield from average_cr(start, seq1)
> ? ?avg2 = yield from average_cr(start, seq2)
> ? ?return avg2 - avg1

Yes, this explanation and example finally made that clear to me.

But it also makes me wonder if we aren't using the wrong tool.

It seems that you want the yields to pass opaquely through
average_diff -- possibly without any actual value; just a
go-ahead-and-suspend-me flag.

Meanwhile, you do want some values to be shared between
average_diff and average_cr.  average_diff can send in some
initial values when it creates/initializes average_cr, but there
isn't any way to get information back without hijacking the
stream of yields.

The proposals below are more ambitious than the current PEP,
but they still somehow feel less contorted by special cases:

Option 1:  Recognize that yield is serving two purposes, and
find another way to spell go-ahead-and-suspend-me.  It may
be too late to do this cleanly.

Option 2:  Find another way for average_diff and average_cr
to share information.

    Option 2a:  Add anoher set of methods, similar to send,
    but specific to co-routines.  This starts to look like the
    Actor that Bruce Eckel asked about a a while ago, and
    Kamaelia may have a good pattern.

    Option 2b:  Let the caller and callee share a scope,
    similar to exec in locals. (Previous discussions talked
    about thunks or macros.)

-jJ


From greg.ewing at canterbury.ac.nz  Mon Apr  6 01:38:30 2009
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Mon, 06 Apr 2009 11:38:30 +1200
Subject: [Python-ideas] x=(yield from) confusion [was:Yet another
 alternative name for yield-from]
In-Reply-To: <49D8C642.3080308@improva.dk>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com>
	<49D60B81.6060209@gmail.com>
	<fb6fbf560904030948t23ad6b08t77e71017e7c62853@mail.gmail.com>
	<49D64BA7.2000006@improva.dk>
	<ca471dc20904031421j2c6c8f49kece03fe24ca539ee@mail.gmail.com>
	<49D69AFA.5070600@improva.dk>
	<ca471dc20904041329p31571e51o44d0ea8bcd74f96f@mail.gmail.com>
	<49D8C642.3080308@improva.dk>
Message-ID: <49D940F6.60107@canterbury.ac.nz>

Jacob Holm wrote:

> The argument that we have no value to send before we have yielded is 
> wrong.  The generator containing the "yield-from" could easily have a 
> value to send (or throw)

No, Guido is right here. You *can't* send a value (other
than None) into a generator that hasn't reached its first
yield (try it and you'll get an exception). The first
call has to be next().

> and if iter(EXPR) returns a coroutine or a 
> non-generator it could easily be ready to accept it.

If it's ready to accept a send, it must have already
yielded a value, which has been lost, when it should have
been yielded to the caller of the delegating generator.

 > In an earlier thread we concluded that if
> the thrown exception is a StopIteration and the *same* StopIteration 
> instance escapes the throw() call, it should be reraised rather than 
> caught and turned into a RETVAL.

That part is right.

> Next issue is that the value returned by it.close() is thrown away by 
> yield-from.

Since I don't believe that close() should be expected to
return a useful value anyway, that's not a problem.

-- 
Greg


From greg.ewing at canterbury.ac.nz  Mon Apr  6 02:35:15 2009
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Mon, 06 Apr 2009 12:35:15 +1200
Subject: [Python-ideas] x=(yield from) confusion [was:Yet another
 alternative name for yield-from]
In-Reply-To: <ca471dc20904050938j4bd866b6ga4ed671195549428@mail.gmail.com>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com>
	<49D60B81.6060209@gmail.com>
	<fb6fbf560904030948t23ad6b08t77e71017e7c62853@mail.gmail.com>
	<49D64BA7.2000006@improva.dk>
	<ca471dc20904031421j2c6c8f49kece03fe24ca539ee@mail.gmail.com>
	<49D69AFA.5070600@improva.dk>
	<ca471dc20904041329p31571e51o44d0ea8bcd74f96f@mail.gmail.com>
	<49D8C43E.5010704@improva.dk>
	<ca471dc20904050938j4bd866b6ga4ed671195549428@mail.gmail.com>
Message-ID: <49D94E43.5000802@canterbury.ac.nz>

Guido van Rossum wrote:

> This example and reasoning are invalid. You shouldn't be throwing
> StopIteration (or ReturnFromGenerator) *into* a generator. That's
> something that should only come *out*.

Okay, if you're happy to take that views, then so
am I. It'll make my expansion simpler.

-- 
Greg



From greg.ewing at canterbury.ac.nz  Mon Apr  6 03:02:43 2009
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Mon, 06 Apr 2009 13:02:43 +1200
Subject: [Python-ideas] about repr
In-Reply-To: <20090405194346.22c6ff55@o>
References: <20090405194346.22c6ff55@o>
Message-ID: <49D954B3.2060303@canterbury.ac.nz>

spir wrote:

> Wonder if you would like a statement (py2) or function (py3)
 > similar to print, except that it would output the repr form
 > instead of str.

I don't think I would use it much. Usually when I want
to print the repr of something, it's mixed in with other
things that I *don't* want the repr of, e.g.

   print "Foo =", repr(foo)

I want the string printed as a plain string, not as
a repr.

> I see str() as intended to produce a view of an object that has
> a kind of "natural" (read: cultural) textual form, like a date, 
> and rather for the user. repr() instead seems to me more "rough",
> informative, and programmer-oriented. 

The way I like to characterize it is:

* str() is for normal output
* repr() is for debugging output

> You will probably find it weird, but I would really like a way 
 > to have a variable name automatically written together with its
 > value.

I understand your motivation, but this would require more than
a function, it would need special syntax. That's a very big
thing to ask for.

> (*) This special casing of identifier alone, as opposed to general
> expression, may seem strange; but it already exists in python
> assignments in order to alias instead of yielding a new value.

Not sure what you mean by that. If you think that in

   a = b

the expression 'b' is somehow treated specially because
it's a bare name, you're wrong. The right hand side of
an assignment is treated just like any other expression.

-- 
Greg



From greg.ewing at canterbury.ac.nz  Mon Apr  6 04:20:37 2009
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Mon, 06 Apr 2009 14:20:37 +1200
Subject: [Python-ideas] x=(yield from) confusion [was:Yet another
 alternative name for yield-from]
In-Reply-To: <ca471dc20904051355o531ef62es2bb6a7c3f415988b@mail.gmail.com>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com>
	<49D60B81.6060209@gmail.com>
	<fb6fbf560904030948t23ad6b08t77e71017e7c62853@mail.gmail.com>
	<49D64BA7.2000006@improva.dk>
	<ca471dc20904031421j2c6c8f49kece03fe24ca539ee@mail.gmail.com>
	<49D69AFA.5070600@improva.dk>
	<ca471dc20904041329p31571e51o44d0ea8bcd74f96f@mail.gmail.com>
	<49D8C43E.5010704@improva.dk>
	<ca471dc20904050938j4bd866b6ga4ed671195549428@mail.gmail.com>
	<49D8F6D8.50806@improva.dk>
	<ca471dc20904051355o531ef62es2bb6a7c3f415988b@mail.gmail.com>
Message-ID: <49D966F5.3010203@canterbury.ac.nz>

Guido van Rossum wrote:

> I'm all of round tuits for a while, so I recommend that you all (and
> whoever else wants to join) find agreement on a next version of the
> PEP.

Just one thing before you go -- did you reach a
decision on whether you want a ReturnFromGenerator
exception?

-- 
Greg




From greg.ewing at canterbury.ac.nz  Mon Apr  6 04:39:53 2009
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Mon, 06 Apr 2009 14:39:53 +1200
Subject: [Python-ideas] x=(yield from) confusion [was:Yet another
 alternative name for yield-from]
In-Reply-To: <49D8F6D8.50806@improva.dk>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com>
	<49D60B81.6060209@gmail.com>
	<fb6fbf560904030948t23ad6b08t77e71017e7c62853@mail.gmail.com>
	<49D64BA7.2000006@improva.dk>
	<ca471dc20904031421j2c6c8f49kece03fe24ca539ee@mail.gmail.com>
	<49D69AFA.5070600@improva.dk>
	<ca471dc20904041329p31571e51o44d0ea8bcd74f96f@mail.gmail.com>
	<49D8C43E.5010704@improva.dk>
	<ca471dc20904050938j4bd866b6ga4ed671195549428@mail.gmail.com>
	<49D8F6D8.50806@improva.dk>
Message-ID: <49D96B79.4040606@canterbury.ac.nz>

Jacob Holm wrote:

> That doesn't really solve the issue of what should happen if you write 
> such code.

I think the point is that if it's something you
shouldn't be doing in the first place, it doesn't
really matter what happens if you do.

-- 
Greg


From greg.ewing at canterbury.ac.nz  Mon Apr  6 04:40:07 2009
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Mon, 06 Apr 2009 14:40:07 +1200
Subject: [Python-ideas] x=(yield from) confusion [was:Yet another
 alternative name for yield-from]
In-Reply-To: <ca471dc20904050943r5eac8c3endb3bae3461d013b8@mail.gmail.com>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com>
	<49D60B81.6060209@gmail.com>
	<fb6fbf560904030948t23ad6b08t77e71017e7c62853@mail.gmail.com>
	<49D64BA7.2000006@improva.dk>
	<ca471dc20904031421j2c6c8f49kece03fe24ca539ee@mail.gmail.com>
	<49D69AFA.5070600@improva.dk> <49D6AC52.4010608@gmail.com>
	<49D6BC46.9000808@improva.dk>
	<ca471dc20904041348u4ea98d60u3a023e8262563df2@mail.gmail.com>
	<49D7ECC4.1030008@canterbury.ac.nz>
	<ca471dc20904050943r5eac8c3endb3bae3461d013b8@mail.gmail.com>
Message-ID: <49D96B87.5020803@canterbury.ac.nz>

Guido van Rossum wrote:

> Throwing in GeneratorExit and catching the ReturnFromGenerator
> exception would have the same problem though, so I'm not sure I buy
> this argument.

I'm not advocating doing that. My view is that both
calling close() on the generator and throwing
GeneratorExit into it are things you only do to
make sure the generator cleans up. You can't
expect to get a meaningful return value either
way.

-- 
Greg


From greg.ewing at canterbury.ac.nz  Mon Apr  6 04:41:48 2009
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Mon, 06 Apr 2009 14:41:48 +1200
Subject: [Python-ideas] x=(yield from) confusion [was:Yet another
 alternative name for yield-from]
In-Reply-To: <fb6fbf560904051548g3b5a4ecej24ea10011dd8e01b@mail.gmail.com>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com>
	<49D5B98A.20700@canterbury.ac.nz>
	<fb6fbf560904030942m731f4bfbr709f59fc309b26b2@mail.gmail.com>
	<49D6B4CF.8080608@canterbury.ac.nz>
	<fb6fbf560904051548g3b5a4ecej24ea10011dd8e01b@mail.gmail.com>
Message-ID: <49D96BEC.7050603@canterbury.ac.nz>

Jim Jewett wrote:
> I can see cases where you want
> the interim yields, and I can see cases where you want the final
> result -- but I'm asking how common it is to need *both*, and
> whether we should really be going to such effort for it.
> 
> If you only need one or the other, I think the PEP can be greatly simplified.

How, exactly? I'd need convincing that you wouldn't
just end up with the same amount of complexity
arranged differently.

-- 
Greg


From greg.ewing at canterbury.ac.nz  Mon Apr  6 04:53:18 2009
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Mon, 06 Apr 2009 14:53:18 +1200
Subject: [Python-ideas] x=(yield from) confusion [was:Yet another
 alternative name for yield-from]
In-Reply-To: <ca471dc20904050943r5eac8c3endb3bae3461d013b8@mail.gmail.com>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com>
	<49D60B81.6060209@gmail.com>
	<fb6fbf560904030948t23ad6b08t77e71017e7c62853@mail.gmail.com>
	<49D64BA7.2000006@improva.dk>
	<ca471dc20904031421j2c6c8f49kece03fe24ca539ee@mail.gmail.com>
	<49D69AFA.5070600@improva.dk> <49D6AC52.4010608@gmail.com>
	<49D6BC46.9000808@improva.dk>
	<ca471dc20904041348u4ea98d60u3a023e8262563df2@mail.gmail.com>
	<49D7ECC4.1030008@canterbury.ac.nz>
	<ca471dc20904050943r5eac8c3endb3bae3461d013b8@mail.gmail.com>
Message-ID: <49D96E9E.6030702@canterbury.ac.nz>

Guido van Rossum wrote:

> I still don't think that refactoring should drive the design
> exclusively.  Refactoring is *one* thing that becomes easier with
> yield-from. But I want the design to look pretty from as many angles
> as possible.

Certainly. But I feel that any extra features should
be in some sense extensions or generalizations of
what is needed to support refactoring. Otherwise the
scope of the proposal can expand without bound.

-- 
Greg



From talin at acm.org  Mon Apr  6 10:17:15 2009
From: talin at acm.org (Talin)
Date: Mon, 06 Apr 2009 01:17:15 -0700
Subject: [Python-ideas] Historical revisionism and source file formatting
Message-ID: <49D9BA8B.4000807@acm.org>

As you know, the current Python source tree contains a mixture of 
various different tab/space conventions and source formatting styles. 
The Python community has opted not to perform any sort of mass 
conversion of source file to a uniform style, because the community 
values time-wise consistency (of versions and branches) over space-wise 
consistency (between files in the same revision).

However, I have often wondered if it would be reasonable to reformat 
*all prior* versions of the source code. Imagine some kind of tool that 
would iterate over all previous versions, copying each version from one 
repository to another, and doing an automatic reformat on each version.

Of course, in order for this to work, the automatic reformatter would 
have to be very, very trustworthy, since any errors would likely go 
undetected for a long time.

One objection to this plan is that the history thus created would be 
"revisionist" - that it would be, in fact, an unauthentic historical 
record of what had happened. I don't have a good answer to that other 
than to say that this is an argument of theory over practicality.

In any case, I don't mean this as a serious suggestion that we reformat 
all Python revisions in the current SVN repository.

However - since we're going to be setting up a new DVCS anyway, I wonder 
if something along these lines would make sense as part of the 
migration. It might be a one-time opportunity to deal with this matter 
once and for all.

-- Talin


From denis.spir at free.fr  Mon Apr  6 12:46:29 2009
From: denis.spir at free.fr (spir)
Date: Mon, 6 Apr 2009 12:46:29 +0200
Subject: [Python-ideas] string.swap()
Message-ID: <20090406124629.5aef56ce@o>

Hello,

Some time ago on this list was mentionned swapping sub-strings inside a string. A short while later I had a use case case for that feature; this led me to explore this topic in a general case defined as:

* Sub-strings to be swapped must not be single characters.
* The string can hold any char or sequence of chars: there is then no safe choice usable as temporary place holder. So that the common scheme s1->temp / s2->s1 / temp->s2 using string.replace() cannot be used.
* But the issue of overlapping substrings is not taken into account.

I asked for algorithms on the tutor mailing list (see below). In addition to the 'naive', but tricky, step-by-step process, there was:
~ A proposal with regex, 'cheating' with the sub() method (I had some similar method, but Kent's is much smarter than mine).
~ A very clever proposal that splits the string into a list and joins back.

My conclusions about this feature are as follows:
-0- It seems rarely needed.
-1- It is of general use, as opposed to domain-specific.
-2- It is much less trivial/obvious than it seems.
-3- It maps conceptually to a single, whole, operation.

What do you think? Is it worth having it Python?


Below exemples of implementation. Note that behind the scene the regex method must perform something analog to the the naive one.
Timing produced the following results:
~ For any reason, the regex version's time is very unstable on smaller strings / number of replacements / number of trials (up to ~10000). Time varies from 60% to 120% of the naive version time.
~ On bigger trials (~ 100000) regex and naive versions are about as (in)efficient.
~ The list version is about 5 times faster in all cases.

def swapNaive(text, s1, s2):
    new_text = ""
    l1, l2 = len(s1), len(s2)
    while (text.count(s1) > 0) or (text.count(s2) > 0):
        i1, i2 = text.find(s1), text.find(s2)
        if i1>=0 and (i2<0 or i1 < i2):
            new_text += text[:i1] + s2
            text = text[i1+l1:]
        else:
            new_text += text[:i2] + s1
            text = text[i2+l2:]
    new_text += text
    return new_text

### proposed by Albert T. Hofkamp
### on the tutor mailing list
def swapList(text, s1, s2):
    pieces = text.split(s1)
    pieces = [p.replace(s2, s1) for p in pieces]
    return s2.join(pieces)

### proposed by Kent Johnson
### on the tutor mailing list
import re
def swapRegex(text, s1, s2):
    def replace(m):
        return s2 if m.group()==s1 else s1
    matcher = re.compile('%s|%s' % (re.escape(s1), re.escape(s2)))
    return matcher.sub(replace, text)

Denis
------
la vita e estrany


From ncoghlan at gmail.com  Mon Apr  6 12:58:22 2009
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Mon, 06 Apr 2009 20:58:22 +1000
Subject: [Python-ideas] Historical revisionism and source file formatting
In-Reply-To: <49D9BA8B.4000807@acm.org>
References: <49D9BA8B.4000807@acm.org>
Message-ID: <49D9E04E.3070702@gmail.com>

Talin wrote:
> However - since we're going to be setting up a new DVCS anyway, I wonder
> if something along these lines would make sense as part of the
> migration. It might be a one-time opportunity to deal with this matter
> once and for all.

I think the folks working out the mechanics of the migration are going
to have enough to sort out without worrying about tab/space differences
in the C code.

Now, if someone could work out a way to do a hg diff that paid attention
to leading whitespace differences in .py and .rst files (where it
matters), but ignored all other whitespace only changes, then that would
be a useful tool in its own right (since it would be applicable to any
project with a mixed Python/non-Python source tree).

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------


From ncoghlan at gmail.com  Mon Apr  6 13:01:23 2009
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Mon, 06 Apr 2009 21:01:23 +1000
Subject: [Python-ideas] string.swap()
In-Reply-To: <20090406124629.5aef56ce@o>
References: <20090406124629.5aef56ce@o>
Message-ID: <49D9E103.9090309@gmail.com>

spir wrote:
> ### proposed by Albert T. Hofkamp
> ### on the tutor mailing list
> def swapList(text, s1, s2):
>     pieces = text.split(s1)
>     pieces = [p.replace(s2, s1) for p in pieces]
>     return s2.join(pieces)

At least putting that on ASPN as a cookbook recipe seems like a good
idea. Not so sure about making it a method though.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------


From jh at improva.dk  Mon Apr  6 14:51:53 2009
From: jh at improva.dk (Jacob Holm)
Date: Mon, 06 Apr 2009 14:51:53 +0200
Subject: [Python-ideas] x=(yield from) confusion [was:Yet another
 alternative name for yield-from]
In-Reply-To: <49D934C1.2010504@gmail.com>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com>	<49D60B81.6060209@gmail.com>	<fb6fbf560904030948t23ad6b08t77e71017e7c62853@mail.gmail.com>	<49D64BA7.2000006@improva.dk>	<ca471dc20904031421j2c6c8f49kece03fe24ca539ee@mail.gmail.com>	<49D69AFA.5070600@improva.dk>	<49D6AC52.4010608@gmail.com>	<49D6BC46.9000808@improva.dk>	<ca471dc20904041348u4ea98d60u3a023e8262563df2@mail.gmail.com>	<49D7ECC4.1030008@canterbury.ac.nz>
	<49D8DF12.50208@improva.dk> <49D934C1.2010504@gmail.com>
Message-ID: <49D9FAE9.9060403@improva.dk>

Nick Coghlan wrote:
> Jacob Holm wrote:
>   
>> I don't think it is common to keep the generator object alive long after
>> the generator is closed, so I don't see the problem in keeping the value
>> so it can be returned by the next close() call.
>>     
>
> I don't think close() means to me what it means to you... close() to me
> means "I'm done with this, it should have been exhausted already, but
> just to make sure all the resources held by the internal frame are
> released properly, I'm shutting it down explicitly"
>
> In other words, the *normal* flow for close() should be the "frame has
> already terminated, so just return immediately" path, not the "frame
> hasn't terminated yet, so throw GeneratorExit in and complain if the
> frame doesn't terminate" path.
>   

That is why #1-6 in my list took care to extract the value from 
StopIteration and attach it to the generator.  Doing it like that allows 
you to ask for the value after the generator is exhausted normally, 
using either next() or close().  This is interesting because it allows 
you to loop over the generator with a normal for-loop and *still* get 
the return value after the loop if you want it.  (You have to construct 
the generator before the loop instead of in the for-loop statement 
itself,  and call close() on it afterwards, but that is easy).  It also 
makes it possible for close to reliably return the value.

The idea of saving the value on the generator is more basic than the 
idea of having close return a value.  It means that calling next on an 
exhausted generator will keep raising StopIteration with the same 
value.  If you don't save the return value on the generator, only the 
first StopIteration will have a value, the rest will always have None as 
their value.

> You're trying to move throwing GeneratorExit into the internal frame
> from the exceptional path to the normal path and I don't think that is a
> good idea. 

I think it is exacltly the right thing for the use cases I have.  
Anything else requires extra support code to get a similar api. (Extra 
exceptions to throw in and/or out, an alternative close function to 
catch the extra exceptions, probably other things as well).

Whether or not it is a good idea to use GeneratorExit for this, I think 
it is important that a "return value from GeneratorExit" does not 
silently throw away the value.  In other words, if close does *not* 
return the value it gets from StopIteration, it should raise an 
exception if that value is not None. 

One option is to let close() reraise the StopIteration if it has a 
non-None value.  This matches Guidos suggestion for a way to access the 
return value after a GeneratorExit in yield-from without changing his 
suggested expansion.  If the return value from the generator isn't 
stored and I can't have close() return the value, this would be my 
preference.

Another option (if you insist that it is an error to return a value 
after a GeneratorExit) is to let close() raise a RuntimeError when it 
catches a StopIteration with a non-None value.

- Jacob


From aahz at pythoncraft.com  Mon Apr  6 15:09:24 2009
From: aahz at pythoncraft.com (Aahz)
Date: Mon, 6 Apr 2009 06:09:24 -0700
Subject: [Python-ideas] string.swap()
In-Reply-To: <20090406124629.5aef56ce@o>
References: <20090406124629.5aef56ce@o>
Message-ID: <20090406130924.GE19296@panix.com>

On Mon, Apr 06, 2009, spir wrote:
> 
> Some time ago on this list was mentionned swapping sub-strings inside
> a string. A short while later I had a use case case for that feature;
> this led me to explore this topic in a general case defined as:
>
> * Sub-strings to be swapped must not be single characters.

This seems like an odd requirement.  In any case, posting as a recipe
seems the way to go.
-- 
Aahz (aahz at pythoncraft.com)           <*>         http://www.pythoncraft.com/

"...string iteration isn't about treating strings as sequences of strings, 
it's about treating strings as sequences of characters.  The fact that
characters are also strings is the reason we have problems, but characters 
are strings for other good reasons."  --Aahz


From jh at improva.dk  Mon Apr  6 15:41:27 2009
From: jh at improva.dk (Jacob Holm)
Date: Mon, 06 Apr 2009 15:41:27 +0200
Subject: [Python-ideas] x=(yield from) confusion [was:Yet another
 alternative name for yield-from]
In-Reply-To: <49D940F6.60107@canterbury.ac.nz>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com>	<49D60B81.6060209@gmail.com>	<fb6fbf560904030948t23ad6b08t77e71017e7c62853@mail.gmail.com>	<49D64BA7.2000006@improva.dk>	<ca471dc20904031421j2c6c8f49kece03fe24ca539ee@mail.gmail.com>	<49D69AFA.5070600@improva.dk>	<ca471dc20904041329p31571e51o44d0ea8bcd74f96f@mail.gmail.com>	<49D8C642.3080308@improva.dk>
	<49D940F6.60107@canterbury.ac.nz>
Message-ID: <49DA0687.9010100@improva.dk>

Greg Ewing wrote:
> Jacob Holm wrote:
>
>> The argument that we have no value to send before we have yielded is 
>> wrong. The generator containing the "yield-from" could easily have a 
>> value to send (or throw)
>
> No, Guido is right here. You *can't* send a value (other
> than None) into a generator that hasn't reached its first
> yield (try it and you'll get an exception). The first
> call has to be next().

The whole idea of the coroutine pattern is to replace this restriction 
on the caller with a restriction about the first yield in the coroutine. 
This would probably be a lot clearer if the coroutine decorator was 
written as:

def coroutine(func):
    def start(*args,**kwargs):
        cr = func(*args,**kwargs)
        v = cr.next()
        if v is not None:
            raise RuntimeError('first yield in coroutine was not None')
        return cr
    return start


The first call *from user code* to a generator decorated with @coroutine 
*can* be a send() or throw(), and in most cases probably should be.

>> and if iter(EXPR) returns a coroutine or a non-generator it could 
>> easily be ready to accept it.
>
> If it's ready to accept a send, it must have already
> yielded a value, which has been lost, when it should have
> been yielded to the caller of the delegating generator.

No, in the coroutine pattern it absolutely should not. The first value 
yielded by the generator of every coroutine is None and should be thrown 
away.

>> Next issue is that the value returned by it.close() is thrown away by 
>> yield-from.
>
> Since I don't believe that close() should be expected to
> return a useful value anyway, that's not a problem.
>

It is a problem in the sense that it is surprising behavior. If close() 
doesn't return the value, it should at least raise some exception. I am 
warming to the idea of reraising the StopIteration if it has a non-None 
value. This matches Guidos suggestion for how to retrieve the value 
after a yield-from that was thrown a GeneratorExit. If you insist it is 
an error to return a value as response to GeneratorExit, raise 
RuntimeError. But *please* don't just swallow the value.

- Jacob



From bruce at leapyear.org  Mon Apr  6 17:21:35 2009
From: bruce at leapyear.org (Bruce Leban)
Date: Mon, 6 Apr 2009 08:21:35 -0700
Subject: [Python-ideas] string.swap()
In-Reply-To: <20090406130924.GE19296@panix.com>
References: <20090406124629.5aef56ce@o> <20090406130924.GE19296@panix.com>
Message-ID: <cf5b87740904060821l25155368ra78962d1dd0b6cb4@mail.gmail.com>

On Mon, Apr 6, 2009 at 6:09 AM, Aahz <aahz at pythoncraft.com> wrote:

> On Mon, Apr 06, 2009, spir wrote:
> >
> > Some time ago on this list was mentionned swapping sub-strings inside
> > a string....
> >
> > * Sub-strings to be swapped must not be single characters.
>
> This seems like an odd requirement.  In any case, posting as a recipe
> seems the way to go.


I think the meaning of this was sub-strings must not be limited to single
characters.

There's a more general operation here that I've used on occasion:

        multiReplace(text, {"old value": "new value", "old2" : "new2", ...})

which replaces all the old values with the new values. In a sense this is a
generalized form of translate. Swap is a special case of this. In the case
of overlapping strings, the earliest match wins. If two strings share a
common prefix, then there needs to be a tiebreaking rule for that: shortest
or longest wins.

--- Bruce
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-ideas/attachments/20090406/94f2b00b/attachment.html>

From ncoghlan at gmail.com  Mon Apr  6 23:36:08 2009
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Tue, 07 Apr 2009 07:36:08 +1000
Subject: [Python-ideas] x=(yield from) confusion [was:Yet another
 alternative name for yield-from]
In-Reply-To: <49D9FAE9.9060403@improva.dk>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com>	<49D60B81.6060209@gmail.com>	<fb6fbf560904030948t23ad6b08t77e71017e7c62853@mail.gmail.com>	<49D64BA7.2000006@improva.dk>	<ca471dc20904031421j2c6c8f49kece03fe24ca539ee@mail.gmail.com>	<49D69AFA.5070600@improva.dk>	<49D6AC52.4010608@gmail.com>	<49D6BC46.9000808@improva.dk>	<ca471dc20904041348u4ea98d60u3a023e8262563df2@mail.gmail.com>	<49D7ECC4.1030008@canterbury.ac.nz>
	<49D8DF12.50208@improva.dk> <49D934C1.2010504@gmail.com>
	<49D9FAE9.9060403@improva.dk>
Message-ID: <49DA75C8.3080603@gmail.com>

Jacob Holm wrote:
> Another option (if you insist that it is an error to return a value
> after a GeneratorExit) is to let close() raise a RuntimeError when it
> catches a StopIteration with a non-None value.

Why do you consider it OK for close() to throw away all of the values
the generator might have yielded in the future, but not OK for it to
throw away the generator's return value? The objection I have to having
close() return a value is that it encourages people to start using
GeneratorExit in their normal generator control flow and I think that's
a really bad idea (on par with calling sys.exit() and then trapping
SystemExit to terminate a search loop - perfectly legal from a language
point of view, but a really bad plan nonetheless).

Now, the fact that repeatedly calling next()/send()/throw() on a
finished generator is meant to keep reraising the same StopIteration
that was thrown when the generator first terminated is a *much* better
justification for preserving the return value on the generator object.
But coupling that with the idea of close() doing anything more than
giving an unfinished generator a final chance to release any resources
it is holding is mixing two completely different ideas.

Better to just add a "value" property to generators that raises a
RuntimeError if the generator frame hasn't terminated yet (probably
along with a "finished" property to allow LBYL interrogation of the
generator state).

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------


From guido at python.org  Mon Apr  6 23:37:31 2009
From: guido at python.org (Guido van Rossum)
Date: Mon, 6 Apr 2009 14:37:31 -0700
Subject: [Python-ideas] x=(yield from) confusion [was:Yet another
	alternative name for yield-from]
In-Reply-To: <49D966F5.3010203@canterbury.ac.nz>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com> 
	<49D64BA7.2000006@improva.dk>
	<ca471dc20904031421j2c6c8f49kece03fe24ca539ee@mail.gmail.com> 
	<49D69AFA.5070600@improva.dk>
	<ca471dc20904041329p31571e51o44d0ea8bcd74f96f@mail.gmail.com> 
	<49D8C43E.5010704@improva.dk>
	<ca471dc20904050938j4bd866b6ga4ed671195549428@mail.gmail.com> 
	<49D8F6D8.50806@improva.dk>
	<ca471dc20904051355o531ef62es2bb6a7c3f415988b@mail.gmail.com> 
	<49D966F5.3010203@canterbury.ac.nz>
Message-ID: <ca471dc20904061437l2915c094rb5184613aea8656@mail.gmail.com>

On Sun, Apr 5, 2009 at 7:20 PM, Greg Ewing <greg.ewing at canterbury.ac.nz> wrote:
> Guido van Rossum wrote:
>
>> I'm all of round tuits for a while, so I recommend that you all (and
>> whoever else wants to join) find agreement on a next version of the
>> PEP.
>
> Just one thing before you go -- did you reach a
> decision on whether you want a ReturnFromGenerator
> exception?

Let's do without it.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)


From ncoghlan at gmail.com  Mon Apr  6 23:40:17 2009
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Tue, 07 Apr 2009 07:40:17 +1000
Subject: [Python-ideas] x=(yield from) confusion [was:Yet another
 alternative name for yield-from]
In-Reply-To: <49DA0687.9010100@improva.dk>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com>	<49D60B81.6060209@gmail.com>	<fb6fbf560904030948t23ad6b08t77e71017e7c62853@mail.gmail.com>	<49D64BA7.2000006@improva.dk>	<ca471dc20904031421j2c6c8f49kece03fe24ca539ee@mail.gmail.com>	<49D69AFA.5070600@improva.dk>	<ca471dc20904041329p31571e51o44d0ea8bcd74f96f@mail.gmail.com>	<49D8C642.3080308@improva.dk>	<49D940F6.60107@canterbury.ac.nz>
	<49DA0687.9010100@improva.dk>
Message-ID: <49DA76C1.7050204@gmail.com>

Jacob Holm wrote:
> If you insist it is
> an error to return a value as response to GeneratorExit, raise
> RuntimeError. But *please* don't just swallow the value.

As I asked in the other thread (but buried in a longer message): why do
you see it as OK for close() to throw away every later value a generator
may have yielded, but not OK for it to throw away the return value?

close() is for finalisation so that generators can have a __del__ method
and hence we can allow yield inside try-finally. That's it. Don't break
that by trying to turn close() into something it isn't.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------


From jh at improva.dk  Tue Apr  7 00:30:12 2009
From: jh at improva.dk (Jacob Holm)
Date: Tue, 07 Apr 2009 00:30:12 +0200
Subject: [Python-ideas] x=(yield from) confusion [was:Yet another
 alternative name for yield-from]
In-Reply-To: <49DA75C8.3080603@gmail.com>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com>	<49D60B81.6060209@gmail.com>	<fb6fbf560904030948t23ad6b08t77e71017e7c62853@mail.gmail.com>	<49D64BA7.2000006@improva.dk>	<ca471dc20904031421j2c6c8f49kece03fe24ca539ee@mail.gmail.com>	<49D69AFA.5070600@improva.dk>	<49D6AC52.4010608@gmail.com>	<49D6BC46.9000808@improva.dk>	<ca471dc20904041348u4ea98d60u3a023e8262563df2@mail.gmail.com>	<49D7ECC4.1030008@canterbury.ac.nz>
	<49D8DF12.50208@improva.dk> <49D934C1.2010504@gmail.com>
	<49D9FAE9.9060403@improva.dk> <49DA75C8.3080603@gmail.com>
Message-ID: <49DA8274.7050708@improva.dk>

Nick Coghlan wrote:
> Jacob Holm wrote:
>   
>> Another option (if you insist that it is an error to return a value
>> after a GeneratorExit) is to let close() raise a RuntimeError when it
>> catches a StopIteration with a non-None value.
>>     
>
> Why do you consider it OK for close() to throw away all of the values
> the generator might have yielded in the future, but not OK for it to
> throw away the generator's return value? 

Because the return value is actually computed and returned from the 
generator, *then* thrown away.  If there is no way to access the value, 
it should be considered an error to return it and flagged as such.  What 
the generator might have yielded if close wasn't called doesn't interest 
me the slightest.

> The objection I have to having
> close() return a value is that it encourages people to start using
> GeneratorExit in their normal generator control flow and I think that's
> a really bad idea (on par with calling sys.exit() and then trapping
> SystemExit to terminate a search loop - perfectly legal from a language
> point of view, but a really bad plan nonetheless).
>   

Yes, I understand that this is how you think of GeneratorExit.

> Now, the fact that repeatedly calling next()/send()/throw() on a
> finished generator is meant to keep reraising the same StopIteration
> that was thrown when the generator first terminated is a *much* better
> justification for preserving the return value on the generator object.
>   

Ok

> But coupling that with the idea of close() doing anything more than
> giving an unfinished generator a final chance to release any resources
> it is holding is mixing two completely different ideas.
>
> Better to just add a "value" property to generators that raises a
> RuntimeError if the generator frame hasn't terminated yet (probably
> along with a "finished" property to allow LBYL interrogation of the
> generator state).
>   

Why not a single property raising AttributeError until the frame is 
terminated?  (Not that I really care as long as I can access the value 
without having access to the original StopIteration).

If the value is stored, I am fine with close not returning it or raising 
an exception.

- Jacob


From greg.ewing at canterbury.ac.nz  Tue Apr  7 00:37:23 2009
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Tue, 07 Apr 2009 10:37:23 +1200
Subject: [Python-ideas] x=(yield from) confusion [was:Yet another
 alternative name for yield-from]
In-Reply-To: <49DA0687.9010100@improva.dk>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com>
	<49D60B81.6060209@gmail.com>
	<fb6fbf560904030948t23ad6b08t77e71017e7c62853@mail.gmail.com>
	<49D64BA7.2000006@improva.dk>
	<ca471dc20904031421j2c6c8f49kece03fe24ca539ee@mail.gmail.com>
	<49D69AFA.5070600@improva.dk>
	<ca471dc20904041329p31571e51o44d0ea8bcd74f96f@mail.gmail.com>
	<49D8C642.3080308@improva.dk> <49D940F6.60107@canterbury.ac.nz>
	<49DA0687.9010100@improva.dk>
Message-ID: <49DA8423.9070101@canterbury.ac.nz>

Jacob Holm wrote:

> No, in the coroutine pattern it absolutely should not. The first value 
> yielded by the generator of every coroutine is None and should be thrown 
> away.

That only applies to the *top level* of a coroutine. If
you factor some code out of a coroutine and call it using
yield-from, the first value yielded by the factored-out
code is needed and mustn't be thrown away.

So I stand by what I said before. If you're using such
a decorator, you only apply it to the top level generator
of the coroutine, and you don't call the top level
using yield-from.

-- 
Greg




From jh at improva.dk  Tue Apr  7 04:59:31 2009
From: jh at improva.dk (Jacob Holm)
Date: Tue, 07 Apr 2009 04:59:31 +0200
Subject: [Python-ideas] x=(yield from) confusion [was:Yet another
 alternative name for yield-from]
In-Reply-To: <49DA8423.9070101@canterbury.ac.nz>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com>	<49D60B81.6060209@gmail.com>	<fb6fbf560904030948t23ad6b08t77e71017e7c62853@mail.gmail.com>	<49D64BA7.2000006@improva.dk>	<ca471dc20904031421j2c6c8f49kece03fe24ca539ee@mail.gmail.com>	<49D69AFA.5070600@improva.dk>	<ca471dc20904041329p31571e51o44d0ea8bcd74f96f@mail.gmail.com>	<49D8C642.3080308@improva.dk>
	<49D940F6.60107@canterbury.ac.nz>	<49DA0687.9010100@improva.dk>
	<49DA8423.9070101@canterbury.ac.nz>
Message-ID: <49DAC193.3030008@improva.dk>

Greg Ewing wrote:
> Jacob Holm wrote:
>
>> No, in the coroutine pattern it absolutely should not. The first 
>> value yielded by the generator of every coroutine is None and should 
>> be thrown away.
>
> That only applies to the *top level* of a coroutine. If
> you factor some code out of a coroutine and call it using
> yield-from, the first value yielded by the factored-out
> code is needed and mustn't be thrown away.

One major reason for factoring something out of a coroutine would be if 
the factored-out code was independently useful as a coroutine. But I 
cannot actually *make* it a coroutine if I want to call it using 
yield-from because it is not always at the "top level".

>
> So I stand by what I said before. If you're using such
> a decorator, you only apply it to the top level generator
> of the coroutine, and you don't call the top level
> using yield-from.
>

So one @coroutine can't call another using yield-from. Why shouldn't it 
be possible? All we need is a way to avoid the first next() call and 
substitute some other value.

Here is a silly example of two coroutines calling each other using one 
of the syntax-based ideas I have for handling this. (I don't care about 
the actual syntax, just about the ability to do this)

@coroutine
def avg2():
    a = yield
    b = yield
    return (a+b)/2

@coroutine
def avg_diff():
    a = yield from avg2() start None  # "start EXPR" means use EXPR for first value to yield instead of next()
    b = yield from avg2() start a     # "start EXPR" means use EXPR for first value to yield instead of next()
    yield b
    return a-b

a = avg2()
a.send(41)
a.send(43)   # raises StopIteration(42)

d = avg_diff()
d.send(1.0)  
d.send(2.0)  # returns from first yield-from, yields 1.5 as part of starting second yield-from
d.send(3.0)
d.send(4.0)  # returns from second yield-from. yields 3.5
d.next()     # returns from avg_diff. raises StopIteration(-2.0)


The important things to note here are that both avg2 and avg_diff are 
independently useful coroutines (yeah ok, not that useful), and that the 
"natural" value to yield from the "d.send(2.0)" line does not come from 
calling next() on the subgenerator, but rather from the outer generator.

I don't think there is any way to achieve this without some way of 
substituting the initial next() call in yield-from.

Regards
- Jacob



From greg.ewing at canterbury.ac.nz  Tue Apr  7 07:42:48 2009
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Tue, 07 Apr 2009 17:42:48 +1200
Subject: [Python-ideas] x=(yield from) confusion [was:Yet another
 alternative name for yield-from]
In-Reply-To: <49DAC193.3030008@improva.dk>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com>
	<49D60B81.6060209@gmail.com>
	<fb6fbf560904030948t23ad6b08t77e71017e7c62853@mail.gmail.com>
	<49D64BA7.2000006@improva.dk>
	<ca471dc20904031421j2c6c8f49kece03fe24ca539ee@mail.gmail.com>
	<49D69AFA.5070600@improva.dk>
	<ca471dc20904041329p31571e51o44d0ea8bcd74f96f@mail.gmail.com>
	<49D8C642.3080308@improva.dk> <49D940F6.60107@canterbury.ac.nz>
	<49DA0687.9010100@improva.dk> <49DA8423.9070101@canterbury.ac.nz>
	<49DAC193.3030008@improva.dk>
Message-ID: <49DAE7D8.1040801@canterbury.ac.nz>

Jacob Holm wrote:

> One major reason for factoring something out of a coroutine would be if 
> the factored-out code was independently useful as a coroutine.

So provide another entry point for using that part
as a top level, or manually apply the wrapper when
it's appropriate.

I find it extroardinary that people seem to have
latched onto David Beazley's idiosyncratic definition
of a "coroutine" and decided that it's written on
a stone tablet from Mt. Sinai that we must always
wrap them in his decorator.

> @coroutine
> def avg_diff():
>    a = yield from avg2() start None  # "start EXPR" means use EXPR for 
> first value to yield instead of next()
>    b = yield from avg2() start a     # "start EXPR" means use EXPR for 
> first value to yield instead of next()

I'm going to need a less abstract example to see
why you might want to do something like that.

-- 
Greg


From denis.spir at free.fr  Tue Apr  7 15:03:41 2009
From: denis.spir at free.fr (spir)
Date: Tue, 7 Apr 2009 15:03:41 +0200
Subject: [Python-ideas] why not "name = value if condition"?
Message-ID: <20090407150341.6a097ac1@o>

Hello,

What's the reason why
   name = value if condition
is invalid? Meaning: there _must_ be an else clause.

[I imagine this has been discussed and refused consciously, but I couldn't find it in PEP308, nore in archives.]

It would be practicle in many situations, e.g. to give default values to parameters:

def writeFile(text, title, format=STANDARD, fileName=None):
   fileName = title+".py" if fileName is None
   ...

Denis

PS:
Sure, one can write
   if condition:
      name = value
but the rationale in favour of, or against, a one-liner shortcut is the same as for the ternary case (with else).

One can also write:
   name = value if condition else name
but... <no comment>.

I find the present situation a bit frustrating, like sitting between 2 chairs.

------
la vita e estrany


From george.sakkis at gmail.com  Tue Apr  7 15:10:33 2009
From: george.sakkis at gmail.com (George Sakkis)
Date: Tue, 7 Apr 2009 09:10:33 -0400
Subject: [Python-ideas] why not "name = value if condition"?
In-Reply-To: <20090407150341.6a097ac1@o>
References: <20090407150341.6a097ac1@o>
Message-ID: <91ad5bf80904070610w40b9e65j95e6725d0fd6e316@mail.gmail.com>

On Tue, Apr 7, 2009 at 9:03 AM, spir <denis.spir at free.fr> wrote:

> Hello,
>
> What's the reason why
> ? name = value if condition
> is invalid? Meaning: there _must_ be an else clause.
>
> [I imagine this has been discussed and refused consciously, but I couldn't find it in PEP308, nore in archives.]
>
> It would be practicle in many situations, e.g. to give default values to parameters:
>
> def writeFile(text, title, format=STANDARD, fileName=None):
> ? fileName = title+".py" if fileName is None
> ? ...
>
> Denis
>
> PS:
> Sure, one can write
> ? if condition:
> ? ? ?name = value
> but the rationale in favour of, or against, a one-liner shortcut is the same as for the ternary case (with else).

You do realize that "if condition: name = value" is a valid one-liner, right ?

if/else on the other hand is a two-liner at least.

George


From steve at pearwood.info  Tue Apr  7 15:38:21 2009
From: steve at pearwood.info (Steven D'Aprano)
Date: Tue, 7 Apr 2009 23:38:21 +1000
Subject: [Python-ideas] why not "name = value if condition"?
In-Reply-To: <20090407150341.6a097ac1@o>
References: <20090407150341.6a097ac1@o>
Message-ID: <200904072338.22025.steve@pearwood.info>

On Tue, 7 Apr 2009 11:03:41 pm spir wrote:
> Hello,
>
> What's the reason why
>    name = value if condition
> is invalid? Meaning: there _must_ be an else clause.


Because "value if condition else other" is an expression, not a 
statement, and thus *must* have a value.

    name = value if condition

only has a value sometimes, and so is invalid for the same reason that:

    name = 

is invalid -- you need to have a value to bind to the name.

    if condition:
        name = value

is different. If condition is false, the branch "name = value" isn't 
taken at all. The entire block isn't an expression, so it doesn't have, 
or need, a value.


> [I imagine this has been discussed and refused consciously, but I
> couldn't find it in PEP308, nore in archives.]
>
> It would be practicle in many situations, e.g. to give default values
> to parameters:
>
> def writeFile(text, title, format=STANDARD, fileName=None):
>    fileName = title+".py" if fileName is None

Maybe so, but this can just as easily be written:

    if fileName is None:
        fileName = title+".py"

And if you're concerned about this being a two-liner (perhaps because 
the Enter key on your keyboard is broken *wink*) you can write it as a 
one-liner:

    if fileName is None: fileName = title+".py"


-- 
Steven D'Aprano


From denis.spir at free.fr  Tue Apr  7 16:55:10 2009
From: denis.spir at free.fr (spir)
Date: Tue, 7 Apr 2009 16:55:10 +0200
Subject: [Python-ideas] why not "name = value if condition"?
In-Reply-To: <200904072338.22025.steve@pearwood.info>
References: <20090407150341.6a097ac1@o>
	<200904072338.22025.steve@pearwood.info>
Message-ID: <20090407165510.12b1c82a@o>

Le Tue, 7 Apr 2009 23:38:21 +1000,
Steven D'Aprano <steve at pearwood.info> s'exprima ainsi:

> > What's the reason why
> >    name = value if condition
> > is invalid? Meaning: there _must_ be an else clause.  

> Because "value if condition else other" is an expression, not a 
> statement, and thus *must* have a value.
> 
>     name = value if condition
> 
> only has a value sometimes, and so is invalid [...]

Yes, thank you! That's the explaination I was looking for.
I meant to use the above formulation for updating already defined variables. Indeed, python does not make the difference I have in mind ;-).

As a side note,
   if condition: name = value
(where 'name' is already registered) is the only case when I do not indent a block, for the reason that it means for me the same as the above invalid one-liner.

Denis
------
la vita e estrany


From jh at improva.dk  Tue Apr  7 17:00:27 2009
From: jh at improva.dk (Jacob Holm)
Date: Tue, 07 Apr 2009 17:00:27 +0200
Subject: [Python-ideas] x=(yield from) confusion [was:Yet another
 alternative name for yield-from]
In-Reply-To: <49DAE7D8.1040801@canterbury.ac.nz>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com>
	<49D60B81.6060209@gmail.com>
	<fb6fbf560904030948t23ad6b08t77e71017e7c62853@mail.gmail.com>
	<49D64BA7.2000006@improva.dk>
	<ca471dc20904031421j2c6c8f49kece03fe24ca539ee@mail.gmail.com>
	<49D69AFA.5070600@improva.dk>
	<ca471dc20904041329p31571e51o44d0ea8bcd74f96f@mail.gmail.com>
	<49D8C642.3080308@improva.dk> <49D940F6.60107@canterbury.ac.nz>
	<49DA0687.9010100@improva.dk> <49DA8423.9070101@canterbury.ac.nz>
	<49DAC193.3030008@improva.dk> <49DAE7D8.1040801@canterbury.ac.nz>
Message-ID: <49DB6A8B.1010709@improva.dk>

Greg Ewing wrote:
> Jacob Holm wrote:
>
>> One major reason for factoring something out of a coroutine would be 
>> if the factored-out code was independently useful as a coroutine.
>
> So provide another entry point for using that part
> as a top level, or manually apply the wrapper when
> it's appropriate.

It is inconvenient to have to keep separate versions around, and it 
doesn't solve the problem.  See example below.

>
> I find it extroardinary that people seem to have
> latched onto David Beazley's idiosyncratic definition
> of a "coroutine" and decided that it's written on
> a stone tablet from Mt. Sinai that we must always
> wrap them in his decorator.

I find it extraordinary that my critical perspective on this issue makes 
you think I am taking his tutorial as dogma.  There is a real issue here 
IMNSHO, and the @coroutine examples just happens to be the easiest way I 
can see of explaining it.

>
>> @coroutine
>> def avg_diff():
>>    a = yield from avg2() start None  # "start EXPR" means use EXPR 
>> for first value to yield instead of next()
>>    b = yield from avg2() start a     # "start EXPR" means use EXPR 
>> for first value to yield instead of next()
>
> I'm going to need a less abstract example to see
> why you might want to do something like that.
>

Ok,  below you will find a modified version of your parser example taken 
from 
http://www.cosc.canterbury.ac.nz/greg.ewing/python/yield-from/parser.txt

The modification consists of applying the @coroutine decorator to 
parse_items and parse_elem and changing them to yield a stream of 
2-tuples describing how each token sent to the coroutine was interpreted.

The expected output is:

Feeding: '<foo>'
Yielding: ('Open', 'foo')
Feeding: 'This'
Yielding: ('Data', 'This')
Feeding: 'is'
Yielding: ('Data', 'is')
Feeding: 'a'
Yielding: ('Data', 'a')
Feeding: '<b>'
Yielding: ('Open', 'b')
Feeding: 'foo'
Yielding: ('Data', 'foo')
Feeding: 'file'
Yielding: ('Data', 'file')
Feeding: '</b>'
Yielding: ('Close', 'b')
Feeding: 'you'
Yielding: ('Data', 'you')
Feeding: 'know.'
Yielding: ('Data', 'know.')
Feeding: '</foo>'
Yielding: ('Close', 'foo')
[('foo', ['This', 'is', 'a', ('b', ['foo', 'file']), 'you', 'know.'])]


I can't see a nice way to get the same sequence of send() calls to yield 
the same values without the ability to override the way the value for 
the initial yield in yield-from is computed.  Even avoiding the use of 
@coroutine, I will still need to pass extra arguments to the generator 
functions to control the initial yield.

- Jacob
------------------------------------------------------------------------

# Support code from example at http://www.cosc.canterbury.ac.nz/greg.ewing/python/yield-from/parser.txt

import re
pat = re.compile(r"(\S+)|(<[^>]*>)")
  
def scanner(text):
    for m in pat.finditer(text):
        token = m.group(0)
        print "Feeding:", repr(token)
        yield token
    yield None # to signal EOF
  
text = "<foo> This is a <b> foo file </b> you know. </foo>"
token_stream = scanner(text)

def is_opening_tag(token):
    return token.startswith("<") and not token.startswith("</")


# Coroutine decorator copied from earlier mail, based on the one in http://dabeaz.com/coroutines/

def coroutine(func):
    def start(*args, **kwargs):
        cr = func(*args, **kwargs)
        v = cr.next()
        if v is not None:
            raise RuntimeError('first yield from coroutine was not None')
        return cr
    return start


# Runner modified from example at http://www.cosc.canterbury.ac.nz/greg.ewing/python/yield-from/parser.txt
# to also print the yielded values.

def run():
    parser = parse_items()
    # The original forgot to call next() here.  That is not necessary in this version since parse_items uses
    # the @coroutine decorator.
    try:
        for m in pat.finditer(text):
            token = m.group(0)
            print "Feeding:", repr(token)
            v = parser.send(token)
            print "Yielded:", v
        parser.send(None) # to signal EOF
    except StopIteration, e:
        tree = e.args[0]
        print tree


# parse_elem modified from example at http://www.cosc.canterbury.ac.nz/greg.ewing/python/yield-from/parser.txt
# to make it a coroutine and to yield a sequence of 2-tuples describing how the recieved data was interpreted.
# (Does not yield the final ('Close', <tagname>) because it returns the tree instead). 

@coroutine
def parse_elem():
    opening_tag = yield
    name = opening_tag[1:-1]
    closing_tag = "</%s>" % name
    items = yield from parse_items(closing_tag) start ('Open', name)
    return (name, items)


# parse_items modified from example at http://www.cosc.canterbury.ac.nz/greg.ewing/python/yield-from/parser.txt
# to make it a coroutine and to yield a sequence of 2-tuples describing how the recieved data was interpreted.

@coroutine
def parse_items(closing_tag = None):
    elems = []
    token = yield
    while token != closing_tag:
        if is_opening_tag(token):
            subtree = yield from parse_elem() as p start p.send(token)
            elems.append(subtree)
            out = ('Close', subtree[0])
        else:
            elems.append(token)
            out = ('Data', token)
        token = yield out
    return elems





From ziade.tarek at gmail.com  Tue Apr  7 20:05:51 2009
From: ziade.tarek at gmail.com (=?ISO-8859-1?Q?Tarek_Ziad=E9?=)
Date: Tue, 7 Apr 2009 20:05:51 +0200
Subject: [Python-ideas] registery system in Python ?
Message-ID: <94bdd2610904071105p718c1f72md0fbb6e8e59d9663@mail.gmail.com>

Hello

I am working on a plugin system for Distutils, inspired from what
setuptools provides (entry_points)
so I am trying to describe how a generic registery could work.

But, as discussed with some people at Pycon, this is a general need.
What about adding a simple generic registery system in Python stdlib ?

The APIs I was thinking about would register plugins under group names
for an easy classification:

- get_plugin(group, name) : returns an object for (group, name)
- register_plugin(group, name, object): register an object, for (group, name)
- unregister_plugin(group, name): removes an object for (group, name)
- list_plugins(group=None, doc=False): returns a list of all objects
for the given group.
- list_groups(): return a list of all groups

having groups make it simpler to classify plugins. In my use case,
group could be : 'distutils:filelist'
to list all plugins that knows how to build a file list. (see
http://wiki.python.org/moin/Distutils/ManifestPluginSystem)

Regards
Tarek

-- 
Tarek Ziad? | Association AfPy | www.afpy.org
Blog FR | http://programmation-python.org
Blog EN | http://tarekziade.wordpress.com/


From ncoghlan at gmail.com  Tue Apr  7 23:59:34 2009
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Wed, 08 Apr 2009 07:59:34 +1000
Subject: [Python-ideas] x=(yield from) confusion [was:Yet another
 alternative name for yield-from]
In-Reply-To: <49DAC193.3030008@improva.dk>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com>	<49D60B81.6060209@gmail.com>	<fb6fbf560904030948t23ad6b08t77e71017e7c62853@mail.gmail.com>	<49D64BA7.2000006@improva.dk>	<ca471dc20904031421j2c6c8f49kece03fe24ca539ee@mail.gmail.com>	<49D69AFA.5070600@improva.dk>	<ca471dc20904041329p31571e51o44d0ea8bcd74f96f@mail.gmail.com>	<49D8C642.3080308@improva.dk>	<49D940F6.60107@canterbury.ac.nz>	<49DA0687.9010100@improva.dk>	<49DA8423.9070101@canterbury.ac.nz>
	<49DAC193.3030008@improva.dk>
Message-ID: <49DBCCC6.1080601@gmail.com>

Jacob Holm wrote:
> So one @coroutine can't call another using yield-from. Why shouldn't it
> be possible? All we need is a way to avoid the first next() call and
> substitute some other value.
> 
> Here is a silly example of two coroutines calling each other using one
> of the syntax-based ideas I have for handling this. (I don't care about
> the actual syntax, just about the ability to do this)
> 
> @coroutine
> def avg2():
>    a = yield
>    b = yield
>    return (a+b)/2
> 
> @coroutine
> def avg_diff():
>    a = yield from avg2() start None  # "start EXPR" means use EXPR for
> first value to yield instead of next()
>    b = yield from avg2() start a     # "start EXPR" means use EXPR for
> first value to yield instead of next()
>    yield b
>    return a-b

You can fix this without syntax by changing the way avg2 is written.

@coroutine
def avg2(start=None):
   a = yield start
   b = yield
   return (a+b)/2

@coroutine
def avg_diff(start=None):
   a = yield from avg2(start)
   b = yield from avg2(a)
   yield b
   return a-b

a = avg2()
a.send(41)
a.send(43)   # raises StopIteration(42)

d = avg_diff()
d.send(1.0)
d.send(2.0)  # returns from first yield-from, yields 1.5 as part of
starting second yield-from
d.send(3.0)
d.send(4.0)  # returns from second yield-from. yields 3.5
d.next()     # returns from avg_diff. raises StopIteration(-2.0)

So it just becomes a new rule of thumb for coroutines: a yield-from
friendly coroutine will accept a "start" argument that defaults to None
and is returned from the first yield call.

And just like threading.Thread, it will leave the idea of defining the
coroutine and starting the coroutine as separate activities.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------


From greg.ewing at canterbury.ac.nz  Wed Apr  8 00:38:02 2009
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Wed, 08 Apr 2009 10:38:02 +1200
Subject: [Python-ideas] why not "name = value if condition"?
In-Reply-To: <20090407150341.6a097ac1@o>
References: <20090407150341.6a097ac1@o>
Message-ID: <49DBD5CA.5060601@canterbury.ac.nz>

spir wrote:
> Hello,
> 
> What's the reason why
>    name = value if condition
> is invalid?
> 
> Sure, one can write
>    if condition:
>       name = value

The 'a if b else c' construct is an expression. The
transformation you want can't be implemented by treating
it as an expression, since you want the 'if' to apply
to the assignment, not just the RHS.

 > but the rationale in favour of, or against, a one-liner shortcut
 > is the same as for the ternary case (with else).

No, it's not. The ternary case can be used as an
expression anywhere, often saving an assignment to
an intermediate variable. And when used with an
assignment, it avoids repeating the LHS, i.e.

   a = b if c else d

as opposed to

   if b:
     a = c
   else:
     a = d

Your proposed shortcut doesn't save anything except
a small amount of whitespace, so the justification
is much weaker.

-- 
Greg


From tleeuwenburg at gmail.com  Wed Apr  8 00:36:53 2009
From: tleeuwenburg at gmail.com (Tennessee Leeuwenburg)
Date: Wed, 8 Apr 2009 08:36:53 +1000
Subject: [Python-ideas] why not "name = value if condition"?
In-Reply-To: <20090407165510.12b1c82a@o>
References: <20090407150341.6a097ac1@o>
	<200904072338.22025.steve@pearwood.info> <20090407165510.12b1c82a@o>
Message-ID: <43c8685c0904071536k5ea90861mfb8244cb4df593cf@mail.gmail.com>

As an aside only, it would be pretty reasonable IMO to have

name = value if condition

and set name to None otherwise

But I'm -1 on the idea as I think it's a bit redundant.

Cheers,
-T

On Wed, Apr 8, 2009 at 12:55 AM, spir <denis.spir at free.fr> wrote:

> Le Tue, 7 Apr 2009 23:38:21 +1000,
> Steven D'Aprano <steve at pearwood.info> s'exprima ainsi:
>
> > > What's the reason why
> > >    name = value if condition
> > > is invalid? Meaning: there _must_ be an else clause.
>
> > Because "value if condition else other" is an expression, not a
> > statement, and thus *must* have a value.
> >
> >     name = value if condition
> >
> > only has a value sometimes, and so is invalid [...]
>
> Yes, thank you! That's the explaination I was looking for.
> I meant to use the above formulation for updating already defined
> variables. Indeed, python does not make the difference I have in mind ;-).
>
> As a side note,
>   if condition: name = value
> (where 'name' is already registered) is the only case when I do not indent
> a block, for the reason that it means for me the same as the above invalid
> one-liner.
>
> Denis
> ------
> la vita e estrany
> _______________________________________________
> Python-ideas mailing list
> Python-ideas at python.org
> http://mail.python.org/mailman/listinfo/python-ideas
>



-- 
--------------------------------------------------
Tennessee Leeuwenburg
http://myownhat.blogspot.com/
"Don't believe everything you think"
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-ideas/attachments/20090408/9c221715/attachment.html>

From lists at cheimes.de  Wed Apr  8 03:12:20 2009
From: lists at cheimes.de (Christian Heimes)
Date: Wed, 08 Apr 2009 03:12:20 +0200
Subject: [Python-ideas] Add setpriority / getpriority to os module.
Message-ID: <grgtlk$pdt$1@ger.gmane.org>

Hello,

I would like to add straight forward wrapper for the setpriority and
getpriority functions to posixmodule.c for the os module. The functions
allow to retrieve and modify the niceness of a process. They are in
particular useful for multiprocessing to implement low priority
processes. The functions are well known as 'nice' or 'renice' commands.

The two functions and the constants PRIO_PROCESS, PRIO_PGRP, PRIO_USER,
PRIO_MIN and PRIO_MAX should be available on most POSIX operating
systems. A patch would come with autoconf tests for the feature.

A while ago somebody suggested a more high level wrapper for the
scheduling niceness that abstracts the niceness level. Windows (through
pywin32) has a different set of values. Such a wrapper is out of scope
for my proposal. Once the os module has the POSIX wrappers it's going to
be easy to create a pure Python abstraction.

I've also some code laying around to modify the IO priority and CPU
affinity on Linux. However the code is more complex and won't work on
other Unix operating systems. If somebody can implement the feature for
*BSD I'm willing to give it a try.

Christian



From jh at improva.dk  Wed Apr  8 03:59:19 2009
From: jh at improva.dk (Jacob Holm)
Date: Wed, 08 Apr 2009 03:59:19 +0200
Subject: [Python-ideas] x=(yield from) confusion [was:Yet another
 alternative name for yield-from]
In-Reply-To: <49DBCCC6.1080601@gmail.com>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com>	<49D60B81.6060209@gmail.com>	<fb6fbf560904030948t23ad6b08t77e71017e7c62853@mail.gmail.com>	<49D64BA7.2000006@improva.dk>	<ca471dc20904031421j2c6c8f49kece03fe24ca539ee@mail.gmail.com>	<49D69AFA.5070600@improva.dk>	<ca471dc20904041329p31571e51o44d0ea8bcd74f96f@mail.gmail.com>	<49D8C642.3080308@improva.dk>	<49D940F6.60107@canterbury.ac.nz>	<49DA0687.9010100@improva.dk>	<49DA8423.9070101@canterbury.ac.nz>
	<49DAC193.3030008@improva.dk> <49DBCCC6.1080601@gmail.com>
Message-ID: <49DC04F7.5080707@improva.dk>

Nick Coghlan wrote:
> Jacob Holm wrote:
>   
>> So one @coroutine can't call another using yield-from. Why shouldn't it
>> be possible? All we need is a way to avoid the first next() call and
>> substitute some other value.
>>
>> [snip code]
> You can fix this without syntax by changing the way avg2 is written.
>
> [snip code]
>
> So it just becomes a new rule of thumb for coroutines: a yield-from
> friendly coroutine will accept a "start" argument that defaults to None
> and is returned from the first yield call.
>   

That is not quite enough.  Your suggested rule of thumb is to replace

def example(*args, **kw):
    ...
    x = yield   # first yield
 


with

def example(start=None, *args, **kw):
    ...
    x = yield start  # first yield


That would have been correct if my statement about what was needed 
wasn't missing a bit.  What you actually need to replace with is 
something like:

def example(start, *args, **kw):
    ...
    if 'throw' in start:
        raise start['throw'] # simulate a throw() on first next()
    elif 'send' in start:
        x = start['send']    # simulate a send() on first next()
    else:
        x = yield start.get('yield')  # use specified value for first next()


This allows you to set up so the first next() call skips the yield and 
acts like a send() or a throw() was called.  Actually, I think that can 
be refactored to:

def cr_init(start):
    if 'throw' in start:
        raise start['throw']
    if 'send' in start:
        return start['send']
    return yield start.get('yield')

def example(start, *args, **kw):
    ...
    x = yield from cr_init(start)


Which makes it almost bearable.

It is also possible to write a @coroutine decorator that can be used 
with this, the trick is to make the undecorated function available as an 
attribute of the wrapper so it is available for use in yield-from.  The 
wrapper can also hide the existence of the start argument from top-level 
users.

def coroutine(func):
    def start(*args, **kwargs):
        cr = func({}, *args, **kwargs)
        v = cr.next()
        if v is not None:
            raise RuntimeError('first yield from coroutine was not None')
        return cr
    start.raw = func
    return start


Using such a coroutine in yield-from then becomes:

# Yield None as first value
yield from example.raw({}, *args, **kwargs)

# Yield 42 as first value
yield from example.raw({'yield':42}, *args, **kwargs)

# Skip the first yield and treat the first next() as a send(42)
yield from example.raw({'send':42},  *args, **kwargs)

# Skip the first yield and treat the first next() as a throw(ValueError(42))
yield from example.raw({'throw':ValueError(42)}, *args, **kwargs)


While using it in other contexts is exactly like people are used to.

So it turns out a couple of support routines and a simple convention can 
work around most of the problems with using @coroutines in yield-from.   
I still think it would be nice if yield-from didn't insist on treating 
its iterator as if it was new.


- Jacob



From ncoghlan at gmail.com  Wed Apr  8 13:34:40 2009
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Wed, 08 Apr 2009 21:34:40 +1000
Subject: [Python-ideas] x=(yield from) confusion [was:Yet another
 alternative name for yield-from]
In-Reply-To: <49DC04F7.5080707@improva.dk>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com>	<49D60B81.6060209@gmail.com>	<fb6fbf560904030948t23ad6b08t77e71017e7c62853@mail.gmail.com>	<49D64BA7.2000006@improva.dk>	<ca471dc20904031421j2c6c8f49kece03fe24ca539ee@mail.gmail.com>	<49D69AFA.5070600@improva.dk>	<ca471dc20904041329p31571e51o44d0ea8bcd74f96f@mail.gmail.com>	<49D8C642.3080308@improva.dk>	<49D940F6.60107@canterbury.ac.nz>	<49DA0687.9010100@improva.dk>	<49DA8423.9070101@canterbury.ac.nz>
	<49DAC193.3030008@improva.dk> <49DBCCC6.1080601@gmail.com>
	<49DC04F7.5080707@improva.dk>
Message-ID: <49DC8BD0.4080303@gmail.com>

Jacob Holm wrote:
> That would have been correct if my statement about what was needed
> wasn't missing a bit.  What you actually need to replace with is
> something like:
> 
> def example(start, *args, **kw):
>    ...
>    if 'throw' in start:
>        raise start['throw'] # simulate a throw() on first next()
>    elif 'send' in start:
>        x = start['send']    # simulate a send() on first next()
>    else:
>        x = yield start.get('yield')  # use specified value for first next()

This elaboration strikes me as completely unecessary, since next() or
the equivalent send(None) are the only legitimate ways to start a generator:

>>> def gen():
...   print "Generator started"
...   yield
...
>>> g = gen()
>>> g.next()
Generator started

A generator that receives a throw() first thing never executes at all:

>>> g = gen()
>>> g.throw(AssertionError)
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "<stdin>", line 1, in gen
AssertionError

Similarly, sending a non-None value first thing triggers an exception
since there is nowhere for the value to go:

>>> g = gen()
>>> g.send(42)
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
TypeError: can't send non-None value to a just-started generator

For the PEP, I think the solution to this issue is a couple of conventions:

1. Either don't implicitly call next() when creating coroutines or else
make that behaviour easy to bypass). This is a bad idea for the same
reason that implicitly calling start() on threads is a bad idea:
sometimes the user will want to separate definition from activation, and
it is a pain when the original author makes that darn near impossible in
order to save one line in the normal case.

2. Coroutines intended for use with yield-from should take a "start"
argument that is used for the value of their first yield. This allows
coroutines to be nested without introducing spurious "None" values into
the yield stream.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------


From jh at improva.dk  Wed Apr  8 14:39:04 2009
From: jh at improva.dk (Jacob Holm)
Date: Wed, 08 Apr 2009 14:39:04 +0200
Subject: [Python-ideas] x=(yield from) confusion [was:Yet another
 alternative name for yield-from]
In-Reply-To: <49DC8BD0.4080303@gmail.com>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com>	<49D60B81.6060209@gmail.com>	<fb6fbf560904030948t23ad6b08t77e71017e7c62853@mail.gmail.com>	<49D64BA7.2000006@improva.dk>	<ca471dc20904031421j2c6c8f49kece03fe24ca539ee@mail.gmail.com>	<49D69AFA.5070600@improva.dk>	<ca471dc20904041329p31571e51o44d0ea8bcd74f96f@mail.gmail.com>	<49D8C642.3080308@improva.dk>	<49D940F6.60107@canterbury.ac.nz>	<49DA0687.9010100@improva.dk>	<49DA8423.9070101@canterbury.ac.nz>
	<49DAC193.3030008@improva.dk> <49DBCCC6.1080601@gmail.com>
	<49DC04F7.5080707@improva.dk> <49DC8BD0.4080303@gmail.com>
Message-ID: <49DC9AE8.8020108@improva.dk>

Nick Coghlan wrote:
> Jacob Holm wrote:
>   
>> That would have been correct if my statement about what was needed
>> wasn't missing a bit.  What you actually need to replace with is
>> something like:
>>
>> def example(start, *args, **kw):
>>    ...
>>    if 'throw' in start:
>>        raise start['throw'] # simulate a throw() on first next()
>>    elif 'send' in start:
>>        x = start['send']    # simulate a send() on first next()
>>    else:
>>        x = yield start.get('yield')  # use specified value for first next()
>>     
>
> This elaboration strikes me as completely unecessary, since next() or
> the equivalent send(None) are the only legitimate ways to start a generator:
>
>   
Correct, but missing the point.  Maybe I explained the "throw" and 
"send" parts badly.  The point is that the following two examples have 
the same effect:

g = example({'send':42})
g.next()

g = example({})
g.next()
g.send(42)


Same effect meaning same last value returned and same internal state.  
You can't do this by just providing a value to yield on the first 
next().  The modified parser example I sent to Greg shows that there is 
a use case for it (although it is written using one of the syntax-based 
ideas). 

> For the PEP, I think the solution to this issue is a couple of conventions:
>
> 1. Either don't implicitly call next() when creating coroutines or else
> make that behaviour easy to bypass). This is a bad idea for the same
> reason that implicitly calling start() on threads is a bad idea:
> sometimes the user will want to separate definition from activation, and
> it is a pain when the original author makes that darn near impossible in
> order to save one line in the normal case.
>   

I don't agree that it is a bad idea to call next automatically.  I can 
see that it is necessary to keep a version around that doesn't do it, 
but that is because of limitations in yield-from.

> 2. Coroutines intended for use with yield-from should take a "start"
> argument that is used for the value of their first yield. This allows
> coroutines to be nested without introducing spurious "None" values into
> the yield stream.
>   

For the coroutine writer, it is just as easy to write:

x = yield start


As it is to write:

x = yield from cr_init(start)


The difference is that the second version is much more useful in yield-from.

Even assuming every relevant object implemented the pattern I suggest, 
it is still not possible to use yield-from to write something like 
itertools.dropwhile and have it delegate all send and throw calls 
correctly.  To make that possible, you need exactly the same thing that 
you need for pre-started coroutines: The ability to replace the next() 
call made by the yield-from expression with something else.  Give me 
that, and you will also have removed the need for a special pattern for 
coroutines that should be usable with yield-from.

Still-hoping-to-avoid-the-need-for-a-special-pattern-ly yours
- Jacob


From jh at improva.dk  Wed Apr  8 17:04:34 2009
From: jh at improva.dk (Jacob Holm)
Date: Wed, 08 Apr 2009 17:04:34 +0200
Subject: [Python-ideas] x=(yield from) confusion [was:Yet another
 alternative name for yield-from]
In-Reply-To: <49DC9AE8.8020108@improva.dk>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com>	<49D60B81.6060209@gmail.com>	<fb6fbf560904030948t23ad6b08t77e71017e7c62853@mail.gmail.com>	<49D64BA7.2000006@improva.dk>	<ca471dc20904031421j2c6c8f49kece03fe24ca539ee@mail.gmail.com>	<49D69AFA.5070600@improva.dk>	<ca471dc20904041329p31571e51o44d0ea8bcd74f96f@mail.gmail.com>	<49D8C642.3080308@improva.dk>	<49D940F6.60107@canterbury.ac.nz>	<49DA0687.9010100@improva.dk>	<49DA8423.9070101@canterbury.ac.nz>	<49DAC193.3030008@improva.dk>
	<49DBCCC6.1080601@gmail.com>	<49DC04F7.5080707@improva.dk>
	<49DC8BD0.4080303@gmail.com> <49DC9AE8.8020108@improva.dk>
Message-ID: <49DCBD02.8030806@improva.dk>

Jacob Holm wrote:
> Even assuming every relevant object implemented the pattern I suggest, 
> it is still not possible to use yield-from to write something like 
> itertools.dropwhile and have it delegate all send and throw calls 
> correctly. To make that possible, you need exactly the same thing that 
> you need for pre-started coroutines: The ability to replace the next() 
> call made by the yield-from expression with something else. Give me 
> that, and you will also have removed the need for a special pattern 
> for coroutines that should be usable with yield-from.

To be clear, I think the best way of handling this is to add a read-only 
property to generator objects holding the latest value yielded, and let 
yield-from use that when present instead of calling next(). (This is not 
a new idea, I am just explaining the consequences as I see them). The 
property can be cleared when the frame is released, so there should be 
no issues with that.

With that property, the dropwhile example becomes trivial:

def dropwhile(predicate, iterable):
    it = iter(iterable)
    v = next(it)
    while predicate(v):
        v = next(it)
    return yield from it  # Starts by yielding the last value checked, which is v.


More interesting (to me) is that the following helpers allow you to call 
a pre-started generator using yield-from in the 3 special ways I 
mentioned *without* needing the generator constructor to take any magic 
arguments.

def first_yielding(value, iterable):
    it = iter(iterable)
    try:
        s = yield value
    except GeneratorExit:
        it.close()
    except BaseException as e:
        it.throw(e) # sets the property so yield-from will use that first
    else:
        it.send(s)  # sets the property so yield-from will use that first
    return yield from it

def first_sending(value, iterable):
    it = iter(iterable)
    it.send(value)  # sets the property so yield-from will use that first
    return yield from it

def first_throwing(exc, iterable):
    it = iter(iterable)
    it.throw(exc)   # sets the property so yield-from will use that first
    return yield from it


# Yield None (first value yielded by a @coroutine) as first value
yield from example(*args, **kwargs)

# Yield 42 as first value
yield from first_yielding(42, example(*args, **kwargs))

# Treat the first next() as a send(42)
yield from first_sending(42, example(*args, **kwargs))

# Treat the first next() as a throw(ValueError(42))
yield from first_throwing(ValueError(42), example(*args, **kwargs))


So no new syntax needed, and coroutines are easily callable without the 
constructor needing extra magic arguments. Also, I am sure the property 
has other unrelated uses. What's not to like?


- Jacob



From python at rcn.com  Wed Apr  8 19:24:11 2009
From: python at rcn.com (Raymond Hettinger)
Date: Wed, 8 Apr 2009 10:24:11 -0700
Subject: [Python-ideas] Custom format() proposal redux
Message-ID: <59A8C10011314B0480169C2568685CFA@RaymondLaptop1>

The original proposal was well-received but it didn't make provisions to handle str.format().
Here is the revised proposal.  Only the last paragraph is new.


Raymond

-------------------------------------------------------------------


Mark Dickinson's decimal test code suggested a good, extensible approach to the problem.  Here's the idea in a nutshell:

  format(value, format_spec='', conventions=None)
     'calls value.__format__(format_spec, conventions)'

Where conventions is an optional dictionary with formatting control values.  Any value object can accept custom controls, but the 
names for standard ones would be taken from the standards provided by localeconv():

  {
   'decimal_point': '.',
   'grouping': [3, 0],
   'negative_sign': '-',
   'positive_sign': '',
   'thousands_sep': ','}

The would let you store several locales using localeconv() and use them at will, thus solving the global variable and threading 
problems with locale:

     import locale
     loc = locale.getlocale() # get current locale
     locale.setlocale(locale.LC_ALL, 'de_DE')
     DE = locale.localeconv()
     locale.setlocale(locale.LC_ALL, 'en_US')
     US = locale.localeconv()
     locale.setlocale(locale.LC_ALL, loc) # restore saved locale

     . . .

     format(x, '8,.f', DE)
     format(y, '8,d', US)

It also lets you write your own conventions on the fly:

     DEBUG = dict(thousands_sep='_')       # style for debugging
     EXTERN = dict(thousands_sep=',')      # style for external display

     . . .

     format(x, '8.1f', DEBUG)
     format(y, '8d', EXTERN)

The dictionaries can be registered for use with the mini-formatting language:

    locale.setlocale(locale.LC_ALL, 'en_US')
    str.format.register(US=locale.localeconv())
    str.format.register(HY=dict(thousands_sep='-'))

     . . .

    'Nigerian President will forward {0:,d!US} to your account'.format(10000000)
    format(y, ',d!HY')
    format(z, ',d!US')


From guido at python.org  Wed Apr  8 20:21:21 2009
From: guido at python.org (Guido van Rossum)
Date: Wed, 8 Apr 2009 11:21:21 -0700
Subject: [Python-ideas] x=(yield from) confusion [was:Yet another
	alternative name for yield-from]
In-Reply-To: <49DCBD02.8030806@improva.dk>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com> 
	<49D940F6.60107@canterbury.ac.nz> <49DA0687.9010100@improva.dk> 
	<49DA8423.9070101@canterbury.ac.nz> <49DAC193.3030008@improva.dk> 
	<49DBCCC6.1080601@gmail.com> <49DC04F7.5080707@improva.dk> 
	<49DC8BD0.4080303@gmail.com> <49DC9AE8.8020108@improva.dk> 
	<49DCBD02.8030806@improva.dk>
Message-ID: <ca471dc20904081121x56e61879q27ae9da604e82dc7@mail.gmail.com>

On Wed, Apr 8, 2009 at 8:04 AM, Jacob Holm <jh at improva.dk> wrote:
> Jacob Holm wrote:
>> Even assuming every relevant object implemented the pattern I suggest, it
>> is still not possible to use yield-from to write something like
>> itertools.dropwhile and have it delegate all send and throw calls correctly.
>> To make that possible, you need exactly the same thing that you need for
>> pre-started coroutines: The ability to replace the next() call made by the
>> yield-from expression with something else. Give me that, and you will also
>> have removed the need for a special pattern for coroutines that should be
>> usable with yield-from.
>
> To be clear, I think the best way of handling this is to add a read-only
> property to generator objects holding the latest value yielded, and let
> yield-from use that when present instead of calling next(). (This is not a
> new idea, I am just explaining the consequences as I see them). The property
> can be cleared when the frame is released, so there should be no issues with
> that.

Let me just respond with the recommendation that you stop pushing for
features that require storing state on the generator object. Quite
apart from implementation issues (which may well be non-existent) I
think it's a really scary thing to add any "state" to a generator that
isn't contained naturally in its stack frame.

If this means that you can't use yield-from for some of your use
cases, well, so be it. It has plenty of other use cases.

And no, I am not prepared to defend this recommendation. But I feel
very strongly about it. So don't challenge me -- it's just going to be
a waste of everyone's time to continue this line of thought.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)


From aahz at pythoncraft.com  Wed Apr  8 21:40:56 2009
From: aahz at pythoncraft.com (Aahz)
Date: Wed, 8 Apr 2009 12:40:56 -0700
Subject: [Python-ideas] Draft PEP (version 0.4):
	Standard	daemon?process?library
In-Reply-To: <loom.20090129T220650-495@post.gmane.org>
References: <87wscj11fl.fsf@benfinney.id.au> <87iqnywsx2.fsf@benfinney.id.au>
	<loom.20090129T161440-214@post.gmane.org>
	<87wscdvmyr.fsf@benfinney.id.au>
	<loom.20090129T220650-495@post.gmane.org>
Message-ID: <20090408194055.GA21266@panix.com>

[responding very late]

On Thu, Jan 29, 2009, Antoine Pitrou wrote:
> Ben Finney <ben+python at ...> writes:
>> 
>> What would be appropriate behaviour in the case of a stale PID file?
>> Abort the daemonisation attempt? Delete the stale lock file silently
>> and continue as though it didn't exist?
> 
> Delete the stale lock file silently and continue as though it didn't exist.
> (and, of course, create another one for the current process)

This should be determined by the API -- for example, a process running
against an NFS disk that may be run from any of several servers but
should only run on one server at any time ought to abort if there's a
stale PID file.
-- 
Aahz (aahz at pythoncraft.com)           <*>         http://www.pythoncraft.com/

"...string iteration isn't about treating strings as sequences of strings, 
it's about treating strings as sequences of characters.  The fact that
characters are also strings is the reason we have problems, but characters 
are strings for other good reasons."  --Aahz


From ben+python at benfinney.id.au  Thu Apr  9 00:22:37 2009
From: ben+python at benfinney.id.au (Ben Finney)
Date: Thu, 09 Apr 2009 08:22:37 +1000
Subject: [Python-ideas] Draft PEP (version 0.4):
	Standard	daemon?process?library
References: <87wscj11fl.fsf@benfinney.id.au> <87iqnywsx2.fsf@benfinney.id.au>
	<loom.20090129T161440-214@post.gmane.org>
	<87wscdvmyr.fsf@benfinney.id.au>
	<loom.20090129T220650-495@post.gmane.org>
	<20090408194055.GA21266@panix.com>
Message-ID: <878wma945u.fsf@benfinney.id.au>

Aahz <aahz at pythoncraft.com> writes:

> [responding very late]
> 
> On Thu, Jan 29, 2009, Antoine Pitrou wrote:
> > Delete the stale lock file silently and continue as though it
> > didn't exist. (and, of course, create another one for the current
> > process)
> 
> This should be determined by the API -- for example, a process
> running against an NFS disk that may be run from any of several
> servers but should only run on one server at any time ought to abort
> if there's a stale PID file.

This and several other issues make PID file handling quite a thorny
subject, and I'm working with Skip Montanaro on an implementation
separated from the daemon PEP 3143.

The current PEP 3143 delegates all these decisions (by not mentioning
them at all) to the context manager ?__entry__? and ?__exit__? of an
optional ?pidfile? parameter.

-- 
 \               ?There's no excuse to be bored. Sad, yes. Angry, yes. |
  `\    Depressed, yes. Crazy, yes. But there's no excuse for boredom, |
_o__)                                          ever.? ?Viggo Mortensen |
Ben Finney



From eric at trueblade.com  Thu Apr  9 01:15:47 2009
From: eric at trueblade.com (Eric Smith)
Date: Wed, 08 Apr 2009 19:15:47 -0400
Subject: [Python-ideas] Custom format() proposal redux
In-Reply-To: <59A8C10011314B0480169C2568685CFA@RaymondLaptop1>
References: <59A8C10011314B0480169C2568685CFA@RaymondLaptop1>
Message-ID: <49DD3023.2040605@trueblade.com>

My general thought is that I like the mechanism Mark uses to get the 
parameters into the __format__ function (the conventions dict). I'm just 
not sure where it needs to be specified in order to get the data into 
__format__.

>     format(x, '8.1f', DEBUG)
>     format(y, '8d', EXTERN)
> 
> The dictionaries can be registered for use with the mini-formatting 
> language:
> 
>    locale.setlocale(locale.LC_ALL, 'en_US')
>    str.format.register(US=locale.localeconv())
>    str.format.register(HY=dict(thousands_sep='-'))

I'm not sure you want to use the word "register", as we might want to 
register other things in the future. Maybe "register_convention"? I 
realize it's a little long.

Also, I don't like the **kwargs functionality here, why not specify it 
as 2 parameters?
str.format.register_convention('HY', dict(thousands_sep='-'))

>    'Nigerian President will forward {0:,d!US} to your 
> account'.format(10000000)
>    format(y, ',d!HY')
>    format(z, ',d!US')

What happens if both a "conventions" parameter and a 
"!<registered-convention>" specifier are present? An error? Does one 
win? Are they merged?

What happens if you specify a convention that's not registered?

I'm not sure I like using the "!<registered-convention>" syntax. I think 
it means you couldn't have both a "!s" or "!r" and a "!<convention>" in 
the same format string, unless we allow "!s!HY". OTOH, "!s" and "!r" 
always yield strings, and I can't see how strings would need a 
convention, since they don't really do much formatting. But I haven't 
given that part much thought, and I don't have a counter-proposal.

Using "!" also means we should probably require that conventions not be 
named "s" or "r", and we might want to reserve all single character 
lower case strings.

A larger concern is libraries. If I'm a library author, there's no way I 
can know what conventions the application has registered. And even if I 
could inspect them, I'm not sure what I'd do with the knowledge. I 
either would have to register my own private conventions (like 
"libraryname.HY"), or we'd need to agree on conventions we're expecting 
to be available and what they mean. What are your thoughts on what a 
library author should do?

Eric.


From greg.ewing at canterbury.ac.nz  Thu Apr  9 01:34:34 2009
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Thu, 09 Apr 2009 11:34:34 +1200
Subject: [Python-ideas] x=(yield from) confusion [was:Yet another
 alternative name for yield-from]
In-Reply-To: <49DC9AE8.8020108@improva.dk>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com>
	<49D60B81.6060209@gmail.com>
	<fb6fbf560904030948t23ad6b08t77e71017e7c62853@mail.gmail.com>
	<49D64BA7.2000006@improva.dk>
	<ca471dc20904031421j2c6c8f49kece03fe24ca539ee@mail.gmail.com>
	<49D69AFA.5070600@improva.dk>
	<ca471dc20904041329p31571e51o44d0ea8bcd74f96f@mail.gmail.com>
	<49D8C642.3080308@improva.dk> <49D940F6.60107@canterbury.ac.nz>
	<49DA0687.9010100@improva.dk> <49DA8423.9070101@canterbury.ac.nz>
	<49DAC193.3030008@improva.dk> <49DBCCC6.1080601@gmail.com>
	<49DC04F7.5080707@improva.dk> <49DC8BD0.4080303@gmail.com>
	<49DC9AE8.8020108@improva.dk>
Message-ID: <49DD348A.4010409@canterbury.ac.nz>

Jacob Holm wrote:

> You can't do this by just providing a value to yield on the first 
> next().

I don't see how providing an initial yield value directly
to the yield-from expression can give you any greater
functionality, though. Can you post a complete example
of that so I don't have to paste code from several messages
together?

> The modified parser example I sent to Greg shows that there is 
> a use case for it

FWIW, that example doesn't fit David Beazley's definition of
a coroutine, since it uses yields to both send and receive
values. He doesn't think that's a sane thing to do.

> I don't agree that it is a bad idea to call next automatically.  I can 
> see that it is necessary to keep a version around that doesn't do it, 
> but that is because of limitations in yield-from.

An alternative viewpoint would be the idea that a coroutine
should always start itself automatically is too simplistic.

-- 
Greg


From greg.ewing at canterbury.ac.nz  Thu Apr  9 01:45:51 2009
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Thu, 09 Apr 2009 11:45:51 +1200
Subject: [Python-ideas] x=(yield from) confusion [was:Yet another
 alternative name for yield-from]
In-Reply-To: <49DCBD02.8030806@improva.dk>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com>
	<49D60B81.6060209@gmail.com>
	<fb6fbf560904030948t23ad6b08t77e71017e7c62853@mail.gmail.com>
	<49D64BA7.2000006@improva.dk>
	<ca471dc20904031421j2c6c8f49kece03fe24ca539ee@mail.gmail.com>
	<49D69AFA.5070600@improva.dk>
	<ca471dc20904041329p31571e51o44d0ea8bcd74f96f@mail.gmail.com>
	<49D8C642.3080308@improva.dk> <49D940F6.60107@canterbury.ac.nz>
	<49DA0687.9010100@improva.dk> <49DA8423.9070101@canterbury.ac.nz>
	<49DAC193.3030008@improva.dk> <49DBCCC6.1080601@gmail.com>
	<49DC04F7.5080707@improva.dk> <49DC8BD0.4080303@gmail.com>
	<49DC9AE8.8020108@improva.dk> <49DCBD02.8030806@improva.dk>
Message-ID: <49DD372F.6020803@canterbury.ac.nz>

Jacob Holm wrote:
> I think the best way of handling this is to add a read-only 
> property to generator objects holding the latest value yielded... The 
> property can be cleared when the frame is released, so there should be 
> no issues with that.

It will still keep the value alive longer than it would
be otherwise. Some people might take issue with that.

> def dropwhile(predicate, iterable):
>    it = iter(iterable)
>    v = next(it)
>    while predicate(v):
>        v = next(it)
>    return yield from it  # Starts by yielding the last value checked, 
> which is v.

In my view this constitutes a shared iterator, and is
therefore outside the scope of yield-from.

I also don't think this generalizes well enough to be
worth going out of our way to support. It only works
because you're making a "tail call" to the iterator,
which is a rather special case. Most itertools-style
functions won't have that property.

> What's not to like?

The fact that yield-from and/or generator behaviour
is being complexified to support things that are
outside the scope of my proposal.

This is why I want to keep focused on refactoring,
to prevent this kind of feature creep.

-- 
Greg




From jh at improva.dk  Thu Apr  9 04:52:42 2009
From: jh at improva.dk (Jacob Holm)
Date: Thu, 09 Apr 2009 04:52:42 +0200
Subject: [Python-ideas] x=(yield from) confusion [was:Yet another
 alternative name for yield-from]
In-Reply-To: <ca471dc20904081121x56e61879q27ae9da604e82dc7@mail.gmail.com>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com>
	<49D940F6.60107@canterbury.ac.nz> <49DA0687.9010100@improva.dk>
	<49DA8423.9070101@canterbury.ac.nz>
	<49DAC193.3030008@improva.dk> <49DBCCC6.1080601@gmail.com>
	<49DC04F7.5080707@improva.dk> <49DC8BD0.4080303@gmail.com>
	<49DC9AE8.8020108@improva.dk> <49DCBD02.8030806@improva.dk>
	<ca471dc20904081121x56e61879q27ae9da604e82dc7@mail.gmail.com>
Message-ID: <49DD62FA.9080504@improva.dk>

Guido van Rossum wrote:
> On Wed, Apr 8, 2009 at 8:04 AM, Jacob Holm <jh at improva.dk> wrote:
>   
>> Jacob Holm wrote:
>>     
>>> Even assuming every relevant object implemented the pattern I suggest, it
>>> is still not possible to use yield-from to write something like
>>> itertools.dropwhile and have it delegate all send and throw calls correctly.
>>> To make that possible, you need exactly the same thing that you need for
>>> pre-started coroutines: The ability to replace the next() call made by the
>>> yield-from expression with something else. Give me that, and you will also
>>> have removed the need for a special pattern for coroutines that should be
>>> usable with yield-from.
>>>       
>> To be clear, I think the best way of handling this is to add a read-only
>> property to generator objects holding the latest value yielded, and let
>> yield-from use that when present instead of calling next(). (This is not a
>> new idea, I am just explaining the consequences as I see them). The property
>> can be cleared when the frame is released, so there should be no issues with
>> that.
>>     
>
> Let me just respond with the recommendation that you stop pushing for
> features that require storing state on the generator object. Quite
> apart from implementation issues (which may well be non-existent) I
> think it's a really scary thing to add any "state" to a generator that
> isn't contained naturally in its stack frame.
>   

Does storing it as part of the frame object count as "naturally in the 
stack frame"?  Because that is probably the most natural place to put 
this, implementation-wise.  If storing on the frame object is also out, 
I will have to start thinking about new syntax again.  Oh well.

I was going to push for saving the final return value from a generator 
somewhere so that each StopIteration raised by an operation on the 
closed generator could have the same value as the StopIteration that 
closed it (or None if it was closed by another exception).  If that 
value can't live on the generator object, I guess that idea is dead.  Am 
I right?

> If this means that you can't use yield-from for some of your use
> cases, well, so be it. It has plenty of other use cases.
>   

That is exactly what I am worried about.  I think the number of use 
cases will be severely limited if we don't have a way to replace the 
initial next() made by yield-from.  This is different from the close() 
issues we debated earlier, where there is a relatively simple 
workaround.  The full workaround for the "initial next()" issue is big, 
ugly, and slow.

Anyway, I still hope the workaround won't be needed.

> And no, I am not prepared to defend this recommendation. But I feel
> very strongly about it. So don't challenge me -- it's just going to be
> a waste of everyone's time to continue this line of thought.
>   

No, I am not going to challenge you on this.   Once I have your answer 
to the questions at the beginning of this mail, I will try to adjust my 
future proposals accordingly.

Best regards
- Jacob


From guido at python.org  Thu Apr  9 05:55:41 2009
From: guido at python.org (Guido van Rossum)
Date: Wed, 8 Apr 2009 20:55:41 -0700
Subject: [Python-ideas] x=(yield from) confusion [was:Yet another
	alternative name for yield-from]
In-Reply-To: <49DD62FA.9080504@improva.dk>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com> 
	<49DA8423.9070101@canterbury.ac.nz> <49DAC193.3030008@improva.dk> 
	<49DBCCC6.1080601@gmail.com> <49DC04F7.5080707@improva.dk> 
	<49DC8BD0.4080303@gmail.com> <49DC9AE8.8020108@improva.dk> 
	<49DCBD02.8030806@improva.dk>
	<ca471dc20904081121x56e61879q27ae9da604e82dc7@mail.gmail.com> 
	<49DD62FA.9080504@improva.dk>
Message-ID: <ca471dc20904082055s5f838a43tc1c175c8bf02cad9@mail.gmail.com>

On Wed, Apr 8, 2009 at 7:52 PM, Jacob Holm <jh at improva.dk> wrote:
> Guido van Rossum wrote:
>>
>> On Wed, Apr 8, 2009 at 8:04 AM, Jacob Holm <jh at improva.dk> wrote:
>>
>>>
>>> Jacob Holm wrote:
>>> To be clear, I think the best way of handling this is to add a read-only
>>> property to generator objects holding the latest value yielded, and let
>>> yield-from use that when present instead of calling next(). (This is not
>>> a
>>> new idea, I am just explaining the consequences as I see them). The
>>> property
>>> can be cleared when the frame is released, so there should be no issues
>>> with
>>> that.
>>>
>>
>> Let me just respond with the recommendation that you stop pushing for
>> features that require storing state on the generator object. Quite
>> apart from implementation issues (which may well be non-existent) I
>> think it's a really scary thing to add any "state" to a generator that
>> isn't contained naturally in its stack frame.
>>
>
> Does storing it as part of the frame object count as "naturally in the stack
> frame"? ?Because that is probably the most natural place to put this,
> implementation-wise.

No, that's out too. My point is that I don't want to add *any* state
beyond what the user thinks of as the "normal" state in the frame
(i.e. local variables, where it is suspended, and the expression stack
and try-except stack). Nothing else.

> If storing on the frame object is also out, I will
> have to start thinking about new syntax again. ?Oh well.

Sorry, no go. New syntax is also out.

> I was going to push for saving the final return value from a generator
> somewhere so that each StopIteration raised by an operation on the closed
> generator could have the same value as the StopIteration that closed it (or
> None if it was closed by another exception). ?If that value can't live on
> the generator object, I guess that idea is dead. ?Am I right?

Right.

>> If this means that you can't use yield-from for some of your use
>> cases, well, so be it. It has plenty of other use cases.
>>
>
> That is exactly what I am worried about. ?I think the number of use cases
> will be severely limited if we don't have a way to replace the initial
> next() made by yield-from.

It doesn't matter if there is only one use case, as long as it is a
common one. And we already have that: Greg Ewing's "refactoring".

Remember, Dave Beazley in his coroutine tutorial expresses doubts
about whether it is really that useful to use generators as
coroutines.

You are fighting a losing battle here, and it would be better if we
stopped short of trying to attain perfection, and instead accepted an
imperfect solution that may be sufficient, or may have to be extended
in the future. If your hypothetical use cases really become important
you can write another PEP.

> This is different from the close() issues we
> debated earlier, where there is a relatively simple workaround. ?The full
> workaround for the "initial next()" issue is big, ugly, and slow.
>
> Anyway, I still hope the workaround won't be needed.

Not if you give up now.

>> And no, I am not prepared to defend this recommendation. But I feel
>> very strongly about it. So don't challenge me -- it's just going to be
>> a waste of everyone's time to continue this line of thought.
>>
>
> No, I am not going to challenge you on this. ? Once I have your answer to
> the questions at the beginning of this mail, I will try to adjust my future
> proposals accordingly.

Great.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)


From ncoghlan at gmail.com  Thu Apr  9 12:09:52 2009
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Thu, 09 Apr 2009 20:09:52 +1000
Subject: [Python-ideas] x=(yield from) confusion [was:Yet another
 alternative name for yield-from]
In-Reply-To: <49DD348A.4010409@canterbury.ac.nz>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com>	<49D60B81.6060209@gmail.com>	<fb6fbf560904030948t23ad6b08t77e71017e7c62853@mail.gmail.com>	<49D64BA7.2000006@improva.dk>	<ca471dc20904031421j2c6c8f49kece03fe24ca539ee@mail.gmail.com>	<49D69AFA.5070600@improva.dk>	<ca471dc20904041329p31571e51o44d0ea8bcd74f96f@mail.gmail.com>	<49D8C642.3080308@improva.dk>
	<49D940F6.60107@canterbury.ac.nz>	<49DA0687.9010100@improva.dk>
	<49DA8423.9070101@canterbury.ac.nz>	<49DAC193.3030008@improva.dk>
	<49DBCCC6.1080601@gmail.com>	<49DC04F7.5080707@improva.dk>
	<49DC8BD0.4080303@gmail.com>	<49DC9AE8.8020108@improva.dk>
	<49DD348A.4010409@canterbury.ac.nz>
Message-ID: <49DDC970.4030604@gmail.com>

Greg Ewing wrote:
> An alternative viewpoint would be the idea that a coroutine
> should always start itself automatically is too simplistic.

That's the angle I've been taking. I can see why it can be convenient to
start a coroutine automatically, but only in the same way that a thread
creation function that also starts the thread for you can be convenient.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------


From jh at improva.dk  Thu Apr  9 14:13:33 2009
From: jh at improva.dk (Jacob Holm)
Date: Thu, 09 Apr 2009 14:13:33 +0200
Subject: [Python-ideas] x=(yield from) confusion [was:Yet another
 alternative name for yield-from]
In-Reply-To: <ca471dc20904082055s5f838a43tc1c175c8bf02cad9@mail.gmail.com>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com>
	<49DA8423.9070101@canterbury.ac.nz>
	<49DAC193.3030008@improva.dk> <49DBCCC6.1080601@gmail.com>
	<49DC04F7.5080707@improva.dk> <49DC8BD0.4080303@gmail.com>
	<49DC9AE8.8020108@improva.dk> <49DCBD02.8030806@improva.dk>
	<ca471dc20904081121x56e61879q27ae9da604e82dc7@mail.gmail.com>
	<49DD62FA.9080504@improva.dk>
	<ca471dc20904082055s5f838a43tc1c175c8bf02cad9@mail.gmail.com>
Message-ID: <49DDE66D.9080401@improva.dk>

Guido van Rossum wrote:
> On Wed, Apr 8, 2009 at 7:52 PM, Jacob Holm <jh at improva.dk> wrote:
>   
>> Does storing it as part of the frame object count as "naturally in the stack
>> frame"?  Because that is probably the most natural place to put this,
>> implementation-wise.
>>     
>
> No, that's out too. My point is that I don't want to add *any* state
> beyond what the user thinks of as the "normal" state in the frame
> (i.e. local variables, where it is suspended, and the expression stack
> and try-except stack). Nothing else.
>   

That rules out Gregs and my patches as well.  They both need extra state 
on the frame object to be able to implement yield-from in the first place.


>> If storing on the frame object is also out, I will
>> have to start thinking about new syntax again.  Oh well.
>>     
>
> Sorry, no go. New syntax is also out.
>   

In that case, I give up.  There is no possible way to fix the "initial 
next()" issue of yield-from without one or the other.


>> I was going to push for saving the final return value from a generator
>> somewhere so that each StopIteration raised by an operation on the closed
>> generator could have the same value as the StopIteration that closed it (or
>> None if it was closed by another exception).  If that value can't live on
>> the generator object, I guess that idea is dead.  Am I right?
>>     
>
> Right.
>   

Then let me revisit my earlier statement that when close() catches a 
StopIteration with a non-None value, it should either return it or raise 
an exception.  Since the value is not saved, a second close() will 
neither be able to return it, nor raise a StopIteration with it.  
Therefore I now think that raising a RuntimeError in that case is the 
only right thing to do.


>>> If this means that you can't use yield-from for some of your use
>>> cases, well, so be it. It has plenty of other use cases.
>>>
>>>       
>> That is exactly what I am worried about.  I think the number of use cases
>> will be severely limited if we don't have a way to replace the initial
>> next() made by yield-from.
>>     
>
> It doesn't matter if there is only one use case, as long as it is a
> common one. And we already have that: Greg Ewing's "refactoring".
>   

I remain unconvinced that the "initial next()" issue isn't also a 
problem for that use case, but I am not going to argue about this.


> Remember, Dave Beazley in his coroutine tutorial expresses doubts
> about whether it is really that useful to use generators as
> coroutines.
>
> You are fighting a losing battle here, and it would be better if we
> stopped short of trying to attain perfection, and instead accepted an
> imperfect solution that may be sufficient, or may have to be extended
> in the future. If your hypothetical use cases really become important
> you can write another PEP.
>
>   

Looks to me like the battle is not so much losing as already lost, and 
probably was from the start.  I just wish I had understood that earlier.

One final suggestion I have is to make yield-from raise a RuntimeError 
if used on a generator that already has a frame.  That would a) open 
some optimization possibilities, b) make it clear that the only intended 
use is with a *fresh* generator or a non-generator iterable, and c) 
allow us to change our minds about the initial next() later without 
changing the semantics of working code.

>> This is different from the close() issues we
>> debated earlier, where there is a relatively simple workaround.  The full
>> workaround for the "initial next()" issue is big, ugly, and slow.
>>
>> Anyway, I still hope the workaround won't be needed.
>>     
>
> Not if you give up now.
>   


Well, since I am giving up on fixing the "initial next()" issue in the 
core, the workaround *will* be needed.  Maybe not in the PEP, but 
certainly as a recipe somewhere.  If you don't mind, I will continue the 
discussion about the details of such a workaround that Nick started a 
couple of mails back in this thread.


>>> And no, I am not prepared to defend this recommendation. But I feel
>>> very strongly about it. So don't challenge me -- it's just going to be
>>> a waste of everyone's time to continue this line of thought.
>>>
>>>       
>> No, I am not going to challenge you on this.   Once I have your answer to
>> the questions at the beginning of this mail, I will try to adjust my future
>> proposals accordingly.
>>     
>
> Great.
>   

Frustrated-ly yours
- Jacob


From ncoghlan at gmail.com  Thu Apr  9 15:02:49 2009
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Thu, 09 Apr 2009 23:02:49 +1000
Subject: [Python-ideas] x=(yield from) confusion [was:Yet another
 alternative name for yield-from]
In-Reply-To: <49DDE66D.9080401@improva.dk>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com>	<49DA8423.9070101@canterbury.ac.nz>	<49DAC193.3030008@improva.dk>
	<49DBCCC6.1080601@gmail.com>	<49DC04F7.5080707@improva.dk>
	<49DC8BD0.4080303@gmail.com>	<49DC9AE8.8020108@improva.dk>
	<49DCBD02.8030806@improva.dk>	<ca471dc20904081121x56e61879q27ae9da604e82dc7@mail.gmail.com>	<49DD62FA.9080504@improva.dk>	<ca471dc20904082055s5f838a43tc1c175c8bf02cad9@mail.gmail.com>
	<49DDE66D.9080401@improva.dk>
Message-ID: <49DDF1F9.8060904@gmail.com>

Jacob Holm wrote:
> Then let me revisit my earlier statement that when close() catches a
> StopIteration with a non-None value, it should either return it or raise
> an exception.  Since the value is not saved, a second close() will
> neither be able to return it, nor raise a StopIteration with it. 
> Therefore I now think that raising a RuntimeError in that case is the
> only right thing to do.

Remember, close() is designed to be about finalization. So long as the
generator indicates that it has finished (i.e. by reraising
GeneratorExit or raising StopIteration with or without a value), the
method has done its job. Raising a RuntimeError for a successfully
closed generator doesn't make any sense.

So if someone wants the return value, they'll need to either use next(),
send() or throw() and catch the StopIteration themselves, or else use
'yield from'.

That said, creating your own stateful wrapper that preserves the last
yield value and the final return value of a generator iterator is also
perfectly possible:

  class CaptureGen(object):
    """Capture and preserve the last yielded value and the
       final return value of a generator iterator instance"""
    NOT_SET = object()

    def __init__(self, geniter):
      self.geniter = geniter
      self._last_yield = self.NOT_SET
      self._return_value = self.NOT_SET

    @property
    def last_yield(self):
      if self._last_yield is self.NOT_SET:
        raise RuntimeError("Generator has not yielded")
      return self._last_yield

    @property
    def return_value(self):
      if self._return_value is self.NOT_SET:
        raise RuntimeError("Generator has not returned")
      return self._return_value

    def _delegate(self, meth, *args):
      try:
        val = meth(*args)
      except StopIteration, ex:
        if self._return_value is self.NOT_SET:
          self._return_value = ex.value
          raise
        raise StopIteration(self._return_value)
      self._last_yield = val
      return val

    def __next__(self):
      return self._delegate(self.geniter.next)
    next = __next__

    def send(self, val):
      return self._delegate(self.geniter.send, val)

    def throw(self, et, ev=None, tb=None):
      return self._delegate(self.geniter.throw, et, ev, tb)

    def close(self):
      self.geniter.close()
      return self._return_value

Something like that may actually turn out to be useful as the basis for
an enhanced coroutine decorator, similar to the way one uses
contextlib.contextmanager to turn a generator object into a context
manager. The PEP is quite usable for refactoring without it though.

>> It doesn't matter if there is only one use case, as long as it is a
>> common one. And we already have that: Greg Ewing's "refactoring".
>>   
> 
> I remain unconvinced that the "initial next()" issue isn't also a
> problem for that use case, but I am not going to argue about this.

For refactoring, the pattern of passing in a "start" value for use in
the first yield expression in the subiterator should be adequate. That's
enough to avoid injecting spurious "None" values into the yield sequence.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------


From jh at improva.dk  Thu Apr  9 16:21:53 2009
From: jh at improva.dk (Jacob Holm)
Date: Thu, 09 Apr 2009 16:21:53 +0200
Subject: [Python-ideas] x=(yield from) confusion [was:Yet another
 alternative name for yield-from]
In-Reply-To: <49DDF1F9.8060904@gmail.com>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com>	<49DA8423.9070101@canterbury.ac.nz>	<49DAC193.3030008@improva.dk>
	<49DBCCC6.1080601@gmail.com>	<49DC04F7.5080707@improva.dk>
	<49DC8BD0.4080303@gmail.com>	<49DC9AE8.8020108@improva.dk>
	<49DCBD02.8030806@improva.dk>	<ca471dc20904081121x56e61879q27ae9da604e82dc7@mail.gmail.com>	<49DD62FA.9080504@improva.dk>	<ca471dc20904082055s5f838a43tc1c175c8bf02cad9@mail.gmail.com>
	<49DDE66D.9080401@improva.dk> <49DDF1F9.8060904@gmail.com>
Message-ID: <49DE0481.1010309@improva.dk>

Nick Coghlan wrote:
> Jacob Holm wrote:
>   
>> Then let me revisit my earlier statement that when close() catches a
>> StopIteration with a non-None value, it should either return it or raise
>> an exception.  Since the value is not saved, a second close() will
>> neither be able to return it, nor raise a StopIteration with it. 
>> Therefore I now think that raising a RuntimeError in that case is the
>> only right thing to do.
>>     
>
> Remember, close() is designed to be about finalization. So long as the
> generator indicates that it has finished (i.e. by reraising
> GeneratorExit or raising StopIteration with or without a value), the
> method has done its job. Raising a RuntimeError for a successfully
> closed generator doesn't make any sense.
>   

Returning a value in response to GeneratorExit is what doesn't make any 
sense when there is no possible way to access that value later.  Raising 
a RuntimeError in this case will be a clear reminder of this.  Think 
newbie protection if you will.

> So if someone wants the return value, they'll need to either use next(),
> send() or throw() and catch the StopIteration themselves, or else use
> 'yield from'.
>   

Yes.  How is that an argument against making close raise a RuntimeError 
if it catches a StopIteration with a non-None value?

> That said, creating your own stateful wrapper that preserves the last
> yield value and the final return value of a generator iterator is also
> perfectly possible:
>
>   class CaptureGen(object):
>     """Capture and preserve the last yielded value and the
>        final return value of a generator iterator instance"""
>     NOT_SET = object()
>
>     def __init__(self, geniter):
>       self.geniter = geniter
>       self._last_yield = self.NOT_SET
>       self._return_value = self.NOT_SET
>
>     @property
>     def last_yield(self):
>       if self._last_yield is self.NOT_SET:
>         raise RuntimeError("Generator has not yielded")
>       return self._last_yield
>
>     @property
>     def return_value(self):
>       if self._return_value is self.NOT_SET:
>         raise RuntimeError("Generator has not returned")
>       return self._return_value
>
>     def _delegate(self, meth, *args):
>       try:
>         val = meth(*args)
>       except StopIteration, ex:
>         if self._return_value is self.NOT_SET:
>           self._return_value = ex.value
>           raise
>         raise StopIteration(self._return_value)
>       self._last_yield = val
>       return val
>
>     def __next__(self):
>       return self._delegate(self.geniter.next)
>     next = __next__
>
>     def send(self, val):
>       return self._delegate(self.geniter.send, val)
>
>     def throw(self, et, ev=None, tb=None):
>       return self._delegate(self.geniter.throw, et, ev, tb)
>
>     def close(self):
>       self.geniter.close()
>       return self._return_value
>
> Something like that may actually turn out to be useful as the basis for
> an enhanced coroutine decorator, similar to the way one uses
> contextlib.contextmanager to turn a generator object into a context
> manager. The PEP is quite usable for refactoring without it though.
>
>   

I know it is possible to create a complete workaround.  The problem is 
that any complete workaround will break the chain of yield-from calls, 
causing a massive overhead on next(), send(), and throw() compared to a 
partial workaround that doesn't break the yield-from chain.

> For refactoring, the pattern of passing in a "start" value for use in
> the first yield expression in the subiterator should be adequate. That's
> enough to avoid injecting spurious "None" values into the yield sequence.
>   
If you define "refactoring" narrowly enough, you are probably right.

Otherwise it depends on how you want the new subgenerator to work.  If 
you want it to be a @coroutine (with or without the decorator) there are 
good reasons for wanting to throw away the value from the first next() 
and provide an initial value to send() or throw() immediately after 
that, so the first value yielded by the yield-from becomes the result of 
the send() or throw().  Taking just a simple "start" value in the 
constructor and making the first yield an "x = yield start" doesn't 
support that use.  Taking a compound "start" value in the constructor 
and making the first yield an "x = yield from cr_init(start)" does, by 
skipping the first yield when neccessary and simulating the send() or 
throw().


- Jacob


From jimjjewett at gmail.com  Thu Apr  9 18:47:18 2009
From: jimjjewett at gmail.com (Jim Jewett)
Date: Thu, 9 Apr 2009 12:47:18 -0400
Subject: [Python-ideas] x=(yield from) confusion [was:Yet another
	alternative name for yield-from]
In-Reply-To: <49DDF1F9.8060904@gmail.com>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com>
	<49DC04F7.5080707@improva.dk> <49DC8BD0.4080303@gmail.com>
	<49DC9AE8.8020108@improva.dk> <49DCBD02.8030806@improva.dk>
	<ca471dc20904081121x56e61879q27ae9da604e82dc7@mail.gmail.com>
	<49DD62FA.9080504@improva.dk>
	<ca471dc20904082055s5f838a43tc1c175c8bf02cad9@mail.gmail.com>
	<49DDE66D.9080401@improva.dk> <49DDF1F9.8060904@gmail.com>
Message-ID: <fb6fbf560904090947n9291841g5cdadb4322ad4b00@mail.gmail.com>

On 4/9/09, Nick Coghlan <ncoghlan at gmail.com> wrote:
> Jacob Holm wrote:
>> Then let me revisit my earlier statement that when close() catches a
>> StopIteration with a non-None value, it should either return it or raise
>> an exception.

This implies that the value is always important; I write plenty of
functions that don't bother to return anything, and expect that would
still be common if I were factoring out loops.

>> Since the value is not saved, a second close() will
>> neither be able to return it, nor raise a StopIteration with it.

If close is called more than once, that suggests at least one of those
calls is just freeing resources, and doesn't need a value.

>>> It doesn't matter if there is only one use case, as long as it is a
>>> common one. And we already have that: Greg Ewing's "refactoring".

>> I remain unconvinced that the "initial next()" issue isn't also a
>> problem for that use case, but I am not going to argue about this.

That does suggest that yield-from *should* accept pre-started
generators, if only because the previous line (or a decorator) may
have primed it.

> For refactoring, the pattern of passing in a "start" value for use in
> the first yield expression in the subiterator should be adequate.

Only if you know what the first value should be.  If you're using a
sentinel that you plan to discard, then it would be way simpler to
just prime the generator before the yield-from.

-jJ


From jimjjewett at gmail.com  Thu Apr  9 18:54:41 2009
From: jimjjewett at gmail.com (Jim Jewett)
Date: Thu, 9 Apr 2009 12:54:41 -0400
Subject: [Python-ideas] yiled-from restrictions [was: x=(yield from)
	confusion]
Message-ID: <fb6fbf560904090954h4376743do576b5d7a2dfbd33a@mail.gmail.com>

On 4/9/09, Jacob Holm <jh at improva.dk> wrote:
> Guido van Rossum wrote:
>> On Wed, Apr 8, 2009 at 7:52 PM, Jacob Holm <jh at improva.dk> wrote:

>>> Does storing it as part of the frame object count as "naturally in the
>>> stack frame"?

>> No, that's out too. My point is that I don't want to add *any* state
>> beyond what the user thinks of as the "normal" state in the frame
>> (i.e. local variables, where it is suspended, and the expression stack
>> and try-except stack). Nothing else.

Do you mean nothing user-visible, or do you really mean nothing, not
even as an implementation detail?

> That rules out Gregs and my patches as well.  They both need extra state
> on the frame object to be able to implement yield-from in the first place.

Well, to do it efficiently anyhow... but it doesn't need to be
user-visible, which is why I asked for clarification.

> One final suggestion I have is to make yield-from raise a RuntimeError
> if used on a generator that already has a frame.

That would prevent priming the generator.  It would also prevent

"handled the header lines already; pass the data records off to a
different routine."

-jJ


From denis.spir at free.fr  Thu Apr  9 19:08:36 2009
From: denis.spir at free.fr (spir)
Date: Thu, 9 Apr 2009 19:08:36 +0200
Subject: [Python-ideas] accurate errors for "magic" methods
Message-ID: <20090409190836.6c1b0802@o>

Hello,

Below a post on python-tutor and my trial to explain what happens.

==============================================
> The error I get when running the above code:
> 
> Traceback (most recent call last):
> 
>   File "listinclass.py", line 34, in <module>
> 
>     test.load_ini()
> 
>   File "listinclass.py", line 17, in load_ini
> 
>     print self.opt['inifile']
> 
> AttributeError: Values instance has no attribute '__getitem__'

This means the following:
* You're trying to access an item inside something called self.opt, by using a key: 'inifile'.
* self.opt should then be a kind of container able to return you an item from a key; in other words, it is supposed to be an object that behaves like a dictionary, right?
* The "magic" method the allows this behaviour is called __getitem__ (the same as for list indexing, actually). So that python looks for this method attribute in the class of self.opt, that happens to be Values, and tells you...

Sure, not very clear!
=====================================================

Actually, I'm wrong: it's perfectly clear as long as the programmer is able to follow all the necessary reflexion path; then probably also able to solve the problem without any help from python. 

The issue here is that a very specific (and meaningful) case (dict-like behaviour missing) is adressed using a very generic (and thus helpless) message (attributeError).

I think error cases about "magic" methods, that implement conceptually meaningful behaviours, should have appropriate messages. In the case above, maybe something like:
"Values instance is not an item container (no __getitem__ method found)."

There may be a sensible relationship with the ABC hierarchy that precisely provides naming and specification to types implementating (some of) the magic methods.
I would love a BehaviourError ;-)

Denis
------
la vita e estrany


From jh at improva.dk  Thu Apr  9 19:32:07 2009
From: jh at improva.dk (Jacob Holm)
Date: Thu, 09 Apr 2009 19:32:07 +0200
Subject: [Python-ideas] x=(yield from) confusion [was:Yet another
 alternative name for yield-from]
In-Reply-To: <fb6fbf560904090947n9291841g5cdadb4322ad4b00@mail.gmail.com>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com>	
	<49DC04F7.5080707@improva.dk> <49DC8BD0.4080303@gmail.com>	
	<49DC9AE8.8020108@improva.dk> <49DCBD02.8030806@improva.dk>	
	<ca471dc20904081121x56e61879q27ae9da604e82dc7@mail.gmail.com>	
	<49DD62FA.9080504@improva.dk>	
	<ca471dc20904082055s5f838a43tc1c175c8bf02cad9@mail.gmail.com>	
	<49DDE66D.9080401@improva.dk> <49DDF1F9.8060904@gmail.com>
	<fb6fbf560904090947n9291841g5cdadb4322ad4b00@mail.gmail.com>
Message-ID: <49DE3117.1030508@improva.dk>

Jim Jewett wrote:
> On 4/9/09, Nick Coghlan <ncoghlan at gmail.com> wrote:
>   
>> Jacob Holm wrote:
>>     
>>> Then let me revisit my earlier statement that when close() catches a
>>> StopIteration with a non-None value, it should either return it or raise
>>> an exception.
>>>       
>
> This implies that the value is always important; I write plenty of
> functions that don't bother to return anything, and expect that would
> still be common if I were factoring out loops.
>   

If you return None (or no value) you won't get an exception with my 
proposal.  Only if you handle a GeneratorExit by returning a non-None 
value.  Since that value is not going to be visible to any other code, 
it doesn't make sense to return it.  I am suggesting that you should get 
a RuntimeError so you become aware that returning the value doesn't make 
sense.

>>> Since the value is not saved, a second close() will
>>> neither be able to return it, nor raise a StopIteration with it.
>>>       
>
> If close is called more than once, that suggests at least one of those
> calls is just freeing resources, and doesn't need a value.
>
>   
>>>> It doesn't matter if there is only one use case, as long as it is a
>>>> common one. And we already have that: Greg Ewing's "refactoring".
>>>>         
>
>   
>>> I remain unconvinced that the "initial next()" issue isn't also a
>>> problem for that use case, but I am not going to argue about this.
>>>       
>
> That does suggest that yield-from *should* accept pre-started
> generators, if only because the previous line (or a decorator) may
> have primed it.
>
>   

That was my argument, but since there is no sane way of handling 
pre-primed generators without extending the PEP in a direction that 
Guido has forbidden, I suggest raising a RuntimeError in this case 
instead.   That allows us to add the necessary features later if the 
need is recognized.

>> For refactoring, the pattern of passing in a "start" value for use in
>> the first yield expression in the subiterator should be adequate.
>>     
>
> Only if you know what the first value should be.  If you're using a
> sentinel that you plan to discard, then it would be way simpler to
> just prime the generator before the yield-from.
>
>   

Yeah, but yield from with pre-primed generators just ain't gonna happen. :(


- Jacob



From jh at improva.dk  Thu Apr  9 19:50:05 2009
From: jh at improva.dk (Jacob Holm)
Date: Thu, 09 Apr 2009 19:50:05 +0200
Subject: [Python-ideas] yiled-from restrictions [was: x=(yield from)
	confusion]
In-Reply-To: <fb6fbf560904090954h4376743do576b5d7a2dfbd33a@mail.gmail.com>
References: <fb6fbf560904090954h4376743do576b5d7a2dfbd33a@mail.gmail.com>
Message-ID: <49DE354D.2090100@improva.dk>

Jim Jewett wrote:
> On 4/9/09, Jacob Holm <jh at improva.dk> wrote:
>   
>> That rules out Gregs and my patches as well.  They both need extra state
>> on the frame object to be able to implement yield-from in the first place.
>>     
>
> Well, to do it efficiently anyhow... but it doesn't need to be
> user-visible, which is why I asked for clarification.
>
>   

Sure, if you change the patch to emit the same byte codes as the 
expansion in the PEP would, you can get by without.  Of course that 
would increase the overhead of each delegated next(), send() or throw() 
call by at least one order of magnitude.  (That is based on actual 
testing, where I compared my implementation of yield-from with a yield 
in a simple for-loop.  Since the PEP expansion is more complex than a 
for-loop, I expect the overhead to be somewhat greater).

>> One final suggestion I have is to make yield-from raise a RuntimeError
>> if used on a generator that already has a frame.
>>     
>
> That would prevent priming the generator.  It would also prevent
>
> "handled the header lines already; pass the data records off to a
> different routine."
>   

Yes, but since there is no sane way the this can work in the current 
proposal, I would rather raise a RuntimeError.


- Jacob


From g.brandl at gmx.net  Thu Apr  9 19:56:16 2009
From: g.brandl at gmx.net (Georg Brandl)
Date: Thu, 09 Apr 2009 17:56:16 +0000
Subject: [Python-ideas] accurate errors for "magic" methods
In-Reply-To: <20090409190836.6c1b0802@o>
References: <20090409190836.6c1b0802@o>
Message-ID: <grlcs0$p5c$1@ger.gmane.org>

spir schrieb:

> Actually, I'm wrong: it's perfectly clear as long as the programmer is able
> to follow all the necessary reflexion path; then probably also able to solve
> the problem without any help from python.
> 
> The issue here is that a very specific (and meaningful) case (dict-like
> behaviour missing) is adressed using a very generic (and thus helpless)
> message (attributeError).
> 
> I think error cases about "magic" methods, that implement conceptually
> meaningful behaviours, should have appropriate messages. In the case above,
> maybe something like: "Values instance is not an item container (no
> __getitem__ method found)."

The time machine strikes again:

>>> class A(object): pass
...
>>> A()['a']
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
TypeError: 'A' object is unsubscriptable


(the difference being that A is new-style, while Values is old-style.)

Georg



From pyideas at rebertia.com  Thu Apr  9 20:33:46 2009
From: pyideas at rebertia.com (Chris Rebert)
Date: Thu, 9 Apr 2009 11:33:46 -0700
Subject: [Python-ideas] accurate errors for "magic" methods
In-Reply-To: <grlcs0$p5c$1@ger.gmane.org>
References: <20090409190836.6c1b0802@o> <grlcs0$p5c$1@ger.gmane.org>
Message-ID: <50697b2c0904091133j8f0f41cm4f9db20435a9822c@mail.gmail.com>

On Thu, Apr 9, 2009 at 10:56 AM, Georg Brandl <g.brandl at gmx.net> wrote:
> spir schrieb:
>
>> Actually, I'm wrong: it's perfectly clear as long as the programmer is able
>> to follow all the necessary reflexion path; then probably also able to solve
>> the problem without any help from python.
>>
>> The issue here is that a very specific (and meaningful) case (dict-like
>> behaviour missing) is adressed using a very generic (and thus helpless)
>> message (attributeError).
>>
>> I think error cases about "magic" methods, that implement conceptually
>> meaningful behaviours, should have appropriate messages. In the case above,
>> maybe something like: "Values instance is not an item container (no
>> __getitem__ method found)."
>
> The time machine strikes again:
>
>>>> class A(object): pass
> ...
>>>> A()['a']
> Traceback (most recent call last):
>  File "<stdin>", line 1, in <module>
> TypeError: 'A' object is unsubscriptable

And if PEP 3134 (http://www.python.org/dev/peps/pep-3134/) were
accepted+implemented, this could be made even clearer by using
exception chaining to indicate that the TypeError was caused (at least
semantically, disregarding optimizations) by the AttributeError.

Cheers,
Chris

--
I have a blog:
http://blog.rebertia.com


From benjamin at python.org  Thu Apr  9 22:45:22 2009
From: benjamin at python.org (Benjamin Peterson)
Date: Thu, 9 Apr 2009 20:45:22 +0000 (UTC)
Subject: [Python-ideas] accurate errors for "magic" methods
References: <20090409190836.6c1b0802@o> <grlcs0$p5c$1@ger.gmane.org>
	<50697b2c0904091133j8f0f41cm4f9db20435a9822c@mail.gmail.com>
Message-ID: <loom.20090409T204452-389@post.gmane.org>

Chris Rebert <pyideas at ...> writes:
> And if PEP 3134 (http://www.python.org/dev/peps/pep-3134/) were
> accepted+implemented, this could be made even clearer by using
> exception chaining to indicate that the TypeError was caused (at least
> semantically, disregarding optimizations) by the AttributeError.

And the time machine hits again! PEP 3134 has been implemented in Py3k.






From greg.ewing at canterbury.ac.nz  Fri Apr 10 02:35:14 2009
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Fri, 10 Apr 2009 12:35:14 +1200
Subject: [Python-ideas] x=(yield from) confusion [was:Yet another
 alternative name for yield-from]
In-Reply-To: <49DDE66D.9080401@improva.dk>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com>
	<49DA8423.9070101@canterbury.ac.nz> <49DAC193.3030008@improva.dk>
	<49DBCCC6.1080601@gmail.com> <49DC04F7.5080707@improva.dk>
	<49DC8BD0.4080303@gmail.com> <49DC9AE8.8020108@improva.dk>
	<49DCBD02.8030806@improva.dk>
	<ca471dc20904081121x56e61879q27ae9da604e82dc7@mail.gmail.com>
	<49DD62FA.9080504@improva.dk>
	<ca471dc20904082055s5f838a43tc1c175c8bf02cad9@mail.gmail.com>
	<49DDE66D.9080401@improva.dk>
Message-ID: <49DE9442.50908@canterbury.ac.nz>

Jacob Holm wrote:

> That rules out Gregs and my patches as well.  They both need extra state 
> on the frame object to be able to implement yield-from in the first place.

But that state is obviously necessary in order to support
yield-from, and it goes away as soon as the yield-from
itself finishes.

Your proposals add non-obvious extra state that persists
longer than normally expected, to support obscure features
that will rarely be used.

> One final suggestion I have is to make yield-from raise a RuntimeError 
> if used on a generator that already has a frame.  That would ...
 > b) make it clear that the only intended
> use is with a *fresh* generator or a non-generator iterable,

But there's no way of detecting a violation of that rule
for a non-generator iterator, and I want to avoid having
special cases for generators, since it goes against duck
typing.

-- 
Greg


From greg.ewing at canterbury.ac.nz  Fri Apr 10 03:11:55 2009
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Fri, 10 Apr 2009 13:11:55 +1200
Subject: [Python-ideas] x=(yield from) confusion [was:Yet another
 alternative name for yield-from]
In-Reply-To: <fb6fbf560904090947n9291841g5cdadb4322ad4b00@mail.gmail.com>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com>
	<49DC04F7.5080707@improva.dk> <49DC8BD0.4080303@gmail.com>
	<49DC9AE8.8020108@improva.dk> <49DCBD02.8030806@improva.dk>
	<ca471dc20904081121x56e61879q27ae9da604e82dc7@mail.gmail.com>
	<49DD62FA.9080504@improva.dk>
	<ca471dc20904082055s5f838a43tc1c175c8bf02cad9@mail.gmail.com>
	<49DDE66D.9080401@improva.dk> <49DDF1F9.8060904@gmail.com>
	<fb6fbf560904090947n9291841g5cdadb4322ad4b00@mail.gmail.com>
Message-ID: <49DE9CDB.4090703@canterbury.ac.nz>

Jim Jewett wrote:

> That does suggest that yield-from *should* accept pre-started
> generators, if only because the previous line (or a decorator) may
> have primed it.

If the previous line has primed it, then it's not a
fresh iterator, so all bets are off.

Same if a decorator has primed it in such a way that
it doesn't behave like a fresh iterator.

-- 
Greg


From ncoghlan at gmail.com  Fri Apr 10 03:54:44 2009
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Fri, 10 Apr 2009 11:54:44 +1000
Subject: [Python-ideas] yiled-from restrictions [was: x=(yield
	from)	confusion]
In-Reply-To: <49DE354D.2090100@improva.dk>
References: <fb6fbf560904090954h4376743do576b5d7a2dfbd33a@mail.gmail.com>
	<49DE354D.2090100@improva.dk>
Message-ID: <49DEA6E4.1050508@gmail.com>

Jacob Holm wrote:
> Jim Jewett wrote:
> Yes, but since there is no sane way the this can work in the current
> proposal, I would rather raise a RuntimeError.

Sure there is.

E.g., suppose we have a simple file format with one header per line,
separated from the main body by a line starting with "~". Header lines
are processed on a per-line basis, but for the body, the unaltered lines
are passed back to the client:

  HeaderFinished = object()
  def process_file(f):
    for line in f:
      if line.startswith('~'):
        break
      yield process_header(line)
    yield HeaderFinished
    yield from f

For more complex cases, once caching of the methods in yield-from is
explicitly disallowed (as per my other message), you can do something like:

>>> def start(iterable, start):
...     itr = iter(iterable)
...     class TrapFirstNext(object):
...       def __iter__(self):
...         return self
...       def next(self):
...         TrapFirstNext.next = itr.next
...         return start
...     return TrapFirstNext()
...
>>> for val in start(range(2), "Hello World!"):
...   print val
...
Hello World!
0
1

So long as the PEP explicitly disallows caching of the bound methods
(which I now think it should do) you can get as creative as you like by
manipulating dynamically generated types at runtime.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------


From ncoghlan at gmail.com  Fri Apr 10 04:04:50 2009
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Fri, 10 Apr 2009 12:04:50 +1000
Subject: [Python-ideas] accurate errors for "magic" methods
In-Reply-To: <loom.20090409T204452-389@post.gmane.org>
References: <20090409190836.6c1b0802@o>
	<grlcs0$p5c$1@ger.gmane.org>	<50697b2c0904091133j8f0f41cm4f9db20435a9822c@mail.gmail.com>
	<loom.20090409T204452-389@post.gmane.org>
Message-ID: <49DEA942.8040003@gmail.com>

Benjamin Peterson wrote:
> Chris Rebert <pyideas at ...> writes:
>> And if PEP 3134 (http://www.python.org/dev/peps/pep-3134/) were
>> accepted+implemented, this could be made even clearer by using
>> exception chaining to indicate that the TypeError was caused (at least
>> semantically, disregarding optimizations) by the AttributeError.
> 
> And the time machine hits again! PEP 3134 has been implemented in Py3k.

Not quite - I'm pretty sure 3134 only kicks in reliably if an exception
makes it back into a Python exception handler. For C code, it isn't
necessarily the case that __cause__ and __context__ will be hooked up
automatically (and in the specific case of TypeErrors coming from
abstract.c, there is never an AttributeError in the first place - the
lookup code is checking specific slots on the type object rather than
doing a normal attribute lookup).

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------


From jimjjewett at gmail.com  Fri Apr 10 04:41:59 2009
From: jimjjewett at gmail.com (Jim Jewett)
Date: Thu, 9 Apr 2009 22:41:59 -0400
Subject: [Python-ideas] x=(yield from) confusion [was:Yet another
	alternative name for yield-from]
In-Reply-To: <49DE3117.1030508@improva.dk>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com>
	<49DC9AE8.8020108@improva.dk> <49DCBD02.8030806@improva.dk>
	<ca471dc20904081121x56e61879q27ae9da604e82dc7@mail.gmail.com>
	<49DD62FA.9080504@improva.dk>
	<ca471dc20904082055s5f838a43tc1c175c8bf02cad9@mail.gmail.com>
	<49DDE66D.9080401@improva.dk> <49DDF1F9.8060904@gmail.com>
	<fb6fbf560904090947n9291841g5cdadb4322ad4b00@mail.gmail.com>
	<49DE3117.1030508@improva.dk>
Message-ID: <fb6fbf560904091941r64b343aewf19bf1c1f76cdae1@mail.gmail.com>

On 4/9/09, Jacob Holm <jh at improva.dk> wrote:
> Jim Jewett wrote:
>> On 4/9/09, Nick Coghlan <ncoghlan at gmail.com> wrote:
>>> Jacob Holm wrote:

>>>> ... when close() catches a
>>>> StopIteration with a non-None value, it should either return it or raise

>> This implies that the value is always important; ...

> If you return None (or no value) you won't get an exception

Think of all the C functions that return a success status, or the
number of bytes read/written.  Checking the return code is good, but
not checking it isn't always worth an Exception.


>> ... does suggest that yield-from *should* accept pre-started
>> generators, if only because the previous line (or a decorator) may
>> have primed it.

> That was my argument, but since there is no sane way of handling
> pre-primed generators without extending the PEP in a direction that
> Guido has forbidden,

Huh?  All you need to do is to not care whether the generator is fresh
or not (let alone primed vs half-used).

If it won't be primed, *and* you can't afford to send back an extra
junk yield, then you need to prime it yourself.  That can be awkward,
but so are all the syntax extensions that boil down to "and implicitly
call next for me once".  (And the "oops, just this once, don't prime
it after all" extensions are even worse.)

-jJ


From jimjjewett at gmail.com  Fri Apr 10 04:53:29 2009
From: jimjjewett at gmail.com (Jim Jewett)
Date: Thu, 9 Apr 2009 22:53:29 -0400
Subject: [Python-ideas] yiled-from restrictions [was: x=(yield from)
	confusion]
In-Reply-To: <49DEA6E4.1050508@gmail.com>
References: <fb6fbf560904090954h4376743do576b5d7a2dfbd33a@mail.gmail.com>
	<49DE354D.2090100@improva.dk> <49DEA6E4.1050508@gmail.com>
Message-ID: <fb6fbf560904091953u6087d042t610a3bd8ac97fcc4@mail.gmail.com>

[deleted file processing example, but Greg may want to add it to the
PEP as a clear motivating use case]

On 4/9/09, Nick Coghlan <ncoghlan at gmail.com> wrote:
> Jacob Holm wrote:
> For more complex cases, once caching of the methods in yield-from is
> explicitly disallowed (as per my other message), you can do something like:

I thought part of the reason to use a generator instead of an iterator
class was speed/efficiency, which that constraint would seem to block.
(name lookups at each level)

>>>> def start(iterable, start):
> ...     itr = iter(iterable)
> ...     class TrapFirstNext(object):
> ...       def __iter__(self):
> ...         return self
> ...       def next(self):
> ...         TrapFirstNext.next = itr.next
> ...         return start
> ...     return TrapFirstNext()

And if that becomes a standard idiom, was anything really simplified?

-jJ


From ncoghlan at gmail.com  Fri Apr 10 05:16:24 2009
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Fri, 10 Apr 2009 13:16:24 +1000
Subject: [Python-ideas] yiled-from restrictions [was: x=(yield from)
 confusion]
In-Reply-To: <fb6fbf560904091953u6087d042t610a3bd8ac97fcc4@mail.gmail.com>
References: <fb6fbf560904090954h4376743do576b5d7a2dfbd33a@mail.gmail.com>	
	<49DE354D.2090100@improva.dk> <49DEA6E4.1050508@gmail.com>
	<fb6fbf560904091953u6087d042t610a3bd8ac97fcc4@mail.gmail.com>
Message-ID: <49DEBA08.6040309@gmail.com>

Jim Jewett wrote:
> [deleted file processing example, but Greg may want to add it to the
> PEP as a clear motivating use case]

It isn't really a good example - it's one of the old use cases that is
easily addressed by "for x in f: yield f" :)


> On 4/9/09, Nick Coghlan <ncoghlan at gmail.com> wrote:
>> Jacob Holm wrote:
>> For more complex cases, once caching of the methods in yield-from is
>> explicitly disallowed (as per my other message), you can do something like:
> 
> I thought part of the reason to use a generator instead of an iterator
> class was speed/efficiency, which that constraint would seem to block.
> (name lookups at each level)
> 
>>>>> def start(iterable, start):
>> ...     itr = iter(iterable)
>> ...     class TrapFirstNext(object):
>> ...       def __iter__(self):
>> ...         return self
>> ...       def next(self):
>> ...         TrapFirstNext.next = itr.next
>> ...         return start
>> ...     return TrapFirstNext()
> 
> And if that becomes a standard idiom, was anything really simplified?

Yes, because you only have to write start() once and then you can use it
as many times as you like. The big gain from yield-from is to allow code
containing a yield statement to be factored out correctly and relatively
easily even when you want to use send() and throw().

My start() example is merely intended to show that even with a
comparatively simplistic approach in the PEP that expects to be running
a fresh iterator to exhaustion every time that yield-from is used, you
can still support some fairly exotic use cases with low overhead by
dynamically updating the subiterator's methods.

The one change I am suggesting to Greg due to this is that we drop the
idea of allowing the bound methods to be cached by the implementation.
Since for loops don't cache the next() method, I no longer think the new
expression should cache any of next(), send() or throw().

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------


From guido at python.org  Fri Apr 10 06:19:09 2009
From: guido at python.org (Guido van Rossum)
Date: Thu, 9 Apr 2009 21:19:09 -0700
Subject: [Python-ideas] x=(yield from) confusion [was:Yet another
	alternative name for yield-from]
In-Reply-To: <49DE9442.50908@canterbury.ac.nz>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com> 
	<49DC04F7.5080707@improva.dk> <49DC8BD0.4080303@gmail.com> 
	<49DC9AE8.8020108@improva.dk> <49DCBD02.8030806@improva.dk> 
	<ca471dc20904081121x56e61879q27ae9da604e82dc7@mail.gmail.com> 
	<49DD62FA.9080504@improva.dk>
	<ca471dc20904082055s5f838a43tc1c175c8bf02cad9@mail.gmail.com> 
	<49DDE66D.9080401@improva.dk> <49DE9442.50908@canterbury.ac.nz>
Message-ID: <ca471dc20904092119x61d4d201n842a7203338498fa@mail.gmail.com>

On Thu, Apr 9, 2009 at 5:35 PM, Greg Ewing <greg.ewing at canterbury.ac.nz> wrote:
> Jacob Holm wrote:
>> That rules out Gregs and my patches as well. ?They both need extra state
>> on the frame object to be able to implement yield-from in the first place.
>
> But that state is obviously necessary in order to support
> yield-from, and it goes away as soon as the yield-from
> itself finishes.

Another way to look at this is, "RETVAL = yield from EXPR" has an
expansion into source code where all that state is kept as either a
local variable or via the position in the code, and that's how we
define the semantics. I don't believe Jacob's proposal (the one that
doesn't require new syntax) works this way.

> Your proposals add non-obvious extra state that persists
> longer than normally expected, to support obscure features
> that will rarely be used.
>
>> One final suggestion I have is to make yield-from raise a RuntimeError if
>> used on a generator that already has a frame. ?That would ...
>
>> b) make it clear that the only intended
>>
>> use is with a *fresh* generator or a non-generator iterable,
>
> But there's no way of detecting a violation of that rule
> for a non-generator iterator, and I want to avoid having
> special cases for generators, since it goes against duck
> typing.

It also goes against the "for x in EXPR: yield x" expansion, which I
would like to maintain as an anchor point for the semantics. To be
precise: when the .send(), .throw() and .close() methods of the outer
generator aren't used, *or* when iter(EXPR) returns a non-generator
iterator, these semantics should be maintained precisely, and the
extended semantics if .send()/.throw()/.close() are used on the outer
generator and iter(EXPR) is a generator should be natural extensions
of this, without semantic discontinuities.

PS. On Jacob's complaint that he didn't realize earlier that his
proposal was dead on arrival: I didn't either. It took me all this
time to translate my gut feelings into rules.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)


From steve at pearwood.info  Fri Apr 10 06:45:12 2009
From: steve at pearwood.info (Steven D'Aprano)
Date: Fri, 10 Apr 2009 14:45:12 +1000
Subject: [Python-ideas] accurate errors for "magic" methods
In-Reply-To: <grlcs0$p5c$1@ger.gmane.org>
References: <20090409190836.6c1b0802@o> <grlcs0$p5c$1@ger.gmane.org>
Message-ID: <200904101445.13348.steve@pearwood.info>

On Fri, 10 Apr 2009 03:56:16 am Georg Brandl wrote:
> spir schrieb:
> > Actually, I'm wrong: it's perfectly clear as long as the programmer
> > is able to follow all the necessary reflexion path; then probably
> > also able to solve the problem without any help from python.
> >
> > The issue here is that a very specific (and meaningful) case
> > (dict-like behaviour missing) is adressed using a very generic (and
> > thus helpless) message (attributeError).
> >
> > I think error cases about "magic" methods, that implement
> > conceptually meaningful behaviours, should have appropriate
> > messages. In the case above, maybe something like: "Values instance
> > is not an item container (no __getitem__ method found)."
>
> The time machine strikes again:
> >>> class A(object): pass
>
> ...
>
> >>> A()['a']
>
> Traceback (most recent call last):
>   File "<stdin>", line 1, in <module>
> TypeError: 'A' object is unsubscriptable
>
>
> (the difference being that A is new-style, while Values is
> old-style.)


Except that the error "object is unsubscriptable" might as well be in 
Klingon to most people, particularly newbies. 

(1) It's easy to misread it as "unscriptable", which is even more 
mysterious.

(2) As far as I know, there's no tradition of describing key or index 
lookup as "subscripting" in Python. I've never noticed it in doc 
strings or the online docs, and after hanging around comp.lang.python 
extensively for years, I feel safe to say it's not a common term among 
even experienced Python developers. I suppose that there's a weak 
connection between index lookup and subscripts in mathematics.

(3) The classic error message tells the newbie exactly what the error 
is: the object has no __getitem__ method. The new error message tells 
the newbie nothing useful. Given that obj is unsubscriptable, what 
needs to be done to make it subscriptable?


-- 
Steven D'Aprano


From denis.spir at free.fr  Fri Apr 10 08:52:38 2009
From: denis.spir at free.fr (spir)
Date: Fri, 10 Apr 2009 08:52:38 +0200
Subject: [Python-ideas] accurate errors for "magic" methods
In-Reply-To: <200904101445.13348.steve@pearwood.info>
References: <20090409190836.6c1b0802@o> <grlcs0$p5c$1@ger.gmane.org>
	<200904101445.13348.steve@pearwood.info>
Message-ID: <20090410085238.60ccb60a@o>

Le Fri, 10 Apr 2009 14:45:12 +1000,
Steven D'Aprano <steve at pearwood.info> s'exprima ainsi:

> On Fri, 10 Apr 2009 03:56:16 am Georg Brandl wrote:
> > spir schrieb:
> > > Actually, I'm wrong: it's perfectly clear as long as the programmer
> > > is able to follow all the necessary reflexion path; then probably
> > > also able to solve the problem without any help from python.
> > >
> > > The issue here is that a very specific (and meaningful) case
> > > (dict-like behaviour missing) is adressed using a very generic (and
> > > thus helpless) message (attributeError).
> > >
> > > I think error cases about "magic" methods, that implement
> > > conceptually meaningful behaviours, should have appropriate
> > > messages. In the case above, maybe something like: "Values instance
> > > is not an item container (no __getitem__ method found)."
> >
> > The time machine strikes again:
> > >>> class A(object): pass
> >
> > ...
> >
> > >>> A()['a']
> >
> > Traceback (most recent call last):
> >   File "<stdin>", line 1, in <module>
> > TypeError: 'A' object is unsubscriptable
> >
> >
> > (the difference being that A is new-style, while Values is
> > old-style.)
> 
> 
> Except that the error "object is unsubscriptable" might as well be in 
> Klingon to most people, particularly newbies. 
> 
> (1) It's easy to misread it as "unscriptable", which is even more 
> mysterious.
> 
> (2) As far as I know, there's no tradition of describing key or index 
> lookup as "subscripting" in Python. I've never noticed it in doc 
> strings or the online docs, and after hanging around comp.lang.python 
> extensively for years, I feel safe to say it's not a common term among 
> even experienced Python developers. I suppose that there's a weak 
> connection between index lookup and subscripts in mathematics.
> 
> (3) The classic error message tells the newbie exactly what the error 
> is: the object has no __getitem__ method. The new error message tells 
> the newbie nothing useful. Given that obj is unsubscriptable, what 
> needs to be done to make it subscriptable?
 
I do agree with all of these comments. A useful error message in such a case should
(1) Speak first at the conceptual level (if it's not a container of individual elements, it can't be indexed nore "key-ed").
(2) Do this using an idiom that a newbie has a good chance to figure out -- even if with some mental effort.
(3) Point at the precise issue from which python itself could figure out there is an error (no __getitem__)

In this specific case, an additional difficulty comes from the common __getitem__ for both index and key lookup (and even slicing).

I have no real clue how to makes things better generally. I do not mean that somathing like 
   "Values instance is not an item container (no __getitem__ method found)."
if for the best.
But I'm sure there is an issue. ABCs bring a common and consistent organisation. We could start from there, establish a standard vocabulary (including eg "subscript" as a superclass of "key" & "index") that would be reused in all docs and tutorials; and use this system for error messages instead of random (& helpless) formulation.

Denis
------
la vita e estrany


From jh at improva.dk  Fri Apr 10 09:42:57 2009
From: jh at improva.dk (Jacob Holm)
Date: Fri, 10 Apr 2009 09:42:57 +0200
Subject: [Python-ideas] x=(yield from) confusion [was:Yet another
 alternative name for yield-from]
In-Reply-To: <fb6fbf560904091941r64b343aewf19bf1c1f76cdae1@mail.gmail.com>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com>	
	<49DC9AE8.8020108@improva.dk> <49DCBD02.8030806@improva.dk>	
	<ca471dc20904081121x56e61879q27ae9da604e82dc7@mail.gmail.com>	
	<49DD62FA.9080504@improva.dk>	
	<ca471dc20904082055s5f838a43tc1c175c8bf02cad9@mail.gmail.com>	
	<49DDE66D.9080401@improva.dk> <49DDF1F9.8060904@gmail.com>	
	<fb6fbf560904090947n9291841g5cdadb4322ad4b00@mail.gmail.com>	
	<49DE3117.1030508@improva.dk>
	<fb6fbf560904091941r64b343aewf19bf1c1f76cdae1@mail.gmail.com>
Message-ID: <49DEF881.804@improva.dk>

Jim Jewett wrote:
> On 4/9/09, Jacob Holm <jh at improva.dk> wrote:
>   
>>> ... does suggest that yield-from *should* accept pre-started
>>> generators, if only because the previous line (or a decorator) may
>>> have primed it.
>>>       
>
>   
>> That was my argument, but since there is no sane way of handling
>> pre-primed generators without extending the PEP in a direction that
>> Guido has forbidden,
>>     
>
> Huh?  All you need to do is to not care whether the generator is fresh
> or not (let alone primed vs half-used).
>   

By no sane way I mean that there is no way to avoid that the first call 
made by the yield-from construct is a
next().  If the pre-primed generator is expecting a send() at that point 
you are screwed.

> If it won't be primed, *and* you can't afford to send back an extra
> junk yield, then you need to prime it yourself.  

And then the yield-from is still starting out by calling next() so you 
are still screwed.

> That can be awkward,
> but so are all the syntax extensions that boil down to "and implicitly
> call next for me once".  (And the "oops, just this once, don't prime
> it after all" extensions are even worse.)
>
>   

The only syntax extension that was really interesting was intended to 
*avoid* this call to next() by providing a different value to yield the 
first time.  Avoiding the call to next() allows you to create all kinds 
of wrappers that manipulate the start of the sequence, and covers all 
other cases I had considered syntax for.

Anyway, this syntax discussion is moot since Guido has already ruled 
that we will have no syntax for this.


- Jacob


From jh at improva.dk  Fri Apr 10 10:06:12 2009
From: jh at improva.dk (Jacob Holm)
Date: Fri, 10 Apr 2009 10:06:12 +0200
Subject: [Python-ideas] x=(yield from) confusion [was:Yet
 another	alternative name for yield-from]
In-Reply-To: <ca471dc20904092119x61d4d201n842a7203338498fa@mail.gmail.com>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com>
	<49DC04F7.5080707@improva.dk> <49DC8BD0.4080303@gmail.com>
	<49DC9AE8.8020108@improva.dk> <49DCBD02.8030806@improva.dk>
	<ca471dc20904081121x56e61879q27ae9da604e82dc7@mail.gmail.com>
	<49DD62FA.9080504@improva.dk>	<ca471dc20904082055s5f838a43tc1c175c8bf02cad9@mail.gmail.com>
	<49DDE66D.9080401@improva.dk> <49DE9442.50908@canterbury.ac.nz>
	<ca471dc20904092119x61d4d201n842a7203338498fa@mail.gmail.com>
Message-ID: <49DEFDF4.1040804@improva.dk>

Guido van Rossum wrote:
> On Thu, Apr 9, 2009 at 5:35 PM, Greg Ewing <greg.ewing at canterbury.ac.nz> wrote:
>   
>> Jacob Holm wrote:
>>     
>>> That rules out Gregs and my patches as well.  They both need extra state
>>> on the frame object to be able to implement yield-from in the first place.
>>>       
>> But that state is obviously necessary in order to support
>> yield-from, and it goes away as soon as the yield-from
>> itself finishes.
>>     
>
> Another way to look at this is, "RETVAL = yield from EXPR" has an
> expansion into source code where all that state is kept as either a
> local variable or via the position in the code, and that's how we
> define the semantics. I don't believe Jacob's proposal (the one that
> doesn't require new syntax) works this way.
>
>   

My proposal was to extend the lifetime of the yielded value until the 
yield expression returned, and to make that value available on the 
frame.  The "keep the value alive" part can easily be written as an 
expansion based on the existing yield semantics, and the "make it 
available on the frame" is similar to the need that yield-from has.  
That doesn't look all that different to me, but I'll let you be the 
judge of that.


- Jacob


From greg.ewing at canterbury.ac.nz  Fri Apr 10 12:42:57 2009
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Fri, 10 Apr 2009 22:42:57 +1200
Subject: [Python-ideas] x=(yield from) confusion [was:Yet another
 alternative name for yield-from]
In-Reply-To: <49DEA06A.9030102@gmail.com>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com>
	<49DA8423.9070101@canterbury.ac.nz> <49DAC193.3030008@improva.dk>
	<49DBCCC6.1080601@gmail.com> <49DC04F7.5080707@improva.dk>
	<49DC8BD0.4080303@gmail.com> <49DC9AE8.8020108@improva.dk>
	<49DCBD02.8030806@improva.dk>
	<ca471dc20904081121x56e61879q27ae9da604e82dc7@mail.gmail.com>
	<49DD62FA.9080504@improva.dk>
	<ca471dc20904082055s5f838a43tc1c175c8bf02cad9@mail.gmail.com>
	<49DDE66D.9080401@improva.dk> <49DDF1F9.8060904@gmail.com>
	<49DE0481.1010309@improva.dk> <49DEA06A.9030102@gmail.com>
Message-ID: <49DF22B1.4090505@canterbury.ac.nz>

Nick Coghlan wrote:

>>>>class Tricky(object):
> 
> ...   def __iter__(self):
> ...     return self
> ...   def next1(self):
> ...     print "Next 1"
> ...     Tricky.next = Tricky.next2

I just tried using a class like that in a for loop, and
it doesn't work. The next() method has to be defined in
the class, because it's actually a special method (it
has a slot in the type object).

> Greg - looks to me like you're going to have to disallow caching the
> method lookup in the PEP.

Even if it would work, I don't want to preclude a
valuable optimization for the sake of such an obscure
use case.

-- 
Greg



From greg.ewing at canterbury.ac.nz  Fri Apr 10 14:01:21 2009
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Sat, 11 Apr 2009 00:01:21 +1200
Subject: [Python-ideas] x=(yield from) confusion [was:Yet another
 alternative name for yield-from]
In-Reply-To: <49DEEE8D.8060906@improva.dk>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com>
	<49DA8423.9070101@canterbury.ac.nz> <49DAC193.3030008@improva.dk>
	<49DBCCC6.1080601@gmail.com> <49DC04F7.5080707@improva.dk>
	<49DC8BD0.4080303@gmail.com> <49DC9AE8.8020108@improva.dk>
	<49DCBD02.8030806@improva.dk>
	<ca471dc20904081121x56e61879q27ae9da604e82dc7@mail.gmail.com>
	<49DD62FA.9080504@improva.dk>
	<ca471dc20904082055s5f838a43tc1c175c8bf02cad9@mail.gmail.com>
	<49DDE66D.9080401@improva.dk> <49DDF1F9.8060904@gmail.com>
	<49DE0481.1010309@improva.dk> <49DEA06A.9030102@gmail.com>
	<49DEEE8D.8060906@improva.dk>
Message-ID: <49DF3511.9050307@canterbury.ac.nz>

Jacob Holm wrote:
> I 
> am saying that there are examples where it is desirable to move one of 
> the arguments that this form of refactoring forces you to put in the 
> constructor so it instead becomes the argument of the first send.

I'm having trouble seeing circumstances in which you
would need to do that. Can you provide an example
in the form of

(a) a piece of unfactored code

(b) a desired refactoring

(c) an explanation of why the desired refactoring
     can't conveniently be done using an unprimed
     generator and plain yield-from.

-- 
Greg


From rdmurray at bitdance.com  Fri Apr 10 15:52:20 2009
From: rdmurray at bitdance.com (R. David Murray)
Date: Fri, 10 Apr 2009 13:52:20 +0000 (UTC)
Subject: [Python-ideas] accurate errors for "magic" methods
References: <20090409190836.6c1b0802@o> <grlcs0$p5c$1@ger.gmane.org>
	<200904101445.13348.steve@pearwood.info>
Message-ID: <grniuk$3ks$1@ger.gmane.org>

Steven D'Aprano <steve at pearwood.info> wrote:
> On Fri, 10 Apr 2009 03:56:16 am Georg Brandl wrote:
> > spir schrieb:
> > > Actually, I'm wrong: it's perfectly clear as long as the programmer
> > > is able to follow all the necessary reflexion path; then probably
> > > also able to solve the problem without any help from python.
> > >
> > > The issue here is that a very specific (and meaningful) case
> > > (dict-like behaviour missing) is adressed using a very generic (and
> > > thus helpless) message (attributeError).
> > >
> > > I think error cases about "magic" methods, that implement
> > > conceptually meaningful behaviours, should have appropriate
> > > messages. In the case above, maybe something like: "Values instance
> > > is not an item container (no __getitem__ method found)."
> >
> > The time machine strikes again:
> > >>> class A(object): pass
> >
> > ...
> >
> > >>> A()['a']
> >
> > Traceback (most recent call last):
> >   File "<stdin>", line 1, in <module>
> > TypeError: 'A' object is unsubscriptable
> >
> >
> > (the difference being that A is new-style, while Values is
> > old-style.)
> 
> 
> Except that the error "object is unsubscriptable" might as well be in 
> Klingon to most people, particularly newbies. 
> 
> (1) It's easy to misread it as "unscriptable", which is even more 
> mysterious.
> 
> (2) As far as I know, there's no tradition of describing key or index 
> lookup as "subscripting" in Python. I've never noticed it in doc 
> strings or the online docs, and after hanging around comp.lang.python 
> extensively for years, I feel safe to say it's not a common term among 
> even experienced Python developers. I suppose that there's a weak 
> connection between index lookup and subscripts in mathematics.
> 
> (3) The classic error message tells the newbie exactly what the error 
> is: the object has no __getitem__ method. The new error message tells 
> the newbie nothing useful. Given that obj is unsubscriptable, what 
> needs to be done to make it subscriptable?

+1

And not just the newbie, either.  The experienced python programmer
looks at the original message and goes "ah ha".  The experienced python
programmer looks at the new message and has to _think about it_ before
understanding arrives.  I think that would be true even if you found a
better word than 'unsubscriptable'.  "Does not implement an item lookup
method" might work.  Or how about, 'Does not implement the Sequence or
Mapping interface'?

But you know what?  Doing a google search for 'python mapping' gets you
the description of dictionaries, while doing a google search for 'python
__getitem__' gets you to the data model chapter that describes how the
mapping/sequence behavior is implemented via __getitem__.  The latter
is more useful even to the newbie, I think.

So even if it looks ugly, I think the error message should mention
__getitem__.

--
R. David Murray



From rdmurray at bitdance.com  Fri Apr 10 15:58:36 2009
From: rdmurray at bitdance.com (R. David Murray)
Date: Fri, 10 Apr 2009 13:58:36 +0000 (UTC)
Subject: [Python-ideas] Add setpriority / getpriority to os module.
References: <grgtlk$pdt$1@ger.gmane.org>
Message-ID: <grnjac$4ap$1@ger.gmane.org>

Christian Heimes <lists at cheimes.de> wrote:
> Hello,
> 
> I would like to add straight forward wrapper for the setpriority and
> getpriority functions to posixmodule.c for the os module. The functions
> allow to retrieve and modify the niceness of a process. They are in
> particular useful for multiprocessing to implement low priority
> processes. The functions are well known as 'nice' or 'renice' commands.
> 
> The two functions and the constants PRIO_PROCESS, PRIO_PGRP, PRIO_USER,
> PRIO_MIN and PRIO_MAX should be available on most POSIX operating
> systems. A patch would come with autoconf tests for the feature.

I don't think there's likely to be much controversy about putting
these in.

--
R. David Murray             http://www.bitdance.com



From ncoghlan at gmail.com  Fri Apr 10 16:57:04 2009
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Sat, 11 Apr 2009 00:57:04 +1000
Subject: [Python-ideas] x=(yield from) confusion [was:Yet another
 alternative name for yield-from]
In-Reply-To: <49DF22B1.4090505@canterbury.ac.nz>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com>	<49DA8423.9070101@canterbury.ac.nz>
	<49DAC193.3030008@improva.dk>	<49DBCCC6.1080601@gmail.com>
	<49DC04F7.5080707@improva.dk>	<49DC8BD0.4080303@gmail.com>
	<49DC9AE8.8020108@improva.dk>	<49DCBD02.8030806@improva.dk>	<ca471dc20904081121x56e61879q27ae9da604e82dc7@mail.gmail.com>	<49DD62FA.9080504@improva.dk>	<ca471dc20904082055s5f838a43tc1c175c8bf02cad9@mail.gmail.com>	<49DDE66D.9080401@improva.dk>
	<49DDF1F9.8060904@gmail.com>	<49DE0481.1010309@improva.dk>
	<49DEA06A.9030102@gmail.com> <49DF22B1.4090505@canterbury.ac.nz>
Message-ID: <49DF5E40.6020107@gmail.com>

Greg Ewing wrote:
> Nick Coghlan wrote:
> 
>>>>> class Tricky(object):
>>
>> ...   def __iter__(self):
>> ...     return self
>> ...   def next1(self):
>> ...     print "Next 1"
>> ...     Tricky.next = Tricky.next2
> 
> I just tried using a class like that in a for loop, and
> it doesn't work. The next() method has to be defined in
> the class, because it's actually a special method (it
> has a slot in the type object).

Note that my example actually *does* modify the class as it goes along
(for exactly the reason you give - next() has to be defined on the class
or the interpreter will ignore it).

Is it possible you assigned to "self.next" instead of to "Tricky.next"
when trying it out?

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------


From scott+python-ideas at scottdial.com  Fri Apr 10 18:01:10 2009
From: scott+python-ideas at scottdial.com (Scott Dial)
Date: Fri, 10 Apr 2009 12:01:10 -0400
Subject: [Python-ideas] accurate errors for "magic" methods
In-Reply-To: <200904101445.13348.steve@pearwood.info>
References: <20090409190836.6c1b0802@o> <grlcs0$p5c$1@ger.gmane.org>
	<200904101445.13348.steve@pearwood.info>
Message-ID: <49DF6D46.1020607@scottdial.com>

Steven D'Aprano wrote:
> Except that the error "object is unsubscriptable" might as well be in 
> Klingon to most people, particularly newbies. 
> 
> (1) It's easy to misread it as "unscriptable", which is even more 
> mysterious.
> 
> (2) As far as I know, there's no tradition of describing key or index 
> lookup as "subscripting" in Python. I've never noticed it in doc 
> strings or the online docs, and after hanging around comp.lang.python 
> extensively for years, I feel safe to say it's not a common term among 
> even experienced Python developers. I suppose that there's a weak 
> connection between index lookup and subscripts in mathematics.
> 
> (3) The classic error message tells the newbie exactly what the error 
> is: the object has no __getitem__ method. The new error message tells 
> the newbie nothing useful. Given that obj is unsubscriptable, what 
> needs to be done to make it subscriptable?
> 
> 

This exact topic came up almost exactly a year ago:

http://mail.python.org/pipermail/python-dev/2008-April/078744.html

It seems the only thing that came out of that discussion was that set()
now has a new error message (that is still inconsistent):

Python 2.7a0 (trunk:71448, Apr 10 2009, 11:49:22)
[GCC 4.1.2 (Gentoo 4.1.2 p1.1)] on linux2
Type "help", "copyright", "credits" or "license" for more information.
>>> set([1,2,3])[0]
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
TypeError: 'set' object does not support indexing

Perhaps the most relevant part though is GvR saying:

"""
I wouldn't bet my life on that. __getitem__ overloading for all sorts
of nefarious purposes has quite a following. I'd prefer a message that
doesn't get into what x "is" but sticks to the point at hand, which is
that it doesn't support __getitem__.
"""

-- 
Scott Dial
scott at scottdial.com
scodial at cs.indiana.edu


From jimjjewett at gmail.com  Fri Apr 10 18:11:19 2009
From: jimjjewett at gmail.com (Jim Jewett)
Date: Fri, 10 Apr 2009 12:11:19 -0400
Subject: [Python-ideas] accurate errors for "magic" methods
In-Reply-To: <49DF6D46.1020607@scottdial.com>
References: <20090409190836.6c1b0802@o> <grlcs0$p5c$1@ger.gmane.org>
	<200904101445.13348.steve@pearwood.info>
	<49DF6D46.1020607@scottdial.com>
Message-ID: <fb6fbf560904100911q294b35cav6d1382d45c28344d@mail.gmail.com>

On 4/10/09, Scott Dial <scott+python-ideas at scottdial.com> wrote:
> Steven D'Aprano wrote:
>> Except that the error "object is unsubscriptable" might as well be in
>> Klingon to most people, particularly newbies.

And newbies shouldn't be worrying about __double_underscore__ names.

> [discussion around http://mail.python.org/pipermail/python-dev/2008-April/078744.html ]
> Perhaps the most relevant part though is GvR saying:
>
> """
> I wouldn't bet my life on that. __getitem__ overloading for all sorts
> of nefarious purposes has quite a following. I'd prefer a message that
> doesn't get into what x "is" but sticks to the point at hand, which is
> that it doesn't support __getitem__.
> """

Could the message at least mention [] as a possible reason that
subscripting or __getitem__ was even called in the first place?

-jJ


From jh at improva.dk  Fri Apr 10 20:58:37 2009
From: jh at improva.dk (Jacob Holm)
Date: Fri, 10 Apr 2009 20:58:37 +0200
Subject: [Python-ideas] x=(yield from) confusion [was:Yet another
 alternative name for yield-from]
In-Reply-To: <49DF3511.9050307@canterbury.ac.nz>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com>
	<49DA8423.9070101@canterbury.ac.nz>
	<49DAC193.3030008@improva.dk> <49DBCCC6.1080601@gmail.com>
	<49DC04F7.5080707@improva.dk> <49DC8BD0.4080303@gmail.com>
	<49DC9AE8.8020108@improva.dk> <49DCBD02.8030806@improva.dk>
	<ca471dc20904081121x56e61879q27ae9da604e82dc7@mail.gmail.com>
	<49DD62FA.9080504@improva.dk>
	<ca471dc20904082055s5f838a43tc1c175c8bf02cad9@mail.gmail.com>
	<49DDE66D.9080401@improva.dk> <49DDF1F9.8060904@gmail.com>
	<49DE0481.1010309@improva.dk> <49DEA06A.9030102@gmail.com>
	<49DEEE8D.8060906@improva.dk> <49DF3511.9050307@canterbury.ac.nz>
Message-ID: <49DF96DD.4040406@improva.dk>

Greg Ewing wrote:
> Jacob Holm wrote:
>> I am saying that there are examples where it is desirable to move one 
>> of the arguments that this form of refactoring forces you to put in 
>> the constructor so it instead becomes the argument of the first send.
>
> I'm having trouble seeing circumstances in which you
> would need to do that. Can you provide an example
> in the form of
>
> (a) a piece of unfactored code

Ok, once again based on your own parser example.   The parse_items 
generator could have been written as:

def parse_items(closing_tag = None):
    elems = []
    token = yield
    while token != closing_tag:
        if is_opening_tag(token):
            name = token[1:-1]
            items = yield from parse_items("</%s>" % name)
            elems.append((name, items))
        else:
            elems.append(token)
        token = yield
    return elems


>
> (b) a desired refactoring

I would like to split off a function for parsing a single element.  And 
I would like it to look like this:

def parse_elem():
    opening_tag = yield
    name = opening_tag[1:-1]
    items = yield from parse_items("</%s>" % name)
    return (name, items)


This differs from the version in your example by taking all the tags as 
arguments to send() instead of having the opening tag as an argument to 
the constructor.

Unfortunately, there is no way to actually use this version in the 
implementation of parse_items.

>
> (c) an explanation of why the desired refactoring
>     can't conveniently be done using an unprimed
>     generator and plain yield-from.
>

The suggested subroutine cannot be used, because parse_items already has 
the value that should go as the argument to the first send().

It is easy to rewrite it to the version you used in the example, but 
that requires you to make the opening_tag an argument to the 
constructor, whereas I want it as an argument to the first send.  You 
can of course make that argument optional and adjust the function to 
only do the first yield if the argument is not given.  That is 
essentially what my "cr_init()" pattern does.  Using that pattern, the 
refactoring looks like this:

def cr_init(start):
    if start is None:
       return yield
    if 'send' in start:
        return start['send']
    if 'throw' in start:
        raise start['throw']
    return yield start.get('yield')

def parse_elem(start=None):
    opening_tag = yield from cr_init(start)
    name = opening_tag[1:-1]
    items = yield from parse_items("</%s>" % name)
    return (name, items)

def parse_items(closing_tag=None, start=None):
    elems = []
    token = yield from cr_init(start)
    while token != closing_tag:
        if is_opening_tag(token):
            elems.append(yield from parse_elem(start={'send':token}))
        else:
            elems.append(token)
        token = yield
    return elems


As you see it *can* be done, but I would hardly call it convenient.  The 
main problem is that the coroutine you want to call must be written with 
this in mind or you are out of luck.  While it *is* possible to write a 
wrapper that lets you call the unmodified parse_elem, that wrapper 
cannot use yield_from to call it so you get a rather large overhead that 
way.

A convention like Nick suggested where all coroutines take an optional 
"start" argument with the first value to yield doesn't help, because it 
is not the value to yield that is the problem.

I hope this helps to explain why the cr_init pattern is needed even for 
relatively simple refactoring now that it seems we are not fixing the 
"initial next()" issue.

- Jacob


From tjreedy at udel.edu  Sat Apr 11 00:31:03 2009
From: tjreedy at udel.edu (Terry Reedy)
Date: Fri, 10 Apr 2009 18:31:03 -0400
Subject: [Python-ideas] accurate errors for "magic" methods
In-Reply-To: <200904101445.13348.steve@pearwood.info>
References: <20090409190836.6c1b0802@o> <grlcs0$p5c$1@ger.gmane.org>
	<200904101445.13348.steve@pearwood.info>
Message-ID: <grohb7$qt6$1@ger.gmane.org>

Steven D'Aprano wrote:
>
> 
> (1) It's easy to misread it as "unscriptable", which is even more 
> mysterious.

'Cannot be subscripted' might be clearer.

> (2) As far as I know, there's no tradition of describing key or index 
> lookup as "subscripting" in Python....

Reference / Expressions

"Subscriptions
A subscription selects an item of a sequence (string, tuple or list) or 
mapping (dictionary) object:..."

> (3) The classic error message tells the newbie exactly what the error 
> is: the object has no __getitem__ method.
 > The new error message tells
> the newbie nothing useful.

It says "Do not try to subscript this object'.

 > Given that obj is unsubscriptable, what
> needs to be done to make it subscriptable?

Typicallly, nothing.  Usually, there is no __getitem__ because there 
should not be.  How would you make a number subscriptable?

To me, saying 'Int cannot be subscripted' (don't do it) is more helpful 
to newbies than 'Int has no __getitem__ method' (so it cannot be 
subscripted, so stop trying).

tjr



From greg.ewing at canterbury.ac.nz  Sat Apr 11 03:08:14 2009
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Sat, 11 Apr 2009 13:08:14 +1200
Subject: [Python-ideas] x=(yield from) confusion [was:Yet another
 alternative name for yield-from]
In-Reply-To: <49DF5E40.6020107@gmail.com>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com>
	<49DA8423.9070101@canterbury.ac.nz> <49DAC193.3030008@improva.dk>
	<49DBCCC6.1080601@gmail.com> <49DC04F7.5080707@improva.dk>
	<49DC8BD0.4080303@gmail.com> <49DC9AE8.8020108@improva.dk>
	<49DCBD02.8030806@improva.dk>
	<ca471dc20904081121x56e61879q27ae9da604e82dc7@mail.gmail.com>
	<49DD62FA.9080504@improva.dk>
	<ca471dc20904082055s5f838a43tc1c175c8bf02cad9@mail.gmail.com>
	<49DDE66D.9080401@improva.dk> <49DDF1F9.8060904@gmail.com>
	<49DE0481.1010309@improva.dk> <49DEA06A.9030102@gmail.com>
	<49DF22B1.4090505@canterbury.ac.nz> <49DF5E40.6020107@gmail.com>
Message-ID: <49DFED7E.9050706@canterbury.ac.nz>

Nick Coghlan wrote:

> Note that my example actually *does* modify the class as it goes along
> (for exactly the reason you give - next() has to be defined on the class
> or the interpreter will ignore it).

Sorry, I didn't notice you were doing that.

But this doesn't seem useful to me. What if you
have two instances of Tricky in use at the same
time? They're going to interfere with each other.

I suppose you could make it work if you created
a new class each time. But my earlier comment
stands -- I don't want to preclude optimizing the
common case for the sake of an uncommon case.

-- 
Greg


From josiah.carlson at gmail.com  Sat Apr 11 04:44:21 2009
From: josiah.carlson at gmail.com (Josiah Carlson)
Date: Fri, 10 Apr 2009 19:44:21 -0700
Subject: [Python-ideas] Add setpriority / getpriority to os module.
In-Reply-To: <grnjac$4ap$1@ger.gmane.org>
References: <grgtlk$pdt$1@ger.gmane.org> <grnjac$4ap$1@ger.gmane.org>
Message-ID: <e6511dbf0904101944u64d55eb4y8c116271a8985e8f@mail.gmail.com>

On Fri, Apr 10, 2009 at 6:58 AM, R. David Murray <rdmurray at bitdance.com> wrote:
> Christian Heimes <lists at cheimes.de> wrote:
>> Hello,
>>
>> I would like to add straight forward wrapper for the setpriority and
>> getpriority functions to posixmodule.c for the os module. The functions
>> allow to retrieve and modify the niceness of a process. They are in
>> particular useful for multiprocessing to implement low priority
>> processes. The functions are well known as 'nice' or 'renice' commands.
>>
>> The two functions and the constants PRIO_PROCESS, PRIO_PGRP, PRIO_USER,
>> PRIO_MIN and PRIO_MAX should be available on most POSIX operating
>> systems. A patch would come with autoconf tests for the feature.
>
> I don't think there's likely to be much controversy about putting
> these in.

Variants for Windows with roughly equivalent meanings would be nice.
It looks pretty straightforward with pywin32, but someone with ctypes
experience may be preferred.

 - Josiah


From greg.ewing at canterbury.ac.nz  Sat Apr 11 06:22:28 2009
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Sat, 11 Apr 2009 16:22:28 +1200
Subject: [Python-ideas] x=(yield from) confusion [was:Yet another
 alternative name for yield-from]
In-Reply-To: <49DF96DD.4040406@improva.dk>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com>
	<49DA8423.9070101@canterbury.ac.nz> <49DAC193.3030008@improva.dk>
	<49DBCCC6.1080601@gmail.com> <49DC04F7.5080707@improva.dk>
	<49DC8BD0.4080303@gmail.com> <49DC9AE8.8020108@improva.dk>
	<49DCBD02.8030806@improva.dk>
	<ca471dc20904081121x56e61879q27ae9da604e82dc7@mail.gmail.com>
	<49DD62FA.9080504@improva.dk>
	<ca471dc20904082055s5f838a43tc1c175c8bf02cad9@mail.gmail.com>
	<49DDE66D.9080401@improva.dk> <49DDF1F9.8060904@gmail.com>
	<49DE0481.1010309@improva.dk> <49DEA06A.9030102@gmail.com>
	<49DEEE8D.8060906@improva.dk> <49DF3511.9050307@canterbury.ac.nz>
	<49DF96DD.4040406@improva.dk>
Message-ID: <49E01B04.6010401@canterbury.ac.nz>

Jacob Holm wrote:

> I would like to split off a function for parsing a single element.  And 
> I would like it to look like this:
> 
> def parse_elem():
>    opening_tag = yield
>    name = opening_tag[1:-1]
>    items = yield from parse_items("</%s>" % name)
>    return (name, items)

I don't see what you gain by writing it like that, though.
You don't even know whether you want to call this function
until you've seen the first token and realized that it's
a tag.

In other words, you need a one-token lookahead. A more
conventional parser would use a scanner that lets you
peek at the next token without absorbing it, but that's
not an option when you're receiving the tokens via
yield, so another solution must be found.

The solution I chose was to keep the lookahead token
as state in the parsing functions, and pass it to
wherever it's needed. Your parse_elem() function clearly
needs it, so it should take it as a parameter.

If there's some circumstance in which you know for
certain that there's an elem coming up, you can always
write another parsing function for dealing with that,
e.g.

   def expect_elem():
     first = yield
     return yield from parse_elem(opening_tag = first)

I don't think there's anything inconvenient about that.

> A convention like Nick suggested where all coroutines take an
 > optional "start" argument with the first value to yield doesn't
 > help, because it is not the value to yield that is the problem.

I think you've confused the issue a bit yourself, because
you started out by asking for a way of specifing the first
value to yield in the yield-from expression. But it seems
that what you really want is to specify the first value
to *send* into the subiterator.

I haven't seen anything so far that convinces me it would
be a serious inconvenience not to have such a feature.

Also, it doesn't seem to generalize. What if your parser
needs a two-token lookahead? Then you'll be asking for a
way to specify the first *two* values to send in. Where
does it end?

-- 
Greg


From ncoghlan at gmail.com  Sat Apr 11 08:53:20 2009
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Sat, 11 Apr 2009 16:53:20 +1000
Subject: [Python-ideas] x=(yield from) confusion [was:Yet another
 alternative name for yield-from]
In-Reply-To: <49DFED7E.9050706@canterbury.ac.nz>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com>
	<49DA8423.9070101@canterbury.ac.nz>
	<49DAC193.3030008@improva.dk> <49DBCCC6.1080601@gmail.com>
	<49DC04F7.5080707@improva.dk> <49DC8BD0.4080303@gmail.com>
	<49DC9AE8.8020108@improva.dk> <49DCBD02.8030806@improva.dk>
	<ca471dc20904081121x56e61879q27ae9da604e82dc7@mail.gmail.com>
	<49DD62FA.9080504@improva.dk>
	<ca471dc20904082055s5f838a43tc1c175c8bf02cad9@mail.gmail.com>
	<49DDE66D.9080401@improva.dk> <49DDF1F9.8060904@gmail.com>
	<49DE0481.1010309@improva.dk> <49DEA06A.9030102@gmail.com>
	<49DF22B1.4090505@canterbury.ac.nz> <49DF5E40.6020107@gmail.com>
	<49DFED7E.9050706@canterbury.ac.nz>
Message-ID: <49E03E60.4090202@gmail.com>

Greg Ewing wrote:
> Nick Coghlan wrote:
> 
>> Note that my example actually *does* modify the class as it goes along
>> (for exactly the reason you give - next() has to be defined on the class
>> or the interpreter will ignore it).
> 
> Sorry, I didn't notice you were doing that.
> 
> But this doesn't seem useful to me. What if you
> have two instances of Tricky in use at the same
> time? They're going to interfere with each other.

> I suppose you could make it work if you created
> a new class each time.

I provided a more complete example in another message that showed how to
put the "start" value idiom in a helper function so it could be used
with any iterator:

 def start(iterable, start):
     itr = iter(iterable)
     class TrapFirstNext(object):
       def __iter__(self):
         return self
       def next(self):
         TrapFirstNext.next = itr.next
         return start
       # Should also set send and throw to
       # itr.send and itr.throw if they exist
     return TrapFirstNext()


>>> for val in start(range(2), "Hello World!"):
...   print val
...
Hello World!
0
1

> But my earlier comment
> stands -- I don't want to preclude optimizing the
> common case for the sake of an uncommon case.

My main concern with allowing next() to be cached is that none of the
existing looping constructs (for loop, comprehensions, generator
expressions) have ever cached the bound method, so doing it in
yield-from would make the new expression an odd special case (e.g. the
for loop above would work, but using start() in a yield-from that cached
the bound method would result in an infinite loop).

That said, the existing language reference doesn't actually *say* that
an implementation isn't allowed to cache the bound next() method in a
for loop - it's just a long-standing convention that CPython has done
the lookup every time around the loop.

If this turns out to be "too slow" for coroutine setups that use
send(val) rather than next(), then a better optimisation may be to turn
send() and throw() into proper optional elements of the iterator
protocol and give them their own slots on the type object. Deviating
from existing looping behaviour in order to allow caching seems like a
worse option, even if the details of the current behaviour were
originally just an implementation accident.

Also, for the simple case where send() and throw() aren't used we want:

  yield from subiter

to do the same thing as:

  for x in subiter:
    yield x

For CPython, that currently means no caching of the next() method in
yield-from, since caching it would break the equivalence with the latter
expansion.

I think allowing next() to be cached in looping constructs should be
pulled out of the PEP and turned into a separate PEP seeking explicit
clarification from Guido as to whether the lack of caching of next() is
part of the definition of looping in the language, or whether it is just
an implementation detail of CPython that shouldn't be relied on.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------


From jh at improva.dk  Sat Apr 11 12:40:34 2009
From: jh at improva.dk (Jacob Holm)
Date: Sat, 11 Apr 2009 12:40:34 +0200
Subject: [Python-ideas] x=(yield from) confusion [was:Yet another
 alternative name for yield-from]
In-Reply-To: <49E01B04.6010401@canterbury.ac.nz>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com>
	<49DA8423.9070101@canterbury.ac.nz>
	<49DAC193.3030008@improva.dk> <49DBCCC6.1080601@gmail.com>
	<49DC04F7.5080707@improva.dk> <49DC8BD0.4080303@gmail.com>
	<49DC9AE8.8020108@improva.dk> <49DCBD02.8030806@improva.dk>
	<ca471dc20904081121x56e61879q27ae9da604e82dc7@mail.gmail.com>
	<49DD62FA.9080504@improva.dk>
	<ca471dc20904082055s5f838a43tc1c175c8bf02cad9@mail.gmail.com>
	<49DDE66D.9080401@improva.dk> <49DDF1F9.8060904@gmail.com>
	<49DE0481.1010309@improva.dk> <49DEA06A.9030102@gmail.com>
	<49DEEE8D.8060906@improva.dk> <49DF3511.9050307@canterbury.ac.nz>
	<49DF96DD.4040406@improva.dk> <49E01B04.6010401@canterbury.ac.nz>
Message-ID: <49E073A2.1080804@improva.dk>

Greg Ewing wrote:
> Jacob Holm wrote:
>
>> I would like to split off a function for parsing a single element.  
>> And I would like it to look like this:
>>
>> def parse_elem():
>>    opening_tag = yield
>>    name = opening_tag[1:-1]
>>    items = yield from parse_items("</%s>" % name)
>>    return (name, items)
>
> I don't see what you gain by writing it like that, though.

A more consistent api for calling it.  Instead of special-casing the 
first input, all input is provided the same way.  That makes it more 
similar to parse_items that is already called that way.

> You don't even know whether you want to call this function
> until you've seen the first token and realized that it's
> a tag.

Not when used from parse_items, but remember that my intension was to 
make parse_elem independently useful.  If you are just starting to parse 
a presumed valid xml stream (without self-closed tags), you know that it 
consists of a single element but won't know that it is.

>
> In other words, you need a one-token lookahead. A more
> conventional parser would use a scanner that lets you
> peek at the next token without absorbing it, but that's
> not an option when you're receiving the tokens via
> yield, so another solution must be found.
>
> The solution I chose was to keep the lookahead token
> as state in the parsing functions, and pass it to
> wherever it's needed. Your parse_elem() function clearly
> needs it, so it should take it as a parameter.
>
> If there's some circumstance in which you know for
> certain that there's an elem coming up, you can always
> write another parsing function for dealing with that,
> e.g.
>
>   def expect_elem():
>     first = yield
>     return yield from parse_elem(opening_tag = first)
>
> I don't think there's anything inconvenient about that.
>

Except you now have one extra function for exactly the same task, just 
with a different calling convention.  And this doesn't handle an initial 
throw() correctly.  Not that I see any reason to use throw in the parser 
example.  I'm just saying an extra function wouldn't work in that case.

>> A convention like Nick suggested where all coroutines take an
> > optional "start" argument with the first value to yield doesn't
> > help, because it is not the value to yield that is the problem.
>
> I think you've confused the issue a bit yourself, because
> you started out by asking for a way of specifing the first
> value to yield in the yield-from expression. But it seems
> that what you really want is to specify the first value
> to *send* into the subiterator.

In this case, yes.  In other cases it really is the first value to yield 
from the subiterator, or the first value to throw into the subiterator.

At least part of the confusion comes from the fact that if yield-from 
could somehow suppress the initial next and yield a different value 
instead (either an extra expression in yield-from or the last value 
yielded by a primed generator), there would be a simple way to write 
wrappers that could be used at the call site to handle all those cases.  
So a feature that allowed specifying the first value to yield in the 
yield-from expression *would* be enough, but a start argument to the 
coroutine constructor isn't.

>
> I haven't seen anything so far that convinces me it would
> be a serious inconvenience not to have such a feature.
>
> Also, it doesn't seem to generalize. What if your parser
> needs a two-token lookahead? Then you'll be asking for a
> way to specify the first *two* values to send in. Where
> does it end?
>

The "suppress initial next()" feature *would* have helped, by enabling 
you to write a generic wrapper to use at the call site that could do 
exactly that.  The wrapper could use send() as many times as needed on 
the wrapped generator, then use yield-from to call it when done.  
Without that feature, the wrapper can't use yield-from to call the 
wrapped generator.   Of course there are (slower) ways to write such a 
wrapper without using yield-from.

The alternative to using a call wrapper is to rewrite the subiterator to 
take the full lookahead as arguments, but how would you write functions 
like parse_elem and parse_items if the lookahead is variable?  (You can 
safely assume that the lookahead is no more than is needed to exhaust 
the generator)

I think I can probably generalize the cr_init() pattern to handle a 
variable lookahead, but I think even a slow call wrapper might be faster 
in that case (depending on the nesting level).

- Jacob


From ncoghlan at gmail.com  Sat Apr 11 13:18:22 2009
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Sat, 11 Apr 2009 21:18:22 +1000
Subject: [Python-ideas] x=(yield from) confusion [was:Yet another
 alternative name for yield-from]
In-Reply-To: <49E073A2.1080804@improva.dk>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com>	<49DA8423.9070101@canterbury.ac.nz>	<49DAC193.3030008@improva.dk>
	<49DBCCC6.1080601@gmail.com>	<49DC04F7.5080707@improva.dk>
	<49DC8BD0.4080303@gmail.com>	<49DC9AE8.8020108@improva.dk>
	<49DCBD02.8030806@improva.dk>	<ca471dc20904081121x56e61879q27ae9da604e82dc7@mail.gmail.com>	<49DD62FA.9080504@improva.dk>	<ca471dc20904082055s5f838a43tc1c175c8bf02cad9@mail.gmail.com>	<49DDE66D.9080401@improva.dk>
	<49DDF1F9.8060904@gmail.com>	<49DE0481.1010309@improva.dk>
	<49DEA06A.9030102@gmail.com>	<49DEEE8D.8060906@improva.dk>
	<49DF3511.9050307@canterbury.ac.nz>	<49DF96DD.4040406@improva.dk>
	<49E01B04.6010401@canterbury.ac.nz> <49E073A2.1080804@improva.dk>
Message-ID: <49E07C7E.4070802@gmail.com>

Jacob Holm wrote:
> At least part of the confusion comes from the fact that if yield-from
> could somehow suppress the initial next and yield a different value
> instead (either an extra expression in yield-from or the last value
> yielded by a primed generator), there would be a simple way to write
> wrappers that could be used at the call site to handle all those cases. 
> So a feature that allowed specifying the first value to yield in the
> yield-from expression *would* be enough, but a start argument to the
> coroutine constructor isn't.

I think leaving this task to wrapper classes in the initial version of
the PEP is the right way to go at this point. Adding a "skip the initial
next and yield <expr> instead" clause later will be much easier than
trying to undo something added now if it turns out to be a mistake.

Greg's basic proposal makes the easy things easy and the difficult
things possible, so it is a very good place to start. The main change I
would like from the original version of the PEP is for caching the bound
methods to be explicitly disallowed in order to match the behaviour of
normal for loops.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------


From jh at improva.dk  Sat Apr 11 14:03:39 2009
From: jh at improva.dk (Jacob Holm)
Date: Sat, 11 Apr 2009 14:03:39 +0200
Subject: [Python-ideas] x=(yield from) confusion [was:Yet another
 alternative name for yield-from]
In-Reply-To: <49E07C7E.4070802@gmail.com>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com>	<49DA8423.9070101@canterbury.ac.nz>	<49DAC193.3030008@improva.dk>
	<49DBCCC6.1080601@gmail.com>	<49DC04F7.5080707@improva.dk>
	<49DC8BD0.4080303@gmail.com>	<49DC9AE8.8020108@improva.dk>
	<49DCBD02.8030806@improva.dk>	<ca471dc20904081121x56e61879q27ae9da604e82dc7@mail.gmail.com>	<49DD62FA.9080504@improva.dk>	<ca471dc20904082055s5f838a43tc1c175c8bf02cad9@mail.gmail.com>	<49DDE66D.9080401@improva.dk>
	<49DDF1F9.8060904@gmail.com>	<49DE0481.1010309@improva.dk>
	<49DEA06A.9030102@gmail.com>	<49DEEE8D.8060906@improva.dk>
	<49DF3511.9050307@canterbury.ac.nz>	<49DF96DD.4040406@improva.dk>
	<49E01B04.6010401@canterbury.ac.nz>
	<49E073A2.1080804@improva.dk> <49E07C7E.4070802@gmail.com>
Message-ID: <49E0871B.2050404@improva.dk>

Nick Coghlan wrote:
> Jacob Holm wrote:
>   
>> At least part of the confusion comes from the fact that if yield-from
>> could somehow suppress the initial next and yield a different value
>> instead (either an extra expression in yield-from or the last value
>> yielded by a primed generator), there would be a simple way to write
>> wrappers that could be used at the call site to handle all those cases. 
>> So a feature that allowed specifying the first value to yield in the
>> yield-from expression *would* be enough, but a start argument to the
>> coroutine constructor isn't.
>>     
>
> I think leaving this task to wrapper classes in the initial version of
> the PEP is the right way to go at this point. 

I have already given up on getting this feature in at this point.  The 
above paragraph was just meant to clear up some misunderstandings.

> Adding a "skip the initial
> next and yield <expr> instead" clause later will be much easier than
> trying to undo something added now if it turns out to be a mistake.
>   

Note that if we decide that it is OK to use yield-from with an 
already-started generator in this version, we can't later change 
yield-from to use the latest value yielded in place of the initial 
next().  That makes new syntax the only possibility for that future 
extension.  Not that this is necessarily a bad thing.

> Greg's basic proposal makes the easy things easy and the difficult
> things possible, so it is a very good place to start. 

Yes. You can even write the slow version of the call-wrappers I am 
talking about, and then replace them with the faster versions later if 
the feature becomes available.

> The main change I
> would like from the original version of the PEP is for caching the bound
> methods to be explicitly disallowed in order to match the behaviour of
> normal for loops.
>   

I am not really sure about this.  It looks very much like an 
implementation detail to me.  On the other hand, the ability to replace 
the methods mid-flight might give us a way to implement the 
call-wrappers with minimal overhead.  Since the current patches don't 
actually do any caching, this is something I should actually be able to try.


- Jacob


From ncoghlan at gmail.com  Sat Apr 11 14:32:17 2009
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Sat, 11 Apr 2009 22:32:17 +1000
Subject: [Python-ideas] x=(yield from) confusion [was:Yet another
 alternative name for yield-from]
In-Reply-To: <49E0871B.2050404@improva.dk>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com>	<49DA8423.9070101@canterbury.ac.nz>	<49DAC193.3030008@improva.dk>
	<49DBCCC6.1080601@gmail.com>	<49DC04F7.5080707@improva.dk>
	<49DC8BD0.4080303@gmail.com>	<49DC9AE8.8020108@improva.dk>
	<49DCBD02.8030806@improva.dk>	<ca471dc20904081121x56e61879q27ae9da604e82dc7@mail.gmail.com>	<49DD62FA.9080504@improva.dk>	<ca471dc20904082055s5f838a43tc1c175c8bf02cad9@mail.gmail.com>	<49DDE66D.9080401@improva.dk>
	<49DDF1F9.8060904@gmail.com>	<49DE0481.1010309@improva.dk>
	<49DEA06A.9030102@gmail.com>	<49DEEE8D.8060906@improva.dk>
	<49DF3511.9050307@canterbury.ac.nz>	<49DF96DD.4040406@improva.dk>
	<49E01B04.6010401@canterbury.ac.nz>
	<49E073A2.1080804@improva.dk> <49E07C7E.4070802@gmail.com>
	<49E0871B.2050404@improva.dk>
Message-ID: <49E08DD1.20504@gmail.com>

Jacob Holm wrote:
> I am not really sure about this.  It looks very much like an
> implementation detail to me.  On the other hand, the ability to replace
> the methods mid-flight might give us a way to implement the
> call-wrappers with minimal overhead.  Since the current patches don't
> actually do any caching, this is something I should actually be able to
> try.

The part that makes me nervous is the fact that the PEP as it stands
gives the green light to an implementation having different bound method
caching behaviour between for loops and the yield-from expression.

That goes against Guido's request that the degenerate case of yield-from
have the same semantics as:

  for x in subiter:
    yield x

Since the language reference is actually silent on the topic of caching
the bound method when iterating over an object, I would phrase it along
the following lines:

 - if for loops in a Python implementation cache next(), then yield-from
in that implementation should also cache next()
 - if yield-from caches next(), it should also cache sent() and throw()
 - Since CPython for loops don't cache the bound method for next(), it
won't cache the methods used by yield-from either

Who knows, maybe Guido will actually clarify the matter for us when he
gets back from his vacation :)

Cheers,
Nick.

P.S. Speaking of vacations, I'll also be offline for the next week or so
(starting tomorrow), and then my internet access for Python activities
will be sketchy for another couple of weeks as I move house. So I won't
be able to contribute much more to this discussion.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------


From guido at python.org  Sat Apr 11 17:58:52 2009
From: guido at python.org (Guido van Rossum)
Date: Sat, 11 Apr 2009 08:58:52 -0700
Subject: [Python-ideas] x=(yield from) confusion [was:Yet another
	alternative name for yield-from]
In-Reply-To: <49E08DD1.20504@gmail.com>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com> 
	<49DEA06A.9030102@gmail.com> <49DEEE8D.8060906@improva.dk> 
	<49DF3511.9050307@canterbury.ac.nz> <49DF96DD.4040406@improva.dk> 
	<49E01B04.6010401@canterbury.ac.nz> <49E073A2.1080804@improva.dk> 
	<49E07C7E.4070802@gmail.com> <49E0871B.2050404@improva.dk> 
	<49E08DD1.20504@gmail.com>
Message-ID: <ca471dc20904110858n349fd7f2tb4e4bb1fe04378a7@mail.gmail.com>

On Sat, Apr 11, 2009 at 5:32 AM, Nick Coghlan <ncoghlan at gmail.com> wrote:
> Jacob Holm wrote:
>> I am not really sure about this. ?It looks very much like an
>> implementation detail to me. ?On the other hand, the ability to replace
>> the methods mid-flight might give us a way to implement the
>> call-wrappers with minimal overhead. ?Since the current patches don't
>> actually do any caching, this is something I should actually be able to
>> try.
>
> The part that makes me nervous is the fact that the PEP as it stands
> gives the green light to an implementation having different bound method
> caching behaviour between for loops and the yield-from expression.
>
> That goes against Guido's request that the degenerate case of yield-from
> have the same semantics as:
>
> ?for x in subiter:
> ? ?yield x
>
> Since the language reference is actually silent on the topic of caching
> the bound method when iterating over an object, I would phrase it along
> the following lines:
>
> ?- if for loops in a Python implementation cache next(), then yield-from
> in that implementation should also cache next()
> ?- if yield-from caches next(), it should also cache sent() and throw()
> ?- Since CPython for loops don't cache the bound method for next(), it
> won't cache the methods used by yield-from either

In ceval.c, the FOR_ITER opcode expects the iterator on top of the
stack and calls (v->ob_type->tp_iternext)(v).

You tell me whether that is caching or not. :-)

> Who knows, maybe Guido will actually clarify the matter for us when he
> gets back from his vacation :)

Or sooner. :-)

> P.S. Speaking of vacations, I'll also be offline for the next week or so
> (starting tomorrow), and then my internet access for Python activities
> will be sketchy for another couple of weeks as I move house. So I won't
> be able to contribute much more to this discussion.

Good, let this be a general trend. :-)

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)


From mrts.pydev at gmail.com  Sun Apr 12 00:22:30 2009
From: mrts.pydev at gmail.com (=?ISO-8859-1?Q?Mart_S=F5mermaa?=)
Date: Sun, 12 Apr 2009 01:22:30 +0300
Subject: [Python-ideas] Add OrderedSet now that OrderedDict is in collections
Message-ID: <ad1f81530904111522i54f8134dn6fb278ad480d346b@mail.gmail.com>

There was a somewhat ancient discussion on OrderedDict and OrderedSet
before: http://mail.python.org/pipermail/python-dev/2005-March/051915.html

The resolution seemed to be that neither of them should be in stdlib. Now
that OrderedDict is in and Raymond Hettinger has created a solid OrderedSet
implementation: http://code.activestate.com/recipes/576694/ , could the
latter also be included in collections?

Here's a very generic use-case:

def _update_key(dct, key, val):
    """
    Update a key in dict *dct*. If they key already exists in *dct* but the
    value doesn't, a set of previous values is created and the value added
to it.
    """
    if key in dct:
        if dct[key] == val:
            return
        s = set(dct[key])
        s.update(val)
        dct[key] = s
    else:
        dct[key] = val

The problem is that I both need to remove duplicates and retain insertion
order like list.append().
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-ideas/attachments/20090412/708554fd/attachment.html>

From greg.ewing at canterbury.ac.nz  Sun Apr 12 01:15:24 2009
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Sun, 12 Apr 2009 11:15:24 +1200
Subject: [Python-ideas] x=(yield from) confusion [was:Yet another
 alternative name for yield-from]
In-Reply-To: <49E073A2.1080804@improva.dk>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com>
	<49DA8423.9070101@canterbury.ac.nz> <49DAC193.3030008@improva.dk>
	<49DBCCC6.1080601@gmail.com> <49DC04F7.5080707@improva.dk>
	<49DC8BD0.4080303@gmail.com> <49DC9AE8.8020108@improva.dk>
	<49DCBD02.8030806@improva.dk>
	<ca471dc20904081121x56e61879q27ae9da604e82dc7@mail.gmail.com>
	<49DD62FA.9080504@improva.dk>
	<ca471dc20904082055s5f838a43tc1c175c8bf02cad9@mail.gmail.com>
	<49DDE66D.9080401@improva.dk> <49DDF1F9.8060904@gmail.com>
	<49DE0481.1010309@improva.dk> <49DEA06A.9030102@gmail.com>
	<49DEEE8D.8060906@improva.dk> <49DF3511.9050307@canterbury.ac.nz>
	<49DF96DD.4040406@improva.dk> <49E01B04.6010401@canterbury.ac.nz>
	<49E073A2.1080804@improva.dk>
Message-ID: <49E1248C.4090303@canterbury.ac.nz>

Jacob Holm wrote:

> Except you now have one extra function for exactly the same task, just 
> with a different calling convention.

I don't see anything wrong with that. If you look in the
stdlib, there are plenty of places where alternative APIs
are provided for the same functionality, e.g. in the re
module you have the module-level functions as well as the
match object methods.

I would rather have a couple of functions written in a
straightforward way than rely on a magic wrapper to
artificially munge them into one. Transparent is better
than opaque.

> The "suppress initial next()" feature *would* have helped, by enabling 
> you to write a generic wrapper to use at the call site that could do 
> exactly that.

Now you're just moving the wrappers from one place to
another. I can write a wrapper to convert any lookahead
taking parsing function into a non-lookahead one:

   def expect(f):
     first = yield
     return yield from f(first)

So at the cost of just one extra function, I can call
any of my parsing functions using either style.

-- 
Greg



From greg.ewing at canterbury.ac.nz  Sun Apr 12 01:35:30 2009
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Sun, 12 Apr 2009 11:35:30 +1200
Subject: [Python-ideas] x=(yield from) confusion [was:Yet another
 alternative name for yield-from]
In-Reply-To: <49E07C7E.4070802@gmail.com>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com>
	<49DA8423.9070101@canterbury.ac.nz> <49DAC193.3030008@improva.dk>
	<49DBCCC6.1080601@gmail.com> <49DC04F7.5080707@improva.dk>
	<49DC8BD0.4080303@gmail.com> <49DC9AE8.8020108@improva.dk>
	<49DCBD02.8030806@improva.dk>
	<ca471dc20904081121x56e61879q27ae9da604e82dc7@mail.gmail.com>
	<49DD62FA.9080504@improva.dk>
	<ca471dc20904082055s5f838a43tc1c175c8bf02cad9@mail.gmail.com>
	<49DDE66D.9080401@improva.dk> <49DDF1F9.8060904@gmail.com>
	<49DE0481.1010309@improva.dk> <49DEA06A.9030102@gmail.com>
	<49DEEE8D.8060906@improva.dk> <49DF3511.9050307@canterbury.ac.nz>
	<49DF96DD.4040406@improva.dk> <49E01B04.6010401@canterbury.ac.nz>
	<49E073A2.1080804@improva.dk> <49E07C7E.4070802@gmail.com>
Message-ID: <49E12942.8060701@canterbury.ac.nz>

Nick Coghlan wrote:
> The main change I
> would like from the original version of the PEP is for caching the bound
> methods to be explicitly disallowed in order to match the behaviour of
> normal for loops.

But if I let the expansion serve as a literal specification,
it won't match the behaviour of for-loops either, because
although it doesn't cache methods, PyIter_Next isn't
exactly the same as looking up next() on the instance
either.

I definitely don't want to preclude the implementation
from using PyIter_Next, as that would be a major
performance hit in the most common case.

I also don't want to preclude caching a send() method,
because in the absence of a __send__ typeslot it's the
only way we have of improving performance.

I don't care much about throw() or close(), because
they will rarely be called anyway. But by the same
token, little would be gained by a wrapper using fancy
tricks to redirect them.

-- 
Greg


From greg.ewing at canterbury.ac.nz  Sun Apr 12 01:44:46 2009
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Sun, 12 Apr 2009 11:44:46 +1200
Subject: [Python-ideas] x=(yield from) confusion [was:Yet another
 alternative name for yield-from]
In-Reply-To: <49E08DD1.20504@gmail.com>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com>
	<49DA8423.9070101@canterbury.ac.nz> <49DAC193.3030008@improva.dk>
	<49DBCCC6.1080601@gmail.com> <49DC04F7.5080707@improva.dk>
	<49DC8BD0.4080303@gmail.com> <49DC9AE8.8020108@improva.dk>
	<49DCBD02.8030806@improva.dk>
	<ca471dc20904081121x56e61879q27ae9da604e82dc7@mail.gmail.com>
	<49DD62FA.9080504@improva.dk>
	<ca471dc20904082055s5f838a43tc1c175c8bf02cad9@mail.gmail.com>
	<49DDE66D.9080401@improva.dk> <49DDF1F9.8060904@gmail.com>
	<49DE0481.1010309@improva.dk> <49DEA06A.9030102@gmail.com>
	<49DEEE8D.8060906@improva.dk> <49DF3511.9050307@canterbury.ac.nz>
	<49DF96DD.4040406@improva.dk> <49E01B04.6010401@canterbury.ac.nz>
	<49E073A2.1080804@improva.dk> <49E07C7E.4070802@gmail.com>
	<49E0871B.2050404@improva.dk> <49E08DD1.20504@gmail.com>
Message-ID: <49E12B6E.6080804@canterbury.ac.nz>

Nick Coghlan wrote:

> Since the language reference is actually silent on the topic of caching
> the bound method when iterating over an object,

Since it's silent about that, if you write a for-loop
that relies on presence or absence of cacheing behaviour,
the result is undefined. The behaviour of yield-from
on the same iterator would also be undefined.

It's meaningless to talk about whether one undefined
construct has the same semantics as another.

-- 
Greg



From guido at python.org  Sun Apr 12 02:04:28 2009
From: guido at python.org (Guido van Rossum)
Date: Sat, 11 Apr 2009 17:04:28 -0700
Subject: [Python-ideas] x=(yield from) confusion [was:Yet another
	alternative name for yield-from]
In-Reply-To: <49E12B6E.6080804@canterbury.ac.nz>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com> 
	<49DEEE8D.8060906@improva.dk> <49DF3511.9050307@canterbury.ac.nz> 
	<49DF96DD.4040406@improva.dk> <49E01B04.6010401@canterbury.ac.nz> 
	<49E073A2.1080804@improva.dk> <49E07C7E.4070802@gmail.com> 
	<49E0871B.2050404@improva.dk> <49E08DD1.20504@gmail.com>
	<49E12B6E.6080804@canterbury.ac.nz>
Message-ID: <ca471dc20904111704y242c42cby5cf27deb7f4c7ffb@mail.gmail.com>

On Sat, Apr 11, 2009 at 4:44 PM, Greg Ewing <greg.ewing at canterbury.ac.nz> wrote:
> Nick Coghlan wrote:
>> Since the language reference is actually silent on the topic of caching
>> the bound method when iterating over an object,
>
> Since it's silent about that, if you write a for-loop
> that relies on presence or absence of cacheing behaviour,
> the result is undefined. The behaviour of yield-from
> on the same iterator would also be undefined.
>
> It's meaningless to talk about whether one undefined
> construct has the same semantics as another.

But I wouldn't claim that the language reference being silent means
that it's undefined. If I were asked for a clarification I would say
that caching shouldn't be allowed if it changes the meaning of the
program. Python in general favors *defined* semantics over leaving
things in the gray.

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)


From ncoghlan at gmail.com  Sun Apr 12 03:01:31 2009
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Sun, 12 Apr 2009 11:01:31 +1000
Subject: [Python-ideas] x=(yield from) confusion [was:Yet another
 alternative name for yield-from]
In-Reply-To: <49E12B6E.6080804@canterbury.ac.nz>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com>
	<49DA8423.9070101@canterbury.ac.nz>
	<49DAC193.3030008@improva.dk> <49DBCCC6.1080601@gmail.com>
	<49DC04F7.5080707@improva.dk> <49DC8BD0.4080303@gmail.com>
	<49DC9AE8.8020108@improva.dk> <49DCBD02.8030806@improva.dk>
	<ca471dc20904081121x56e61879q27ae9da604e82dc7@mail.gmail.com>
	<49DD62FA.9080504@improva.dk>
	<ca471dc20904082055s5f838a43tc1c175c8bf02cad9@mail.gmail.com>
	<49DDE66D.9080401@improva.dk> <49DDF1F9.8060904@gmail.com>
	<49DE0481.1010309@improva.dk> <49DEA06A.9030102@gmail.com>
	<49DEEE8D.8060906@improva.dk> <49DF3511.9050307@canterbury.ac.nz>
	<49DF96DD.4040406@improva.dk> <49E01B04.6010401@canterbury.ac.nz>
	<49E073A2.1080804@improva.dk> <49E07C7E.4070802@gmail.com>
	<49E0871B.2050404@improva.dk> <49E08DD1.20504@gmail.com>
	<49E12B6E.6080804@canterbury.ac.nz>
Message-ID: <49E13D6B.7030109@gmail.com>

Greg Ewing wrote:
> Nick Coghlan wrote:
> 
>> Since the language reference is actually silent on the topic of caching
>> the bound method when iterating over an object,
> 
> Since it's silent about that, if you write a for-loop
> that relies on presence or absence of cacheing behaviour,
> the result is undefined. The behaviour of yield-from
> on the same iterator would also be undefined.
> 
> It's meaningless to talk about whether one undefined
> construct has the same semantics as another.

I agree that would be true in the absence of an accepted reference
implementation (i.e. CPython) that doesn't cache the bound methods
(hence allowing one to play games with the next() method definition
while looping over an iterator).

If I understand Guido's last message correctly, this is one of the cases
where he would like the existing behaviour of the CPython implementation
to be the defined behaviour for the language as well.

Cheers,
Nick.

P.S. I created http://bugs.python.org/issue5739 as a documentation bug
pointing back to this email thread in relation to whether it is OK for a
Python implementation to cache the next() method lookup in a for loop.

P.P.S. OK, stepping away from the computer and going on vacation now... :)

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------


From ncoghlan at gmail.com  Sun Apr 12 03:13:09 2009
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Sun, 12 Apr 2009 11:13:09 +1000
Subject: [Python-ideas] x=(yield from) confusion [was:Yet another
 alternative name for yield-from]
In-Reply-To: <49E12942.8060701@canterbury.ac.nz>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com>
	<49DA8423.9070101@canterbury.ac.nz>
	<49DAC193.3030008@improva.dk> <49DBCCC6.1080601@gmail.com>
	<49DC04F7.5080707@improva.dk> <49DC8BD0.4080303@gmail.com>
	<49DC9AE8.8020108@improva.dk> <49DCBD02.8030806@improva.dk>
	<ca471dc20904081121x56e61879q27ae9da604e82dc7@mail.gmail.com>
	<49DD62FA.9080504@improva.dk>
	<ca471dc20904082055s5f838a43tc1c175c8bf02cad9@mail.gmail.com>
	<49DDE66D.9080401@improva.dk> <49DDF1F9.8060904@gmail.com>
	<49DE0481.1010309@improva.dk> <49DEA06A.9030102@gmail.com>
	<49DEEE8D.8060906@improva.dk> <49DF3511.9050307@canterbury.ac.nz>
	<49DF96DD.4040406@improva.dk> <49E01B04.6010401@canterbury.ac.nz>
	<49E073A2.1080804@improva.dk> <49E07C7E.4070802@gmail.com>
	<49E12942.8060701@canterbury.ac.nz>
Message-ID: <49E14025.1020709@gmail.com>

Greg Ewing wrote:
> I definitely don't want to preclude the implementation
> from using PyIter_Next, as that would be a major
> performance hit in the most common case.

We already have a general caveat in the docs saying that an
implementation is allowed (or sometimes even required) to bypass normal
attribute lookup for special methods defined by the language. You may
want to point to that caveat from the PEP:

http://docs.python.org/reference/datamodel.html#special-method-lookup-for-new-style-classes
http://docs.python.org/3.0/reference/datamodel.html#special-method-lookup

Being able to use PyIter_Next and the various other typeslots is exactly
what that caveat is about.

One way you can make the expansion more explicit about bypassing the
instance is to write things like "type(itr).send(itr, val)" instead of
"itr.send(val)".

> I also don't want to preclude caching a send() method,
> because in the absence of a __send__ typeslot it's the
> only way we have of improving performance.

Actually, we do have another way of improving performance - add a
typeslot for it :)

That can be left until we find out whether or not the lookup of send()
becomes a performance bottleneck for yield-from usage (which I doubt
will be the case).

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------


From greg.ewing at canterbury.ac.nz  Sun Apr 12 08:02:16 2009
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Sun, 12 Apr 2009 18:02:16 +1200
Subject: [Python-ideas] x=(yield from) confusion [was:Yet another
 alternative name for yield-from]
In-Reply-To: <49E14025.1020709@gmail.com>
References: <fb6fbf560904021843j5aeb131fv9f11d212c0674476@mail.gmail.com>
	<49DA8423.9070101@canterbury.ac.nz> <49DAC193.3030008@improva.dk>
	<49DBCCC6.1080601@gmail.com> <49DC04F7.5080707@improva.dk>
	<49DC8BD0.4080303@gmail.com> <49DC9AE8.8020108@improva.dk>
	<49DCBD02.8030806@improva.dk>
	<ca471dc20904081121x56e61879q27ae9da604e82dc7@mail.gmail.com>
	<49DD62FA.9080504@improva.dk>
	<ca471dc20904082055s5f838a43tc1c175c8bf02cad9@mail.gmail.com>
	<49DDE66D.9080401@improva.dk> <49DDF1F9.8060904@gmail.com>
	<49DE0481.1010309@improva.dk> <49DEA06A.9030102@gmail.com>
	<49DEEE8D.8060906@improva.dk> <49DF3511.9050307@canterbury.ac.nz>
	<49DF96DD.4040406@improva.dk> <49E01B04.6010401@canterbury.ac.nz>
	<49E073A2.1080804@improva.dk> <49E07C7E.4070802@gmail.com>
	<49E12942.8060701@canterbury.ac.nz> <49E14025.1020709@gmail.com>
Message-ID: <49E183E8.5030300@canterbury.ac.nz>

Nick Coghlan wrote:

> We already have a general caveat in the docs saying that an
> implementation is allowed (or sometimes even required) to bypass normal
> attribute lookup for special methods defined by the language.

However, in 2.x it's not obvious that next() is a special
method, because it doesn't have an __xxx__ name.

I think what I'll do is this:

* Use Python 3 syntax for the expansion, and write
   next(_i) instead of _i.next().

* Not say anything one way or the other about cacheing
   methods.

-- 
Greg


From mrts.pydev at gmail.com  Sun Apr 12 12:40:12 2009
From: mrts.pydev at gmail.com (=?ISO-8859-1?Q?Mart_S=F5mermaa?=)
Date: Sun, 12 Apr 2009 13:40:12 +0300
Subject: [Python-ideas] Proposed addtion to urllib.parse in 3.1 (and
	urlparse in 2.7)
In-Reply-To: <ad1f81530903300522m51fd1099s90c05983ae748fa3@mail.gmail.com>
References: <ad1f81530903260849k7867f8e7k45a558f3cb608dd3@mail.gmail.com>
	<49CD2930.4080307@cornell.edu> <gqjnti$qes$1@ger.gmane.org>
	<91ad5bf80903271728ka18360cpd514aa5dd93cd74a@mail.gmail.com>
	<ca471dc20903271926y61f16740h8c3f29e4a1e4c376@mail.gmail.com>
	<ad1f81530903300304m796e75dmc942d38c015e4fc6@mail.gmail.com>
	<49D09ECF.5090407@trueblade.com>
	<ad1f81530903300355g2e112cadwcf5250761d4e1f87@mail.gmail.com>
	<49D0ACD5.5090209@gmail.com>
	<ad1f81530903300522m51fd1099s90c05983ae748fa3@mail.gmail.com>
Message-ID: <ad1f81530904120340n675c03f3u51c573cd3a6df404@mail.gmail.com>

The general consensus in python-ideas is that the following is needed, so I
bring it to python-dev to final discussions before I file a feature request
in bugs.python.org.

Proposal: add add_query_params() for appending query parameters to an URL to
urllib.parse and urlparse.

Implementation:
http://github.com/mrts/qparams/blob/83d1ec287ec10934b5e637455819cf796b1b421c/qparams.py(feel
free to fork and comment).

Behaviour (longish, guided by "simple things are simiple, complex things
possible"):

In the simplest form, parameters can be passed via keyword arguments:

    >>> add_query_params('foo', bar='baz')
    'foo?bar=baz'

    >>> add_query_params('http://example.com/a/b/c?a=b', b='d')
    'http://example.com/a/b/c?a=b&b=d'

Note that '/', if given in arguments, is encoded:

    >>> add_query_params('http://example.com/a/b/c?a=b', b='d', foo='/bar')
    'http://example.com/a/b/c?a=b&b=d&foo=%2Fbar'

Duplicates are discarded:

    >>> add_query_params('http://example.com/a/b/c?a=b', a='b')
    'http://example.com/a/b/c?a=b'

    >>> add_query_params('http://example.com/a/b/c?a=b&c=q', a='b', b='d',
    ...  c='q')
    'http://example.com/a/b/c?a=b&c=q&b=d'

But different values for the same key are supported:

    >>> add_query_params('http://example.com/a/b/c?a=b', a='c', b='d')
    'http://example.com/a/b/c?a=b&a=c&b=d'

Pass different values for a single key in a list (again, duplicates are
removed):

    >>> add_query_params('http://example.com/a/b/c?a=b', a=('q', 'b', 'c'),
    ... b='d')
    'http://example.com/a/b/c?a=b&a=q&a=c&b=d'

Keys with no value are respected, pass ``None`` to create one:

    >>> add_query_params('http://example.com/a/b/c?a', b=None)
    'http://example.com/a/b/c?a&b'

But if a value is given, the empty key is considered a duplicate (i.e. the
case of a&a=b is considered nonsensical):

    >>> add_query_params('http://example.com/a/b/c?a', a='b', c=None)
    'http://example.com/a/b/c?a=b&c'

If you need to pass in key names that are not allowed in keyword arguments,
pass them via a dictionary in second argument:

    >>> add_query_params('foo', {"+'|???": 'bar'})
    'foo?%2B%27%7C%C3%A4%C3%BC%C3%B6=bar'

Order of original parameters is retained, although similar keys are grouped
together. Order of keyword arguments is not (and can not be) retained:

    >>> add_query_params('foo?a=b&b=c&a=b&a=d', a='b')
    'foo?a=b&a=d&b=c'

    >>> add_query_params('http://example.com/a/b/c?a=b&q=c&e=d',
    ... x='y', e=1, o=2)
    'http://example.com/a/b/c?a=b&q=c&e=d&e=1&x=y&o=2'

If you need to retain the order of the added parameters, use an
:class:`OrderedDict` as the second argument (*params_dict*):

    >>> from collections import OrderedDict
    >>> od = OrderedDict()
    >>> od['xavier'] = 1
    >>> od['abacus'] = 2
    >>> od['janus'] = 3
    >>> add_query_params('http://example.com/a/b/c?a=b', od)
    'http://example.com/a/b/c?a=b&xavier=1&abacus=2&janus=3'

If both *params_dict* and keyword arguments are provided, values from the
former are used before the latter:

    >>> add_query_params('http://example.com/a/b/c?a=b', od, xavier=1.1,
    ... zorg='a', alpha='b', watt='c', borg='d')
    '
http://example.com/a/b/c?a=b&xavier=1&xavier=1.1&abacus=2&janus=3&zorg=a&borg=d&watt=c&alpha=b
'

Do nothing with a single argument:

    >>> add_query_params('a')
    'a'

    >>> add_query_params('arbitrary strange stuff?????*()+-=42')
    'arbitrary strange stuff?\xc3\xb6\xc3\xa4\xc3\xbc\xc3\xb5*()+-=42'
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-ideas/attachments/20090412/d48fe5c7/attachment.html>

From jh at improva.dk  Sun Apr 12 14:23:54 2009
From: jh at improva.dk (Jacob Holm)
Date: Sun, 12 Apr 2009 14:23:54 +0200
Subject: [Python-ideas] Proposed addtion to urllib.parse in 3.1
 (and	urlparse in 2.7)
In-Reply-To: <ad1f81530904120340n675c03f3u51c573cd3a6df404@mail.gmail.com>
References: <ad1f81530903260849k7867f8e7k45a558f3cb608dd3@mail.gmail.com>	<49CD2930.4080307@cornell.edu>
	<gqjnti$qes$1@ger.gmane.org>	<91ad5bf80903271728ka18360cpd514aa5dd93cd74a@mail.gmail.com>	<ca471dc20903271926y61f16740h8c3f29e4a1e4c376@mail.gmail.com>	<ad1f81530903300304m796e75dmc942d38c015e4fc6@mail.gmail.com>	<49D09ECF.5090407@trueblade.com>	<ad1f81530903300355g2e112cadwcf5250761d4e1f87@mail.gmail.com>	<49D0ACD5.5090209@gmail.com>	<ad1f81530903300522m51fd1099s90c05983ae748fa3@mail.gmail.com>
	<ad1f81530904120340n675c03f3u51c573cd3a6df404@mail.gmail.com>
Message-ID: <49E1DD5A.30405@improva.dk>

Hi Mart

I haven't really followed this thread closely, so I apologize if some of 
my comments below have already been addressed.

Mart S?mermaa wrote:
> The general consensus in python-ideas is that the following is needed, 
> so I bring it to python-dev to final discussions before I file a 
> feature request in bugs.python.org <http://bugs.python.org>.
>
> Proposal: add add_query_params() for appending query parameters to an 
> URL to urllib.parse and urlparse.
>
> Implementation: 
> http://github.com/mrts/qparams/blob/83d1ec287ec10934b5e637455819cf796b1b421c/qparams.py 
> (feel free to fork and comment).
>
> Behaviour (longish, guided by "simple things are simiple, complex 
> things possible"):
>
> In the simplest form, parameters can be passed via keyword arguments:
>
>     >>> add_query_params('foo', bar='baz')
>     'foo?bar=baz'
>
>     >>> add_query_params('http://example.com/a/b/c?a=b', b='d')
>     'http://example.com/a/b/c?a=b&b=d <http://example.com/a/b/c?a=b&b=d>'
>
> Note that '/', if given in arguments, is encoded:
>
>     >>> add_query_params('http://example.com/a/b/c?a=b', b='d', 
> foo='/bar')
>     'http://example.com/a/b/c?a=b&b=d&foo=%2Fbar 
> <http://example.com/a/b/c?a=b&b=d&foo=%2Fbar>'
>
> Duplicates are discarded:

Why discard duplicates?  They are valid and have a well-defined meaning.

>
>     >>> add_query_params('http://example.com/a/b/c?a=b', a='b')
>     'http://example.com/a/b/c?a=b'

I would prefer: 'http://example.com/a/b/c?a=b&a=b'

>
>     >>> add_query_params('http://example.com/a/b/c?a=b&c=q 
> <http://example.com/a/b/c?a=b&c=q>', a='b', b='d',
>     ...  c='q')
>     'http://example.com/a/b/c?a=b&c=q&b=d 
> <http://example.com/a/b/c?a=b&c=q&b=d>'
>

I would prefer: 'http://example.com/a/b/c?a=b&c=q&a=b&b=d'


> But different values for the same key are supported:
>
>     >>> add_query_params('http://example.com/a/b/c?a=b', a='c', b='d')
>     'http://example.com/a/b/c?a=b&a=c&b=d 
> <http://example.com/a/b/c?a=b&a=c&b=d>'
>
> Pass different values for a single key in a list (again, duplicates are
> removed):
>
>     >>> add_query_params('http://example.com/a/b/c?a=b', a=('q', 'b', 
> 'c'),
>     ... b='d')
>     'http://example.com/a/b/c?a=b&a=q&a=c&b=d 
> <http://example.com/a/b/c?a=b&a=q&a=c&b=d>'
>
> Keys with no value are respected, pass ``None`` to create one:
>
>     >>> add_query_params('http://example.com/a/b/c?a', b=None)
>     'http://example.com/a/b/c?a&b <http://example.com/a/b/c?a&b>'
>
> But if a value is given, the empty key is considered a duplicate (i.e. the
> case of a&a=b is considered nonsensical):

Again, it is a valid url and this will change its meaning.  Why?

>
>     >>> add_query_params('http://example.com/a/b/c?a', a='b', c=None)
>     'http://example.com/a/b/c?a=b&c <http://example.com/a/b/c?a=b&c>'
>
> If you need to pass in key names that are not allowed in keyword 
> arguments,
> pass them via a dictionary in second argument:
>
>     >>> add_query_params('foo', {"+'|???": 'bar'})
>     'foo?%2B%27%7C%C3%A4%C3%BC%C3%B6=bar'
>
> Order of original parameters is retained, although similar keys are 
> grouped
> together. 

Why the grouping?  Is it a side effect of your desire to discard 
duplicates?   Changing the order like that changes the meaning of the 
url.  A concrete case where the order of field names matters is the 
":records" converter in http://pypi.python.org/pypi/zope.httpform/1.0.1 
(a small independent package extracted from the form handling code in zope).

> Order of keyword arguments is not (and can not be) retained:
>
>     >>> add_query_params('foo?a=b&b=c&a=b&a=d', a='b')
>     'foo?a=b&a=d&b=c'
>
>     >>> add_query_params('http://example.com/a/b/c?a=b&q=c&e=d 
> <http://example.com/a/b/c?a=b&q=c&e=d>',
>     ... x='y', e=1, o=2)
>     'http://example.com/a/b/c?a=b&q=c&e=d&e=1&x=y&o=2 
> <http://example.com/a/b/c?a=b&q=c&e=d&e=1&x=y&o=2>'
>
> If you need to retain the order of the added parameters, use an
> :class:`OrderedDict` as the second argument (*params_dict*):
>
>     >>> from collections import OrderedDict
>     >>> od = OrderedDict()
>     >>> od['xavier'] = 1
>     >>> od['abacus'] = 2
>     >>> od['janus'] = 3
>     >>> add_query_params('http://example.com/a/b/c?a=b', od)
>     'http://example.com/a/b/c?a=b&xavier=1&abacus=2&janus=3 
> <http://example.com/a/b/c?a=b&xavier=1&abacus=2&janus=3>'
>
> If both *params_dict* and keyword arguments are provided, values from the
> former are used before the latter:
>
>     >>> add_query_params('http://example.com/a/b/c?a=b', od, xavier=1.1,
>     ... zorg='a', alpha='b', watt='c', borg='d')
>     
> 'http://example.com/a/b/c?a=b&xavier=1&xavier=1.1&abacus=2&janus=3&zorg=a&borg=d&watt=c&alpha=b 
> <http://example.com/a/b/c?a=b&xavier=1&xavier=1.1&abacus=2&janus=3&zorg=a&borg=d&watt=c&alpha=b>'
>
> Do nothing with a single argument:
>
>     >>> add_query_params('a')
>     'a'
>
>     >>> add_query_params('arbitrary strange stuff?????*()+-=42')
>     'arbitrary strange stuff?\xc3\xb6\xc3\xa4\xc3\xbc\xc3\xb5*()+-=42'

If you change it to keep duplicates and not unnecessarily mangle the 
field order I am +1, else I am -0.

- Jacob


From mrts.pydev at gmail.com  Sun Apr 12 15:15:46 2009
From: mrts.pydev at gmail.com (=?ISO-8859-1?Q?Mart_S=F5mermaa?=)
Date: Sun, 12 Apr 2009 16:15:46 +0300
Subject: [Python-ideas] Proposed addtion to urllib.parse in 3.1 (and
	urlparse in 2.7)
In-Reply-To: <49E1DD5A.30405@improva.dk>
References: <ad1f81530903260849k7867f8e7k45a558f3cb608dd3@mail.gmail.com>
	<91ad5bf80903271728ka18360cpd514aa5dd93cd74a@mail.gmail.com>
	<ca471dc20903271926y61f16740h8c3f29e4a1e4c376@mail.gmail.com>
	<ad1f81530903300304m796e75dmc942d38c015e4fc6@mail.gmail.com>
	<49D09ECF.5090407@trueblade.com>
	<ad1f81530903300355g2e112cadwcf5250761d4e1f87@mail.gmail.com>
	<49D0ACD5.5090209@gmail.com>
	<ad1f81530903300522m51fd1099s90c05983ae748fa3@mail.gmail.com>
	<ad1f81530904120340n675c03f3u51c573cd3a6df404@mail.gmail.com>
	<49E1DD5A.30405@improva.dk>
Message-ID: <ad1f81530904120615o92e786cv184716098887c33a@mail.gmail.com>

On Sun, Apr 12, 2009 at 3:23 PM, Jacob Holm <jh at improva.dk> wrote:

> Hi Mart
>
>    >>> add_query_params('http://example.com/a/b/c?a=b', b='d', foo='/bar')
>>    'http://example.com/a/b/c?a=b&b=d&foo=%2Fbar <
>> http://example.com/a/b/c?a=b&b=d&foo=%2Fbar>'
>>
>> Duplicates are discarded:
>>
>
> Why discard duplicates?  They are valid and have a well-defined meaning.



The bad thing about reasoning about query strings is that there is no
comprehensive documentation about their meaning. Both RFC 1738 and RFC 3986
are rather vague in that matter. But I agree that duplicates actually have a
meaning (an ordered list of identical values), so I'll remove the bits that
prune them unless anyone opposes (which I doubt).


>> But if a value is given, the empty key is considered a duplicate (i.e. the
>> case of a&a=b is considered nonsensical):
>>
>
> Again, it is a valid url and this will change its meaning.  Why?


I'm uncertain whether a&a=b has a meaning, but don't see any harm in
supporting it, so I'll add the feature.


>>    >>> add_query_params('http://example.com/a/b/c?a', a='b', c=None)
>>    'http://example.com/a/b/c?a=b&c <http://example.com/a/b/c?a=b&c>'
>>
>> If you need to pass in key names that are not allowed in keyword
>> arguments,
>> pass them via a dictionary in second argument:
>>
>>    >>> add_query_params('foo', {"+'|???": 'bar'})
>>    'foo?%2B%27%7C%C3%A4%C3%BC%C3%B6=bar'
>>
>> Order of original parameters is retained, although similar keys are
>> grouped
>> together.
>>
>
> Why the grouping?  Is it a side effect of your desire to discard
> duplicates?   Changing the order like that changes the meaning of the url.
>  A concrete case where the order of field names matters is the ":records"
> converter in http://pypi.python.org/pypi/zope.httpform/1.0.1 (a small
> independent package extracted from the form handling code in zope).


 It's also related to duplicate handling, but it mostly relates to the data
structure used in the initial implementation (an OrderedDict). Re-grouping
is removed now and not having to deal with duplicates simplified the code
considerably (using a simple list of key-value tuples now).

If you change it to keep duplicates and not unnecessarily mangle the field
> order I am +1, else I am -0.


Thanks for your input! Changes pushed to github (see the updated behaviour
there as well):

http://github.com/mrts/qparams/blob/4f32670b55082f8d0ef01c33524145c3264c161a/qparams.py

MS
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-ideas/attachments/20090412/a322d5ec/attachment.html>

From cs at zip.com.au  Sun Apr 12 23:17:46 2009
From: cs at zip.com.au (Cameron Simpson)
Date: Mon, 13 Apr 2009 07:17:46 +1000
Subject: [Python-ideas] Proposed addtion to urllib.parse in 3.1 (and
	urlparse in 2.7)
In-Reply-To: <ad1f81530904120615o92e786cv184716098887c33a@mail.gmail.com>
Message-ID: <20090412211746.GA23767@cskk.homeip.net>

On 12Apr2009 16:15, Mart S?mermaa <mrts.pydev at gmail.com> wrote:
| On Sun, Apr 12, 2009 at 3:23 PM, Jacob Holm <jh at improva.dk> wrote:
| > Hi Mart
| >    >>> add_query_params('http://example.com/a/b/c?a=b', b='d', foo='/bar')
| >>    'http://example.com/a/b/c?a=b&b=d&foo=%2Fbar <
| >> http://example.com/a/b/c?a=b&b=d&foo=%2Fbar>'
| >>
| >> Duplicates are discarded:
| >
| > Why discard duplicates?  They are valid and have a well-defined meaning.
| 
| The bad thing about reasoning about query strings is that there is no
| comprehensive documentation about their meaning. Both RFC 1738 and RFC 3986
| are rather vague in that matter. But I agree that duplicates actually have a
| meaning (an ordered list of identical values), so I'll remove the bits that
| prune them unless anyone opposes (which I doubt).

+1 from me, with the following suggestion: it's probably worth adding the
to doco that people working with dict-style query_string params should
probably go make a dict or OrderedDict and use:

  add_query_params(..., **the_dict)

just to make the other use case obvious.

An alternative would be to have add_ and append_ methods with set and
list behaviour. Feels a little like API bloat, though the convenience
function can be nice.

Cheers,
-- 
Cameron Simpson <cs at zip.com.au> DoD#743
http://www.cskk.ezoshosting.com/cs/

The wonderous pulp and fibre of the brain had been substituted by brass and
iron; he had taught wheelwork to think. - Harry Wilmot Buxton 1832,
                referring to Charles Babbage and his difference engine.


From erik at cq2.org  Tue Apr 14 17:11:46 2009
From: erik at cq2.org (Erik Groeneveld)
Date: Tue, 14 Apr 2009 17:11:46 +0200
Subject: [Python-ideas] yield-from practical experience
Message-ID: <aaec99390904140811g3ac7990bnfcd4298e0ab19bfb@mail.gmail.com>

Andrew McNabb suggested I take a look at the python-ideas thread about
yield-from because I have written a little framework called Weightless
that supports decomposing a program into generators.  I am aware that
I am a bit late, but here is my two cents.

I read the PEP and the discussions (most of it, to be fair) and I
compared it to what I have been creating the last couple of years.
For one thing, it is certainly not easy to create a comprehensible way
of decomposing programs into generators, especially in the presence of
exceptions.  It took me two years to understand it, and I have coded
it into a single generator called 'compose'
(http://www.weightless.io/compose).

As some people wanted to see an implementation of 'yield from': here
is one that actually uses the normal yield and just assumes any
generator is to be descended into.
(http://weightless.svn.sourceforge.net/viewvc/weightless/trunk/weightless/_compose.py?revision=17&view=markup)
'compose' allows one to decompose a program into generators, just like
it has been proposed.

As was is also questioned why all the complexity is needed. I believe
it is necessary to create an clear, abstract, comprehensible and
consistent programming interface.  Consider these examples:

Many programmers would expect this to work:

       def myHandler():
               try:
                       request = yield from readRequest()
               except:
                       handle error

But it is really not working without catching exceptions from
readRequest() and forward them to myHandler().

Again, many programmers would expect this to work:


       def a():
               yield from b()
       def b():
               yield from c()
       def c():
               raise Exception('b')
               yield 1

You would like to see a stack trace like:

       Traceback (most recent call last):
         File "example.py", line 34, in
           list(compose(a()))
         File "example.py", line 27, in a
           yield b()
         File "example.py", line 29, in b
           yield c()
         File "example.py", line 32, in c
           raise Exception('b')
       Exception: b

but you will see this, for more of less the same reason as why you
weren't able to catch exceptions in the first place:

Traceback (most recent call last):
         File "probeer.py", line 34, in
           list(compose(a()))
         File "probeer.py", line 32, in c
           raise Exception('b')
       Exception: b

We succeeded to create a rather complicated 'compose' that give an
intuitive decomposition tool for programmers and we are using it at a
daily basis. (see link above)

One of the main advantages (besides lightweight threading) is the use
of JSP style pipelines.  Things like for exampe an HTTP protocol
handler are much easier (have a more friendly flow, as Klaas van
Schelven puts it) to create than with callbacks.  An simple example
can bee seen here:
http://weightless.svn.sourceforge.net/viewvc/weightless/trunk/examples/callbacksvsgenerator.py?revision=9&view=markup
 I've set this up for the Dutch Python Usergroup and it demonstrates
the idea.

I really would be glad if something like 'yield from' becomes part of
the language.  I would love to cooperate during the realization and
testing so I can validate my own bakery of the last years and perhaps
contribute something?


Erik Groeneveld


From benjamin at python.org  Wed Apr 15 00:08:00 2009
From: benjamin at python.org (Benjamin Peterson)
Date: Tue, 14 Apr 2009 22:08:00 +0000 (UTC)
Subject: [Python-ideas] seperating streams from there underlying buffer
Message-ID: <loom.20090414T215600-168@post.gmane.org>

Currently, If you want write/read binary from a stream you are using
TextIOWrapper with you must keep the TextIOWrapper alive otherwise, it will
close the underlying buffer. I think we should provide a disconnect() method for
BufferedIOBase and TextIOBase that removes from the wrapper and returns raw
stream or buffer respectively.



From pyideas at rebertia.com  Wed Apr 15 08:35:22 2009
From: pyideas at rebertia.com (Chris Rebert)
Date: Tue, 14 Apr 2009 23:35:22 -0700
Subject: [Python-ideas] accurate errors for "magic" methods
In-Reply-To: <grniuk$3ks$1@ger.gmane.org>
References: <20090409190836.6c1b0802@o> <grlcs0$p5c$1@ger.gmane.org>
	<200904101445.13348.steve@pearwood.info> <grniuk$3ks$1@ger.gmane.org>
Message-ID: <50697b2c0904142335k57410de8h7a1a3f406ce4e570@mail.gmail.com>

On Fri, Apr 10, 2009 at 6:52 AM, R. David Murray <rdmurray at bitdance.com> wrote:
> Steven D'Aprano <steve at pearwood.info> wrote:
>> On Fri, 10 Apr 2009 03:56:16 am Georg Brandl wrote:
>> > spir schrieb:
>> > > Actually, I'm wrong: it's perfectly clear as long as the programmer
>> > > is able to follow all the necessary reflexion path; then probably
>> > > also able to solve the problem without any help from python.
>> > >
>> > > The issue here is that a very specific (and meaningful) case
>> > > (dict-like behaviour missing) is adressed using a very generic (and
>> > > thus helpless) message (attributeError).
>> > >
>> > > I think error cases about "magic" methods, that implement
>> > > conceptually meaningful behaviours, should have appropriate
>> > > messages. In the case above, maybe something like: "Values instance
>> > > is not an item container (no __getitem__ method found)."
>> >
>> > The time machine strikes again:
>> > >>> class A(object): pass
>> >
>> > ...
>> >
>> > >>> A()['a']
>> >
>> > Traceback (most recent call last):
>> >   File "<stdin>", line 1, in <module>
>> > TypeError: 'A' object is unsubscriptable
>> >
>> >
>> > (the difference being that A is new-style, while Values is
>> > old-style.)
>>
>>
>> Except that the error "object is unsubscriptable" might as well be in
>> Klingon to most people, particularly newbies.
>>
>> (1) It's easy to misread it as "unscriptable", which is even more
>> mysterious.
>>
>> (2) As far as I know, there's no tradition of describing key or index
>> lookup as "subscripting" in Python. I've never noticed it in doc
>> strings or the online docs, and after hanging around comp.lang.python
>> extensively for years, I feel safe to say it's not a common term among
>> even experienced Python developers. I suppose that there's a weak
>> connection between index lookup and subscripts in mathematics.
>>
>> (3) The classic error message tells the newbie exactly what the error
>> is: the object has no __getitem__ method. The new error message tells
>> the newbie nothing useful. Given that obj is unsubscriptable, what
>> needs to be done to make it subscriptable?
>
> +1
>
> And not just the newbie, either.  The experienced python programmer
> looks at the original message and goes "ah ha".  The experienced python
> programmer looks at the new message and has to _think about it_ before
> understanding arrives.  I think that would be true even if you found a
> better word than 'unsubscriptable'.  "Does not implement an item lookup
> method" might work.  Or how about, 'Does not implement the Sequence or
> Mapping interface'?
>
> But you know what?  Doing a google search for 'python mapping' gets you
> the description of dictionaries, while doing a google search for 'python
> __getitem__' gets you to the data model chapter that describes how the
> mapping/sequence behavior is implemented via __getitem__.  The latter
> is more useful even to the newbie, I think.
>
> So even if it looks ugly, I think the error message should mention
> __getitem__.

I have gone ahead and filed a bug: http://bugs.python.org/issue5760

Cheers,
Chris
--
I have a blog:
http://blog.rebertia.com


From greg.ewing at canterbury.ac.nz  Wed Apr 15 08:36:38 2009
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Wed, 15 Apr 2009 18:36:38 +1200
Subject: [Python-ideas] Revised**10 PEP on Yield-From
Message-ID: <49E58076.4060202@canterbury.ac.nz>

Draft 11 of the PEP.

Changes in this version:

- GeneratorExit always calls close() and is always
   reraised.

- Special handling of thrown-in StopIterations
   removed, since Guido doesn't think you should be
   doing that in the first place.

- Expansion uses next(_i) instead of _i.next() and
   doesn't mention cacheing of methods.

-- 
Greg
-------------- next part --------------
An embedded and charset-unspecified text was scrubbed...
Name: yield-from-rev11d.txt
URL: <http://mail.python.org/pipermail/python-ideas/attachments/20090415/cee11f31/attachment.txt>

From solipsis at pitrou.net  Wed Apr 15 11:43:09 2009
From: solipsis at pitrou.net (Antoine Pitrou)
Date: Wed, 15 Apr 2009 09:43:09 +0000 (UTC)
Subject: [Python-ideas] seperating streams from their underlying buffer
References: <loom.20090414T215600-168@post.gmane.org>
Message-ID: <loom.20090415T094109-712@post.gmane.org>

Benjamin Peterson <benjamin at ...> writes:
> 
> I think we should provide a disconnect() method for
> BufferedIOBase and TextIOBase that removes from the wrapper and returns raw
> stream or buffer respectively.

Good idea :)
disconnect() sounds network-y, how about detach()?
Also, it should probably do an implicit flush of the internal buffer.




From denis.spir at free.fr  Wed Apr 15 11:50:37 2009
From: denis.spir at free.fr (spir)
Date: Wed, 15 Apr 2009 11:50:37 +0200
Subject: [Python-ideas] Why not Ruby -- and what else
Message-ID: <20090415115037.6fff0824@o>

I have just found the following from http://en.wikipedia.org/wiki/Iterator#Implicit_iterators

===================================
Some object-oriented languages such as Perl, Python, C#, Ruby and later versions of Java and Delphi provide an intrinsic way of iterating through the elements of a container object without the introduction of an explicit iterator object. An actual iterator object may exist in reality, but if it does it is not exposed within the source code of the language.

Implicit iterators are often manifested by a "foreach" statement (or equivalent), such as in the following Python example:

for value in iterable:
    print value

Or other times they may be created by the collection object itself, as in this Ruby example:

iterable.each do |value|
  puts value
end
==================================

For any weird reason, Ruby's famous PLP (principle of least surprise) translates by me to PHC (principle of highest confusion). Like if adding colours, fonts, shapes, makes a better visual design.

Still, instead of Python's version, my favorite language would say:

(1) syntactic magic version

   traverse container with item
      console.write item
or
   through container with item
      console.write item

Which I find both more obvious and explicit.

(2) OO version (see Io language for worthful source of inspiration)

   container.traverse(item,
      console.write item
      )

The latter case intends to semantically express the fact that 'traverse' (or 'through', or whatever) is a method that takes an item _name_ and a code block as parameters. Actually, if consistency really counts, it should read:

   container.traverse("item",
      console.write item
      )
or even
   container.traverse("item",
      "console.write item"
      )

Which is ugly, isn't it? The reason why I still prefere syntactic magic ;-) Or we could consider another possibility:

(3) postfix version

   container traverse
      console write
or
   container.traverse
      console.write

that often simply gets rid of function parameter names -- usually. What happens to be on the data stack will be written out -- in this case: each item provided by traverse. Another view on the same thing: 'traverse' is a higher order function that takes 'console write' as parameter and feeds it with data to write out.
Postfix notation is imo an endless source of elegant formulation -- as long as stack juggling can be avoided.

Denis
------
la vita e estrany


From jh at improva.dk  Wed Apr 15 12:53:52 2009
From: jh at improva.dk (Jacob Holm)
Date: Wed, 15 Apr 2009 12:53:52 +0200
Subject: [Python-ideas] Revised**10 PEP on Yield-From
In-Reply-To: <49E58076.4060202@canterbury.ac.nz>
References: <49E58076.4060202@canterbury.ac.nz>
Message-ID: <49E5BCC0.3060907@improva.dk>

Hi Greg

A few comments on the latest PEP 380 version (rev11d)...

1) IIRC, the use of sys.exc_info() is not needed in 3.x as all 
exceptions have a __traceback__ attribute.

2) The expansion is not handling StopIterations raised as a result of 
calling _i.throw().  They should be treated just like any other 
StopIteration that ends the yield-from.

A simpler expansion based on 1) and 2) but otherwise identical is:

    _i = iter(EXPR)
    try:
        _y = next(_i)
        while 1:
            try:
                _s = yield _y
            except GeneratorExit:
                _m = getattr(_i, 'close', None)
                if _m is not None:
                    _m()
                raise
            except BaseException as _e:
                _m = getattr(_i, 'throw', None)
                if _m is not None:
                    _y = _m(_e)
                else:
                    raise
            else:
                if _s is None:
                    _y = next(_i)
                else:
                    _y = _i.send(_s)
    except StopIteration as _e:
        _r = _e.value
    RESULT = _r


3) If the subiterator has a close() but doesn't have throw() it won't be 
closed when throw() is called on the outer generator.  This is fine with 
me, I am just not sure if it is intentional.

4) If the subiterator has a close() but doesn't have send() it won't be 
closed when a send() on the outer generator causes an AttributeError in 
the expansion.  Again this is fine with me, I am just not sure if it is 
intentional.

5) The last paragraph in the "Use of StopIteration to return values" 
section, seems to be a leftover from an earlier draft of the PEP that 
used a different exception.

6) Several of the issues we have been discussing on python-ideas are not 
mentioned at all:

    * The "initial next()" issue should at least be described and listed
      as out of scope.
    * The "what should close() do if it catches StopIteration with a
      value" issue I don't think we have resolved either way.  Since we
      are not going to store the value, only the first close() would be
      able to return it.  Under those conditions, I no longer think that
      returning the value is a good idea.  If we are not storing or
      returning the value, I think close() should raise an exception. 
      Either reraise the StopIteration, so that the caller has a chance
      to get the value that way, or raise a RuntimeError, because it is
      meaningless to return a value as response to a GeneratorExit when
      that value cannot later be accessed by anything and it is
      therefore most likely a bug.
    * The special-casing of StopIteration should probably be mentioned
      as a rejected idea.  Not special-casing it does break the
      refactoring principle, and I think it important to mention that in
      some way.
    * There may be other issues I have forgotten at the moment.

7) By not mentioning caching, you are effectively saying the methods 
won't be cached.  I have exactly one use for this.  The fastest 
pure-python "full" workaround I can find for the "initial next()" issue 
is a wrapper using Nicks self-modifying class hack.  With this the 
delegation cost is less than 1/3 of any other approach I have tried. 
(But still 13 times higher than a yield-from without the wrapper when 
using your patch).  All that means is that adding caching later would be 
likely to break some code that relied on the exact semantics as 
described in the PEP.

Other than that, everything looks fine.

Best regards
- Jacob


From denis.spir at free.fr  Wed Apr 15 12:57:14 2009
From: denis.spir at free.fr (spir)
Date: Wed, 15 Apr 2009 12:57:14 +0200
Subject: [Python-ideas] Fw:  accurate errors for "magic" methods
Message-ID: <20090415125714.6b3487e2@o>

Le Tue, 14 Apr 2009 23:35:22 -0700,
Chris Rebert <pyideas at rebertia.com> s'exprima ainsi:

[...]
> >> >
> >> > >>> A()['a']
> >> >
> >> > Traceback (most recent call last):
> >> >   File "<stdin>", line 1, in <module>
> >> > TypeError: 'A' object is unsubscriptable
> >> >
> >> >
> >> > (the difference being that A is new-style, while Values is
> >> > old-style.)
> >>
> >>
> >> Except that the error "object is unsubscriptable" might as well be in
> >> Klingon to most people, particularly newbies.
> >>
> >> (1) It's easy to misread it as "unscriptable", which is even more
> >> mysterious.
> >>
> >> (2) As far as I know, there's no tradition of describing key or index
> >> lookup as "subscripting" in Python. I've never noticed it in doc
> >> strings or the online docs, and after hanging around comp.lang.python
> >> extensively for years, I feel safe to say it's not a common term among
> >> even experienced Python developers. I suppose that there's a weak
> >> connection between index lookup and subscripts in mathematics.
> >>
> >> (3) The classic error message tells the newbie exactly what the error
> >> is: the object has no __getitem__ method. The new error message tells
> >> the newbie nothing useful. Given that obj is unsubscriptable, what
> >> needs to be done to make it subscriptable?
> >
> > +1
> >
> > And not just the newbie, either.  The experienced python programmer
> > looks at the original message and goes "ah ha".  The experienced python
> > programmer looks at the new message and has to _think about it_ before
> > understanding arrives.  I think that would be true even if you found a
> > better word than 'unsubscriptable'.  "Does not implement an item lookup
> > method" might work.  Or how about, 'Does not implement the Sequence or
> > Mapping interface'?
> >
> > But you know what?  Doing a google search for 'python mapping' gets you
> > the description of dictionaries, while doing a google search for 'python
> > __getitem__' gets you to the data model chapter that describes how the
> > mapping/sequence behavior is implemented via __getitem__.  The latter
> > is more useful even to the newbie, I think.
> >
> > So even if it looks ugly, I think the error message should mention
> > __getitem__.
> 
> I have gone ahead and filed a bug: http://bugs.python.org/issue5760
> 
> Cheers,
> Chris

Here's a copy of the suggeston:
================
Use exception chaining and rephrase the error message to get something like:

AttributeError: class 'A' has no attribute '__getitem__'
The above exception was the direct cause of the following exception:
TypeError: 'A' object does not support the 'get item' operator
================

Much better, I guess. But actually such errors often happen on objects for which a 'get item' operation simply makes no sense (because of another design or programming error). A typical case beeing None itself (see what I mean?). The remedy is rarely to add a __getitem__ method to a custom container class. Rather it is to check whether a container was properly returned by a previous operation:
   cont = ...
   try:
      result = cont[0]
   except AttributeError:
      raise ValueError("Cannot find...")

As a consequence, I still think that mentioning the notion of container is helpful, eg:
   TypeError: 'A' object is not a container able to support the 'get item' operator.

Also, what do you think of "item extraction" as an understandable wording for "use of []" or "call to __getitem__"(*).
   AttributeError: class 'A' has no attribute '__getitem__'
   The above exception was the direct cause of the following exception:
   TypeError: 'A' object is not a container able to support item extraction.

If ever such an idiom as "item extraction" is found correct, it could be reused in the ABC lexicon, for consistency. 

(*) An issue for finding a proper idiom is that getitem is also called for slicing. A single item is not a mini-slice.

Denis

------
la vita e estrany


From pyideas at rebertia.com  Wed Apr 15 13:00:59 2009
From: pyideas at rebertia.com (Chris Rebert)
Date: Wed, 15 Apr 2009 04:00:59 -0700
Subject: [Python-ideas] Fw: accurate errors for "magic" methods
In-Reply-To: <20090415125714.6b3487e2@o>
References: <20090415125714.6b3487e2@o>
Message-ID: <50697b2c0904150400o62d91f6sf34a95cd15387e35@mail.gmail.com>

> ================
> Use exception chaining and rephrase the error message to get something like:
>
> AttributeError: class 'A' has no attribute '__getitem__'
> The above exception was the direct cause of the following exception:
> TypeError: 'A' object does not support the 'get item' operator
> ================
>
> Much better, I guess. But actually such errors often happen on objects for which a 'get item' operation simply makes no sense (because of another design or programming error). A typical case beeing None itself (see what I mean?). The remedy is rarely to add a __getitem__ method to a custom container class. Rather it is to check whether a container was properly returned by a previous operation:

I don't think it really adds any confusion in the "you shouldn't have
tried subscripting this in the first place" case. The message seems
pretty clear: you can't subscript this thing; why?: it doesn't have a
__getitem__; why might that be?: it might not make sense to subscript
it.
If you get the error with None, it seems fairly clear that (A) you
can't modify NoneType (B) trying to subscript None makes no sense. In
any case, it seems a significant improvement over the current error
message to me.

>   cont = ...
>   try:
>      result = cont[0]
>   except AttributeError:
>      raise ValueError("Cannot find...")
>
> As a consequence, I still think that mentioning the notion of container is helpful, eg:
>   TypeError: 'A' object is not a container able to support the 'get item' operator.

See the quote from Guido earlier in the thread. Not everything
defining the operator is necessarily a container.

>
> Also, what do you think of "item extraction" as an understandable wording for "use of []" or "call to __getitem__"(*).

Sounds okay. I was thinking "item access" personally. Feel free to
suggest it on the bug page.

>   AttributeError: class 'A' has no attribute '__getitem__'
>   The above exception was the direct cause of the following exception:
>   TypeError: 'A' object is not a container able to support item extraction.
>
> If ever such an idiom as "item extraction" is found correct, it could be reused in the ABC lexicon, for consistency.

Agreed. The x[y] operator needs a canonical name that makes sense for
common uses of the operator, and this name should be used consistently
throughout the docs.

> (*) An issue for finding a proper idiom is that getitem is also called for slicing. A single item is not a mini-slice.

Cheers,
Chris
--
I have a blog:
http://blog.rebertia.com


From daniel at stutzbachenterprises.com  Wed Apr 15 14:13:37 2009
From: daniel at stutzbachenterprises.com (Daniel Stutzbach)
Date: Wed, 15 Apr 2009 07:13:37 -0500
Subject: [Python-ideas] seperating streams from there underlying buffer
In-Reply-To: <loom.20090414T215600-168@post.gmane.org>
References: <loom.20090414T215600-168@post.gmane.org>
Message-ID: <eae285400904150513k4996c8d6n40d8cc66f027034d@mail.gmail.com>

On Tue, Apr 14, 2009 at 5:08 PM, Benjamin Peterson <benjamin at python.org>wrote:

> Currently, If you want write/read binary from a stream you are using
> TextIOWrapper with you must keep the TextIOWrapper alive otherwise, it will
> close the underlying buffer. I think we should provide a disconnect()
> method for
> BufferedIOBase and TextIOBase that removes from the wrapper and returns raw
> stream or buffer respectively.
>

I'd rather that wrapper objects not close the underlying object when garbage
collected.  Better to let the underlying object close itself when garbage
collected.  An explicit call to close() should still work, of course.

Basically, move the __del__ from IOBase to RawIOBase.  Maybe it's too late
to make that change, though. :-(

--
Daniel Stutzbach, Ph.D.
President, Stutzbach Enterprises, LLC <http://stutzbachenterprises.com>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-ideas/attachments/20090415/f7d1057e/attachment.html>

From aahz at pythoncraft.com  Wed Apr 15 15:25:13 2009
From: aahz at pythoncraft.com (Aahz)
Date: Wed, 15 Apr 2009 06:25:13 -0700
Subject: [Python-ideas] Why not Ruby -- and what else
In-Reply-To: <20090415115037.6fff0824@o>
References: <20090415115037.6fff0824@o>
Message-ID: <20090415132513.GA27323@panix.com>

On Wed, Apr 15, 2009, spir wrote:
>
> I have just found the following from
> http://en.wikipedia.org/wiki/Iterator#Implicit_iterators

Please don't post free-form rambles here; although there are blue-sky
ideas posted here, the focus should be on concrete proposals for changing
Python.  I don't even know what your point was, let alone any proposal.

If you're just wanting to have a conversation about Python design
philosophy, please use comp.lang.python.
-- 
Aahz (aahz at pythoncraft.com)           <*>         http://www.pythoncraft.com/

Why is this newsgroup different from all other newsgroups?


From adam at atlas.st  Wed Apr 15 19:20:15 2009
From: adam at atlas.st (Adam Atlas)
Date: Wed, 15 Apr 2009 13:20:15 -0400
Subject: [Python-ideas] 'default' keyword argument for max(), min()
Message-ID: <6AD4057E-7F0E-4282-A680-A603417086F5@atlas.st>

I propose adding a "default" keyword argument to max() and min(),  
which provides a value to return in the event that an empty iterable  
is passed. (If no "default" argument is provided, and the iterable is  
empty, it would raise ValueError as it does currently.) I find this to  
be a very common need when using those functions. Of course this is  
already possible with a bit more code, but it depends on what type of  
object the iterable is -- if it supports __len__ or __nonzero__, that  
can be used to check if it's empty beforehand, but if it is a  
generator, for instance, it would have to be converted to a list  
first, which might be undesirable if there is the possibility that it  
is a very large sequence. Adding a "default" keyword argument to max()  
and min() would be an elegant way to centralize and simplify this  
common and useful behaviour.

If there is support for this idea, I can submit a patch implementing it.


From jh at improva.dk  Wed Apr 15 19:31:35 2009
From: jh at improva.dk (Jacob Holm)
Date: Wed, 15 Apr 2009 19:31:35 +0200
Subject: [Python-ideas] 'default' keyword argument for max(), min()
In-Reply-To: <6AD4057E-7F0E-4282-A680-A603417086F5@atlas.st>
References: <6AD4057E-7F0E-4282-A680-A603417086F5@atlas.st>
Message-ID: <49E619F7.6000009@improva.dk>

Adam Atlas wrote:
> I propose adding a "default" keyword argument to max() and min(), 
> which provides a value to return in the event that an empty iterable 
> is passed. (If no "default" argument is provided, and the iterable is 
> empty, it would raise ValueError as it does currently.) I find this to 
> be a very common need when using those functions. Of course this is 
> already possible with a bit more code, but it depends on what type of 
> object the iterable is -- if it supports __len__ or __nonzero__, that 
> can be used to check if it's empty beforehand, but if it is a 
> generator, for instance, it would have to be converted to a list 
> first, which might be undesirable if there is the possibility that it 
> is a very large sequence. Adding a "default" keyword argument to max() 
> and min() would be an elegant way to centralize and simplify this 
> common and useful behaviour.
>
> If there is support for this idea, I can submit a patch implementing it.

+1, I have often wanted that.

Cheers
- Jacob


From george.sakkis at gmail.com  Wed Apr 15 19:39:04 2009
From: george.sakkis at gmail.com (George Sakkis)
Date: Wed, 15 Apr 2009 13:39:04 -0400
Subject: [Python-ideas] 'default' keyword argument for max(), min()
In-Reply-To: <49E619F7.6000009@improva.dk>
References: <6AD4057E-7F0E-4282-A680-A603417086F5@atlas.st>
	<49E619F7.6000009@improva.dk>
Message-ID: <91ad5bf80904151039v60e62d5k2d8301cfc089dee0@mail.gmail.com>

On Wed, Apr 15, 2009 at 1:31 PM, Jacob Holm <jh at improva.dk> wrote:
> Adam Atlas wrote:
>>
>> I propose adding a "default" keyword argument to max() and min(), which
>> provides a value to return in the event that an empty iterable is passed.
>> (If no "default" argument is provided, and the iterable is empty, it would
>> raise ValueError as it does currently.) I find this to be a very common need
>> when using those functions. Of course this is already possible with a bit
>> more code, but it depends on what type of object the iterable is -- if it
>> supports __len__ or __nonzero__, that can be used to check if it's empty
>> beforehand, but if it is a generator, for instance, it would have to be
>> converted to a list first, which might be undesirable if there is the
>> possibility that it is a very large sequence. Adding a "default" keyword
>> argument to max() and min() would be an elegant way to centralize and
>> simplify this common and useful behaviour.
>>
>> If there is support for this idea, I can submit a patch implementing it.
>
> +1, I have often wanted that.

Seconded; I've been bitten more than once by this.

George


From steve at pearwood.info  Wed Apr 15 19:55:18 2009
From: steve at pearwood.info (Steven D'Aprano)
Date: Thu, 16 Apr 2009 03:55:18 +1000
Subject: [Python-ideas] 'default' keyword argument for max(), min()
In-Reply-To: <6AD4057E-7F0E-4282-A680-A603417086F5@atlas.st>
References: <6AD4057E-7F0E-4282-A680-A603417086F5@atlas.st>
Message-ID: <200904160355.19848.steve@pearwood.info>

On Thu, 16 Apr 2009 03:20:15 am Adam Atlas wrote:
> I propose adding a "default" keyword argument to max() and min(),
> which provides a value to return in the event that an empty iterable
> is passed.

+1

-- 
Steven D'Aprano


From taleinat at gmail.com  Wed Apr 15 20:08:23 2009
From: taleinat at gmail.com (Tal Einat)
Date: Wed, 15 Apr 2009 21:08:23 +0300
Subject: [Python-ideas] 'default' keyword argument for max(), min()
In-Reply-To: <6AD4057E-7F0E-4282-A680-A603417086F5@atlas.st>
References: <6AD4057E-7F0E-4282-A680-A603417086F5@atlas.st>
Message-ID: <7afdee2f0904151108o1eb121f4wbb7ace61a7b71e9@mail.gmail.com>

On Wed, Apr 15, 2009 at 8:20 PM, Adam Atlas <adam at atlas.st> wrote:

> I propose adding a "default" keyword argument to max() and min(), which
> provides a value to return in the event that an empty iterable is passed.


+1

This has precedent in reduce()'s "initial" keyword argument; note that min()
and max() are really simple special cases of reduce().
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-ideas/attachments/20090415/ad9c88b3/attachment.html>

From python at rcn.com  Wed Apr 15 20:17:07 2009
From: python at rcn.com (Raymond Hettinger)
Date: Wed, 15 Apr 2009 11:17:07 -0700
Subject: [Python-ideas] 'default' keyword argument for max(), min()
References: <6AD4057E-7F0E-4282-A680-A603417086F5@atlas.st>
Message-ID: <0A5FAA88DBA0449591EA6C696C9EBDDE@RaymondLaptop1>


[Adam Atlas]
>I propose adding a "default" keyword argument to max() and min(),  
> which provides a value to return in the event that an empty iterable  
> is passed. 

Could you write your proposal out in pure python so
we can see how it interacts with the key-keyword
argument and how it works when the number of
positional arguments is not one.

Will min(default=0) still return a TypeError?
Will min(1, 2, default=0) return 0 or 1?
Will min([1,2], default=0) return 1?  # different from min([0,1,2])

Also, can you post some snippets of real-world use cases.
Is the default value always zero (even for max)?
I'm wondering if there are any patterns to the use cases.
I don't doubt that the use cases exist, I'm just curious
what they are and what it says about how min() and max()
are being used.

Are the typical use cases occuring with iterables that are also
sequences?  If so, why would a default argument be better
than a conditional expression:

    x = min(s) if s else 0


Raymond




From john.arbash.meinel at gmail.com  Wed Apr 15 20:19:35 2009
From: john.arbash.meinel at gmail.com (John Arbash Meinel)
Date: Wed, 15 Apr 2009 13:19:35 -0500
Subject: [Python-ideas] 'default' keyword argument for max(), min()
In-Reply-To: <0A5FAA88DBA0449591EA6C696C9EBDDE@RaymondLaptop1>
References: <6AD4057E-7F0E-4282-A680-A603417086F5@atlas.st>
	<0A5FAA88DBA0449591EA6C696C9EBDDE@RaymondLaptop1>
Message-ID: <49E62537.8090807@gmail.com>


...

> Are the typical use cases occuring with iterables that are also
> sequences?  If so, why would a default argument be better
> than a conditional expression:
> 
>    x = min(s) if s else 0
> 
> 
> Raymond
> 

Because min(s) if s could be a generator which won't evaluate to False,
even though it returns no entries.

John
=:->


From python at rcn.com  Wed Apr 15 20:22:42 2009
From: python at rcn.com (Raymond Hettinger)
Date: Wed, 15 Apr 2009 11:22:42 -0700
Subject: [Python-ideas] 'default' keyword argument for max(), min()
References: <6AD4057E-7F0E-4282-A680-A603417086F5@atlas.st>
	<7afdee2f0904151108o1eb121f4wbb7ace61a7b71e9@mail.gmail.com>
Message-ID: <1E71D803A60846688839966C249271E5@RaymondLaptop1>


[Adam Atlas]
> > I propose adding a "default" keyword argument to max() and min(), 
> >which provides a value to return in the event that an empty iterable is passed.

[Tal Einat]
> +1
>
> This has precedent in reduce()'s "initial" keyword argument; note that min() 
> and max() are really simple special cases of reduce().

Of course, what he is proposing has completely different semantics
than an intial argument.  If anyone finds that distinction to be confusing,
then we would have a good reason not to accept the proposal.

   min([1,2], initial=0)    -->  0
   min([1,2], default=0)  --> 1


Raymond









From erik at cq2.org  Wed Apr 15 20:29:31 2009
From: erik at cq2.org (Erik Groeneveld)
Date: Wed, 15 Apr 2009 20:29:31 +0200
Subject: [Python-ideas] Revised**10 PEP on Yield-From
In-Reply-To: <49E58076.4060202@canterbury.ac.nz>
References: <49E58076.4060202@canterbury.ac.nz>
Message-ID: <aaec99390904151129r7e463fd4i93e9f7514c5aec05@mail.gmail.com>

Greg,

Please forgive me for hooking into this discussion so late.  Below are
my late comments to your original PEP, and below those some new stuff.
 I have been writing weightless/compose which does exactly what your
PEP is trying to accomplish.  I'll check my stuff against this PEP.

I really appreciate your initiative!  It helps me a lot.

2009/4/15 Greg Ewing <greg.ewing at canterbury.ac.nz>:
> Draft 11 of the PEP.
>
> Changes in this version:
>
> - GeneratorExit always calls close() and is always
> ?reraised.
>
> - Special handling of thrown-in StopIterations
> ?removed, since Guido doesn't think you should be
> ?doing that in the first place.
>
> - Expansion uses next(_i) instead of _i.next() and
> ?doesn't mention cacheing of methods.
>
> --
> Greg
>
> PEP: XXX
> Title: Syntax for Delegating to a Subgenerator
> Version: $Revision$
> Last-Modified: $Date$
> Author: Gregory Ewing <greg.ewing at canterbury.ac.nz>
> Status: Draft
> Type: Standards Track
> Content-Type: text/x-rst
> Created: 13-Feb-2009
> Python-Version: 3.x
> Post-History:
>
>
> Abstract
> ========
>
> A syntax is proposed for a generator to delegate part of its
> operations to another generator. This allows a section of code
> containing 'yield' to be factored out and placed in another
> generator. Additionally, the subgenerator is allowed to return with a
> value, and the value is made available to the delegating generator.
>
> The new syntax also opens up some opportunities for optimisation when
> one generator re-yields values produced by another.
>
>
> Motivation
> ==========
>
> A Python generator is a form of coroutine, but has the limitation that
> it can only yield to its immediate caller. ?This means that a piece of
> code containing a ``yield`` cannot be factored out and put into a
> separate function in the same way as other code. ?Performing such a
> factoring causes the called function to itself become a generator, and
> it is necessary to explicitly iterate over this second generator and
> re-yield any values that it produces.
>
> If yielding of values is the only concern, this can be performed without
> much difficulty using a loop such as
>
> ::
>
> ? ?for v in g:
> ? ? ? ?yield v
>
> However, if the subgenerator is to interact properly with the caller
> in the case of calls to ``send()``, ``throw()`` and ``close()``, things
> become considerably more difficult. ?As will be seen later, the necessary
> code is very complicated, and it is tricky to handle all the corner cases
> correctly.
>
> A new syntax will be proposed to address this issue. In the simplest
> use cases, it will be equivalent to the above for-loop, but it will also
> handle the full range of generator behaviour, and allow generator code
> to be refactored in a simple and straightforward way.
>
>
> Proposal
> ========
>
> The following new expression syntax will be allowed in the body of a
> generator:
>
> ::
>
> ? ?yield from <expr>
>

These are the exact problems I can't solve neatly in weightless/compose:

> where <expr> is an expression evaluating to an iterable, from which an
> iterator is extracted. The iterator is run to exhaustion, during which
> time it yields and receives values directly to or from the caller of
> the generator containing the ``yield from`` expression (the
> "delegating generator").

this allows a programmer to express the intention of just returning a
generator or wanting to delegate the work to a 'subgenerator'.
Weightless/compose now just descends into every generator, while this
is certainly not always wanted.  Great I think, I like the syntax.

> Furthermore, when the iterator is another generator, the subgenerator
> is allowed to execute a ``return`` statement with a value, and that
> value becomes the value of the ``yield from`` expression.

In Weightless/compose, after several different tries I settled for
mimicking returning a value by using raise StopIteration(returnvalue).
 As return in a generator raises StopIteration(), I think it is very
natural to use return like this in a generator (if fact I wished it
would be possible sometimes, not being aware of python-ideas).  So I
like it too.

> The full semantics of the ``yield from`` expression can be described
> in terms of the generator protocol as follows:
>
> ? ?* Any values that the iterator yields are passed directly to the
> ? ? ?caller.

Clear.

> ? ?* Any values sent to the delegating generator using ``send()``
> ? ? ?are passed directly to the iterator. If the sent value is None,
> ? ? ?the iterator's ``next()`` method is called. If the sent value is
> ? ? ?not None, the iterator's ``send()`` method is called. Any exception
> ? ? ?resulting from attempting to call ``next`` or ``send`` is raised
> ? ? ?in the delegating generator.

Clear. I have implemented this by just calling send(...) either with
None or with a value.  The VM dispatches that to next() when the value
is None, I assume.

> ? ?* Exceptions other than GeneratorExit passed to the ``throw()`` method
> ? ? ?of the delegating generator are forwarded to the ``throw()`` method of
> ? ? ?the iterator. Any exception resulting from attempting to call ``throw()``
> ? ? ?are propagated to the delegating generator.

I let any Exception propagate using the throw() method.  I believe
this will not correctly handle GeneratorExit as outlined in the
discussion before.  I'll have to change this I think.

> ? ?* If a GeneratorExit exception is thrown into the delegating generator,
> ? ? ?the ``close()`` method of the iterator is called if it has one. If this
> ? ? ?call results in an exception, it is propagated to the delegating generator.
> ? ? ?Otherwise, the GeneratorExit is reraised in the delegating generator.

I have a hard time understanding what this would mean in a pure python
implementation.  I added both bullets to my unittests to work it out
later.

> ? ? ?The implicit GeneratorExit resulting from closing the delegating
> ? ? ?generator is treated as though it were passed in using ``throw()``.

By "closing the delegating generator" you mean "from the outside, call
close() on it"?  It then will raise the GeneratorExit exception, and I
understand it. I added a unittest as well.

>
> ? ?* The value of the ``yield from`` expression is the first argument
> ? ? ?to the ``StopIteration`` exception raised by the iterator when it
> ? ? ?terminates.
>
> ? ?* ``return expr`` in a generator causes ``StopIteration(expr)`` to
> ? ? ?be raised.

I assume that 'return 1 2 3' will have one return value being a tuple
(1,2,3) which is one argument to StopIteration(), and which is
unpacked when 'yield from' returns?

> Enhancements to StopIteration
> -----------------------------
>
> For convenience, the ``StopIteration`` exception will be given a
> ``value`` attribute that holds its first argument, or None if there
> are no arguments.

I am using StopIteration's 'args' atrribute?  But after reading the
motivation below, it could indeed confuse other generators, and a
separate StopIteration would be better, I think.

>
> Formal Semantics
> ----------------
>
> Python 3 syntax is used in this section.
>
> 1. The statement
>
> ::
>
> ? ?RESULT = yield from EXPR
>
> is semantically equivalent to
>
> ::
>
> ? ?_i = iter(EXPR)
> ? ?try:
> ? ? ? ?_y = next(_i)
> ? ?except StopIteration as _e:
> ? ? ? ?_r = _e.value
> ? ?else:
> ? ? ? ?while 1:
> ? ? ? ? ? ?try:
> ? ? ? ? ? ? ? ?_s = yield _y
> ? ? ? ? ? ?except GeneratorExit:
> ? ? ? ? ? ? ? ?_m = getattr(_i, 'close', None)
> ? ? ? ? ? ? ? ?if _m is not None:
> ? ? ? ? ? ? ? ? ? ?_m()
> ? ? ? ? ? ? ? ?raise
> ? ? ? ? ? ?except:
> ? ? ? ? ? ? ? ?_m = getattr(_i, 'throw', None)
> ? ? ? ? ? ? ? ?if _m is not None:
> ? ? ? ? ? ? ? ? ? ?_y = _m(*sys.exc_info())
> ? ? ? ? ? ? ? ?else:
> ? ? ? ? ? ? ? ? ? ?raise
> ? ? ? ? ? ?else:
> ? ? ? ? ? ? ? ?try:
> ? ? ? ? ? ? ? ? ? ?if _s is None:
> ? ? ? ? ? ? ? ? ? ? ? ?_y = next(_i)
> ? ? ? ? ? ? ? ? ? ?else:
> ? ? ? ? ? ? ? ? ? ? ? ?_y = _i.send(_s)
> ? ? ? ? ? ? ? ?except StopIteration as _e:
> ? ? ? ? ? ? ? ? ? ?_r = _e.value
> ? ? ? ? ? ? ? ? ? ?break
> ? ?RESULT = _r
>

I'll take this one with me, as I really need some time to compare it
to my own code. I'll come back to it later.


> 2. In a generator, the statement
>
> ::
>
> ? ?return value
>
> is semantically equivalent to
>
> ::
>
> ? ?raise StopIteration(value)
>
> except that, as currently, the exception cannot be caught by ``except``
> clauses within the returning generator.

Clear.


> 3. The StopIteration exception behaves as though defined thusly:
>
> ::
>
> ? class StopIteration(Exception):
>
> ? ? ? def __init__(self, *args):
> ? ? ? ? ? if len(args) > 0:
> ? ? ? ? ? ? ? self.value = args[0]
> ? ? ? ? ? else:
> ? ? ? ? ? ? ? self.value = None
> ? ? ? ? ? Exception.__init__(self, *args)
>

I probably miss the point, could you explain why this is needed?


> Rationale
> =========
>
> The Refactoring Principle
> -------------------------
>
> The rationale behind most of the semantics presented above stems from
> the desire to be able to refactor generator code. It should be possible
> to take an section of code containing one or more ``yield`` expressions,
> move it into a separate function (using the usual techniques to deal
> with references to variables in the surrounding scope, etc.), and
> call the new function using a ``yield from`` expression.
>
> The behaviour of the resulting compound generator should be, as far as
> possible, exactly the same as the original unfactored generator in all
> situations, including calls to ``next()``, ``send()``, ``throw()`` and
> ``close()``.
>
> The semantics in cases of subiterators other than generators has been
> chosen as a reasonable generalization of the generator case.

Yes!  Exactly.  I just call this supporting 'program decomposition'.
For clearity, you could probably add the name of the refactoring, it
is called 'extract method' isn't it?


> Finalization
> ------------
>
> There was some debate as to whether explicitly finalizing the delegating
> generator by calling its ``close()`` method while it is suspended at a
> ``yield from`` should also finalize the subiterator. An argument against
> doing so is that it would result in premature finalization of the
> subiterator if references to it exist elsewhere.
>
> Consideration of non-refcounting Python implementations led to the
> decision that this explicit finalization should be performed, so that
> explicitly closing a factored generator has the same effect as doing
> so to an unfactored one in all Python implementations.
>
> The assumption made is that, in the majority of use cases, the subiterator
> will not be shared. The rare case of a shared subiterator can be
> accommodated by means of a wrapper that blocks ``throw()`` and ``close()``
> calls, or by using a means other than ``yield from`` to call the
> subiterator.


I agree completely.  I went through some lenght to get proper
clean-up, and I solved it similarly.


> Generators as Threads
> ---------------------
>
> A motivation for generators being able to return values concerns the
> use of generators to implement lightweight threads. ?When using
> generators in that way, it is reasonable to want to spread the
> computation performed by the lightweight thread over many functions.
> One would like to be able to call a subgenerator as though it were an
> ordinary function, passing it parameters and receiving a returned
> value.
>
> Using the proposed syntax, a statement such as
>
> ::
>
> ? ?y = f(x)
>
> where f is an ordinary function, can be transformed into a delegation
> call
>
> ::
>
> ? ?y = yield from g(x)
>
> where g is a generator. One can reason about the behaviour of the
> resulting code by thinking of g as an ordinary function that can be
> suspended using a ``yield`` statement.
>
> When using generators as threads in this way, typically one is not
> interested in the values being passed in or out of the yields.
> However, there are use cases for this as well, where the thread is
> seen as a producer or consumer of items. The ``yield from``
> expression allows the logic of the thread to be spread over as
> many functions as desired, with the production or consumption of
> items occuring in any subfunction, and the items are automatically
> routed to or from their ultimate source or destination.
>
> Concerning ``throw()`` and ``close()``, it is reasonable to expect
> that if an exception is thrown into the thread from outside, it should
> first be raised in the innermost generator where the thread is suspended,
> and propagate outwards from there; and that if the thread is terminated
> from outside by calling ``close()``, the chain of active generators
> should be finalised from the innermost outwards.

Yes, I believe you make sure that:

try:
    x = yield from y()
except SomeError:
   return 'HELP'

actually does catch the SomeError exception when raised in y(), or one
it its descendants?


>
>
> Syntax
> ------
>
> The particular syntax proposed has been chosen as suggestive of its
> meaning, while not introducing any new keywords and clearly standing
> out as being different from a plain ``yield``.
>

Next section I skipped, I you don't mind.

>
> Optimisations
> -------------
>
> Using a specialised syntax opens up possibilities for optimisation
> when there is a long chain of generators. ?Such chains can arise, for
> instance, when recursively traversing a tree structure. ?The overhead
> of passing ``next()`` calls and yielded values down and up the chain
> can cause what ought to be an O(n) operation to become, in the worst
> case, O(n\*\*2).
>
> A possible strategy is to add a slot to generator objects to hold a
> generator being delegated to. ?When a ``next()`` or ``send()`` call is
> made on the generator, this slot is checked first, and if it is
> nonempty, the generator that it references is resumed instead. ?If it
> raises StopIteration, the slot is cleared and the main generator is
> resumed.
>
> This would reduce the delegation overhead to a chain of C function
> calls involving no Python code execution. ?A possible enhancement would
> be to traverse the whole chain of generators in a loop and directly
> resume the one at the end, although the handling of StopIteration is
> more complicated then.
>
>
> Use of StopIteration to return values
> -------------------------------------
>
> There are a variety of ways that the return value from the generator
> could be passed back. Some alternatives include storing it as an
> attribute of the generator-iterator object, or returning it as the
> value of the ``close()`` call to the subgenerator. However, the proposed
> mechanism is attractive for a couple of reasons:
>
> * Using a generalization of the StopIteration exception makes it easy
> ?for other kinds of iterators to participate in the protocol without
> ?having to grow an extra attribute or a close() method.
>
> * It simplifies the implementation, because the point at which the
> ?return value from the subgenerator becomes available is the same
> ?point at which the exception is raised. Delaying until any later
> ?time would require storing the return value somewhere.
>
> Originally it was proposed to simply extend StopIteration to accept
> a value. However, it was felt desirable by some to have a mechanism
> for detecting the erroneous use of a value-returning generator in a
> context that is not aware of generator return values. Using an
> exception that is a superclass of StopIteration means that code
> knowing about generator return values only has one exception to
> catch, and code that does not know about them will fail to catch
> the new exception.

I agree. And I begin to understand the need for that value attribute. Ok.

> [...]

For now I have one more fundamental question left over.

Will the the delegating generator remain on the call-stack or not?

The current behaviour is that a stack-frame is created for a
generator, which is not disposed when next()/send() returns, but kept
somewere.  When a new call to next()/send() happens, the same
stack-frame is put back on the call-stack.  This is crucial because it
acts as the o-so-valuable closure.

I came across this problem because I was writing code that traverses
the call-stack in order to find some place to put 'generator-local'
variables (like thread-local).  My implementation in
weightless/compose does not physically keep the generators on the
call-stack (I don't know how to do that in Python), but keep it's own
stack.  I would have to extend 'compose' to not only search on the
real call-stack but also traverse it own semi-call-stack if/when it
finds an instance of itself on the real call-stack.

I am writing code like:

def x():
    somevar = 10
    yield from y()

then it I write in y:

def y():
        frame = currentframe().f_back
        while 'somevar' not in frame.f_locals:
            frame = frame.f_back
        return frame.f_locals['somevar']

would this find the variable in x?

Best regards,
Erik


From steve at pearwood.info  Wed Apr 15 21:11:28 2009
From: steve at pearwood.info (Steven D'Aprano)
Date: Thu, 16 Apr 2009 05:11:28 +1000
Subject: [Python-ideas] 'default' keyword argument for max(), min()
In-Reply-To: <0A5FAA88DBA0449591EA6C696C9EBDDE@RaymondLaptop1>
References: <6AD4057E-7F0E-4282-A680-A603417086F5@atlas.st>
	<0A5FAA88DBA0449591EA6C696C9EBDDE@RaymondLaptop1>
Message-ID: <200904160511.29682.steve@pearwood.info>

On Thu, 16 Apr 2009 04:17:07 am Raymond Hettinger wrote:
> [Adam Atlas]
>
> >I propose adding a "default" keyword argument to max() and min(),
> > which provides a value to return in the event that an empty
> > iterable is passed.
>
> Could you write your proposal out in pure python so
> we can see how it interacts with the key-keyword
> argument and how it works when the number of
> positional arguments is not one.
>
> Will min(default=0) still return a TypeError?
> Will min(1, 2, default=0) return 0 or 1?
> Will min([1,2], default=0) return 1?  # different from min([0,1,2])

I would expect the answers should be:

Yes, 1 and 1

but I'd be prepared to be talked into:

No, 1 and 1.


> Also, can you post some snippets of real-world use cases.
> Is the default value always zero (even for max)?

I don't believe there should be a default value for default. If you 
don't provide an explicit default, the current behaviour should remain 
unchanged.


> I'm wondering if there are any patterns to the use cases.
> I don't doubt that the use cases exist, I'm just curious
> what they are and what it says about how min() and max()
> are being used.
>
> Are the typical use cases occuring with iterables that are also
> sequences?  If so, why would a default argument be better
> than a conditional expression:
>
>     x = min(s) if s else 0

If s could be either an iterable or a sequence, you would need to write 
that as:

s = list(s)
x = min(s) if s else 0

which turns a single conceptual operation into two operations.



-- 
Steven D'Aprano


From george.sakkis at gmail.com  Wed Apr 15 21:19:05 2009
From: george.sakkis at gmail.com (George Sakkis)
Date: Wed, 15 Apr 2009 15:19:05 -0400
Subject: [Python-ideas] 'default' keyword argument for max(), min()
In-Reply-To: <200904160511.29682.steve@pearwood.info>
References: <6AD4057E-7F0E-4282-A680-A603417086F5@atlas.st>
	<0A5FAA88DBA0449591EA6C696C9EBDDE@RaymondLaptop1>
	<200904160511.29682.steve@pearwood.info>
Message-ID: <91ad5bf80904151219g46bbfc5dgdd5ecbe32cfd842f@mail.gmail.com>

On Wed, Apr 15, 2009 at 3:11 PM, Steven D'Aprano <steve at pearwood.info> wrote:

> On Thu, 16 Apr 2009 04:17:07 am Raymond Hettinger wrote:
>> [Adam Atlas]
>>
>> >I propose adding a "default" keyword argument to max() and min(),
>> > which provides a value to return in the event that an empty
>> > iterable is passed.
>>
>> Could you write your proposal out in pure python so
>> we can see how it interacts with the key-keyword
>> argument and how it works when the number of
>> positional arguments is not one.
>>
>> Will min(default=0) still return a TypeError?
>> Will min(1, 2, default=0) return 0 or 1?
>> Will min([1,2], default=0) return 1? ?# different from min([0,1,2])
>
> I would expect the answers should be:
>
> Yes, 1 and 1
>
> but I'd be prepared to be talked into:
>
> No, 1 and 1.

I think it would be counter-intuitive and error-prone if min(iterable,
default=0) was different from min(*iterable, default=0), so I'd say no
on the first one.

George


From denis.spir at free.fr  Wed Apr 15 21:24:05 2009
From: denis.spir at free.fr (spir)
Date: Wed, 15 Apr 2009 21:24:05 +0200
Subject: [Python-ideas] 'default' keyword argument for max(), min()
In-Reply-To: <0A5FAA88DBA0449591EA6C696C9EBDDE@RaymondLaptop1>
References: <6AD4057E-7F0E-4282-A680-A603417086F5@atlas.st>
	<0A5FAA88DBA0449591EA6C696C9EBDDE@RaymondLaptop1>
Message-ID: <20090415212405.04886939@o>

Le Wed, 15 Apr 2009 11:17:07 -0700,
"Raymond Hettinger" <python at rcn.com> s'exprima ainsi:

> 
> [Adam Atlas]
> >I propose adding a "default" keyword argument to max() and min(),  
> > which provides a value to return in the event that an empty iterable  
> > is passed. 
> 
> Could you write your proposal out in pure python so
> we can see how it interacts with the key-keyword
> argument and how it works when the number of
> positional arguments is not one.
> 
> Will min(default=0) still return a TypeError?
> Will min(1, 2, default=0) return 0 or 1?
> Will min([1,2], default=0) return 1?  # different from min([0,1,2])

While there has been quick support for the proposal, I do not find it as obvious as it seems.
I see an issue based on a possible confusion about "default". Actually, "default" is the name of a proposed optional argument for min() and max() -- but this does not mean this argument, that happens to be called "default", itself has an evident default value ;-)
Even for min(), I really doubt 0 is a right choice as "default"'s default value; while for max() it's imo obviously wrong.

The issue as I see it is related to the fact that python does not allow optional arguments without default values -- which in most cases is not problematic. But here I would like an hypothetical
   min(s, optional default)
or
   min(s, ?default)

While this is not possible, I support the proposal with None as default value for "default", instead of an often wrong choice.
   min(s, default=None)
   max(s, default=None)

Maybe another word as "default" would help avoid confusion, too.

Denis
------
la vita e estrany


From qrczak at knm.org.pl  Wed Apr 15 22:55:57 2009
From: qrczak at knm.org.pl (Marcin 'Qrczak' Kowalczyk)
Date: Wed, 15 Apr 2009 22:55:57 +0200
Subject: [Python-ideas] 'default' keyword argument for max(), min()
In-Reply-To: <91ad5bf80904151219g46bbfc5dgdd5ecbe32cfd842f@mail.gmail.com>
References: <6AD4057E-7F0E-4282-A680-A603417086F5@atlas.st> 
	<0A5FAA88DBA0449591EA6C696C9EBDDE@RaymondLaptop1>
	<200904160511.29682.steve@pearwood.info> 
	<91ad5bf80904151219g46bbfc5dgdd5ecbe32cfd842f@mail.gmail.com>
Message-ID: <3f4107910904151355w33b228b1s74b73512cd258609@mail.gmail.com>

2009/4/15 George Sakkis <george.sakkis at gmail.com>:

> I think it would be counter-intuitive and error-prone if min(iterable,
> default=0) was different from min(*iterable, default=0),

It is definitely different if iterable == [[7]].

Since min(*iterable) will break if iterable has length 1, it should
not be called this way at all. Values should be passed to min as
individual arguments only if their number is statically known, and in
this case there is no reason to use 'default'.

-- 
Marcin Kowalczyk
qrczak at knm.org.pl
http://qrnik.knm.org.pl/~qrczak/


From steve at pearwood.info  Thu Apr 16 01:09:10 2009
From: steve at pearwood.info (Steven D'Aprano)
Date: Thu, 16 Apr 2009 09:09:10 +1000
Subject: [Python-ideas] 'default' keyword argument for max(), min()
In-Reply-To: <91ad5bf80904151219g46bbfc5dgdd5ecbe32cfd842f@mail.gmail.com>
References: <6AD4057E-7F0E-4282-A680-A603417086F5@atlas.st>
	<200904160511.29682.steve@pearwood.info>
	<91ad5bf80904151219g46bbfc5dgdd5ecbe32cfd842f@mail.gmail.com>
Message-ID: <200904160909.10575.steve@pearwood.info>

On Thu, 16 Apr 2009 05:19:05 am George Sakkis wrote:

> I think it would be counter-intuitive and error-prone if
> min(iterable, default=0) was different from min(*iterable,
> default=0), so I'd say no on the first one.

That's already the case.

>>> min( iter([1]) )
1
>>> min( *iter([1]) )
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
TypeError: 'int' object is not iterable


-- 
Steven D'Aprano


From george.sakkis at gmail.com  Thu Apr 16 02:45:02 2009
From: george.sakkis at gmail.com (George Sakkis)
Date: Wed, 15 Apr 2009 20:45:02 -0400
Subject: [Python-ideas] 'default' keyword argument for max(), min()
In-Reply-To: <200904160909.10575.steve@pearwood.info>
References: <6AD4057E-7F0E-4282-A680-A603417086F5@atlas.st>
	<200904160511.29682.steve@pearwood.info>
	<91ad5bf80904151219g46bbfc5dgdd5ecbe32cfd842f@mail.gmail.com>
	<200904160909.10575.steve@pearwood.info>
Message-ID: <91ad5bf80904151745j78c05c47q774736ef9f447c90@mail.gmail.com>

On Wed, Apr 15, 2009 at 7:09 PM, Steven D'Aprano <steve at pearwood.info> wrote:
> On Thu, 16 Apr 2009 05:19:05 am George Sakkis wrote:
>
>> I think it would be counter-intuitive and error-prone if
>> min(iterable, default=0) was different from min(*iterable,
>> default=0), so I'd say no on the first one.
>
> That's already the case.
>
>>>> min( iter([1]) )
> 1
>>>> min( *iter([1]) )
> Traceback (most recent call last):
> ?File "<stdin>", line 1, in <module>
> TypeError: 'int' object is not iterable

Oh right, posted too fast; the single argument case is already
special, so we might as well make the zero arg case special.

George


From greg.ewing at canterbury.ac.nz  Thu Apr 16 04:35:02 2009
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Thu, 16 Apr 2009 14:35:02 +1200
Subject: [Python-ideas] Revised**10 PEP on Yield-From
In-Reply-To: <aaec99390904151129r7e463fd4i93e9f7514c5aec05@mail.gmail.com>
References: <49E58076.4060202@canterbury.ac.nz>
	<aaec99390904151129r7e463fd4i93e9f7514c5aec05@mail.gmail.com>
Message-ID: <49E69956.20009@canterbury.ac.nz>

Erik Groeneveld wrote:

> By "closing the delegating generator" you mean "from the outside, call
> close() on it"?

Yes, that's right.

> I assume that 'return 1 2 3' will have one return value being a tuple
> (1,2,3)

Um, you can't write 'return 1 2 3'. You can write
'return 1, 2, 3' which returns a tuple.

>> For convenience, the ``StopIteration`` exception will be given a
>> ``value`` attribute that holds its first argument, or None if there
>> are no arguments.
> 
> I am using StopIteration's 'args' atrribute?

The 'value' attribute is purely a convenience -- you
can get it from args[0] just as well.

> For clearity, you could probably add the name of the refactoring, it
> is called 'extract method' isn't it?

Quite likely it's called that in some book or other,
I wouldn't really know. I don't think of refactorings
as having names, I just do them.

> Yes, I believe you make sure that:
> 
> try:
>     x = yield from y()
> except SomeError:
>    return 'HELP'
> 
> actually does catch the SomeError exception when raised in y(), or one
> it its descendants?

That's the idea.

>> Originally it was proposed to simply extend StopIteration to accept
>> a value...

That whole paragraph shouldn't be there, it's left over
from an earlier version. Guido originally wanted to use
a different exception for returning with a value, but
he changed his mind.

> Will the the delegating generator remain on the call-stack or not?

It certainly shows up in tracebacks, but I'm not
actually sure whether it will be seen by
sys._getframe() while a subiterator is running
in my current implementation. I'll try an
experiment to find out.

Even if it does, it's probably not something you
should rely on, since a different implementation
that optimizes more aggressively could behave
differently.

In the case of thread-local storage for generator
based threads, I don't think I would try to attach
them to a generator frame, because a thread isn't
just a single generator, it's a collection of
generators.

Rather I'd use the scheduler to manage them. The
scheduler knows when it's switching from one
thread to another and can switch in the appropriate
set of variables.

-- 
Greg


From jared.grubb at gmail.com  Thu Apr 16 05:39:26 2009
From: jared.grubb at gmail.com (Jared Grubb)
Date: Wed, 15 Apr 2009 20:39:26 -0700
Subject: [Python-ideas] 'default' keyword argument for max(), min()
In-Reply-To: <0A5FAA88DBA0449591EA6C696C9EBDDE@RaymondLaptop1>
References: <6AD4057E-7F0E-4282-A680-A603417086F5@atlas.st>
	<0A5FAA88DBA0449591EA6C696C9EBDDE@RaymondLaptop1>
Message-ID: <2B979181-B4B0-4970-9A3E-2890A0C50F5F@gmail.com>

On 15 Apr 2009, at 11:17, Raymond Hettinger wrote:
> [Adam Atlas]
>> I propose adding a "default" keyword argument to max() and min(),   
>> which provides a value to return in the event that an empty  
>> iterable  is passed.
>
> Could you write your proposal out in pure python so
> we can see how it interacts with the key-keyword
> argument and how it works when the number of
> positional arguments is not one.

Here's one option... I'm going to cheat a little here and just wrap  
the built-in min, but a quick/simple answer could be:

def min2(*vars, **kw):
      try:
          if 'key' in kw:
               return min(*vars, key=kw['key'])
          return min(*vars)
      except Exception:
          if 'default' in kw:
              return kw['default']
          raise

> Will min(default=0) still return a TypeError?
> Will min(1, 2, default=0) return 0 or 1?
> Will min([1,2], default=0) return 1?  # different from min([0,1,2])

# Your examples
min2() -> TypeError
min2(default=0) -> 0
min2(1,2,default=0) -> 1
min2([1,2], default=0) -> 1

# Iterator that yields things that are not comparable
min2([1, set()]) -> TypeError
min2([1, set()], default=7 ) -> 7

# Iterator that raises an exception
def foo():
    yield 1
    raise ValueError

min(foo()) -> ValueError
min2(foo()) -> ValueError
min2(foo(), default=None)  -> None

Jared


From arnodel at googlemail.com  Thu Apr 16 11:10:52 2009
From: arnodel at googlemail.com (Arnaud Delobelle)
Date: Thu, 16 Apr 2009 10:10:52 +0100
Subject: [Python-ideas] 'default' keyword argument for max(), min()
In-Reply-To: <2B979181-B4B0-4970-9A3E-2890A0C50F5F@gmail.com>
References: <6AD4057E-7F0E-4282-A680-A603417086F5@atlas.st>
	<0A5FAA88DBA0449591EA6C696C9EBDDE@RaymondLaptop1>
	<2B979181-B4B0-4970-9A3E-2890A0C50F5F@gmail.com>
Message-ID: <2444F7FF-283C-4562-AE46-0F9F9709BB7D@googlemail.com>


On 16 Apr 2009, at 04:39, Jared Grubb wrote:

> def min2(*vars, **kw):
>     try:
>         if 'key' in kw:
>              return min(*vars, key=kw['key'])
>         return min(*vars)
>     except Exception:
>         if 'default' in kw:
>             return kw['default']
>         raise

Nitpick:

 >>> class Err(Exception): pass
...
 >>> def it():
...     raise Err()
...     yield 42
...
 >>> min(it())
Traceback (most recent call last):
   File "<stdin>", line 1, in <module>
   File "<stdin>", line 2, in it
__main__.Err
 >>> min2(it())
Traceback (most recent call last):
   File "<stdin>", line 1, in <module>
   File "<stdin>", line 5, in min2
   File "<stdin>", line 2, in it
__main__.Err
 >>> min2(it(), default=12)
12

Shouldn't the last one raise Err as well?

-- 
Arnaud



From denis.spir at free.fr  Thu Apr 16 11:13:38 2009
From: denis.spir at free.fr (spir)
Date: Thu, 16 Apr 2009 11:13:38 +0200
Subject: [Python-ideas] 'default' keyword argument for max(), min()
In-Reply-To: <2B979181-B4B0-4970-9A3E-2890A0C50F5F@gmail.com>
References: <6AD4057E-7F0E-4282-A680-A603417086F5@atlas.st>
	<0A5FAA88DBA0449591EA6C696C9EBDDE@RaymondLaptop1>
	<2B979181-B4B0-4970-9A3E-2890A0C50F5F@gmail.com>
Message-ID: <20090416111338.0874720e@o>

Le Wed, 15 Apr 2009 20:39:26 -0700,
Jared Grubb <jared.grubb at gmail.com> s'exprima ainsi:

> On 15 Apr 2009, at 11:17, Raymond Hettinger wrote:
> > [Adam Atlas]
> >> I propose adding a "default" keyword argument to max() and min(),   
> >> which provides a value to return in the event that an empty  
> >> iterable  is passed.
> >
> > Could you write your proposal out in pure python so
> > we can see how it interacts with the key-keyword
> > argument and how it works when the number of
> > positional arguments is not one.
> 
> Here's one option... I'm going to cheat a little here and just wrap  
> the built-in min, but a quick/simple answer could be:
> 
> def min2(*vars, **kw):
>       try:
>           if 'key' in kw:
>                return min(*vars, key=kw['key'])
>           return min(*vars)
>       except Exception:
>           if 'default' in kw:
>               return kw['default']
>           raise

Is the purpose of this proposal really to return a default value in *any* case of exception?

> > Will min(default=0) still return a TypeError?
> > Will min(1, 2, default=0) return 0 or 1?
> > Will min([1,2], default=0) return 1?  # different from min([0,1,2])
> 
> # Your examples
> min2() -> TypeError
> min2(default=0) -> 0
> min2(1,2,default=0) -> 1
> min2([1,2], default=0) -> 1
> 
> # Iterator that yields things that are not comparable
> min2([1, set()]) -> TypeError
> min2([1, set()], default=7 ) -> 7   ***1***
> 
> # Iterator that raises an exception
> def foo():
>     yield 1
>     raise ValueError
> 
> min(foo()) -> ValueError
> min2(foo()) -> ValueError
> min2(foo(), default=None)  -> None  ***2***

In the case #2 above, maybe there is some utility to get a default.
I consider the case #1 as real programming error that should be properly warned with an exception.

> Jared

Intuitively, I thought the proposal was rather about a possibly empty iterable, result of previous unpredictable computations; then calling:
   min(iterable, default=whatever)

>>> min([])
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
ValueError: min() arg is an empty sequence
>>> def min2(*vars, **kw):
...     # case vars holds (only) an empty iterable and "default" is provided
...     if len(vars) == 1:
...         thing = vars[0]
...         try:
...             if len(thing) == 0 and 'default' in kw:
...                 return kw['default']
...         except TypeError:
...             pass
...     # else normal min() behaviour
...     if 'default' in kw:
...         del kw['default']
...     return min(*vars, **kw)
>>> min2([], default='X')
'X'
>>> min2(1,set(), default='X')
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "<stdin>", line 13, in min2
TypeError: can only compare to a set
>>> min2((1,set()), default='X')
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "<stdin>", line 13, in min2
TypeError: can only compare to a set
>>> min(1)
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
TypeError: 'int' object is not iterable
>>> min2(1, default='X')
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "<stdin>", line 13, in min2
TypeError: 'int' object is not iterable
>>> min((1))
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
TypeError: 'int' object is not iterable
>>> min2((1), default='X')
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "<stdin>", line 13, in min2
TypeError: 'int' object is not iterable
>>> min([1])
1
>>> min2([1], default='X')
1

But, as said before, I have no clue about exceptions raised by a generator.

Denis
------
la vita e estrany


From jh at improva.dk  Thu Apr 16 12:33:05 2009
From: jh at improva.dk (Jacob Holm)
Date: Thu, 16 Apr 2009 12:33:05 +0200
Subject: [Python-ideas] 'default' keyword argument for max(), min()
In-Reply-To: <2B979181-B4B0-4970-9A3E-2890A0C50F5F@gmail.com>
References: <6AD4057E-7F0E-4282-A680-A603417086F5@atlas.st>	<0A5FAA88DBA0449591EA6C696C9EBDDE@RaymondLaptop1>
	<2B979181-B4B0-4970-9A3E-2890A0C50F5F@gmail.com>
Message-ID: <49E70961.5020308@improva.dk>

Jared Grubb wrote:
> On 15 Apr 2009, at 11:17, Raymond Hettinger wrote:
>> [Adam Atlas]
>>> I propose adding a "default" keyword argument to max() and min(), 
>>> which provides a value to return in the event that an empty iterable 
>>> is passed.
>>
>> Could you write your proposal out in pure python so
>> we can see how it interacts with the key-keyword
>> argument and how it works when the number of
>> positional arguments is not one.
>
> Here's one option... I'm going to cheat a little here and just wrap 
> the built-in min, but a quick/simple answer could be:
>
> def min2(*vars, **kw):
> try:
> if 'key' in kw:
> return min(*vars, key=kw['key'])
> return min(*vars)
> except Exception:
> if 'default' in kw:
> return kw['default']
> raise

This will swallow exceptions that are completely unrelated to the number 
of values we are taking the min() of. I am -1 on that. Here is my 
version. It also includes an 'initial' argument (not part of Adams 
proposal) to illustrate the difference between the two concepts. I would 
be +1 to adding 'initial' as well, but that is less interesting than 
'default' because the 'initial' behavior is easy to get in other ways, 
e.g. "min(itertools.chain((initial,), values), key=func)".

_marker = object()

def min(*args, **kwargs):
    # extract and validate kwargs
    initial = kwargs.pop('initial', _marker)
    default = kwargs.pop('default', _marker)
    key = kwargs.pop('key', _marker)
    if kwargs:
        raise TypeError('min() got an unexpected keyword argument')
    # validate args, this TypeError is needed for backwards compatibility
    if initial is _marker and default is _marker and not args:
        raise TypeError('min expected 1 arguments, got 0')
    # create iterator for the values
    if len(args) == 1:
        it = iter(args[0])
    else:
        it = iter(args)
    # extract first value if any and handle empty sequence
    if initial is not _marker:
        result = initial
    else:
        for result in it:
            break
        else:
            if default is _marker:
                raise ValueError('min() arg is an empty sequence')
            return default
    # handle remaining values
    if key is _marker:
        for value in it:
            if value < result:
                result = value
    else:
        resultkey = key(result)
        for value in it:
            valuekey = key(value)
            if valuekey < resultkey:
                result, resultkey = value, valuekey
    return result


And here is what I get with the examples so far:

>>> min()
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "<stdin>", line 10, in min
TypeError: min expected 1 arguments, got 0
>>> min(default=0)
0
>>> min(1,2,default=0)
1
>>> min([1,2],default=0)
1
>>> min([1,set()])
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "<stdin>", line 29, in min
TypeError: can only compare to a set
>>> min([1,set()], default=7)
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "<stdin>", line 29, in min
TypeError: can only compare to a set
>>> def foo():
...     yield 1
...     raise ValueError
... 
>>> min(foo())
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "<stdin>", line 28, in min
  File "<stdin>", line 3, in foo
ValueError
>>> min(foo(), default=None)
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "<stdin>", line 28, in min
  File "<stdin>", line 3, in foo
ValueError



Cheers
- Jacob


From arnodel at googlemail.com  Thu Apr 16 13:44:51 2009
From: arnodel at googlemail.com (Arnaud Delobelle)
Date: Thu, 16 Apr 2009 12:44:51 +0100
Subject: [Python-ideas] 'default' keyword argument for max(), min()
In-Reply-To: <49E70961.5020308@improva.dk>
References: <6AD4057E-7F0E-4282-A680-A603417086F5@atlas.st>	<0A5FAA88DBA0449591EA6C696C9EBDDE@RaymondLaptop1>
	<2B979181-B4B0-4970-9A3E-2890A0C50F5F@gmail.com>
	<49E70961.5020308@improva.dk>
Message-ID: <0003247D-F139-4A4A-8F94-725B41872135@googlemail.com>


On 16 Apr 2009, at 11:33, Jacob Holm wrote:
>
> _marker = object()
>
> def min(*args, **kwargs):
>   # extract and validate kwargs
>   initial = kwargs.pop('initial', _marker)
>   default = kwargs.pop('default', _marker)
>   key = kwargs.pop('key', _marker)
>   if kwargs:
>       raise TypeError('min() got an unexpected keyword argument')
>   # validate args, this TypeError is needed for backwards  
> compatibility
>   if initial is _marker and default is _marker and not args:
>       raise TypeError('min expected 1 arguments, got 0')
>   # create iterator for the values
>   if len(args) == 1:
>       it = iter(args[0])
>   else:
>       it = iter(args)
>   # extract first value if any and handle empty sequence
>   if initial is not _marker:
>       result = initial
>   else:
>       for result in it:
>           break
>       else:
>           if default is _marker:
>               raise ValueError('min() arg is an empty sequence')
>           return default
>   # handle remaining values
>   if key is _marker:
>       for value in it:
>           if value < result:
>               result = value
>   else:
>       resultkey = key(result)
>       for value in it:
>           valuekey = key(value)
>           if valuekey < resultkey:
>               result, resultkey = value, valuekey
>   return result

I made a similar implementation, that I post here FWIW (using  
positional only arguments makes it slighly more compact):

_not_provided = object()

def min(first, *rest, key=_not_provided, default=_not_provided):
     if not rest:
         rest = iter(first)
         for first in rest:
             break
         else:
             if default is _not_provided:
                 raise ValueError("min() arg is an empty sequence")
             else:
                 return default
     if key is _not_provided:
         for el in rest:
             if el < first:
                 first = el
     else:
         fkey = key(first)
         for el in rest:
             elkey = key(el)
             if elkey < fkey:
                 first, fkey = el, elkey
     return first




From steve at pearwood.info  Thu Apr 16 14:09:12 2009
From: steve at pearwood.info (Steven D'Aprano)
Date: Thu, 16 Apr 2009 22:09:12 +1000
Subject: [Python-ideas] 'default' keyword argument for max(), min()
In-Reply-To: <2444F7FF-283C-4562-AE46-0F9F9709BB7D@googlemail.com>
References: <6AD4057E-7F0E-4282-A680-A603417086F5@atlas.st>
	<2B979181-B4B0-4970-9A3E-2890A0C50F5F@gmail.com>
	<2444F7FF-283C-4562-AE46-0F9F9709BB7D@googlemail.com>
Message-ID: <200904162209.13111.steve@pearwood.info>

On Thu, 16 Apr 2009 07:10:52 pm Arnaud Delobelle wrote:
> On 16 Apr 2009, at 04:39, Jared Grubb wrote:
> > def min2(*vars, **kw):
> >     try:
> >         if 'key' in kw:
> >              return min(*vars, key=kw['key'])
> >         return min(*vars)
> >     except Exception:
> >         if 'default' in kw:
> >             return kw['default']
> >         raise
>
> Nitpick:

I don't think pointing out that the proposed behaviour inappropriately 
swallows random exceptions is a nitpick. I think it's a valuable 
service :)

I think it is vital that min() and max() don't hide bugs by swallowing 
all exceptions. Here's my go at a pure Python version:


SENTINEL = object()

def is_iterable(obj):
    try:
        iter(obj)
    except TypeError:
        return False
    return True

def min(*vars, key=None, default=SENTINEL):
    if len(vars) == 1:
        if is_iterable(vars):
            vars = iter(vars[0])
        else:
            raise TypeError
    try:
        smallest = vars.next()
    except StopIteration:
        if default is SENTINEL:
            raise ValueError
        else:
            return default
    if key is not None:
        smallest = key(smallest)
    for value in vars:
        if key is not None:
            value = key(value)
        if value < smallest:
            smallest = vars
    return smallest





-- 
Steven D'Aprano


From qrczak at knm.org.pl  Thu Apr 16 14:14:25 2009
From: qrczak at knm.org.pl (Marcin 'Qrczak' Kowalczyk)
Date: Thu, 16 Apr 2009 14:14:25 +0200
Subject: [Python-ideas] 'default' keyword argument for max(), min()
In-Reply-To: <49E70961.5020308@improva.dk>
References: <6AD4057E-7F0E-4282-A680-A603417086F5@atlas.st> 
	<0A5FAA88DBA0449591EA6C696C9EBDDE@RaymondLaptop1>
	<2B979181-B4B0-4970-9A3E-2890A0C50F5F@gmail.com> 
	<49E70961.5020308@improva.dk>
Message-ID: <3f4107910904160514n1573dd7am79e179f0d6c47759@mail.gmail.com>

2009/4/16 Jacob Holm <jh at improva.dk>:

> ? # validate args, this TypeError is needed for backwards compatibility
> ? if initial is _marker and default is _marker and not args:
> ? ? ? raise TypeError('min expected 1 arguments, got 0')

I would prefer to raise this error if not args, regardless of initial
and default. This makes the preconditions simpler, and the excluded
cases are never useful (the result value is statically known if the
length of args is statically known, which should always be true to
avoid the case when it has length 1).

It would even make sense to allow initial and default only if args has
length 1. This would again exclude cases where the arguments contain a
statically known redundancy, but this time the preconditions would be
more complicated.

-- 
Marcin Kowalczyk
qrczak at knm.org.pl
http://qrnik.knm.org.pl/~qrczak/


From jh at improva.dk  Thu Apr 16 14:20:01 2009
From: jh at improva.dk (Jacob Holm)
Date: Thu, 16 Apr 2009 14:20:01 +0200
Subject: [Python-ideas] 'default' keyword argument for max(), min()
In-Reply-To: <0003247D-F139-4A4A-8F94-725B41872135@googlemail.com>
References: <6AD4057E-7F0E-4282-A680-A603417086F5@atlas.st>	<0A5FAA88DBA0449591EA6C696C9EBDDE@RaymondLaptop1>
	<2B979181-B4B0-4970-9A3E-2890A0C50F5F@gmail.com>
	<49E70961.5020308@improva.dk>
	<0003247D-F139-4A4A-8F94-725B41872135@googlemail.com>
Message-ID: <49E72271.4050306@improva.dk>

Hi Arnaud

Arnaud Delobelle wrote:
>
> On 16 Apr 2009, at 11:33, Jacob Holm wrote:
>>
>> [snip my example]
>
> _not_provided = object()
>
> def min(first, *rest, key=_not_provided, default=_not_provided):
> if not rest:
> rest = iter(first)
> for first in rest:
> break
> else:
> if default is _not_provided:
> raise ValueError("min() arg is an empty sequence")
> else:
> return default
> if key is _not_provided:
> for el in rest:
> if el < first:
> first = el
> else:
> fkey = key(first)
> for el in rest:
> elkey = key(el)
> if elkey < fkey:
> first, fkey = el, elkey
> return first
>
>
Yes, that is indeed a bit easier on the eyes, but doesn't work on 2.6. 
Also your version treats the case "min(default=0)" differently from 
mine. That might be a good thing though :) The only reason I could see 
for anyone hitting that case would be a use like "min(*values, 
default=0)", and that is much better handled by dropping the star anyway 
because of the special case for sequences of length 1.

Cheers
- Jacob




From jh at improva.dk  Thu Apr 16 14:27:23 2009
From: jh at improva.dk (Jacob Holm)
Date: Thu, 16 Apr 2009 14:27:23 +0200
Subject: [Python-ideas] 'default' keyword argument for max(), min()
In-Reply-To: <3f4107910904160514n1573dd7am79e179f0d6c47759@mail.gmail.com>
References: <6AD4057E-7F0E-4282-A680-A603417086F5@atlas.st>
	<0A5FAA88DBA0449591EA6C696C9EBDDE@RaymondLaptop1>
	<2B979181-B4B0-4970-9A3E-2890A0C50F5F@gmail.com>
	<49E70961.5020308@improva.dk>
	<3f4107910904160514n1573dd7am79e179f0d6c47759@mail.gmail.com>
Message-ID: <49E7242B.2070009@improva.dk>

Marcin 'Qrczak' Kowalczyk wrote:
> 2009/4/16 Jacob Holm <jh at improva.dk>:
>
>   
>>   # validate args, this TypeError is needed for backwards compatibility
>>   if initial is _marker and default is _marker and not args:
>>       raise TypeError('min expected 1 arguments, got 0')
>>     
>
> I would prefer to raise this error if not args, regardless of initial
> and default. This makes the preconditions simpler, and the excluded
> cases are never useful (the result value is statically known if the
> length of args is statically known, which should always be true to
> avoid the case when it has length 1).
>   

I agree.

> It would even make sense to allow initial and default only if args has
> length 1. This would again exclude cases where the arguments contain a
> statically known redundancy, but this time the preconditions would be
> more complicated.
>   

It would complicate the preconditions without much gain, but I don't 
really care either way.

- Jacob


From denis.spir at free.fr  Thu Apr 16 19:40:39 2009
From: denis.spir at free.fr (spir)
Date: Thu, 16 Apr 2009 19:40:39 +0200
Subject: [Python-ideas] 'default' keyword argument for max(), min()
In-Reply-To: <3f4107910904160514n1573dd7am79e179f0d6c47759@mail.gmail.com>
References: <6AD4057E-7F0E-4282-A680-A603417086F5@atlas.st>
	<0A5FAA88DBA0449591EA6C696C9EBDDE@RaymondLaptop1>
	<2B979181-B4B0-4970-9A3E-2890A0C50F5F@gmail.com>
	<49E70961.5020308@improva.dk>
	<3f4107910904160514n1573dd7am79e179f0d6c47759@mail.gmail.com>
Message-ID: <20090416194039.39d48ce8@o>

Le Thu, 16 Apr 2009 14:14:25 +0200,
"Marcin 'Qrczak' Kowalczyk" <qrczak at knm.org.pl> s'exprima ainsi:

> It would even make sense to allow initial and default only if args has
> length 1. This would again exclude cases where the arguments contain a
> statically known redundancy, but this time the preconditions would be
> more complicated.

As I understand it, anyway, default really makes sense only when args has length 1 and args[0] is an iterable.
Cannot really see the sense of initial for min()/max().

Denis
------
la vita e estrany


From jared.grubb at gmail.com  Thu Apr 16 20:19:32 2009
From: jared.grubb at gmail.com (Jared Grubb)
Date: Thu, 16 Apr 2009 11:19:32 -0700
Subject: [Python-ideas] 'default' keyword argument for max(), min()
In-Reply-To: <2444F7FF-283C-4562-AE46-0F9F9709BB7D@googlemail.com>
References: <6AD4057E-7F0E-4282-A680-A603417086F5@atlas.st>
	<0A5FAA88DBA0449591EA6C696C9EBDDE@RaymondLaptop1>
	<2B979181-B4B0-4970-9A3E-2890A0C50F5F@gmail.com>
	<2444F7FF-283C-4562-AE46-0F9F9709BB7D@googlemail.com>
Message-ID: <91B141F0-AFDB-4A83-AEA3-04D375E42483@gmail.com>


On 16 Apr 2009, at 02:10, Arnaud Delobelle wrote:
> On 16 Apr 2009, at 04:39, Jared Grubb wrote:
>> def min2(*vars, **kw):
>>    try:
>>        if 'key' in kw:
>>             return min(*vars, key=kw['key'])
>>        return min(*vars)
>>    except Exception:
>>        if 'default' in kw:
>>            return kw['default']
>>        raise
>
> Nitpick: [...]

Yes, the "except Exception" was intentional such that "default=..."  
gives a no-throw guarantee. (I originally had "except TypeError", but  
then that would swallow all TypeError, even those given by the  
iterator; and between THOSE two behaviors, no-throw seemed most  
intuitive)

I'm 0 on whether that should be the semantics or not, but as some have  
pointed out, swallowing bugs in iterators is not always a good thing.  
On the other hand, a no-throw min is kinda nice too...

I personally prefer some of the other posters' versions that basically  
do "default catches only the TypeError that min would have thrown if  
iterator was empty" semantics.

Jared


From python at rcn.com  Thu Apr 16 20:57:42 2009
From: python at rcn.com (Raymond Hettinger)
Date: Thu, 16 Apr 2009 11:57:42 -0700
Subject: [Python-ideas] 'default' keyword argument for max(), min()
References: <6AD4057E-7F0E-4282-A680-A603417086F5@atlas.st><0A5FAA88DBA0449591EA6C696C9EBDDE@RaymondLaptop1><2B979181-B4B0-4970-9A3E-2890A0C50F5F@gmail.com><49E70961.5020308@improva.dk><3f4107910904160514n1573dd7am79e179f0d6c47759@mail.gmail.com>
	<20090416194039.39d48ce8@o>
Message-ID: <7CA8CB84F7E54CFD8F756774363ACB5C@RaymondLaptop1>


>> It would even make sense to allow initial and default only if args has
>> length 1. This would again exclude cases where the arguments contain a
>> statically known redundancy, but this time the preconditions would be
>> more complicated.
>
> As I understand it, anyway, default really makes sense only when args has length 1 and args[0] is an iterable.
> Cannot really see the sense of initial for min()/max().


So the motivating case for a default argument boils down to:

* The input is an iterable (otherwise the number of positional arguments is already known when the call is written).
* The input is not a sequence of known length (otherwise, you could just use "min(seq) if seq else default").
* The input is potentially long (otherwise you could trivially convert to a sequence with list(iterable)).
* The input is potentially empty (otherwise you wouldn't need a default).
* There is a semantically meaningful default case for an empty input.
* You only want the min or max but no other information from the iterable (otherwise you would need to convert it to a sequence so 
that min/max wouldn't consume all the data).

I think this is a YAGNI case.  Yes, it does come up every now and then
but I don't think it is worth complicating what should be a very simple function.

FWIW, we recently rejected a perfectly reasonable addition to operator.attrgetter()
and operator.itemgetter() that would have added a default value.  The problem
is that it didn't work well with the other extensions that had already been accepted
(like having multiple arguments such as itemgetter(1, 4, 7) or dotted attribute
chains like attrgetter("store.department.register")).  The concept behind the rejection
is that it isn't worthwhile to overload a function with too many alternative extensions
even if the extensions make sense taken individually.

I contend that min/max are already in that position.   They already have some
signature complexity with min()-->TypeError; min(x)-->where-x-is-iterable;
min(x,y,z)-->where-args-are-unrolled; and min(*args) changing behavior based
on the length of args.  On top of that, we've already extended min/max with a
key= argument, further adding to its complexity.

I think the default arg is a bridge too far.  This is evidenced by the
complexity of the suggested implementations (remember the zen of python).
And, it is evidenced above discussion on signature complexity resulting from
a kitchen-sink full of individually simple extensions.  This is also evidenced
by the disagreements in this thread about what the various corner cases
should do.  And, at least one respondent significantly misinterpreted
the default-argument as being equivalent to a initial-argument.

There has been almost zero discussion on use cases.  There is no doubt
that they exist, but no compelling cases have been presented (i.e. real-world
code that is much improved with the extension).   Likewise, there has been
no discussion of alternatives (it is not hard to write an itertool that wraps
an iterator with something that supplies a default when the iterator is empty).

one-aspect-of-language-design-is-knowing-when-quit-ly yours,


Raymond








From aahz at pythoncraft.com  Thu Apr 16 21:34:58 2009
From: aahz at pythoncraft.com (Aahz)
Date: Thu, 16 Apr 2009 12:34:58 -0700
Subject: [Python-ideas] 'default' keyword argument for max(), min()
In-Reply-To: <7CA8CB84F7E54CFD8F756774363ACB5C@RaymondLaptop1>
References: <20090416194039.39d48ce8@o>
	<7CA8CB84F7E54CFD8F756774363ACB5C@RaymondLaptop1>
Message-ID: <20090416193458.GA22884@panix.com>

On Thu, Apr 16, 2009, Raymond Hettinger wrote:
>
> There has been almost zero discussion on use cases.  There is no doubt
> that they exist, but no compelling cases have been presented (i.e.
> real-world code that is much improved with the extension).  Likewise,
> there has been no discussion of alternatives (it is not hard to write
> an itertool that wraps an iterator with something that supplies a
> default when the iterator is empty).
>
> one-aspect-of-language-design-is-knowing-when-quit-ly yours,

+1
-- 
Aahz (aahz at pythoncraft.com)           <*>         http://www.pythoncraft.com/

"If you think it's expensive to hire a professional to do the job, wait
until you hire an amateur."  --Red Adair


From george.sakkis at gmail.com  Thu Apr 16 22:14:10 2009
From: george.sakkis at gmail.com (George Sakkis)
Date: Thu, 16 Apr 2009 16:14:10 -0400
Subject: [Python-ideas] 'default' keyword argument for max(), min()
In-Reply-To: <7CA8CB84F7E54CFD8F756774363ACB5C@RaymondLaptop1>
References: <6AD4057E-7F0E-4282-A680-A603417086F5@atlas.st>
	<0A5FAA88DBA0449591EA6C696C9EBDDE@RaymondLaptop1>
	<2B979181-B4B0-4970-9A3E-2890A0C50F5F@gmail.com>
	<49E70961.5020308@improva.dk>
	<3f4107910904160514n1573dd7am79e179f0d6c47759@mail.gmail.com>
	<20090416194039.39d48ce8@o>
	<7CA8CB84F7E54CFD8F756774363ACB5C@RaymondLaptop1>
Message-ID: <91ad5bf80904161314q5456910dp8cf07084024e54eb@mail.gmail.com>

On Thu, Apr 16, 2009 at 2:57 PM, Raymond Hettinger <python at rcn.com> wrote:

> So the motivating case for a default argument boils down to:
>
> * The input is an iterable (otherwise the number of positional arguments is
> already known when the call is written).

Yes.

> * The input is not a sequence of known length (otherwise, you could just use
> "min(seq) if seq else default").

True, but the latter is easy to forget, less succinct and easy to miss
when refactoring a function to work on iterables instead of sequences
only.

> * The input is potentially long (otherwise you could trivially convert to a
> sequence with list(iterable)).

That should be the default mentality; unless one *knows* that the
input is "short" (for some definition of "short"), he should assume
that it is potentially long. Regardless of the length, I don't think
it's the responsibility of the iterable's consumer to convert it; if
the input is always short, why it's not a sequence in the first place
?

> * The input is potentially empty (otherwise you wouldn't need a default).

Yes.

> * There is a semantically meaningful default case for an empty input.

Yes, typically 0 or None.

> * You only want the min or max but no other information from the iterable
> (otherwise you would need to convert it to a sequence so that min/max
> wouldn't consume all the data).

Yes, I often use min/max with gen. expressions: Compare:
     if min(f(x) for x in iterable if x>0) > 0:
with
   _values = [f(x) for x in iterable if x>0]
   if _values and min(_values) > 0:

> I think this is a YAGNI case. ?Yes, it does come up every now and then
> but I don't think it is worth complicating what should be a very simple
> function.

The discussion has indeed sidetracked with handling the special cases,
signature definition and whatnot, but I believe meeting the conditions
you outlined above is not as rare as their number implies. I hope the
rest of the thread focuses on this motivating case so that this
proposal is not rejected due to excessive bikeshedding.

George


From george.sakkis at gmail.com  Thu Apr 16 23:06:17 2009
From: george.sakkis at gmail.com (George Sakkis)
Date: Thu, 16 Apr 2009 17:06:17 -0400
Subject: [Python-ideas] 'default' keyword argument for max(), min()
In-Reply-To: <7CA8CB84F7E54CFD8F756774363ACB5C@RaymondLaptop1>
References: <6AD4057E-7F0E-4282-A680-A603417086F5@atlas.st>
	<0A5FAA88DBA0449591EA6C696C9EBDDE@RaymondLaptop1>
	<2B979181-B4B0-4970-9A3E-2890A0C50F5F@gmail.com>
	<49E70961.5020308@improva.dk>
	<3f4107910904160514n1573dd7am79e179f0d6c47759@mail.gmail.com>
	<20090416194039.39d48ce8@o>
	<7CA8CB84F7E54CFD8F756774363ACB5C@RaymondLaptop1>
Message-ID: <91ad5bf80904161406x7e27e491m48e8ea74671fbd65@mail.gmail.com>

On Thu, Apr 16, 2009 at 2:57 PM, Raymond Hettinger <python at rcn.com> wrote:

> I think the default arg is a bridge too far. ?This is evidenced by the
> complexity of the suggested implementations (remember the zen of python).

FWIW, the following works good enough for my use cases:

def min2(*args, **kwds):
    if 'default' not in kwds:
        return min(*args, **kwds)
    default = kwds.pop('default')
    try: return min(*args, **kwds)
    except ValueError, ex:
        if 'arg is an empty sequence' in ex.message:
            return default
        raise

As an aside, it would be nice If min/max start raising a more narrow
ValueError subtype, say EmptyIterableError, so that hacks such as
checking the exception message are not necessary.

George


From jess.austin at gmail.com  Fri Apr 17 00:35:59 2009
From: jess.austin at gmail.com (Jess Austin)
Date: Thu, 16 Apr 2009 17:35:59 -0500
Subject: [Python-ideas] [Python-Dev] Issue5434: datetime.monthdelta
In-Reply-To: <49E7A823.3060702@trueblade.com>
References: <b8ad139e0904161441g9782b75r1bd5811157eca079@mail.gmail.com>
	<49E7A823.3060702@trueblade.com>
Message-ID: <b8ad139e0904161535y3257a322wb1a31c93927ea3c7@mail.gmail.com>

On Thu, Apr 16, 2009 at 4:50 PM, Eric Smith <eric at trueblade.com> wrote:
> Jess Austin wrote:
>>
>> What other behavior options besides "last-valid-day-of-the-month"
>> would you like to see?
>
> - Add 30 days to the source date.
> I'm sure there are others.

Python can do this already:

dt + timedelta(30)

> Followups to python-ideas.

This is fine with me, and I have joined -ideas now.  I do still hope
that an experienced core developer will have the time to comment on
the code itself, but it might be best to get discussion of alternate
functionality off -dev.

cheers,
Jess


From pyideas at rebertia.com  Fri Apr 17 00:41:07 2009
From: pyideas at rebertia.com (Chris Rebert)
Date: Thu, 16 Apr 2009 15:41:07 -0700
Subject: [Python-ideas] [Python-Dev] Issue5434: datetime.monthdelta
In-Reply-To: <b8ad139e0904161535y3257a322wb1a31c93927ea3c7@mail.gmail.com>
References: <b8ad139e0904161441g9782b75r1bd5811157eca079@mail.gmail.com>
	<49E7A823.3060702@trueblade.com>
	<b8ad139e0904161535y3257a322wb1a31c93927ea3c7@mail.gmail.com>
Message-ID: <50697b2c0904161541k613f8ad8jba3e3f553853dfb@mail.gmail.com>

On Thu, Apr 16, 2009 at 3:35 PM, Jess Austin <jess.austin at gmail.com> wrote:
> On Thu, Apr 16, 2009 at 4:50 PM, Eric Smith <eric at trueblade.com> wrote:
>> Jess Austin wrote:
>>>
>>> What other behavior options besides "last-valid-day-of-the-month"
>>> would you like to see?
>>
>> - Add 30 days to the source date.
>> I'm sure there are others.
>
> Python can do this already:
>
> dt + timedelta(30)
>
>> Followups to python-ideas.

You might want to look at this recent thread on the issue:
http://mail.python.org/pipermail/python-list/2009-March/704921.html

Apologies if it's already been mentioned on -dev; I don't subscribe to it.

Cheers,
Chris
-- 
I have a blog:
http://blog.rebertia.com


From python at rcn.com  Fri Apr 17 01:07:10 2009
From: python at rcn.com (Raymond Hettinger)
Date: Thu, 16 Apr 2009 16:07:10 -0700
Subject: [Python-ideas] 'default' keyword argument for max(), min()
References: <6AD4057E-7F0E-4282-A680-A603417086F5@atlas.st>
	<0A5FAA88DBA0449591EA6C696C9EBDDE@RaymondLaptop1>
	<2B979181-B4B0-4970-9A3E-2890A0C50F5F@gmail.com>
	<49E70961.5020308@improva.dk>
	<3f4107910904160514n1573dd7am79e179f0d6c47759@mail.gmail.com>
	<20090416194039.39d48ce8@o>
	<7CA8CB84F7E54CFD8F756774363ACB5C@RaymondLaptop1>
	<91ad5bf80904161314q5456910dp8cf07084024e54eb@mail.gmail.com>
Message-ID: <A571D09BBD22489CB4015BEA67E8AD94@RaymondLaptop1>

> Yes, I often use min/max with gen. expressions: Compare:
>     if min(f(x) for x in iterable if x>0) > 0:

Do you mean:
      if min((f(x) for x in iterable if x>0), default=0) > 0: ...

I don't find that to be a clear expression of what you're trying to do.
Too much logic forced into a one-liner (also note that the inner parens
are required).

> with
>   _values = [f(x) for x in iterable if x>0]
>   if _values and min(_values) > 0:

or with:
   if all(f(x)>0 for x in iterable if x>0):             ...

I think you're focusing on just one solution, one that involves piling-up
too many extensions in one function that should be dirt simple.  There
are many other approaches:  try/except, wrap the input in a default
itertool, use all(), use next(it, default) to test the first value, etc.

>> The discussion has indeed sidetracked with handling the special cases,
>> signature definition and whatnot, but I believe meeting the conditions
>> you outlined above is not as rare as their number implies. 

This may be a symptom of a particular programming style.
I've found zero useful examples in scans of the standard library,
in my own personal code base, or third-party extensions that I use regularly.


>> I hope the
>> rest of the thread focuses on this motivating case so that this
>> proposal is not rejected due to excessive bikeshedding.

A discussion of use cases is always helpful, but the rest of
the discussion wasn't bikeshedding.  It revealed that the
default-argument doesn't make sense with non-iterable positional
arguments and that some were confusing it with an initial-argument.
No one yet has produced a clean, pure-python version that only 
affects a single iterable argument (ignoring positional cases where
a default doesn't make sense) and that doesn't wrap the existing 
min/max code (it is important to look at the fully spelled-out 
pure python code to see that the overall design, taking all features 
into account, isn't clean).

Also, I did a couple quick checks on other languages to see
any use a default for empty min() but had no luck.  Do you
know of any languages where a min() with default is a 
proven best practice?

> As an aside, it would be nice If min/max start raising a more narrow
> ValueError subtype, say EmptyIterableError, so that hacks such as
> checking the exception message are not necessary.

I would support that proposal if it would end this effort to
complexify min/max.  


Raymond



FWIW, here's an itertool recipe that you may find useful.

def default(iterable, default=None):
    '''Yield elements of the iterable or if it is empty, yield the default.

    default([1,2,3], default=0) --> 1 2 3
    default([], default=0)      --> 0        

    '''
    it = iter(iterable)
    return chain([next(it, default)], it)


From jess.austin at gmail.com  Fri Apr 17 01:32:10 2009
From: jess.austin at gmail.com (Jess Austin)
Date: Thu, 16 Apr 2009 18:32:10 -0500
Subject: [Python-ideas] [Python-Dev] Issue5434: datetime.monthdelta
In-Reply-To: <50697b2c0904161541k613f8ad8jba3e3f553853dfb@mail.gmail.com>
References: <b8ad139e0904161441g9782b75r1bd5811157eca079@mail.gmail.com>
	<49E7A823.3060702@trueblade.com>
	<b8ad139e0904161535y3257a322wb1a31c93927ea3c7@mail.gmail.com>
	<50697b2c0904161541k613f8ad8jba3e3f553853dfb@mail.gmail.com>
Message-ID: <b8ad139e0904161632v386610f5w3078a48d293169e7@mail.gmail.com>

On Thu, Apr 16, 2009 at 5:41 PM, Chris Rebert <pyideas at rebertia.com> wrote:
> You might want to look at this recent thread on the issue:
> http://mail.python.org/pipermail/python-list/2009-March/704921.html
>
> Apologies if it's already been mentioned on -dev; I don't subscribe to it.

Haha, and up until now I haven't subscribed to -ideas.  It seemed the
main complaint in the previous thread was that there are different
ways to handle the corner case of the end of the month, and we
certainly can't just choose one way.

My take on that is that if you want an exception for invalid dates,
use date.replace().  If you want an exact number of days offset, use
timedelta.  If you want the same date, some number of months offset,
while month-end issues are silently handled, you can use the
monthdelta patch I have at http://bugs.python.org/issue5434 and
introduce at http://mail.python.org/pipermail/python-dev/2009-April/088794.html
.

I'm also aware of dateutil, but in its current form it couldn't be
integrated into core as "batteries included".  IMHO dateutil isn't as
pythonic as it could be either.  My philosophy when designing this was
to be as much like timedelta (which I think is a super API) as
possible.

cheers,
Jess


From python at rcn.com  Fri Apr 17 02:20:00 2009
From: python at rcn.com (Raymond Hettinger)
Date: Thu, 16 Apr 2009 17:20:00 -0700
Subject: [Python-ideas] 'default' keyword argument for max(), min()
References: <6AD4057E-7F0E-4282-A680-A603417086F5@atlas.st>
	<0A5FAA88DBA0449591EA6C696C9EBDDE@RaymondLaptop1>
	<2B979181-B4B0-4970-9A3E-2890A0C50F5F@gmail.com>
	<49E70961.5020308@improva.dk>
	<3f4107910904160514n1573dd7am79e179f0d6c47759@mail.gmail.com>
	<20090416194039.39d48ce8@o>
	<7CA8CB84F7E54CFD8F756774363ACB5C@RaymondLaptop1>
	<91ad5bf80904161314q5456910dp8cf07084024e54eb@mail.gmail.com>
Message-ID: <70D16501CA1D434E8753548301227932@RaymondLaptop1>

> The discussion has indeed sidetracked with handling the special cases,
> signature definition and whatnot, 

Does the following code emit 10 or -10?

    print(min([], default=10, key=operator.neg))


Raymond




From jh at improva.dk  Fri Apr 17 02:31:39 2009
From: jh at improva.dk (Jacob Holm)
Date: Fri, 17 Apr 2009 02:31:39 +0200
Subject: [Python-ideas] 'default' keyword argument for max(), min()
In-Reply-To: <70D16501CA1D434E8753548301227932@RaymondLaptop1>
References: <6AD4057E-7F0E-4282-A680-A603417086F5@atlas.st>	<0A5FAA88DBA0449591EA6C696C9EBDDE@RaymondLaptop1>	<2B979181-B4B0-4970-9A3E-2890A0C50F5F@gmail.com>	<49E70961.5020308@improva.dk>	<3f4107910904160514n1573dd7am79e179f0d6c47759@mail.gmail.com>	<20090416194039.39d48ce8@o>	<7CA8CB84F7E54CFD8F756774363ACB5C@RaymondLaptop1>	<91ad5bf80904161314q5456910dp8cf07084024e54eb@mail.gmail.com>
	<70D16501CA1D434E8753548301227932@RaymondLaptop1>
Message-ID: <49E7CDEB.7090801@improva.dk>

Raymond Hettinger wrote:
>> The discussion has indeed sidetracked with handling the special cases,
>> signature definition and whatnot, 
>
> Does the following code emit 10 or -10?
>
> print(min([], default=10, key=operator.neg))
>

10, obviously. The result is always one of the provided values. The key 
function is only used for selecting which one


- Jacob


From tleeuwenburg at gmail.com  Fri Apr 17 02:52:55 2009
From: tleeuwenburg at gmail.com (Tennessee Leeuwenburg)
Date: Fri, 17 Apr 2009 10:52:55 +1000
Subject: [Python-ideas] [Python-Dev] Issue5434: datetime.monthdelta
In-Reply-To: <18919.51931.874515.848841@montanaro.dyndns.org>
References: <b8ad139e0904152318p5473cbe5yb5f55a19894cc834@mail.gmail.com>
	<18918.61476.980951.991275@montanaro.dyndns.org>
	<b8ad139e0904161131g2f2b84fbpd67952697952afa9@mail.gmail.com>
	<18919.51931.874515.848841@montanaro.dyndns.org>
Message-ID: <43c8685c0904161752i6a7f4a23o3ece8f5b71ec6dd8@mail.gmail.com>

My thoughts on balance regarding monthdeltas:
  -- Month operations are useful, people will want to do them
  -- I think having a monthdelta object rather than a method makes sense to
me
  -- I think the documentation is severely underdone. The functionality is
not intuitive
     and therefore the docs need a lot more detail than usual
  -- Can you specify "1 month plus 10 days"?, i.e. add a monthdelta to a
timedelta?
  -- What about other cyclical periods (fortnights, 28 days, lunar cycles,
high tides)?

Cheers,
-T
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-ideas/attachments/20090417/9a0a5f57/attachment.html>

From steve at pearwood.info  Fri Apr 17 03:35:34 2009
From: steve at pearwood.info (Steven D'Aprano)
Date: Fri, 17 Apr 2009 11:35:34 +1000
Subject: [Python-ideas] 'default' keyword argument for max(), min()
In-Reply-To: <200904162209.13111.steve@pearwood.info>
References: <6AD4057E-7F0E-4282-A680-A603417086F5@atlas.st>
	<2444F7FF-283C-4562-AE46-0F9F9709BB7D@googlemail.com>
	<200904162209.13111.steve@pearwood.info>
Message-ID: <200904171135.34835.steve@pearwood.info>

On Thu, 16 Apr 2009 10:09:12 pm Steven D'Aprano wrote:

> I think it is vital that min() and max() don't hide bugs by
> swallowing all exceptions. Here's my go at a pure Python version:

As folks may have noticed by actually trying to run the damn thing, my 
first attempt hadn't been tested at all and failed miserably to work. 
Think of it as pseudo-code, which believe it or not I did intend to 
write but got distracted and forgot. *wry-grin*

(Thanks to Denis Spir for the polite way he pointed that out off-list.)

I agree with Raymond that in the absence of a compelling use-case, 
python-dev won't accept the proposal. (I'm still +1 on the idea, but I 
know when I'm licked.)

For the record, in case future generations want to re-visit the 
proposal, here's a pure Python version of min() plus default which I 
have tested in Python 2.6.1 and I think it should match the behaviour 
of the built-in min(). Anyone who wants to use it should feel free to 
do so (credit would be nice but not required).

def is_iterable(obj):
? ? try:
? ? ? ? iter(obj)
? ? except TypeError:
? ? ? ? return False
? ? return True

def min(*vars, **kwargs):
    SENTINEL = object()
    kw = {'key': None, 'default': SENTINEL}
    kw.update(kwargs)
    if len(kw) != 2:
        raise TypeError('min() got an unexpected key word argument')
    key = kw['key']
    default = kw['default']
    if len(vars) == 1:
        if is_iterable(vars[0]):
            vars = iter(vars[0])
        else:
            raise TypeError(
            "'%s' object is not iterable" % type(vars[0]).__name__)
    else:
        vars = iter(vars)
    try:
        result = vars.next()
    except StopIteration:
        if default is SENTINEL:
            raise ValueError("min() arg is an empty sequence")
        else:
            return default
    compare_result = result if key is None else key(result)
    for value in vars:
        compare_value = value if key is None else key(value)
        if compare_value < compare_result:
            result = value
            compare_result = compare_value
    return result



-- 
Steven D'Aprano


From george.sakkis at gmail.com  Fri Apr 17 03:50:42 2009
From: george.sakkis at gmail.com (George Sakkis)
Date: Thu, 16 Apr 2009 21:50:42 -0400
Subject: [Python-ideas] 'default' keyword argument for max(), min()
In-Reply-To: <A571D09BBD22489CB4015BEA67E8AD94@RaymondLaptop1>
References: <6AD4057E-7F0E-4282-A680-A603417086F5@atlas.st>
	<0A5FAA88DBA0449591EA6C696C9EBDDE@RaymondLaptop1>
	<2B979181-B4B0-4970-9A3E-2890A0C50F5F@gmail.com>
	<49E70961.5020308@improva.dk>
	<3f4107910904160514n1573dd7am79e179f0d6c47759@mail.gmail.com>
	<20090416194039.39d48ce8@o>
	<7CA8CB84F7E54CFD8F756774363ACB5C@RaymondLaptop1>
	<91ad5bf80904161314q5456910dp8cf07084024e54eb@mail.gmail.com>
	<A571D09BBD22489CB4015BEA67E8AD94@RaymondLaptop1>
Message-ID: <91ad5bf80904161850i46674ecal2d3a24d4ba50e28@mail.gmail.com>

On Thu, Apr 16, 2009 at 7:07 PM, Raymond Hettinger <python at rcn.com> wrote:

> I think you're focusing on just one solution, one that involves piling-up
> too many extensions in one function that should be dirt simple. ?There
> are many other approaches: ?try/except, wrap the input in a default
> itertool, use all(), use next(it, default) to test the first value, etc.

Right, I've now grown used to wrapping it in try/except in advance
just to be on the safe side; in the past I had to go back and fix it
after it has already bombed with a ValueError once. So yes, overall I
find the proposed parameter a handy convenience, not a fix to a
glaring omission.

I think what it boils down to is how exceptional is the empty iterable
for the overall task at hand. Does what follows the min/max
computation differ substantially when the input is empty from when
it's not, or does the following code can remain blissfully ignorant
about how was the value determined ? In the former case, raising an
exception is a feature since it forces the user to think about this
case and handle it explicitly. If not, then it's just extra
boilerplate we could do without.

A good example of the first case is division by zero: typically the
course of action is totally different if some intermediate computation
involves division by zero; you can't just replace it with some default
and continue with the rest computations as if nothing happened
(although a default might make sense as the "final" result, depending
on the application). OTOH, code that uses dict.get() or getattr() with
a default doesn't really care whether the key/attribute is actually in
the queried dict/object, the following logic remains the same.

So the bottom line is, are most use cases of taking the min/max of an
empty iterable closer to these that involve an intermediate
DivisionByZero or to those that involve a missing key/attribute ? I
don't have a general answer, both make sense under different
circumstances.

George


PS: Thanks for the default() recipe, seems generally useful, not only
for min/max.


From aahz at pythoncraft.com  Fri Apr 17 03:52:35 2009
From: aahz at pythoncraft.com (Aahz)
Date: Thu, 16 Apr 2009 18:52:35 -0700
Subject: [Python-ideas] [Python-Dev] Issue5434: datetime.monthdelta
In-Reply-To: <43c8685c0904161752i6a7f4a23o3ece8f5b71ec6dd8@mail.gmail.com>
References: <b8ad139e0904152318p5473cbe5yb5f55a19894cc834@mail.gmail.com>
	<18918.61476.980951.991275@montanaro.dyndns.org>
	<b8ad139e0904161131g2f2b84fbpd67952697952afa9@mail.gmail.com>
	<18919.51931.874515.848841@montanaro.dyndns.org>
	<43c8685c0904161752i6a7f4a23o3ece8f5b71ec6dd8@mail.gmail.com>
Message-ID: <20090417015235.GA428@panix.com>

On Fri, Apr 17, 2009, Tennessee Leeuwenburg wrote:
>
>   -- What about other cyclical periods (fortnights, 28 days, lunar cycles,
> high tides)?

That reminds me that we need to add a units module so that we can specify
furlongs per fortnight!
-- 
Aahz (aahz at pythoncraft.com)           <*>         http://www.pythoncraft.com/

"If you think it's expensive to hire a professional to do the job, wait
until you hire an amateur."  --Red Adair


From tleeuwenburg at gmail.com  Fri Apr 17 04:08:08 2009
From: tleeuwenburg at gmail.com (Tennessee Leeuwenburg)
Date: Fri, 17 Apr 2009 12:08:08 +1000
Subject: [Python-ideas] [Python-Dev] Issue5434: datetime.monthdelta
In-Reply-To: <20090417015235.GA428@panix.com>
References: <b8ad139e0904152318p5473cbe5yb5f55a19894cc834@mail.gmail.com>
	<18918.61476.980951.991275@montanaro.dyndns.org>
	<b8ad139e0904161131g2f2b84fbpd67952697952afa9@mail.gmail.com>
	<18919.51931.874515.848841@montanaro.dyndns.org>
	<43c8685c0904161752i6a7f4a23o3ece8f5b71ec6dd8@mail.gmail.com>
	<20090417015235.GA428@panix.com>
Message-ID: <43c8685c0904161908g72941e75xfa11b345138654ad@mail.gmail.com>

On Fri, Apr 17, 2009 at 11:52 AM, Aahz <aahz at pythoncraft.com> wrote:

> On Fri, Apr 17, 2009, Tennessee Leeuwenburg wrote:
> >
> >   -- What about other cyclical periods (fortnights, 28 days, lunar
> cycles,
> > high tides)?
>
> That reminds me that we need to add a units module so that we can specify
> furlongs per fortnight!


The Python Papers carried an article about a year back from someone who had
developed a units module, it would be very useful in some circumstances. The
codebase I work on has heaps of faux-units functionality which would be
great to wrap up into more fundamental unit-aware classes.

Cheers,
-T
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-ideas/attachments/20090417/47282f09/attachment.html>

From greg.ewing at canterbury.ac.nz  Fri Apr 17 04:18:33 2009
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Fri, 17 Apr 2009 14:18:33 +1200
Subject: [Python-ideas] [Python-Dev] Issue5434: datetime.monthdelta
In-Reply-To: <43c8685c0904161752i6a7f4a23o3ece8f5b71ec6dd8@mail.gmail.com>
References: <b8ad139e0904152318p5473cbe5yb5f55a19894cc834@mail.gmail.com>
	<18918.61476.980951.991275@montanaro.dyndns.org>
	<b8ad139e0904161131g2f2b84fbpd67952697952afa9@mail.gmail.com>
	<18919.51931.874515.848841@montanaro.dyndns.org>
	<43c8685c0904161752i6a7f4a23o3ece8f5b71ec6dd8@mail.gmail.com>
Message-ID: <49E7E6F9.8000905@canterbury.ac.nz>

Tennessee Leeuwenburg wrote:

>   -- What about other cyclical periods (fortnights, 28 days, lunar 
> cycles, high tides)?

Fortnights and 28 days are fixed numbers of days, so
they should be covered by the existing timedelta
object.

Lunar cycles and high tides would be interesting,
but they probably belong in a different module. Also,
being astronomically determined, they wouldn't exactly
fit into the datetime module's calendrical notion of
time, which ignores leap seconds and other such
inconvenient physical phenomena.

-- 
Greg


From ben+python at benfinney.id.au  Fri Apr 17 04:19:37 2009
From: ben+python at benfinney.id.au (Ben Finney)
Date: Fri, 17 Apr 2009 12:19:37 +1000
Subject: [Python-ideas] [Python-Dev] Issue5434: datetime.monthdelta
References: <b8ad139e0904152318p5473cbe5yb5f55a19894cc834@mail.gmail.com>
	<18918.61476.980951.991275@montanaro.dyndns.org>
	<b8ad139e0904161131g2f2b84fbpd67952697952afa9@mail.gmail.com>
	<18919.51931.874515.848841@montanaro.dyndns.org>
	<43c8685c0904161752i6a7f4a23o3ece8f5b71ec6dd8@mail.gmail.com>
	<20090417015235.GA428@panix.com>
Message-ID: <87r5zsowcm.fsf@benfinney.id.au>

Aahz <aahz at pythoncraft.com> writes:

> That reminds me that we need to add a units module so that we can
> specify furlongs per fortnight!

+1, both to the humour and to the serious idea of such a module.

-- 
 \            ?Conscience is the inner voice that warns us somebody is |
  `\                                       looking.? ?Henry L. Mencken |
_o__)                                                                  |
Ben Finney



From greg.ewing at canterbury.ac.nz  Fri Apr 17 04:25:27 2009
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Fri, 17 Apr 2009 14:25:27 +1200
Subject: [Python-ideas] 'default' keyword argument for max(), min()
In-Reply-To: <91ad5bf80904161850i46674ecal2d3a24d4ba50e28@mail.gmail.com>
References: <6AD4057E-7F0E-4282-A680-A603417086F5@atlas.st>
	<0A5FAA88DBA0449591EA6C696C9EBDDE@RaymondLaptop1>
	<2B979181-B4B0-4970-9A3E-2890A0C50F5F@gmail.com>
	<49E70961.5020308@improva.dk>
	<3f4107910904160514n1573dd7am79e179f0d6c47759@mail.gmail.com>
	<20090416194039.39d48ce8@o>
	<7CA8CB84F7E54CFD8F756774363ACB5C@RaymondLaptop1>
	<91ad5bf80904161314q5456910dp8cf07084024e54eb@mail.gmail.com>
	<A571D09BBD22489CB4015BEA67E8AD94@RaymondLaptop1>
	<91ad5bf80904161850i46674ecal2d3a24d4ba50e28@mail.gmail.com>
Message-ID: <49E7E897.8010903@canterbury.ac.nz>

George Sakkis wrote:

> So the bottom line is, are most use cases of taking the min/max of an
> empty iterable closer to these that involve an intermediate
> DivisionByZero or to those that involve a missing key/attribute ?

In my experience, it's always been much more like
DivisionByZero -- either it shouldn't happen in the
first place or something radically different must
be done.

-- 
Greg


From skip at pobox.com  Fri Apr 17 05:17:54 2009
From: skip at pobox.com (skip at pobox.com)
Date: Thu, 16 Apr 2009 22:17:54 -0500
Subject: [Python-ideas] [Python-Dev] Issue5434: datetime.monthdelta
In-Reply-To: <20090417015235.GA428@panix.com>
References: <20090417015235.GA428@panix.com>
Message-ID: <18919.62690.127063.932977@montanaro.dyndns.org>

    Aahz> That reminds me that we need to add a units module so that we can
    Aahz> specify furlongs per fortnight!

easy_install magnitude

Skip





From greg.ewing at canterbury.ac.nz  Fri Apr 17 06:52:20 2009
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Fri, 17 Apr 2009 16:52:20 +1200
Subject: [Python-ideas] Revised**10 PEP on Yield-From
In-Reply-To: <49E5BCC0.3060907@improva.dk>
References: <49E58076.4060202@canterbury.ac.nz> <49E5BCC0.3060907@improva.dk>
Message-ID: <49E80B04.8040908@canterbury.ac.nz>

Jacob Holm wrote:

> 1) IIRC, the use of sys.exc_info() is not needed in 3.x as all 
> exceptions have a __traceback__ attribute.

The 3.0 docs still list the signature of throw() as
having 3 args, though, so I'll leave it that way for
now.

> 2) The expansion is not handling StopIterations raised as a result of 
> calling _i.throw().

Yes, I was a bit too aggressive in ripping out
StopIteration handling there. :-)

> A simpler expansion based on 1) and 2) but otherwise identical is:

I had a try block around everything in an earlier
version, but I changed it because it was prone to
catching too much. I think I'll stick with separate
targeted try blocks, because it's easier to make
sure I'm catching only what I intend to catch.

> 3) If the subiterator has a close() but doesn't have throw() it won't be 
> closed when throw() is called on the outer generator.  This is fine with 
> me, I am just not sure if it is intentional.
> 
> 4) If the subiterator has a close() but doesn't have send() it won't be 
> closed when a send() on the outer generator causes an AttributeError in 
> the expansion.  Again this is fine with me, I am just not sure if it is 
> intentional.

I'm not worried about those much. The important thing
is for explicit finalization to work as expected.

> 5) The last paragraph in the "Use of StopIteration to return values" 
> section, seems to be a leftover from an earlier draft of the PEP that 
> used a different exception.

Yes, I need to remove that.

> 6) Several of the issues we have been discussing on python-ideas are not 
> mentioned at all:

I'll add some discussion about these.

> 7) By not mentioning caching, you are effectively saying the methods 
> won't be cached.

That's what I mean to say. Guido has pointed out that
Python doesn't generally do cacheing if it would change
the behaviour of the code, which means we can't do
cacheing here.

There's still a fast path available if the subiterator
is a generator, which is the case I'm mostly concerned
about, so I'm not as worried as I was about not being
able to cache send().

-- 
Greg


From greg.ewing at canterbury.ac.nz  Fri Apr 17 06:56:24 2009
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Fri, 17 Apr 2009 16:56:24 +1200
Subject: [Python-ideas] Revised**11 PEP on Yield-From
Message-ID: <49E80BF8.4090803@canterbury.ac.nz>

Draft 12 of the PEP.

Fixed a bug in the expansion (didn't handle
StopIteration raised by throw).

Removed paragraph about StopIteration left over
from an earlier version.

Added some discussion about rejected ideas.

-- 
Greg
-------------- next part --------------
An embedded and charset-unspecified text was scrubbed...
Name: yield-from-rev12.txt
URL: <http://mail.python.org/pipermail/python-ideas/attachments/20090417/0debaceb/attachment.txt>

From jared.grubb at gmail.com  Fri Apr 17 09:33:48 2009
From: jared.grubb at gmail.com (Jared Grubb)
Date: Fri, 17 Apr 2009 00:33:48 -0700
Subject: [Python-ideas] 'default' keyword argument for max(), min()
In-Reply-To: <A571D09BBD22489CB4015BEA67E8AD94@RaymondLaptop1>
References: <6AD4057E-7F0E-4282-A680-A603417086F5@atlas.st>
	<0A5FAA88DBA0449591EA6C696C9EBDDE@RaymondLaptop1>
	<2B979181-B4B0-4970-9A3E-2890A0C50F5F@gmail.com>
	<49E70961.5020308@improva.dk>
	<3f4107910904160514n1573dd7am79e179f0d6c47759@mail.gmail.com>
	<20090416194039.39d48ce8@o>
	<7CA8CB84F7E54CFD8F756774363ACB5C@RaymondLaptop1>
	<91ad5bf80904161314q5456910dp8cf07084024e54eb@mail.gmail.com>
	<A571D09BBD22489CB4015BEA67E8AD94@RaymondLaptop1>
Message-ID: <3C4CA5BF-C4FA-4181-B7AF-BFEED19FDF75@gmail.com>

On 16 Apr 2009, at 16:07, Raymond Hettinger wrote:
>>> The discussion has indeed sidetracked with handling the special  
>>> cases,
>>> signature definition and whatnot, but I believe meeting the  
>>> conditions
>>> you outlined above is not as rare as their number implies.
>
> This may be a symptom of a particular programming style.
> I've found zero useful examples in scans of the standard library,
> in my own personal code base, or third-party extensions that I use  
> regularly.

I did find one example in the stdlib in my scan (Python 2.6.1):

doctest.py:
     def _min_indent(self, s):
         "Return the minimum indentation of any non-blank line in `s`"
         indents = [len(indent) for indent in  
self._INDENT_RE.findall(s)]
         if len(indents) > 0:
             return min(indents)
         else:
             return 0

Note, however, that I only found 3 examples total that used an  
iterable at all (almost all of the cases in the stdlib were of the  
form "min(x,y)").

The other two examples were in timeit and urllib2, but each knew the  
iterables were non-empty because it had explicitly constructed them in  
such a way that they could not be empty.

Jared


From jared.grubb at gmail.com  Fri Apr 17 10:01:57 2009
From: jared.grubb at gmail.com (Jared Grubb)
Date: Fri, 17 Apr 2009 01:01:57 -0700
Subject: [Python-ideas] 'default' keyword argument for max(), min()
In-Reply-To: <200904171135.34835.steve@pearwood.info>
References: <6AD4057E-7F0E-4282-A680-A603417086F5@atlas.st>
	<2444F7FF-283C-4562-AE46-0F9F9709BB7D@googlemail.com>
	<200904162209.13111.steve@pearwood.info>
	<200904171135.34835.steve@pearwood.info>
Message-ID: <828464C3-17E3-4E07-8DCA-64C17B17224F@gmail.com>

FWIW, I ran Steven's version against "test_min" from test_builtin.py,  
and it passed all those tests.

Jared

On 16 Apr 2009, at 18:35, Steven D'Aprano wrote:
> For the record, in case future generations want to re-visit the
> proposal, here's a pure Python version of min() plus default which I
> have tested in Python 2.6.1 and I think it should match the behaviour
> of the built-in min(). Anyone who wants to use it should feel free to
> do so (credit would be nice but not required).
>
> def is_iterable(obj):
>     try:
>         iter(obj)
>     except TypeError:
>         return False
>     return True
>
> def min(*vars, **kwargs):
>    SENTINEL = object()
>    kw = {'key': None, 'default': SENTINEL}
>    kw.update(kwargs)
>    if len(kw) != 2:
>        raise TypeError('min() got an unexpected key word argument')
>    key = kw['key']
>    default = kw['default']
>    if len(vars) == 1:
>        if is_iterable(vars[0]):
>            vars = iter(vars[0])
>        else:
>            raise TypeError(
>            "'%s' object is not iterable" % type(vars[0]).__name__)
>    else:
>        vars = iter(vars)
>    try:
>        result = vars.next()
>    except StopIteration:
>        if default is SENTINEL:
>            raise ValueError("min() arg is an empty sequence")
>        else:
>            return default
>    compare_result = result if key is None else key(result)
>    for value in vars:
>        compare_value = value if key is None else key(value)
>        if compare_value < compare_result:
>            result = value
>            compare_result = compare_value
>    return result
>
>
>
> -- 
> Steven D'Aprano


From denis.spir at free.fr  Fri Apr 17 11:29:39 2009
From: denis.spir at free.fr (spir)
Date: Fri, 17 Apr 2009 11:29:39 +0200
Subject: [Python-ideas] 'default' keyword argument for max(), min()
In-Reply-To: <91ad5bf80904161406x7e27e491m48e8ea74671fbd65@mail.gmail.com>
References: <6AD4057E-7F0E-4282-A680-A603417086F5@atlas.st>
	<0A5FAA88DBA0449591EA6C696C9EBDDE@RaymondLaptop1>
	<2B979181-B4B0-4970-9A3E-2890A0C50F5F@gmail.com>
	<49E70961.5020308@improva.dk>
	<3f4107910904160514n1573dd7am79e179f0d6c47759@mail.gmail.com>
	<20090416194039.39d48ce8@o>
	<7CA8CB84F7E54CFD8F756774363ACB5C@RaymondLaptop1>
	<91ad5bf80904161406x7e27e491m48e8ea74671fbd65@mail.gmail.com>
Message-ID: <20090417112939.1d7b8031@o>

Le Thu, 16 Apr 2009 17:06:17 -0400,
George Sakkis <george.sakkis at gmail.com> s'exprima ainsi:

> As an aside, it would be nice If min/max start raising a more narrow
> ValueError subtype, say EmptyIterableError, so that hacks such as
> checking the exception message are not necessary.

+1

------
la vita e estrany


From denis.spir at free.fr  Fri Apr 17 11:36:03 2009
From: denis.spir at free.fr (spir)
Date: Fri, 17 Apr 2009 11:36:03 +0200
Subject: [Python-ideas] 'default' keyword argument for max(), min()
In-Reply-To: <A571D09BBD22489CB4015BEA67E8AD94@RaymondLaptop1>
References: <6AD4057E-7F0E-4282-A680-A603417086F5@atlas.st>
	<0A5FAA88DBA0449591EA6C696C9EBDDE@RaymondLaptop1>
	<2B979181-B4B0-4970-9A3E-2890A0C50F5F@gmail.com>
	<49E70961.5020308@improva.dk>
	<3f4107910904160514n1573dd7am79e179f0d6c47759@mail.gmail.com>
	<20090416194039.39d48ce8@o>
	<7CA8CB84F7E54CFD8F756774363ACB5C@RaymondLaptop1>
	<91ad5bf80904161314q5456910dp8cf07084024e54eb@mail.gmail.com>
	<A571D09BBD22489CB4015BEA67E8AD94@RaymondLaptop1>
Message-ID: <20090417113603.35e6f1e3@o>

Le Thu, 16 Apr 2009 16:07:10 -0700,
"Raymond Hettinger" <python at rcn.com> s'exprima ainsi:

> > As an aside, it would be nice If min/max start raising a more narrow
> > ValueError subtype, say EmptyIterableError, so that hacks such as
> > checking the exception message are not necessary.  
> 
> I would support that proposal if it would end this effort to
> complexify min/max.  

This would be a satisfying solution for me:

try:
    result = min(iterable)
except EmptyIterableError:
    result = default

Rather pythonic I guess; as well as much simpler and easier to implement.

Denis
------
la vita e estrany


From jh at improva.dk  Fri Apr 17 14:25:32 2009
From: jh at improva.dk (Jacob Holm)
Date: Fri, 17 Apr 2009 14:25:32 +0200
Subject: [Python-ideas] Revised**11 PEP on Yield-From
In-Reply-To: <49E80BF8.4090803@canterbury.ac.nz>
References: <49E80BF8.4090803@canterbury.ac.nz>
Message-ID: <49E8753C.7010908@improva.dk>

Greg Ewing wrote:
> Draft 12 of the PEP.
>
> Fixed a bug in the expansion (didn't handle
> StopIteration raised by throw).
>

Just so you know, I now agree that a long expansion with multiple 
"try...except StopIteration" blocks is the right thing to do.  There are 
only two cases I can see where it makes a difference compared to what I 
suggested:

   1. Throwing StopIteration to an iterator without a throw() method.  I
      would prefer treating this case *exactly* as if the iterator had a
      trivial throw method:  "def throw(self, et, ev=None, tb=None):
      raise et, ev, tb".  Meaning that the StopIteration *should* be
      caught by yield-from.  Treating it like this makes it easier to
      write wrappers that don't accidentally change the semantics of an
      obscure corner case.  Principle of least surprise and all that... 
      It is easy enough to change the expansion to do this by expanding
      the try block around the throw() call or by actually using such a
      trivial throw method as a fallback.  Alternatively, the expansion
      can be rewritten using functools.partial as in the bottom of this
      mail.  It has identical semantics to draft 12 of the PEP, except
      for the handling of the missing throw method. I actually like that
      version because it is careful about what exceptions to catch, but
      still only has one "try...except StopIteration". YMMV.
   2. Calling an iterator.close() that raises a StopIteration. 
      Arguably, such a close() is an error, so getting an exception in
      the caller is better than swallowing it and turning it into a
      normal return.  Especially since we only called close() as part of
      handling GeneratorExit in the first place.

An unrelated question...  What should happen with an iterator that has a 
throw or close attribute that just happens to have the value None?  
Should that be treated as an error because None is not callable, or 
should it be treated as if the attribute wasn't there?  The expansion 
handles it as if the attribute wasn't there, but IIRC your patch will 
raise a TypeError trying to call None.

> Removed paragraph about StopIteration left over
> from an earlier version.
>
> Added some discussion about rejected ideas.
>

Looks good, except...

> Suggestion: If ``close()`` is not to return a value, then raise an
> exception if StopIteration with a non-None value occurs.
>
> Resolution: No clear reason to do so. Ignoring a return value is not
> considered an error anywhere else in Python.
>   

I may have been unclear about why I thought this should raise a 
RuntimeError.  As I see it there are only two code patterns in a 
generator that would have close() catch a StopIteration with a non-None 
value.

    * An explicit catch of GeneratorExit followed by "return Value". 
      This is harmless and potentially useful, although probably an
      abuse of GeneratorExit (that was one of the early arguments for
      not returning a value from close).  Not raising a RuntimeError in
      close makes it simpler to share a code path between the common and
      the forced exit.
    * An implicit catch of GeneratorExit, followed by "return Value". 
      By an "implicit catch", I mean either a catch of "BaseException"
      or a "finally" clause.  In both cases, "return Value" will hide
      the original exception and that is almost certainly a bug. 
      Raising a RuntimeError would let you discover this bug early.

The question now is whether it is better to catch n00b errors or to 
allow careful programmers a bit more freedom in how they structure their 
code.  When I started writing this mail I was leaning towards catching 
errors, but I have now changed my mind.  I think giving more power to 
experienced users is more important.

Best regards
- Jacob

------------------------------------------------------------------------

    _i = iter(EXPR)
    _p = partial(next, _i)
    while 1:
        try:
            _y = _p()
        except StopIteration as _e:
            _r = _e.value
            break
        try:
            _s = yield _y
        except GeneratorExit:
            _m = getattr(_i, 'close', None)
            if _m is not None:
                _m()
            raise
        except:
            _m = getattr(_i, 'throw', None)
            if _m is None:
                def _m(et, ev, tb):
                    raise et, ev, tb
            _p = partial(_m, *sys.exc_info())
        else:
            if _s is None:
                _p = partial(next, _i)
            else:
                _p = partial(_i.send, _s)
    RESULT = _r




From aahz at pythoncraft.com  Fri Apr 17 16:01:44 2009
From: aahz at pythoncraft.com (Aahz)
Date: Fri, 17 Apr 2009 07:01:44 -0700
Subject: [Python-ideas] Revised**11 PEP on Yield-From
In-Reply-To: <49E80BF8.4090803@canterbury.ac.nz>
References: <49E80BF8.4090803@canterbury.ac.nz>
Message-ID: <20090417140144.GC17462@panix.com>

Looks good.  I can almost follow it!  IIRC, there were some suggestions
given for motivating examples that were posted in the thread, but I don't
see any of them either in the PEP itself nor at your URL with examples.
-- 
Aahz (aahz at pythoncraft.com)           <*>         http://www.pythoncraft.com/

"If you think it's expensive to hire a professional to do the job, wait
until you hire an amateur."  --Red Adair


From george.sakkis at gmail.com  Fri Apr 17 17:11:12 2009
From: george.sakkis at gmail.com (George Sakkis)
Date: Fri, 17 Apr 2009 11:11:12 -0400
Subject: [Python-ideas] 'default' keyword argument for max(), min()
In-Reply-To: <06830ED3DC2F4673B6483222962A829A@RaymondLaptop1>
References: <6AD4057E-7F0E-4282-A680-A603417086F5@atlas.st>
	<2B979181-B4B0-4970-9A3E-2890A0C50F5F@gmail.com>
	<49E70961.5020308@improva.dk>
	<3f4107910904160514n1573dd7am79e179f0d6c47759@mail.gmail.com>
	<20090416194039.39d48ce8@o>
	<7CA8CB84F7E54CFD8F756774363ACB5C@RaymondLaptop1>
	<91ad5bf80904161314q5456910dp8cf07084024e54eb@mail.gmail.com>
	<A571D09BBD22489CB4015BEA67E8AD94@RaymondLaptop1>
	<91ad5bf80904161850i46674ecal2d3a24d4ba50e28@mail.gmail.com>
	<06830ED3DC2F4673B6483222962A829A@RaymondLaptop1>
Message-ID: <91ad5bf80904170811j71e84e5cr638bd548e6d2576c@mail.gmail.com>

On Thu, Apr 16, 2009 at 10:08 PM, Raymond Hettinger <python at rcn.com> wrote:

> I did take time to scan the standard library and my own code
> base for examples. ?They are *very* hard to come by.
> I couldn't even contrive an example for max().
> The one use case presented so far is more cleanly coded
> with all() than with min(). ?The example isn't even remotely
> compelling.

Are you looking specifically for examples with non-sequence iterables
? As I mentioned, I would prefer "min(s, default=0)" from "min(s) if s
else 0" if the former was available as it's more future-proof.

> The code with a min default is unattractive
> by comparison:
>
> ?if min((f(x) for x in iterable if x>0), default=0) > 0: ? ? ? ...
>
> It is much better with:
>
> ?if all(f(x)>0 for x in iterable if x>0): ? ? ? ? ?...

Indeed, all() looks better in this case; maybe I was using the min
value later and did it this way. I'll search for any actual examples
as I recalled this from memory.

> Besides use cases, my main objection is that it is poor design
> to overload too many features in one function. ?If the existing
> signature for min() were just min(it), then it might not be an
> issue. ?But the confluence of the multi-arg form, the exceptional
> zero-arg case, weirdness with *args, and the key= function
> make for little room to add new features.

That's a valid point, although the main mental burden lies in the
multi-arg vs iterable single-arg peculiarity; optional keyword
arguments scale better in general. As another aside, I'm curious of
the relative frequency between the single-arg vs the multi-arg form. I
personally use the former much more than multi-args, and for those
it's almost always two args and very rarely three or more. Given that
min(x,y) can be now written as an equivalent if/else expression, I'm
wondering if the min/max signature would be the way it is if it was
proposed now for the first time.

> There is also the problem of confusing a default-argument
> with an initial start value. ?When reading through code, my eyes would have
> a hard time accepting that min(it, default=0) could
> return something greater than zero.

Interesting, it took me a few seconds to think of why you might read
it otherwise since it's "obvious" to me what it means. I had never
thought of the initial value semantics wrt to min/max before this
thread came up.

> Did you look at the itertool alternative? ?To me, it seems
> to cleanly separate concerns (computing the min/max
> from providing an iterator with a default value):
>
> ? min(default(iterable, 0))

Yes, that's neat, and default() may be a good candidate for addition
to itertools.

> I sense that you're going to try to force this through but wish

No, not at all. I'm not the OP and as I said, I have accepted that I
should think of min/max more like division than a missing
key/attribute. One can't just blindly use "min(x)" without considering
the case where x is empty as he can't use 1/x without considering the
case where x is zero. It's slightly inconvenient if all you have is a
bunch of non-negative numbers and the "obvious" default is zero but
generally explicit is better than implicit.

George


From jh at improva.dk  Fri Apr 17 18:15:32 2009
From: jh at improva.dk (Jacob Holm)
Date: Fri, 17 Apr 2009 18:15:32 +0200
Subject: [Python-ideas] Revised**11 PEP on Yield-From
In-Reply-To: <49E80BF8.4090803@canterbury.ac.nz>
References: <49E80BF8.4090803@canterbury.ac.nz>
Message-ID: <49E8AB24.9000402@improva.dk>

Trying again, as the last version was mangled. (Thanks to Aahz for 
pointing that out).  I hope this is better...


Greg Ewing wrote:
 > Draft 12 of the PEP.
 >
 > Fixed a bug in the expansion (didn't handle StopIteration raised by
 > throw).
 >

Just so you know, I now agree that a long expansion with multiple
"try...except StopIteration" blocks is the right thing to do.  There
are only two cases I can see where it makes a difference compared to
what I suggested:

   1. Throwing StopIteration to an iterator without a throw() method.

      I would prefer treating this case *exactly* as if the iterator
      had a trivial throw method:

         def throw(self, et, ev=None, tb=None):
             raise et, ev, tb

      In other words, I think the StopIteration *should* be caught by
      yield-from.

      Treating it like this makes it easier to write wrappers that
      don't accidentally change the semantics of an obscure corner
      case.  Principle of least surprise and all that...

      It is easy enough to change the expansion to do this by expanding
      the try block around the throw() call or by actually using such a
      trivial throw method as a fallback.  Alternatively, the expansion
      can be rewritten using functools.partial as in the bottom of this
      mail.  It has identical semantics to draft 12 of the PEP, except
      for the handling of the missing throw method. I actually like
      that version because it is careful about what exceptions to
      catch, but still only has one "try...except StopIteration". YMMV.

   2. Calling an iterator.close() that raises a StopIteration.

      Arguably, such a close() is an error, so getting an exception in
      the caller is better than swallowing it and turning it into a
      normal return.  Especially since we only called close() as part
      of handling GeneratorExit in the first place.

An unrelated question...  What should happen with an iterator that has
a throw or close attribute that just happens to have the value None?
Should that be treated as an error because None is not callable, or
should it be treated as if the attribute wasn't there?  The expansion
handles it as if the attribute wasn't there, but IIRC your patch will
raise a TypeError trying to call None.

 > Removed paragraph about StopIteration left over from an earlier
 > version.
 >
 > Added some discussion about rejected ideas.
 >

Looks good, except...

 > Suggestion: If ``close()`` is not to return a value, then raise an
 > exception if StopIteration with a non-None value occurs.
 >
 > Resolution: No clear reason to do so. Ignoring a return value is not
 > considered an error anywhere else in Python.
 >

I may have been unclear about why I thought this should raise a
RuntimeError.  As I see it there are only two code patterns in a
generator that would have close() catch a StopIteration with a non-None
value.

    * An explicit catch of GeneratorExit followed by "return Value".

      This is harmless and potentially useful, although probably an
      abuse of GeneratorExit (that was one of the early arguments for
      not returning a value from close).  Not raising a RuntimeError in
      close makes it simpler to share a code path between the common
      and the forced exit.

    * An implicit catch of GeneratorExit, followed by "return Value".

      By an "implicit catch", I mean either a catch of "BaseException"
      or a "finally" clause.  In both cases, "return Value" will hide
      the original exception and that is almost certainly a bug.
      Raising a RuntimeError would let you discover this bug early.

The question now is whether it is better to catch n00b errors or to
allow careful programmers a bit more freedom in how they structure
their code.  When I started writing this mail I was leaning towards
catching errors, but I have now changed my mind.  I think giving more
power to experienced users is more important.

Best regards
- Jacob

------------------------------------------------------------------------

    _i = iter(EXPR)
    _p = partial(next, _i)
    while 1:
        try:
            _y = _p()
        except StopIteration as _e:
            _r = _e.value
            break
        try:
            _s = yield _y
        except GeneratorExit:
            _m = getattr(_i, 'close', None)
            if _m is not None:
                _m()
            raise
        except:
            _m = getattr(_i, 'throw', None)
            if _m is None:
                def _m(et, ev, tb):
                    raise et, ev, tb
            _p = partial(_m, *sys.exc_info())
        else:
            if _s is None:
                _p = partial(next, _i)
            else:
                _p = partial(_i.send, _s)
    RESULT = _r






From fuzzyman at gmail.com  Fri Apr 17 23:00:55 2009
From: fuzzyman at gmail.com (Michael Foord)
Date: Fri, 17 Apr 2009 22:00:55 +0100
Subject: [Python-ideas] Collect **kw arguments as an ordered dictionary
Message-ID: <6f4025010904171400l69ae8799mad5f88f251d03e9e@mail.gmail.com>

Hello all,

It would be nice if **kw arguments collected in a function / method
signature were collected as an ordered dictionary rather  than ordinary
dictionary.

Main use case, currently (I believe) the odict constructor doesn't guarantee
to preserve ordering if created with keyword arguments:

    odict(key=value, key2=value, key3=value)

My use case - I'd like to preserve the ordering to reproduce exactly the
order of arguments for the Mock module representation of the objects are
used. Because the order of iterating over arguments isn't the same as they
are passed in representations can't be compared in tests reliably across
Python versions / implementations.

All the best,

Michael Foord

-- 
http://www.ironpythoninaction.com/
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-ideas/attachments/20090417/324f82cf/attachment.html>

From greg.ewing at canterbury.ac.nz  Fri Apr 17 23:26:00 2009
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Sat, 18 Apr 2009 09:26:00 +1200
Subject: [Python-ideas] Revised**11 PEP on Yield-From
In-Reply-To: <49E8753C.7010908@improva.dk>
References: <49E80BF8.4090803@canterbury.ac.nz> <49E8753C.7010908@improva.dk>
Message-ID: <49E8F3E8.8090108@canterbury.ac.nz>

Jacob Holm wrote:

>   1. Throwing StopIteration to an iterator without a throw() method.

Guido seems happy not to care what happens if you
throw StopIteration in, so I'm happy to do so as
well -- it saves considerable complication.

>   2. Calling an iterator.close() that raises a StopIteration.      
> Arguably, such a close() is an error, so getting an exception in
>      the caller is better than swallowing it

There are plenty of other ways to get strange
results by raising StopIteration in places where
you shouldn't, so I'm not worried about this either.

 > What should happen with an iterator that has a
> throw or close attribute that just happens to have the value None?
 > The expansion handles it as if the attribute wasn't there

That's a good point -- I hadn't intended that.

>    * An implicit catch of GeneratorExit, followed by "return Value". 
>      By an "implicit catch", I mean either a catch of "BaseException"
>      or a "finally" clause.  In both cases, "return Value" will hide
>      the original exception and that is almost certainly a bug.

Doing either of those things *anywhere* is likely to hide
a bug. I don't see a strong reason to single out this
particular case and try to detect it.

 > I have now changed my mind.  I think giving more power to
> experienced users is more important.

My reasoning is more along the lines that it's not worth
the bother of trying to detect this error, even if it's
an error at all, which isn't entirely certain.

-- 
Greg


From python at rcn.com  Sat Apr 18 00:09:51 2009
From: python at rcn.com (Raymond Hettinger)
Date: Fri, 17 Apr 2009 15:09:51 -0700
Subject: [Python-ideas] Collect **kw arguments as an ordered dictionary
References: <6f4025010904171400l69ae8799mad5f88f251d03e9e@mail.gmail.com>
Message-ID: <8BC265BD80374CE38D09B3F018DAF68C@RaymondLaptop1>


[Michael Foord]
> It would be nice if **kw arguments collected in a function / method signature
> were collected as an ordered dictionary rather  than ordinary dictionary.

I think that would be nice, but until ordered dicts get a C implementation,
it might greatly impair performance.


> Main use case, currently (I believe) the odict constructor doesn't guarantee 
> to preserve ordering if created with keyword arguments: 

That is correct.  And so noted in the docs:

     http://docs.python.org/dev/library/collections.html#ordereddict-objects


> My use case - I'd like to preserve the ordering to reproduce exactly the 
> order of arguments for the Mock module representation of the objects 
> are used. Because the order of iterating over arguments isn't the same 
> as they are passed in representations can't be compared in tests 
> reliably across Python versions / implementations.

Sounds reasonable.


Raymond
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-ideas/attachments/20090417/94e43fee/attachment.html>

From jh at improva.dk  Sat Apr 18 00:39:21 2009
From: jh at improva.dk (Jacob Holm)
Date: Sat, 18 Apr 2009 00:39:21 +0200
Subject: [Python-ideas] Revised**11 PEP on Yield-From
In-Reply-To: <49E8F3E8.8090108@canterbury.ac.nz>
References: <49E80BF8.4090803@canterbury.ac.nz> <49E8753C.7010908@improva.dk>
	<49E8F3E8.8090108@canterbury.ac.nz>
Message-ID: <49E90519.1030808@improva.dk>

Greg Ewing wrote:
> Jacob Holm wrote:
> 
>>   1. Throwing StopIteration to an iterator without a throw() method.
> 
> Guido seems happy not to care what happens if you
> throw StopIteration in, so I'm happy to do so as
> well -- it saves considerable complication.

I was about to reiterate how treating a missing throw like I suggested 
would make my wrappers simpler, but realized that it actually makes very 
little difference.

> 
>  > What should happen with an iterator that has a
>> throw or close attribute that just happens to have the value None?
>  > The expansion handles it as if the attribute wasn't there
> 
> That's a good point -- I hadn't intended that.

Ok.

Note that treating None as missing might actually be useful by making it 
easier to "hide" the throw() or close() method of a base class from 
yield-from.  IIRC there is a precedent for treating None this way in the 
handling of hash().

Even without handling None this way it is still possible to hide the 
base class methods by creating a property that raises AttributeError, so 
this is not all that important.

I just think that using None is slightly cleaner for this use.


Cheers
- Jacob


From greg.ewing at canterbury.ac.nz  Sat Apr 18 03:15:10 2009
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Sat, 18 Apr 2009 13:15:10 +1200
Subject: [Python-ideas] Revised**11 PEP on Yield-From
In-Reply-To: <49E90519.1030808@improva.dk>
References: <49E80BF8.4090803@canterbury.ac.nz> <49E8753C.7010908@improva.dk>
	<49E8F3E8.8090108@canterbury.ac.nz> <49E90519.1030808@improva.dk>
Message-ID: <49E9299E.6050606@canterbury.ac.nz>

Jacob Holm wrote:

> Note that treating None as missing might actually be useful by making it 
> easier to "hide" the throw() or close() method of a base class from 
> yield-from.  IIRC there is a precedent for treating None this way in the 
> handling of hash().

In the case of hash() there's a specific reason for wanting
to be able to hide the base method. You'd have to justify
wanting to be able to do the same for send() and throw()
in particular.

-- 
Greg


From jh at improva.dk  Sat Apr 18 03:44:13 2009
From: jh at improva.dk (Jacob Holm)
Date: Sat, 18 Apr 2009 03:44:13 +0200
Subject: [Python-ideas] Revised**11 PEP on Yield-From
In-Reply-To: <49E9299E.6050606@canterbury.ac.nz>
References: <49E80BF8.4090803@canterbury.ac.nz> <49E8753C.7010908@improva.dk>
	<49E8F3E8.8090108@canterbury.ac.nz> <49E90519.1030808@improva.dk>
	<49E9299E.6050606@canterbury.ac.nz>
Message-ID: <49E9306D.7010706@improva.dk>

Greg Ewing wrote:
> Jacob Holm wrote:
> 
>> Note that treating None as missing might actually be useful by making 
>> it easier to "hide" the throw() or close() method of a base class from 
>> yield-from.  IIRC there is a precedent for treating None this way in 
>> the handling of hash().
> 
> In the case of hash() there's a specific reason for wanting
> to be able to hide the base method. You'd have to justify
> wanting to be able to do the same for send() and throw()
> in particular.
> 

I don't care much either way.  Providing a property that raises 
AttributeError is almost as easy as using None if you really need it.

- Jacob


From castironpi at gmail.com  Sat Apr 18 11:56:07 2009
From: castironpi at gmail.com (Aaron Brady)
Date: Sat, 18 Apr 2009 02:56:07 -0700 (PDT)
Subject: [Python-ideas] python-like garbage collector & workaround
In-Reply-To: <59a221a0903301736y40a348b5ib18b938643fad98b@mail.gmail.com>
References: <49CEA57B.5070004@canterbury.ac.nz>
	<318523839.906611238334142946.JavaMail.root@sz0050a.emeryville.ca.mail.comcast.net>
	<59a221a0903301736y40a348b5ib18b938643fad98b@mail.gmail.com>
Message-ID: <3bdf40ad-ab75-454d-bff9-ab2aa54d6170@b16g2000yqb.googlegroups.com>

On Mar 30, 7:36?pm, Jan Kanis <jan.ka... at phil.uu.nl> wrote:
> 2009/3/29 ?<castironpi... at comcast.net>:
>
> > ----- Original Message -----
> > From: "Greg Ewing" <greg.ew... at canterbury.ac.nz>
> > To: castironpi... at comcast.net
> > Cc: Python-id... at python.org
> > Sent: Saturday, March 28, 2009 5:32:27 PM GMT -06:00 US/Canada Central
> > Subject: Re: [Python-ideas] python-like garbage collector & workaround
>
> > castironpi... at comcast.net wrote:
>
> > ?> I'm considering a workaround that performs GC in two steps. ?First, it
> >> requests the objects to drop their references that participate in the
> >> cycle. ?Then, it enqueues the decref'ed object for an unnested
> >> destruction.
>
> Castironpi,
> I don't think your solution solves the problem. In a single stage
> finalization design, it is allways possible to call the destructors of
> the objects in the cycle in random order. The problem is that now when
> A gets finalized, it cannot use its reference to B anymore because B
> may have already been finalized, and thus we cannot assume B can still
> be used for anything usefull. The problem, of course, is that one of A
> or B may still need the other during its finalization.
>
> In your solution, the real question is what the state of an object is
> supposed to be when it is in between the two stages of finalization.
> Is it still supposed to be a fully functional object, that handles all
> operations just as if it were still fully alive? In that case the
> object can only drop the references that it doesn't actually need to
> perform any of its operations (not just finalization). But if we
> assume that an object has all its references for a reason, there is
> nothing it can drop. (except if it uses a reference for caching or
> similar things. But I think that is only a minority of all use cases.)
> If you propose an object counts as 'finalized' (or at least, no longer
> fully functional) when it is in between stages of finalization, we
> have the same problem as in the single stage random order
> finalization: other objects that refer to it can no longer use it for
> anything usefull.
>
> The only option that is left is to have the object be in some
> in-between state. But that really complicates Pythons object model,
> because every object now has two visible states: alive and
> about-to-die. So every object that wants to support this form of
> finalization has to specify what kind of operations are still
> available in its about-to-die state, and all destructors of all
> objects need to restrict themselves to only these kind of operations.
> And then, of course, there is still the question of what to do if
> there are still cycles left after the first stage.
>
> If you still think your proposal is usefull, you'll probably need to
> explain why these problems don't matter enough or whether there are
> important use cases that it solves.
> _______________________________________________
> Python-ideas mailing list
> Python-id... at python.orghttp://mail.python.org/mailman/listinfo/python-ideas

Hi, sorry for the delay.  Not to beat a dead horse, but I've tried c-
l-
py for the technical discussion.

For user-defined types, the only option is currently to scan the
'gc.garbage' collection.  I want to provide more options for the users
of my system.

I was thinking about implementing weak-refs, so long as it's worth
it.  That's one approach that I understand doesn't have the same
problem.  However, that risks losing an object in the middle of
execution, not just when both have left reachability.

Another option is to add a 'is reachable' system call.  Weak-refs
would be akin to an 'still exists' call.

class StarCrossedLover:
   def __init__( self, other ):
       self.other= other
   def __del__( self ):
       kill( self.other )

romero= StarCrossedLover( )
joriet= StarCrossedLover( )

In this example, I think weak-refs would be most accurate.  However,
that litters the entire class code with weak-ref testing.  Under my
model, self.other may or may not be valid in the destructor.  With
weak-refs, self.other may or may not be valid anywhere at all!
However, they are consistent and __del__ methods can always get
called.  Should I add them?

Thanks in advance.


From ncoghlan at gmail.com  Sat Apr 18 13:55:00 2009
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Sat, 18 Apr 2009 21:55:00 +1000
Subject: [Python-ideas] Revised**10 PEP on Yield-From
In-Reply-To: <49E5BCC0.3060907@improva.dk>
References: <49E58076.4060202@canterbury.ac.nz> <49E5BCC0.3060907@improva.dk>
Message-ID: <49E9BF94.3050601@gmail.com>

Jacob Holm wrote:
>    * The "what should close() do if it catches StopIteration with a
>      value" issue I don't think we have resolved either way.  Since we
>      are not going to store the value, only the first close() would be
>      able to return it.  Under those conditions, I no longer think that
>      returning the value is a good idea.  If we are not storing or
>      returning the value, I think close() should raise an exception.
>      Either reraise the StopIteration, so that the caller has a chance
>      to get the value that way, or raise a RuntimeError, because it is
>      meaningless to return a value as response to a GeneratorExit when
>      that value cannot later be accessed by anything and it is
>      therefore most likely a bug.

If you care about a generator's return value *don't* just call "close()"
on it. Use "throw(GeneratorExit)" instead so you can catch the exception
and interrogate the return value yourself.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------


From ncoghlan at gmail.com  Sat Apr 18 14:17:35 2009
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Sat, 18 Apr 2009 22:17:35 +1000
Subject: [Python-ideas] 'default' keyword argument for max(), min()
In-Reply-To: <20090415212405.04886939@o>
References: <6AD4057E-7F0E-4282-A680-A603417086F5@atlas.st>	<0A5FAA88DBA0449591EA6C696C9EBDDE@RaymondLaptop1>
	<20090415212405.04886939@o>
Message-ID: <49E9C4DF.7090305@gmail.com>

spir wrote:
> The issue as I see it is related to the fact that python does not
> allow optional arguments without default values -- which in most
> cases is not problematic. But here I would like an hypothetical 
> min(s, optional default) or min(s, ?default)

Actually, you can have true optional arguments in Python. They're
especially easy to do for functions written in C, but there a couple of
common tricks for writing them in pure Python as well:

1. Extract the optional keyword-only argument manually (this most
closely mimics the approach used for optional arguments in C code, and
is also the only way to get keyword-only arguments in Python 2.x)

  def f(**kwds):
    try:
      arg = kwds.pop("optional")
      have_arg = True
    except KeyError:
      have_arg = False
    if kwds:
      raise TypeError("Unexpected keyword arguments")
    return have_arg

>>> f()
False
>>> f(optional=None)
True
>>> f(fred=1)
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "<stdin>", line 8, in f
TypeError: Unexpected keyword arguments
>>> f(1)
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
TypeError: f() takes exactly 0 arguments (1 given)

The downside of that approach is that you have to check for unexpected
keyword arguments yourself, which leads directly to the second approach.

2. Use a custom sentinel value to indicate a missing keyword-only
argument (this only works in Python 3.x where keyword-only parameter
syntax is available)

MISSING = object()
def f(*, optional=MISSING):
    return optional is not MISSING

>>> f()
False
>>> f(optional=1)
True
>>> f(1)
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
TypeError: f() takes exactly 0 positional arguments (1 given)
>>> f(fred=1)
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
TypeError: f() got an unexpected keyword argument 'fred'

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------


From jh at improva.dk  Sat Apr 18 14:20:46 2009
From: jh at improva.dk (Jacob Holm)
Date: Sat, 18 Apr 2009 14:20:46 +0200
Subject: [Python-ideas] Revised**10 PEP on Yield-From
In-Reply-To: <49E9BF94.3050601@gmail.com>
References: <49E58076.4060202@canterbury.ac.nz> <49E5BCC0.3060907@improva.dk>
	<49E9BF94.3050601@gmail.com>
Message-ID: <49E9C59E.4010306@improva.dk>

Nick Coghlan wrote:
> Jacob Holm wrote:
>>    * The "what should close() do if it catches StopIteration with a
>>      value" issue I don't think we have resolved either way.  Since we
>>      are not going to store the value, only the first close() would be
>>      able to return it.  Under those conditions, I no longer think that
>>      returning the value is a good idea.  If we are not storing or
>>      returning the value, I think close() should raise an exception.
>>      Either reraise the StopIteration, so that the caller has a chance
>>      to get the value that way, or raise a RuntimeError, because it is
>>      meaningless to return a value as response to a GeneratorExit when
>>      that value cannot later be accessed by anything and it is
>>      therefore most likely a bug.
> 
> If you care about a generator's return value *don't* just call "close()"
> on it. Use "throw(GeneratorExit)" instead so you can catch the exception
> and interrogate the return value yourself.
> 

I already know I can do that.  My point was that if close is defined to 
not return the value, then returning a value in response to a 
GeneratorExit is suspect and likely to be a bug in the generator.

I have since realised that there can be valid cases where being able to 
share the exit code path between the normal and GeneratorExit cases 
simplifies things.  Also, Greg has convinced me that the type of error I 
was worried about is not specific to generators, so there is no 
particular reason to do the extra work of detecting them here.

So I now agree we can swallow the return value without raising an exception.

Cheers
- Jacob


From facundobatista at gmail.com  Sat Apr 18 14:21:45 2009
From: facundobatista at gmail.com (Facundo Batista)
Date: Sat, 18 Apr 2009 09:21:45 -0300
Subject: [Python-ideas] Heap data type
Message-ID: <e04bdf310904180521j76689f6j6cc7d207094b2d33@mail.gmail.com>

Using the module heapq, I found that it's not easy to use. Or, at
least, that it could be straightforward to use.

My main issue is that for correct usage:

- user should provide an external list, that shouldn't use for
anything else to don't break the invariant

- to alter the heap queue, a bunch of functions must be used, passing
always the external list


I think that changing "external list" for "internal attribute", and
"bunch of functions " for "methods", it will leave the module easier
to use, safer, and more object oriented.

So, I basically coded it. What do you think about including this class
in the heap module?

"""
class Heap(object):
    '''Maintains a heapified list, always conserving the invariant.

    Heaps are arrays for which heap[k] <= heap[2*k+1] and
    heap[k] <= heap[2*k+2] for all k, counting elements from zero.
    '''

    def __init__(self, iterable=None):
        '''Initializes the heap from any iterable.

        >>> Heap([1, 2])
        Heap([1, 2])
        >>> Heap([])
        Heap([])
        >>> Heap()
        Heap([])
        >>> Heap((1, 2, 3))
        Heap([1, 2, 3])
        >>> Heap(x**2 for x in range(3))
        Heap([0, 1, 4])
        '''
        if iterable is None:
            self._queue = []
        else:
            self._queue = list(iterable)
            heapq.heapify(self._queue)

    def __repr__(self):
        return "Heap(%s)" % self._queue

    def push(self, item):
        '''Push the item to the heap queue.

        >>> h = Heap()
        >>> h.push(5)
        >>> h.push(4)
        >>> h.push(3)
        >>> h.push(2)
        >>> h.push(1)
        >>> h.push(0)
        >>> h
        Heap([0, 2, 1, 5, 3, 4])
        '''
        heapq.heappush(self._queue, item)

    def pop(self):
        '''Pop one item from the heap queue.

        >>> h = Heap([0, 2, 1, 5, 3, 4])
        >>> h.pop()
        0
        >>> h.pop()
        1
        >>> h.pop()
        2
        >>> h.pop()
        3
        >>> h.pop()
        4
        >>> h.pop()
        5
        >>> h
        Heap([])
        >>> h.pop()
        Traceback (most recent call last):
        ...
        IndexError: index out of range
        '''
        return heapq.heappop(self._queue)

    def pushpop(self, item):
        '''Push the item and pop one from the heap queue.

        Note that this method is more efficient than calling both
        push() and pop() separately.

        >>> h = Heap()
        >>> h.pushpop(7)
        7
        >>> h.push(5)
        >>> h.push(4)
        >>> h.push(3)
        >>> h.pushpop(7)
        3
        >>> h.pushpop(7)
        4
        >>> h
        Heap([5, 7, 7])
        '''
        return heapq.heappushpop(self._queue, item)

    def replace(self, item):
        '''Pop one item and push the received one into the heap queue

        Note that this method is more efficient than calling both
        pop() and push() separately.

        >>> h = Heap()
        >>> h.replace(3)
        Traceback (most recent call last):
        ...
        IndexError: index out of range
        >>> h.push(7)
        >>> h.replace(2)
        7
        >>> h
        Heap([2])
        >>> h.push(3)
        >>> h
        Heap([2, 3])
        >>> h.replace(1)
        2
        >>> h.replace(9)
        1
        >>> h
        Heap([3, 9])
        '''
        return heapq.heapreplace(self._queue, item)
"""

Regards,

-- 
.    Facundo

Blog: http://www.taniquetil.com.ar/plog/
PyAr: http://www.python.org/ar/


From aahz at pythoncraft.com  Sat Apr 18 14:43:58 2009
From: aahz at pythoncraft.com (Aahz)
Date: Sat, 18 Apr 2009 05:43:58 -0700
Subject: [Python-ideas] Heap data type
In-Reply-To: <e04bdf310904180521j76689f6j6cc7d207094b2d33@mail.gmail.com>
References: <e04bdf310904180521j76689f6j6cc7d207094b2d33@mail.gmail.com>
Message-ID: <20090418124357.GA8506@panix.com>

On Sat, Apr 18, 2009, Facundo Batista wrote:
>
> I think that changing "external list" for "internal attribute", and
> "bunch of functions " for "methods", it will leave the module easier
> to use, safer, and more object oriented.

+1 -- I recently did a short presentation on Big O notation and Python
containers, and heapq looked remarkably ugly.  
-- 
Aahz (aahz at pythoncraft.com)           <*>         http://www.pythoncraft.com/

"If you think it's expensive to hire a professional to do the job, wait
until you hire an amateur."  --Red Adair


From denis.spir at free.fr  Sat Apr 18 14:44:09 2009
From: denis.spir at free.fr (spir)
Date: Sat, 18 Apr 2009 14:44:09 +0200
Subject: [Python-ideas] Collect **kw arguments as an ordered dictionary
In-Reply-To: <8BC265BD80374CE38D09B3F018DAF68C@RaymondLaptop1>
References: <6f4025010904171400l69ae8799mad5f88f251d03e9e@mail.gmail.com>
	<8BC265BD80374CE38D09B3F018DAF68C@RaymondLaptop1>
Message-ID: <20090418144409.3efa13af@o>

Le Fri, 17 Apr 2009 15:09:51 -0700,
"Raymond Hettinger" <python at rcn.com> s'exprima ainsi:

> 
> [Michael Foord]
> > It would be nice if **kw arguments collected in a function / method
> > signature were collected as an ordered dictionary rather  than ordinary
> > dictionary.
> 
> I think that would be nice, but until ordered dicts get a C implementation,
> it might greatly impair performance.

What is/are the main issue(s) about that?

> > Main use case, currently (I believe) the odict constructor doesn't
> > guarantee to preserve ordering if created with keyword arguments: 
> 
> That is correct.  And so noted in the docs:
> 
>      http://docs.python.org/dev/library/collections.html#ordereddict-objects

More generally, no custom func/constructor can rely on **kwargs. Any order preserving application needs to require (name,value) pair lists instead of the (much nicer imo) kwarg syntax. This is especially pitiful for custom types.

> > My use case - I'd like to preserve the ordering to reproduce exactly the 
> > order of arguments for the Mock module representation of the objects 
> > are used. Because the order of iterating over arguments isn't the same 
> > as they are passed in representations can't be compared in tests 
> > reliably across Python versions / implementations.
> 
> Sounds reasonable.

If only for printed feedback -- first of all to the developper itself -- I think this a sensible feature.

> Raymond

==================================
OT:
I take the opportunity to talk about something related. This is a data type that would be a kind of list/dict mix, maybe more, but basically ordered. Just discovered that Lua's table (http://lua-users.org/wiki/TablesTutorial) is this kind of thing; but iteration doesn't preserve order (background hash like for python dicts) for non-index keys.

Having very few CS knowledge, I wonder what kind of underlying data structure would easily allow this. As of know, I'm thinking at the following:
* Keys could be anything "stringifiable", serializable or representable as an array of ints.
* An issue is avoiding conflict between keys of != types, eg 1 vs '1'.
* This would allow using trie algorithms (http://en.wikipedia.org/wiki/Trie), especially for item insertion/deletion/update/lookup. Even better may be a Patricia or a de la Briandais tree. Iteration is not an issue precisely because order is kept: it's walking the tree leaves.
* Time efficiency is not my primary concern -- I just don't want it to be stupidly slow ;-)
* I'm also interested in alternatives to implement basically ordered dicts, without any standard python dict in the background. And in (name:value) ordered collections as for namespace registers, or kwarg lists (cf Michael's need above).

Anyone interested to cooperate?
[Aside python itself, my motivation is rather the fun of exploration, and a possible toy custom language I have in mind.]

Denis
------
la vita e estrany


From ncoghlan at gmail.com  Sat Apr 18 14:45:58 2009
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Sat, 18 Apr 2009 22:45:58 +1000
Subject: [Python-ideas] Revised**10 PEP on Yield-From
In-Reply-To: <49E9C59E.4010306@improva.dk>
References: <49E58076.4060202@canterbury.ac.nz> <49E5BCC0.3060907@improva.dk>
	<49E9BF94.3050601@gmail.com> <49E9C59E.4010306@improva.dk>
Message-ID: <49E9CB86.8090303@gmail.com>

Jacob Holm wrote:
> So I now agree we can swallow the return value without raising an
> exception.

Yeah, I made the mistake of replying before I finished catching up on
the post-holiday email collection - I eventually got to those later
threads and saw you had already changed your mind.

Hopefully Greg can now get hold of Guido and Benjamin in time to get the
PEP over the line for 3.1b1 :)

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------


From aahz at pythoncraft.com  Sat Apr 18 14:46:37 2009
From: aahz at pythoncraft.com (Aahz)
Date: Sat, 18 Apr 2009 05:46:37 -0700
Subject: [Python-ideas] Collect **kw arguments as an ordered dictionary
In-Reply-To: <20090418144409.3efa13af@o>
References: <6f4025010904171400l69ae8799mad5f88f251d03e9e@mail.gmail.com>
	<8BC265BD80374CE38D09B3F018DAF68C@RaymondLaptop1>
	<20090418144409.3efa13af@o>
Message-ID: <20090418124637.GB8506@panix.com>

On Sat, Apr 18, 2009, spir wrote:
>
> ==================================
> OT:
> I take the opportunity to talk about something related. 

If you're going to change the subject, please make a new post starting a
new thread; python-ideas is archived for future reference, and burying
topics in other threads makes the archive less useful.
-- 
Aahz (aahz at pythoncraft.com)           <*>         http://www.pythoncraft.com/

"If you think it's expensive to hire a professional to do the job, wait
until you hire an amateur."  --Red Adair


From jh at improva.dk  Sat Apr 18 15:28:00 2009
From: jh at improva.dk (Jacob Holm)
Date: Sat, 18 Apr 2009 15:28:00 +0200
Subject: [Python-ideas] Heap data type
In-Reply-To: <e04bdf310904180521j76689f6j6cc7d207094b2d33@mail.gmail.com>
References: <e04bdf310904180521j76689f6j6cc7d207094b2d33@mail.gmail.com>
Message-ID: <49E9D560.7050308@improva.dk>

Facundo Batista wrote:
> Using the module heapq, I found that it's not easy to use. Or, at
> least, that it could be straightforward to use.
> 
> My main issue is that for correct usage:
> 
> - user should provide an external list, that shouldn't use for
> anything else to don't break the invariant
> 
> - to alter the heap queue, a bunch of functions must be used, passing
> always the external list
> 
> 
> I think that changing "external list" for "internal attribute", and
> "bunch of functions " for "methods", it will leave the module easier
> to use, safer, and more object oriented.
> 
> So, I basically coded it. What do you think about including this class
> in the heap module?
> 

Your "Heap" class implements (part of) an abstract data type usually 
known as a priority queue.

The fact that it uses the heapq module internally is an implementation 
detail that is not (and should not be IMHO) exposed to the users of the 
class.  So if this is added, I think it should rather go in the 
collections module.

Much better (faster for some operations) implementations of this ADT are 
possible, so by not tying the implementation to heapq we would be free 
to switch the implementation later if we want.


+1 to adding a collections.priority_queue, initially based on your class.

+0 to adding your class as heapq.Heap


In either case I think you should add at least the following additional 
method.

   def peek(self):
       return self._queue[0]


- Jacob


From ncoghlan at gmail.com  Sat Apr 18 15:52:49 2009
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Sat, 18 Apr 2009 23:52:49 +1000
Subject: [Python-ideas] Needing help to change the grammar
In-Reply-To: <49E2670E.3070705@g.nevcal.com>
References: <thiagoharry.1239415106.squirrel@tern.riseup.net>	<b8e622740904102320na23074fod4317de771d80254@mail.gmail.com>	<thiagoharry.1239563362.squirrel@swift.riseup.net>	<grtj13$ma0$1@ger.gmane.org>	<p04330107c6080d82eb60@[192.168.123.162]>
	<49E2670E.3070705@g.nevcal.com>
Message-ID: <49E9DB31.9020202@gmail.com>

(moving discussion to Python Ideas)

(Context for py-ideas: a teacher in Brazil is working on a Python
language variant that uses Portuguese rather than English-based
keywords. This is intended for use in teaching introductory programming
lessons, not as a professional development tool)

Glenn Linderman wrote:
> import pt_BR
> 
> An implementation along that line, except for things like reversing the
> order of "not" and "is", would allow the next national language
> customization to be done by just recoding the pt_BR module, renaming to
> pt_it or pt_fr or pt_no and translating a bunch of strings, no?
> 
> Probably it would be sufficient to allow for one language at a time, per
> module.

Making that work would actually require something like the file encoding
cookie that is detected at the parsing stage. Otherwise the parser and
compiler would choke on the unexpected keywords long before the
interpreter reached the stage of attempting to import anything.

Adjusting the parser to accept different keyword names would be even
more difficult though, since changing the details of the grammar
definition is a lot more invasive than just changing the encoding of the
file being read.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------


From stefan_ml at behnel.de  Sat Apr 18 16:01:25 2009
From: stefan_ml at behnel.de (Stefan Behnel)
Date: Sat, 18 Apr 2009 16:01:25 +0200
Subject: [Python-ideas] Heap data type
In-Reply-To: <e04bdf310904180521j76689f6j6cc7d207094b2d33@mail.gmail.com>
References: <e04bdf310904180521j76689f6j6cc7d207094b2d33@mail.gmail.com>
Message-ID: <gscmfl$1kt$1@ger.gmane.org>

Facundo Batista wrote:
> Using the module heapq, I found that it's not easy to use. Or, at
> least, that it could be straightforward to use.
> [...]
> So, I basically coded it. What do you think about including this class
> in the heap module?

I did something similar a couple of years ago (and I doubt that I'm the
only one). You should be able to find it in the bug tracker (no longer open).

Stefan



From daniel at stutzbachenterprises.com  Sat Apr 18 16:49:07 2009
From: daniel at stutzbachenterprises.com (Daniel Stutzbach)
Date: Sat, 18 Apr 2009 09:49:07 -0500
Subject: [Python-ideas] Heap data type
In-Reply-To: <20090418124357.GA8506@panix.com>
References: <e04bdf310904180521j76689f6j6cc7d207094b2d33@mail.gmail.com>
	<20090418124357.GA8506@panix.com>
Message-ID: <eae285400904180749t512b6cb0g144053b4004a2f@mail.gmail.com>

On Sat, Apr 18, 2009 at 7:43 AM, Aahz <aahz at pythoncraft.com> wrote:

> +1 -- I recently did a short presentation on Big O notation and Python
> containers, and heapq looked remarkably ugly.
>

Another issue with heapq is that it does not support modifying the priority
of an existing element (usually called the "decrease_key" operation in
textbooks).  The hard part of incrementing decrease_key is that somehow the
object in the heap needs to know its current position, else you have to do
an expensive linear search to find the object's current position.

I've implemented a couple of heaps for Python over the years:

1.
http://svn.python.org/view/sandbox/trunk/collections/pairing_heap.py?view=markup&pathrev=40887
(checked into the Sandbox... 5 years ago! I think I also have a C
reimplementation somewhere)

If you don't care about altering the priority, then the interface is more or
less like Facundo described.

To keep track of the positions, insert operations return a wrapper around
each object.  The user needs to keep track of the wrapper and pass it to the
adjust_key() method if they want to change the priority of the object
later.


2. HeapDict.  http://pypi.python.org/pypi/HeapDict

Looks, act, and quacks like a dict/MutableMapping, except popitem() returns
the item with the lowest value instead of a random item.  Also, there's a
peekitem() method to examine that item without removing it.  Since the
interface is familiar, it's super-easy to use without having to look up the
names of the methods whenever you need to use a heap.

The downside is that you can't add the same element into the heap twice
(just like you can't have the same key in a dictionary twice).  Personally
I've only inserted the same element more than once when doing toy examples
with integers, so for me at least that's no great loss. ;-)

--
Daniel Stutzbach, Ph.D.
President, Stutzbach Enterprises, LLC <http://stutzbachenterprises.com>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-ideas/attachments/20090418/2eaa31ce/attachment.html>

From george.sakkis at gmail.com  Sat Apr 18 16:57:09 2009
From: george.sakkis at gmail.com (George Sakkis)
Date: Sat, 18 Apr 2009 10:57:09 -0400
Subject: [Python-ideas] Heap data type
In-Reply-To: <gscmfl$1kt$1@ger.gmane.org>
References: <e04bdf310904180521j76689f6j6cc7d207094b2d33@mail.gmail.com>
	<gscmfl$1kt$1@ger.gmane.org>
Message-ID: <91ad5bf80904180757w77920f36h9734a16eea14ed10@mail.gmail.com>

On Sat, Apr 18, 2009 at 10:01 AM, Stefan Behnel <stefan_ml at behnel.de> wrote:

> Facundo Batista wrote:
>> Using the module heapq, I found that it's not easy to use. Or, at
>> least, that it could be straightforward to use.
>> [...]
>> So, I basically coded it. What do you think about including this class
>> in the heap module?
>
> I did something similar a couple of years ago (and I doubt that I'm the
> only one).

Indeed, you're not: http://code.activestate.com/recipes/440673/

George


From denis.spir at free.fr  Sat Apr 18 17:31:57 2009
From: denis.spir at free.fr (spir)
Date: Sat, 18 Apr 2009 17:31:57 +0200
Subject: [Python-ideas] Needing help to change the grammar
In-Reply-To: <49E9DB31.9020202@gmail.com>
References: <thiagoharry.1239415106.squirrel@tern.riseup.net>
	<b8e622740904102320na23074fod4317de771d80254@mail.gmail.com>
	<thiagoharry.1239563362.squirrel@swift.riseup.net>
	<grtj13$ma0$1@ger.gmane.org>
	<p04330107c6080d82eb60@[192.168.123.162]>
	<49E2670E.3070705@g.nevcal.com> <49E9DB31.9020202@gmail.com>
Message-ID: <20090418173157.570a6a2f@o>

Le Sat, 18 Apr 2009 23:52:49 +1000,
Nick Coghlan <ncoghlan at gmail.com> s'exprima ainsi:

> (moving discussion to Python Ideas)
> 
> (Context for py-ideas: a teacher in Brazil is working on a Python
> language variant that uses Portuguese rather than English-based
> keywords. This is intended for use in teaching introductory programming
> lessons, not as a professional development tool)
> 
> Glenn Linderman wrote:
> > import pt_BR
> > 
> > An implementation along that line, except for things like reversing the
> > order of "not" and "is", would allow the next national language
> > customization to be done by just recoding the pt_BR module, renaming to
> > pt_it or pt_fr or pt_no and translating a bunch of strings, no?
> > 
> > Probably it would be sufficient to allow for one language at a time, per
> > module.
> 
> Making that work would actually require something like the file encoding
> cookie that is detected at the parsing stage. Otherwise the parser and
> compiler would choke on the unexpected keywords long before the
> interpreter reached the stage of attempting to import anything.
> 
> Adjusting the parser to accept different keyword names would be even
> more difficult though, since changing the details of the grammar
> definition is a lot more invasive than just changing the encoding of the
> file being read.

> Cheers,
> Nick.
> 

Maybe I don't really understand the problem, or am overlooking obvious issues. If the question is only to have a national language variant of python, there are certainly numerous easier methods than tweaking the parser to make it flexible enough to be natural language-aware.

Why not simply have a preprocessing func that translates back to standard/english python using a simple dict? For practicle everyday work, this may done by:
* assigning a special extension (eg .pybr) to the 'special' source code files,
* associating this extension to the preprocessing program...
* that would pass the back-translated .py source to python.

[A more general solution would be to introduce a customization layer/interface in a python-aware editor. Sources would always been stored in standard format. At load-time, they would be translated according to a currently active config, that, indeed, would only affect developper input-output (the principle is thus analog to syntax-highlighting).
* Any developper can edit any source according to his/her own preferences.
* Python does not need to care about that.
* Customization can be lexical (keywords, builtins, signs) but also touch a certain amount of syntax.
The issue here is that the editor parser (for syntax highlighting and numerous nice features) has to be made flexible enough to cope with this customization.]

Denis
------
la vita e estrany


From rasky at develer.com  Sat Apr 18 18:05:13 2009
From: rasky at develer.com (Giovanni Bajo)
Date: Sat, 18 Apr 2009 16:05:13 +0000 (UTC)
Subject: [Python-ideas] Addition to I/O stack: send()
Message-ID: <gsctnp$uek$1@ger.gmane.org>

Hello,

currently, Python has no builtin way to "copy a file object into 
another", an operation commonly referred to as "send". I think this would 
be a good addition to the I/O stack.

To make things clearer, this is pseudo-code of what I meant:

   def send(self, out):
       while 1:
           data = self.read(4096)
           if not data:
               break
           out.write(data)


I'll notice that the implementation can be tweaked in different ways:

 * Some operating systems do support this operation natively (through 
syscalls like "sendfile"), which is faster because the data does not 
roundtrips to user space. This is useful eg. for webservers serving files 
to network.

 * In the user-space implementation, the buffer size could match whatever 
buffer already exists for a buffered file reader (eg: BufferedIOBase and 
derivates).

Because of these different details, I think it would be a good addition 
to the I/O stack. Though, I don't fully understand it yet as to provide a 
more concrete proposals as where to put each different possible 
implementation.

Also, I'm unsure how this could couple with different kind of files 
(pipes, sockets, etc.). As far as I can tell, the new I/O stack is not 
used by the standard socket library yet, for instance.
-- 
Giovanni Bajo
Develer S.r.l.
http://www.develer.com



From tjreedy at udel.edu  Sat Apr 18 22:03:17 2009
From: tjreedy at udel.edu (Terry Reedy)
Date: Sat, 18 Apr 2009 16:03:17 -0400
Subject: [Python-ideas] Needing help to change the grammar
In-Reply-To: <20090418173157.570a6a2f@o>
References: <thiagoharry.1239415106.squirrel@tern.riseup.net>	<b8e622740904102320na23074fod4317de771d80254@mail.gmail.com>	<thiagoharry.1239563362.squirrel@swift.riseup.net>	<grtj13$ma0$1@ger.gmane.org>	<p04330107c6080d82eb60@[192.168.123.162]>	<49E2670E.3070705@g.nevcal.com>
	<49E9DB31.9020202@gmail.com> <20090418173157.570a6a2f@o>
Message-ID: <gsdbm4$p19$1@ger.gmane.org>

spir wrote:
> Le Sat, 18 Apr 2009 23:52:49 +1000,
> Nick Coghlan <ncoghlan at gmail.com> s'exprima ainsi:
> 
>> (moving discussion to Python Ideas)
>>
>> (Context for py-ideas: a teacher in Brazil is working on a Python
>> language variant that uses Portuguese rather than English-based
>> keywords. This is intended for use in teaching introductory programming
>> lessons, not as a professional development tool)
>>
>> Glenn Linderman wrote:
>>> import pt_BR
>>>
>>> An implementation along that line, except for things like reversing the
>>> order of "not" and "is", would allow the next national language
>>> customization to be done by just recoding the pt_BR module, renaming to
>>> pt_it or pt_fr or pt_no and translating a bunch of strings, no?
>>>
>>> Probably it would be sufficient to allow for one language at a time, per
>>> module.
>> Making that work would actually require something like the file encoding
>> cookie that is detected at the parsing stage. Otherwise the parser and
>> compiler would choke on the unexpected keywords long before the
>> interpreter reached the stage of attempting to import anything.

My original proposal in response to the OP was that language be encoded 
in the extension: pybr, for instance.  That would be noticed before 
reading the file.  Cached modules would still be standard .pyc, 
interoperable with .pyc compiled from normal Python.  I am presuming 
this would work on all systems.

>> Adjusting the parser to accept different keyword names would be even
>> more difficult though, since changing the details of the grammar
>> definition is a lot more invasive than just changing the encoding of the
>> file being read.
> 
>> Cheers,
>> Nick.
>>
> 
> Maybe I don't really understand the problem, or am overlooking obvious issues. If the question is only to have a national language variant of python, there are certainly numerous easier methods than tweaking the parser to make it flexible enough to be natural language-aware.
> 
> Why not simply have a preprocessing func that translates back to standard/english python using a simple dict? For practicle everyday work, this may done by:
> * assigning a special extension (eg .pybr) to the 'special' source code files,
> * associating this extension to the preprocessing program...
> * that would pass the back-translated .py source to python.

The OP was proposing to change 'is not' to the equivalent of 'not is'. 
I am not sure of how critical that would actually be.  For the purpose 
of easing transition to international Python, not messing with statement 
word order would be a plus.

>  [A more general solution would be to introduce a customization layer/interface in a python-aware editor. Sources would always been stored in standard format. At load-time, they would be translated according to a currently active config, that, indeed, would only affect developper input-output (the principle is thus analog to syntax-highlighting).
> * Any developper can edit any source according to his/her own preferences.
> * Python does not need to care about that.
> * Customization can be lexical (keywords, builtins, signs) but also touch a certain amount of syntax.
> The issue here is that the editor parser (for syntax highlighting and numerous nice features) has to be made flexible enough to cope with this customization.]

This might be easier than changing the interpreter.  The extension could 
just as be be read and written by an editor.  The problem is the 
multiple editors.

The reason I susggested some support in the core for nationalization is 
that I think a) it is inevitable, in spite of the associated problem of 
ghettoization, while b) ghettoization should be discourage and can be 
ameliorated with a bit of core support.  I am aware, of course, that 
such support, by removing one barrier to nationalization, will 
accelerate the development of such versions.

Terry Jan Reedy



From solipsis at pitrou.net  Sun Apr 19 00:56:55 2009
From: solipsis at pitrou.net (Antoine Pitrou)
Date: Sat, 18 Apr 2009 22:56:55 +0000 (UTC)
Subject: [Python-ideas] Addition to I/O stack: send()
References: <gsctnp$uek$1@ger.gmane.org>
Message-ID: <loom.20090418T224443-992@post.gmane.org>


Hi Giovanni,

> I'll notice that the implementation can be tweaked in different ways:
> 
>  * Some operating systems do support this operation natively (through 
> syscalls like "sendfile"), which is faster because the data does not 
> roundtrips to user space. This is useful eg. for webservers serving files 
> to network.
> 
>  * In the user-space implementation, the buffer size could match whatever 
> buffer already exists for a buffered file reader (eg: BufferedIOBase and 
> derivates).

I think it all sounds like premature optimization. If you have any workload 
where the cost of user/kernel switching or of copying the data measuredly
affects the overall program speed (rather than e.g. the Python interpretation 
overhead), it would be interesting to hear about it.

(I'm not saying that copying stuff is free; actually, the 3.1 I/O stack somewhat
tries to minimize copying of data between the various layers, by using e.g.
memoryviews and readinto(). But I don't think it is worth going to extremes just
to avoid all memory copies. The cost of a single non-optimized 
method call is likely to be higher than memcpy'ing, say, 1024 bytes...)

Right now, it is true there is no way to do this kind of things. There are two
distinct things you are asking for really:
1/ direct access to the buffered data of a BufferedIOBase object without copying
things around
2/ automagical optimization of file descriptor-to-file descriptor copying
through something like sendfile(), when supported by the OS

Also, please notice that 1 and 2 are exclusive :-) In 1 you are doing buffered
I/O, while in 2 you are doing raw I/O.

If you really need ?ber-fast copy between two file descriptors, the solution
is probably to whip up a trivial C extension that simply wraps around the
sendfile() system call. You could propose it for inclusion in the stdlib, in the
os module, by the way.

Regards

Antoine.




From python at rcn.com  Sun Apr 19 01:40:31 2009
From: python at rcn.com (Raymond Hettinger)
Date: Sat, 18 Apr 2009 16:40:31 -0700
Subject: [Python-ideas] Heap data type
References: <e04bdf310904180521j76689f6j6cc7d207094b2d33@mail.gmail.com>
	<20090418124357.GA8506@panix.com>
Message-ID: <3CDA63554E1546DEA84A696B56BB4876@RaymondLaptop1>

Facundo, I would like to work with you on this.
I've been the primary maintainer for heapq for a while
and had already started working on something like this
in response to repeated requested to support a key= function
(like that for sorted/min/max).


Raymond


>> I think that changing "external list" for "internal attribute", and
>> "bunch of functions " for "methods", it will leave the module easier
>> to use, safer, and more object oriented.
> 
> +1 -- I recently did a short presentation on Big O notation and Python
> containers, and heapq looked remarkably ugly.  



From solipsis at pitrou.net  Sun Apr 19 02:35:00 2009
From: solipsis at pitrou.net (Antoine Pitrou)
Date: Sun, 19 Apr 2009 00:35:00 +0000 (UTC)
Subject: [Python-ideas] Addition to I/O stack: send()
References: <gsctnp$uek$1@ger.gmane.org>
	<loom.20090418T224443-992@post.gmane.org>
Message-ID: <loom.20090419T002854-788@post.gmane.org>


Hello again,

> Right now, it is true there is no way to do this kind of things. There are two
> distinct things you are asking for really:
> 1/ direct access to the buffered data of a BufferedIOBase object without
copying
> things around

We could add two methods to BufferedIOBase:
- peek_inside(self[, n]): returns a view of up to n bytes on the internal buffer
- read_inside(self[, n]): same as peek_inside(), but also advances the file
position after the end of the returned view

The good thing is that default unoptimized versions of these methods could be
provided in the BufferedIOBase ABC, simply by calling peek() (resp. read()) and
wrapping the result in a memoryview.

Regards

Antoine.




From greg.ewing at canterbury.ac.nz  Sun Apr 19 08:30:17 2009
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Sun, 19 Apr 2009 18:30:17 +1200
Subject: [Python-ideas] Revised**12 PEP on Yield-From
Message-ID: <49EAC4F9.90107@canterbury.ac.nz>

Draft 13 of the PEP.

Adjusted the expansion so as not to suggest that
a throw or close attribute with the value None
should be treated as a missing method.

-- 
Greg
-------------- next part --------------
An embedded and charset-unspecified text was scrubbed...
Name: yield-from-rev13.txt
URL: <http://mail.python.org/pipermail/python-ideas/attachments/20090419/c7d72ba8/attachment.txt>

From stephen at xemacs.org  Sun Apr 19 10:56:31 2009
From: stephen at xemacs.org (Stephen J. Turnbull)
Date: Sun, 19 Apr 2009 17:56:31 +0900
Subject: [Python-ideas] Needing help to change the grammar
In-Reply-To: <gsdbm4$p19$1@ger.gmane.org>
References: <thiagoharry.1239415106.squirrel@tern.riseup.net>
	<b8e622740904102320na23074fod4317de771d80254@mail.gmail.com>
	<thiagoharry.1239563362.squirrel@swift.riseup.net>
	<grtj13$ma0$1@ger.gmane.org>
	<p04330107c6080d82eb60@[192.168.123.162]>
	<49E2670E.3070705@g.nevcal.com> <49E9DB31.9020202@gmail.com>
	<20090418173157.570a6a2f@o> <gsdbm4$p19$1@ger.gmane.org>
Message-ID: <87prf92f9c.fsf@xemacs.org>

Terry Reedy writes:
 > spir wrote:
 > > Le Sat, 18 Apr 2009 23:52:49 +1000,
 > > Nick Coghlan <ncoghlan at gmail.com> s'exprima ainsi:

 > >> Making that work would actually require something like the file
 > >> encoding cookie that is detected at the parsing stage. Otherwise
 > >> the parser and compiler would choke on the unexpected keywords
 > >> long before the interpreter reached the stage of attempting to
 > >> import anything.

I think this is the right way to go.  We currently need, and will need
for the medium term, coding cookies for legacy encoding support.  I
don't see why this shouldn't work the same way.

 > My original proposal in response to the OP was that language be encoded 
 > in the extension: pybr, for instance.

But there are a lot of languages.  Once the ice is broken, I think a
lot of translations will appear.  So I think the variant extension
approach is likely to get pretty ugly.

 > >> Adjusting the parser to accept different keyword names would be even
 > >> more difficult though, since changing the details of the grammar
 > >> definition is a lot more invasive than just changing the encoding of the
 > >> file being read.

But the grammar is not being changed in the details; it's actually not
being changed at all (with the one exception).  If it's a one-to-one
map at the keyword level, I don't see why there would be a problem.
Of course there will be the occasional word order issue, as here with
"is not", and that does involve changing the grammar.

 > > Why not simply have a preprocessing func that translates back to
 > > standard/english python using a simple dict?

Because it's just not that simple, of course.  You need to parse far
enough to recognize strings, for example, and leave them alone.  Since
the parser doesn't detect unbalanced quotation marks in comments, you
need to parse those too.  You must parse import statements, because
the file name might happen to be the equivalent of a keyword, and
*not* translate those.  There may be other issues, as well.

 > The reason I susggested some support in the core for nationalization is 
 > that I think a) it is inevitable, in spite of the associated problem of 
 > ghettoization, while b) ghettoization should be discourage and can be 
 > ameliorated with a bit of core support.  I am aware, of course, that 
 > such support, by removing one barrier to nationalization, will 
 > accelerate the development of such versions.

I don't think that ghettoization is that much more encouraged by this
development than by PEP 263.  It's always been possible to use
non-English identifiers, even with languages normally not written in
ASCII (there are several C identifiers in XEmacs than I'm pretty sure
are obscenities in Latin and Portuguese, I wouldn't be surprised if a
similar device isn't occasionally used in Python programs<wink>), and
of course comments have long been written in practically any
ASCII-compatible coding you can name.  I think it was Alex Martelli
who contributed a couple of rather (un)amusing stories about
multinational teams where all of one nationality up and quit one day,
leaving the rest of the team with copiously but unintelligibly
documented code, to the PEP 263 discussion.

In fact, AFAICS the fact that it's parsable as Python means that
translated keywords aren't a problem at all, since that same parser
can be adapted to substitute the English versions for you.  That still
leaves you with meaningless identifiers and comments, but as I say we
already had those.



From ncoghlan at gmail.com  Sun Apr 19 15:29:39 2009
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Sun, 19 Apr 2009 23:29:39 +1000
Subject: [Python-ideas] Revised**12 PEP on Yield-From
In-Reply-To: <49EAC4F9.90107@canterbury.ac.nz>
References: <49EAC4F9.90107@canterbury.ac.nz>
Message-ID: <49EB2743.4010507@gmail.com>

Greg Ewing wrote:
>     * ``return expr`` in a generator causes ``StopIteration(expr)`` to
>       be raised.

One minor nit here - this bullet point is somewhat ambiguous as to where
the raised exception is visible. It is probably worth mentioning
explicitly that as with existing "return" statements in generators, the
StopIteration exception won't be seen in the generator's own frame.

As someone else mentioned on another recent draft, it may be worth
including some of the toy examples we were playing with in some of the
discussion threads. Sure, they weren't all that practical, but they
seemed to do a good job of getting the data flow concepts across.

Otherwise looks good to me.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------


From jh at improva.dk  Sun Apr 19 15:46:28 2009
From: jh at improva.dk (Jacob Holm)
Date: Sun, 19 Apr 2009 15:46:28 +0200
Subject: [Python-ideas] Revised**12 PEP on Yield-From
In-Reply-To: <49EAC4F9.90107@canterbury.ac.nz>
References: <49EAC4F9.90107@canterbury.ac.nz>
Message-ID: <49EB2B34.9080607@improva.dk>

Greg Ewing wrote:
> Draft 13 of the PEP.
> 
> Adjusted the expansion so as not to suggest that
> a throw or close attribute with the value None
> should be treated as a missing method.
> 

A few more details about the expansion.


First a minor nit.  There is no need to use getattr if the name is 
constant and you are going to catch AttributeError anyway.  Just use 
_i.close and _i.throw.


Next a more serious issue.  The current use of *sys.exc_info() in the 
throw handling is actually wrong.  Getting the "throw" attribute may run 
arbitrary python code which could easily replace the exception.

We should probably add

   _x = sys.exc_info()

as the first line in the "except BaseException as _e" block, and change

   _y = _m(*sys.exc_info())

to:

   _y = _m(*_x)

Alternatively, we could get the signature of throw() fixed so that it 
matches the "raise" statement again.  Then we can drop the use of 
sys.exc_info(), and just use:

   _y = _m(_e)

But that is probably out of scope for this PEP.


Other than that, I think the expansion matches what we have agreed on.


FWIW I still consider an expansion using functools.partial to be more 
readable because it centralizes the StopIteration handling and reduces 
the code nesting.  Here is an updated version of such an expansion that 
is semantically equivalent to the PEP rev 13 + my suggested fix for the 
sys.exc_info() issue:

     _i = iter(EXPR)
     _p = functools.partial(next, _i)
     while 1:
         try:
             _y = _p()
         except StopIteration as _e:
             _r = _e.value
             break
         try:
             _s = yield _y
         except GeneratorExit as _e:
             try:
                 _m = _i.close
             except AttributeError:
                 pass
             else:
                 _m()
             raise _e
         except BaseException as _e:
             _x = sys.exc_info()
             try:
                 _m = _i.throw
             except AttributeError:
                 raise _e
             _p = functools.partial(_m, *_x)
         else:
             if _s is None:
                 _p = functools.partial(next, _i)
             else:
                 _p = functools.partial(_i.send, _s)
     RESULT = _r

If you don't like it, fine.  Since it is only a presentation detail, I 
won't push for it.  I *would* like to hear your reasoning, but will 
accept whatever conclusion you come to, I think :)


Anyway, it looks to me like we are almost done.  What are the chances of 
getting this into 3.1 and 2.7?


Hopefully-soon-using-yield-from-for-real-ly yours
- Jacob



From jh at improva.dk  Sun Apr 19 15:58:39 2009
From: jh at improva.dk (Jacob Holm)
Date: Sun, 19 Apr 2009 15:58:39 +0200
Subject: [Python-ideas] Revised**12 PEP on Yield-From
In-Reply-To: <49EB2743.4010507@gmail.com>
References: <49EAC4F9.90107@canterbury.ac.nz> <49EB2743.4010507@gmail.com>
Message-ID: <49EB2E0F.6050908@improva.dk>

Nick Coghlan wrote:
> Greg Ewing wrote:
>>     * ``return expr`` in a generator causes ``StopIteration(expr)`` to
>>       be raised.
> 
> One minor nit here - this bullet point is somewhat ambiguous as to where
> the raised exception is visible. It is probably worth mentioning
> explicitly that as with existing "return" statements in generators, the
> StopIteration exception won't be seen in the generator's own frame.
> 


It is mentioned later, in the formal semantics,

> ...
> except that, as currently, the exception cannot be caught by ``except``
> clauses within the returning generator.

but I agree that it probably wouldn't hurt to mention it earlier as well.


> As someone else mentioned on another recent draft, it may be worth
> including some of the toy examples we were playing with in some of the
> discussion threads. Sure, they weren't all that practical, but they
> seemed to do a good job of getting the data flow concepts across.
> 


It may be good enough just to include links to the relevant messages in 
the PEP.  Getting some small examples in the actual docs is probably 
more important.


> Otherwise looks good to me.
> 


There is a minor issue with the handling of throw in the expansion, but 
otherwise I agree.

Cheers
- Jacob


From ncoghlan at gmail.com  Sun Apr 19 16:04:15 2009
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Mon, 20 Apr 2009 00:04:15 +1000
Subject: [Python-ideas] Revised**12 PEP on Yield-From
In-Reply-To: <49EB2B34.9080607@improva.dk>
References: <49EAC4F9.90107@canterbury.ac.nz> <49EB2B34.9080607@improva.dk>
Message-ID: <49EB2F5F.2080709@gmail.com>

Jacob Holm wrote:
> First a minor nit.  There is no need to use getattr if the name is
> constant and you are going to catch AttributeError anyway.  Just use
> _i.close and _i.throw.

Yep, good idea.

> Next a more serious issue.  The current use of *sys.exc_info() in the
> throw handling is actually wrong.  Getting the "throw" attribute may run
> arbitrary python code which could easily replace the exception.

Good catch (and I agree with your first suggested fix of grabbing the
exception details before retrieving the method - the with statement
expansion in PEP 343 had to do something similar for similar reasons).

> Alternatively, we could get the signature of throw() fixed so that it
> matches the "raise" statement again.  Then we can drop the use of
> sys.exc_info(), and just use:
> 
>   _y = _m(_e)
> 
> But that is probably out of scope for this PEP.

Yep (using sys.exc_info() also translates more cleanly back to Python 2.x)

> FWIW I still consider an expansion using functools.partial to be more
> readable because it centralizes the StopIteration handling and reduces
> the code nesting.  Here is an updated version of such an expansion that
> is semantically equivalent to the PEP rev 13 + my suggested fix for the
> sys.exc_info() issue:

Nobody would ever implement it that way though - using partial like that
in the formal semantic definition implies a whole heap of temporary
objects that just won't exist in practice.

Better to use the more verbose expansion that is much closer in spirit
to the way it would actually be implemented.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------


From jh at improva.dk  Sun Apr 19 16:34:02 2009
From: jh at improva.dk (Jacob Holm)
Date: Sun, 19 Apr 2009 16:34:02 +0200
Subject: [Python-ideas] Revised**12 PEP on Yield-From
In-Reply-To: <49EB2F5F.2080709@gmail.com>
References: <49EAC4F9.90107@canterbury.ac.nz> <49EB2B34.9080607@improva.dk>
	<49EB2F5F.2080709@gmail.com>
Message-ID: <49EB365A.8010000@improva.dk>

Nick Coghlan wrote:
> Jacob Holm wrote:
>> FWIW I still consider an expansion using functools.partial to be more
>> readable because it centralizes the StopIteration handling and reduces
>> the code nesting.  Here is an updated version of such an expansion that
>> is semantically equivalent to the PEP rev 13 + my suggested fix for the
>> sys.exc_info() issue:
> 
> Nobody would ever implement it that way though - using partial like that
> in the formal semantic definition implies a whole heap of temporary
> objects that just won't exist in practice.
> 
> Better to use the more verbose expansion that is much closer in spirit
> to the way it would actually be implemented.

The temporary objects don't bother me.  That is really a deep 
implementation detail.  As for being closer in spirit, the real 
implementation is already going to be very different so I don't see that 
as a problem either.

FWIW, it is quite easy to write same style expansion without using 
functools.partial, like this:

     _i = iter(EXPR)
     _m, _a = next, (_i,)
     while 1:
         try:
             _y = _m(*_a)
         except StopIteration as _e:
             _r = _e.value
             break
         try:
             _s = yield _y
         except GeneratorExit as _e:
             try:
                 _m = _i.close
             except AttributeError:
                 pass
             else:
                 _m()
             raise _e
         except BaseException as _e:
             _a = sys.exc_info()
             try:
                 _m = _i.throw
             except AttributeError:
                 raise _e
         else:
             if _s is None:
                 _m, _a = next, (_i,)
             else:
                 _m, _a = _i.send, (_s,)
     RESULT = _r

This is even a line shorter than the version using functools.partial, 
and the temporary _a tuples used actually match what happens in a normal 
function call anyway (I think)...

Anyway, as I said it is not all that important.  It is a presentation 
detail, and as such very subjective.  I can agree to disagree about what 
version is clearer.

Cheers
- Jacob


From Scott.Daniels at Acm.Org  Sun Apr 19 17:52:21 2009
From: Scott.Daniels at Acm.Org (Scott David Daniels)
Date: Sun, 19 Apr 2009 08:52:21 -0700
Subject: [Python-ideas] Revised**12 PEP on Yield-From
In-Reply-To: <49EB365A.8010000@improva.dk>
References: <49EAC4F9.90107@canterbury.ac.nz>
	<49EB2B34.9080607@improva.dk>	<49EB2F5F.2080709@gmail.com>
	<49EB365A.8010000@improva.dk>
Message-ID: <gsfh3c$cc6$1@ger.gmane.org>

Jacob Holm wrote:
> ...
>     _i = iter(EXPR)
>     _m, _a = next, (_i,)
>     while 1:
>         try:
>             _y = _m(*_a)
>         except StopIteration as _e:
>             _r = _e.value
>             break
>         try:
>             _s = yield _y
>         except GeneratorExit as _e:
>             try:
>                 _m = _i.close
>             except AttributeError:
>                 pass
>             else:
>                 _m()
>             raise _e
>         except BaseException as _e:
>             _a = sys.exc_info()
>             try:
>                 _m = _i.throw
>             except AttributeError:
>                 raise _e
>         else:
>             if _s is None:
>                 _m, _a = next, (_i,)
>             else:
>                 _m, _a = _i.send, (_s,)
>     RESULT = _r
> 

Now that we passed the magic three or four threshold, is
it not easier to read if we pick some better names?

       _iter = iter(EXPR)
       _call, _arg = next, _iter
       while 1:
           try:
               _out = _call(_arg)
           except StopIteration as _except:
               _result = _except.value
               break
           try:
               _in = yield _out
           except GeneratorExit as _except:
               try:
                   _close = _iter.close
               except AttributeError:
                   pass
               else:
                   _close()
               raise _except
           except BaseException as _except:
               _a = sys.exc_info()
               try:
                   _call = _iter.throw
               except AttributeError:
                   raise _except
           else:
               if _in is None:
                   _call, _arg = next, _iter
               else:
                   _call, _arg = _iter.send, _in
       RESULT = _result

--Scott David Daniels
Scott.Daniels at Acm.Org



From jh at improva.dk  Sun Apr 19 18:26:35 2009
From: jh at improva.dk (Jacob Holm)
Date: Sun, 19 Apr 2009 18:26:35 +0200
Subject: [Python-ideas] Revised**12 PEP on Yield-From
In-Reply-To: <gsfh3c$cc6$1@ger.gmane.org>
References: <49EAC4F9.90107@canterbury.ac.nz>	<49EB2B34.9080607@improva.dk>	<49EB2F5F.2080709@gmail.com>	<49EB365A.8010000@improva.dk>
	<gsfh3c$cc6$1@ger.gmane.org>
Message-ID: <49EB50BB.5010408@improva.dk>

Scott David Daniels wrote:
> Now that we passed the magic three or four threshold, is
> it not easier to read if we pick some better names?
> 
>       _iter = iter(EXPR)
>       _call, _arg = next, _iter

                             ^^^^^ should be (_iter,)

>       while 1:
>           try:
>               _out = _call(_arg)

                              ^^^^ should be *_arg

>           except StopIteration as _except:
>               _result = _except.value
>               break
>           try:
>               _in = yield _out
>           except GeneratorExit as _except:
>               try:
>                   _close = _iter.close
>               except AttributeError:
>                   pass
>               else:
>                   _close()
>               raise _except
>           except BaseException as _except:
>               _a = sys.exc_info()

                 ^^ should be _arg, forcing the other changes.

>               try:
>                   _call = _iter.throw
>               except AttributeError:
>                   raise _except
>           else:
>               if _in is None:
>                   _call, _arg = next, _iter

                                         ^^^^^ should be (_iter,)

>               else:
>                   _call, _arg = _iter.send, _in

                                               ^^^ should be (_in,)

>       RESULT = _result


As noted inline, you missed one of the _a's and so falsely assumed there 
would always only be one argument to _call.  In the case of throw there 
is 3.

I don't really care what the variables are called.  That bikeshed 
discussion is not one I want to participate in.  Your names are as good 
as any I guess.

Anyway, this is moot unless Greg agrees that this style of expansion is 
a good idea in the first place.

Not-bikeshedding-ly yours
- Jacob


From grosser.meister.morti at gmx.net  Sun Apr 19 19:36:03 2009
From: grosser.meister.morti at gmx.net (=?ISO-8859-1?Q?Mathias_Panzenb=F6ck?=)
Date: Sun, 19 Apr 2009 19:36:03 +0200
Subject: [Python-ideas] Revised**12 PEP on Yield-From
In-Reply-To: <49EB50BB.5010408@improva.dk>
References: <49EAC4F9.90107@canterbury.ac.nz>	<49EB2B34.9080607@improva.dk>	<49EB2F5F.2080709@gmail.com>	<49EB365A.8010000@improva.dk>	<gsfh3c$cc6$1@ger.gmane.org>
	<49EB50BB.5010408@improva.dk>
Message-ID: <49EB6103.2080406@gmx.net>

Jacob Holm wrote:
 > Scott David Daniels wrote:
 >> Now that we passed the magic three or four threshold, is
 >> it not easier to read if we pick some better names?
 >>
 >>       _iter = iter(EXPR)
 >>       _call, _arg = next, _iter
 >

And I guess there is a missing "next = _iter.next" before this line?



From jh at improva.dk  Sun Apr 19 19:44:50 2009
From: jh at improva.dk (Jacob Holm)
Date: Sun, 19 Apr 2009 19:44:50 +0200
Subject: [Python-ideas] Revised**12 PEP on Yield-From
In-Reply-To: <49EB6103.2080406@gmx.net>
References: <49EAC4F9.90107@canterbury.ac.nz>	<49EB2B34.9080607@improva.dk>	<49EB2F5F.2080709@gmail.com>	<49EB365A.8010000@improva.dk>	<gsfh3c$cc6$1@ger.gmane.org>	<49EB50BB.5010408@improva.dk>
	<49EB6103.2080406@gmx.net>
Message-ID: <49EB6312.1000404@improva.dk>

Mathias Panzenb?ck wrote:
> Jacob Holm wrote:
>  > Scott David Daniels wrote:
>  >> Now that we passed the magic three or four threshold, is
>  >> it not easier to read if we pick some better names?
>  >>
>  >>       _iter = iter(EXPR)
>  >>       _call, _arg = next, _iter
>  >
> 
> And I guess there is a missing "next = _iter.next" before this line?
> 


Nope.  In 2.6 and 3.0, next is a global function:

"""
next(iterator[, default])

Return the next item from the iterator. If default is given and the 
iterator is exhausted, it is returned instead of raising StopIteration.
"""

- Jacob


From Scott.Daniels at Acm.Org  Sun Apr 19 23:28:53 2009
From: Scott.Daniels at Acm.Org (Scott David Daniels)
Date: Sun, 19 Apr 2009 14:28:53 -0700
Subject: [Python-ideas] Revised**11 PEP on Yield-From
In-Reply-To: <49E80BF8.4090803@canterbury.ac.nz>
References: <49E80BF8.4090803@canterbury.ac.nz>
Message-ID: <gsg4qb$t1$1@ger.gmane.org>

Greg Ewing wrote:
> Draft 12 of the PEP....

In the ensuing discussion, I replied to one of Jacob Holm's points
by making a comment that was really meant to apply to the PEP itself,
rather than his suggestion.  I replied at the place where it became
obvious to me how I felt the PEP could become clearer.  In an off-line
exchange, I found he thought I was talking simply about his suggested
change, so I'll restate my case here.  I do so not for emphasis, but
rather to attach it to the proper context.  This is, of course,
Greg's decision for readability.  I share Jacob's fear of bike-shed
discussions on variable names (I am not necessarily in love with
the names I've chosen here).

Now that we passed the magic three or four threshold, is
it not easier to read if we pick some better names?
Instead of:
 >    _i = iter(EXPR)
 >    try:
 >        _y = next(_i)
 >    except StopIteration as _e:
 >        _r = _e.value
 >    else:
 >        while 1:
 >            try:
 >                _s = yield _y
 >            except GeneratorExit:
 >                _m = getattr(_i, 'close', None)
 >                if _m is not None:
 >                    _m()
 >                raise
 >            except:
 >                _m = getattr(_i, 'throw', None)
 >                if _m is not None:
 >                    try:
 >                        _y = _m(*sys.exc_info())
 >                    except StopIteration as _e:
 >                        _r = _e.value
 >                        break
 >                else:
 >                    raise
 >            else:
 >                try:
 >                    if _s is None:
 >                        _y = next(_i)
 >                    else:
 >                        _y = _i.send(_s)
 >                except StopIteration as _e:
 >                    _r = _e.value
 >                    break
 >    RESULT = _r

we could use:
     _iterator = iter(EXPR)
     try:
         _out = next(_iterator)
     except StopIteration as _error:
         _result = _error.value
     else:
         while 1:
             try:
                 _inner = yield _out
             except GeneratorExit:
                 _close = getattr(_i, 'close', None)
                 if _close is not None:
                     _close()
                 raise
             except:
                 _throw = getattr(_iterator, 'throw', None)
                 if _throw is not None:
                     try:
                         _out = _throw(*sys.exc_info())
                     except StopIteration as _error:
                         _result = _error.value
                         break
                 else:
                     raise
             else:
                 try:
                     if _inner is None:
                         _out = next(_iterator)
                     else:
                         _out = _i.send(_inner)
                 except StopIteration as _error:
                     _result = _error.value
                     break
     RESULT = _result


--Scott David Daniels
Scott.Daniels at Acm.Org



From greg.ewing at canterbury.ac.nz  Mon Apr 20 03:00:36 2009
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Mon, 20 Apr 2009 13:00:36 +1200
Subject: [Python-ideas] Revised**12 PEP on Yield-From
In-Reply-To: <49EB2743.4010507@gmail.com>
References: <49EAC4F9.90107@canterbury.ac.nz> <49EB2743.4010507@gmail.com>
Message-ID: <49EBC934.6010802@canterbury.ac.nz>

Nick Coghlan wrote:

> As someone else mentioned on another recent draft, it may be worth
> including some of the toy examples we were playing with in some of the
> discussion threads.

If you can nominate which ones you think should be
included, I'll take a look.

-- 
Greg


From greg.ewing at canterbury.ac.nz  Mon Apr 20 03:01:59 2009
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Mon, 20 Apr 2009 13:01:59 +1200
Subject: [Python-ideas] Revised**12 PEP on Yield-From
In-Reply-To: <49EB2743.4010507@gmail.com>
References: <49EAC4F9.90107@canterbury.ac.nz> <49EB2743.4010507@gmail.com>
Message-ID: <49EBC987.1040202@canterbury.ac.nz>

Nick Coghlan wrote:

> It is probably worth mentioning
> explicitly that as with existing "return" statements in generators, the
> StopIteration exception won't be seen in the generator's own frame.

I do mention that in a later section. Do you think it
needs to be in both places?

-- 
Greg


From greg.ewing at canterbury.ac.nz  Mon Apr 20 03:22:35 2009
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Mon, 20 Apr 2009 13:22:35 +1200
Subject: [Python-ideas] Revised**12 PEP on Yield-From
In-Reply-To: <49EB2B34.9080607@improva.dk>
References: <49EAC4F9.90107@canterbury.ac.nz> <49EB2B34.9080607@improva.dk>
Message-ID: <49EBCE5B.6040704@canterbury.ac.nz>

Jacob Holm wrote:

> First a minor nit.  There is no need to use getattr if the name is 
> constant and you are going to catch AttributeError anyway.  Just use 
> _i.close and _i.throw.

Quite right. I can't have been thinking very straight
when I did that!

> We should probably add
> 
>   _x = sys.exc_info()
> 
> as the first line in the "except BaseException as _e" block, and change
> 
>   _y = _m(*sys.exc_info())
> 
> to:
> 
>   _y = _m(*_x)

I'm not sure if this is really necessary, since function
calls save/restore the exception being handled, if I
understand correctly. But I suppose it can't hurt to
clarify this.

> FWIW I still consider an expansion using functools.partial to be more 
> readable

That's a matter of opinion -- I find it harder to follow
because it separates the logic for deciding which method
to call from the place where it's called. Also I would
rather express the expansion in terms of core language
features as far as possible rather than relying on
something imported from a library.

> Anyway, it looks to me like we are almost done.  What are the chances of 
> getting this into 3.1 and 2.7?

You'd have to ask Guido.

-- 
Greg


From jh at improva.dk  Mon Apr 20 11:05:45 2009
From: jh at improva.dk (Jacob Holm)
Date: Mon, 20 Apr 2009 11:05:45 +0200
Subject: [Python-ideas] Revised**12 PEP on Yield-From
In-Reply-To: <49EBCE5B.6040704@canterbury.ac.nz>
References: <49EAC4F9.90107@canterbury.ac.nz> <49EB2B34.9080607@improva.dk>
	<49EBCE5B.6040704@canterbury.ac.nz>
Message-ID: <49EC3AE9.7020403@improva.dk>

Greg Ewing wrote:
> Jacob Holm wrote:
> 
>> We should probably add
>>
>>   _x = sys.exc_info()
>>
>> as the first line in the "except BaseException as _e" block, and change
>>
>>   _y = _m(*sys.exc_info())
>>
>> to:
>>
>>   _y = _m(*_x)
> 
> I'm not sure if this is really necessary, since function
> calls save/restore the exception being handled, if I
> understand correctly. But I suppose it can't hurt to
> clarify this.
> 


For some reason I thought that the save/restore only applied to what 
would be raised by a bare raise and not to sys.exc_info().  I have just 
tested it with a small script and it appears I was wrong.  Sorry for the 
noise.


>> FWIW I still consider an expansion using functools.partial to be more 
>> readable
> 
> That's a matter of opinion -- I find it harder to follow
> because it separates the logic for deciding which method
> to call from the place where it's called. 


It may be slightly harder to follow, but it is easier to see that 
StopIteration is treated the same for the three operations, and it is a 
bit shorter and less deeply nested.

Anyway, that is only my opinion and in this case it is yours that count.


> Also I would
> rather express the expansion in terms of core language
> features as far as possible rather than relying on
> something imported from a library.


As shown in my response to Nick, the use of functools.partial is not 
actually needed for this style of expansion.  It is just as easy to 
collect the function and arguments in different variables and use 
_m(*_a) in the call.

I'll shut up about it now.  If you still don't like it that's fine.


Cheers
- Jacob


From ncoghlan at gmail.com  Mon Apr 20 14:34:49 2009
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Mon, 20 Apr 2009 22:34:49 +1000
Subject: [Python-ideas] Revised**12 PEP on Yield-From
In-Reply-To: <49EBC987.1040202@canterbury.ac.nz>
References: <49EAC4F9.90107@canterbury.ac.nz> <49EB2743.4010507@gmail.com>
	<49EBC987.1040202@canterbury.ac.nz>
Message-ID: <49EC6BE9.1050705@gmail.com>

Greg Ewing wrote:
> Nick Coghlan wrote:
> 
>> It is probably worth mentioning
>> explicitly that as with existing "return" statements in generators, the
>> StopIteration exception won't be seen in the generator's own frame.
> 
> I do mention that in a later section. Do you think it
> needs to be in both places?
> 

I must have just missed it in the latter section - no need to say it twice.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------


From gerald.britton at gmail.com  Mon Apr 20 15:01:21 2009
From: gerald.britton at gmail.com (Gerald Britton)
Date: Mon, 20 Apr 2009 09:01:21 -0400
Subject: [Python-ideas] Heap data type
In-Reply-To: <3CDA63554E1546DEA84A696B56BB4876@RaymondLaptop1>
References: <e04bdf310904180521j76689f6j6cc7d207094b2d33@mail.gmail.com> 
	<20090418124357.GA8506@panix.com>
	<3CDA63554E1546DEA84A696B56BB4876@RaymondLaptop1>
Message-ID: <5d1a32000904200601q4e4cb41ewe89ea781d1bfabd2@mail.gmail.com>

Raymond, Facundo,

May I suggest that you implement an interface similar to the sort()
method for list objects:
=========================
sort([cmp[, key[, reverse]]])

"cmp specifies a custom comparison function of two arguments (list
items) which should return a negative, zero or positive number
depending on whether the first argument is considered smaller than,
equal to, or larger than the second argument: cmp=lambda x,y:
cmp(x.lower(), y.lower()). The default value is None.

"key specifies a function of one argument that is used to extract a
comparison key from each list element: key=str.lower. The default
value is None.

"reverse is a boolean value. If set to True, then the list elements
are sorted as if each comparison were reversed."
========================

You could add these option arguments to the Heap class constructor,
which would then be used to maintain the heap.  The _reverse_ argument
would cover the min/max heap question.  The other arguments would
allow the user to use the class for a priority queue of more complex
objects.  For example, the Heap could consist of object references
that would be compared using the function supplied in _cmp_ or _key_
arguments.

On Sat, Apr 18, 2009 at 7:40 PM, Raymond Hettinger <python at rcn.com> wrote:
> Facundo, I would like to work with you on this.
> I've been the primary maintainer for heapq for a while
> and had already started working on something like this
> in response to repeated requested to support a key= function
> (like that for sorted/min/max).
>
>
> Raymond
>
>
>>> I think that changing "external list" for "internal attribute", and
>>> "bunch of functions " for "methods", it will leave the module easier
>>> to use, safer, and more object oriented.
>>
>> +1 -- I recently did a short presentation on Big O notation and Python
>> containers, and heapq looked remarkably ugly.
>
> _______________________________________________
> Python-ideas mailing list
> Python-ideas at python.org
> http://mail.python.org/mailman/listinfo/python-ideas
>



-- 
Gerald Britton


From tjreedy at udel.edu  Mon Apr 20 19:41:54 2009
From: tjreedy at udel.edu (Terry Reedy)
Date: Mon, 20 Apr 2009 13:41:54 -0400
Subject: [Python-ideas] Heap data type
In-Reply-To: <5d1a32000904200601q4e4cb41ewe89ea781d1bfabd2@mail.gmail.com>
References: <e04bdf310904180521j76689f6j6cc7d207094b2d33@mail.gmail.com>
	<20090418124357.GA8506@panix.com>	<3CDA63554E1546DEA84A696B56BB4876@RaymondLaptop1>
	<5d1a32000904200601q4e4cb41ewe89ea781d1bfabd2@mail.gmail.com>
Message-ID: <gsic4v$sa8$1@ger.gmane.org>

Gerald Britton wrote:
> Raymond, Facundo,
> 
> May I suggest that you implement an interface similar to the sort()
> method for list objects:
> =========================
> sort([cmp[, key[, reverse]]])
> 
> "cmp specifies a custom comparison function of two arguments (list
> items) which should return a negative, zero or positive number
> depending on whether the first argument is considered smaller than,
> equal to, or larger than the second argument: cmp=lambda x,y:
> cmp(x.lower(), y.lower()). The default value is None.

Cmp(), .__cmp__, cmp=xxx, etc, are gone in Py3.

+something for the other two.




From gerald.britton at gmail.com  Mon Apr 20 21:16:58 2009
From: gerald.britton at gmail.com (Gerald Britton)
Date: Mon, 20 Apr 2009 15:16:58 -0400
Subject: [Python-ideas] Heap data type
In-Reply-To: <gsic4v$sa8$1@ger.gmane.org>
References: <e04bdf310904180521j76689f6j6cc7d207094b2d33@mail.gmail.com> 
	<20090418124357.GA8506@panix.com>
	<3CDA63554E1546DEA84A696B56BB4876@RaymondLaptop1> 
	<5d1a32000904200601q4e4cb41ewe89ea781d1bfabd2@mail.gmail.com> 
	<gsic4v$sa8$1@ger.gmane.org>
Message-ID: <5d1a32000904201216h6fa71153keb84036ad68d5bf3@mail.gmail.com>

Is this only targeting 3.x?  I think it would be useful in 2.x as well.

On Mon, Apr 20, 2009 at 1:41 PM, Terry Reedy <tjreedy at udel.edu> wrote:
> Gerald Britton wrote:
>>
>> Raymond, Facundo,
>>
>> May I suggest that you implement an interface similar to the sort()
>> method for list objects:
>> =========================
>> sort([cmp[, key[, reverse]]])
>>
>> "cmp specifies a custom comparison function of two arguments (list
>> items) which should return a negative, zero or positive number
>> depending on whether the first argument is considered smaller than,
>> equal to, or larger than the second argument: cmp=lambda x,y:
>> cmp(x.lower(), y.lower()). The default value is None.
>
> Cmp(), .__cmp__, cmp=xxx, etc, are gone in Py3.
>
> +something for the other two.
>
>
> _______________________________________________
> Python-ideas mailing list
> Python-ideas at python.org
> http://mail.python.org/mailman/listinfo/python-ideas
>



-- 
Gerald Britton


From alexandre at peadrop.com  Tue Apr 21 04:45:27 2009
From: alexandre at peadrop.com (Alexandre Vassalotti)
Date: Mon, 20 Apr 2009 22:45:27 -0400
Subject: [Python-ideas] Add setpriority / getpriority to os module.
In-Reply-To: <grgtlk$pdt$1@ger.gmane.org>
References: <grgtlk$pdt$1@ger.gmane.org>
Message-ID: <acd65fa20904201945q145a51b3p7c4adcd3668e6215@mail.gmail.com>

On Tue, Apr 7, 2009 at 9:12 PM, Christian Heimes <lists at cheimes.de> wrote:
> Hello,
>
> I would like to add straight forward wrapper for the setpriority and
> getpriority functions to posixmodule.c for the os module.

Wouldn't it be better to add these functions to the 'resource' module?

Cheers,
-- Alexandre


From cmjohnson.mailinglist at gmail.com  Tue Apr 21 05:20:04 2009
From: cmjohnson.mailinglist at gmail.com (Carl Johnson)
Date: Mon, 20 Apr 2009 17:20:04 -1000
Subject: [Python-ideas] Needing help to change the grammar
In-Reply-To: <87prf92f9c.fsf@xemacs.org>
References: <thiagoharry.1239415106.squirrel@tern.riseup.net>
	<b8e622740904102320na23074fod4317de771d80254@mail.gmail.com>
	<thiagoharry.1239563362.squirrel@swift.riseup.net>
	<grtj13$ma0$1@ger.gmane.org> <p04330107c6080d82eb60@192.168.123.162>
	<49E2670E.3070705@g.nevcal.com> <49E9DB31.9020202@gmail.com>
	<20090418173157.570a6a2f@o> <gsdbm4$p19$1@ger.gmane.org>
	<87prf92f9c.fsf@xemacs.org>
Message-ID: <3bdda690904202020o37944d00mefac62fd281e7b22@mail.gmail.com>

Stephen J. Turnbull wrote:
> Terry Reedy writes:
> ?> spir wrote:
> ?> > Why not simply have a preprocessing func that translates back to
> ?> > standard/english python using a simple dict?
>
> Because it's just not that simple, of course. ?You need to parse far
> enough to recognize strings, for example, and leave them alone. ?Since
> the parser doesn't detect unbalanced quotation marks in comments, you
> need to parse those too. ?You must parse import statements, because
> the file name might happen to be the equivalent of a keyword, and
> *not* translate those. ?There may be other issues, as well.

Would it be possible to use 2to3 for this? It wouldn't be perfect but
it might be easier to scale a preprocessor to dozens of languages
without freezing those users out of the ability to use standard
English Python modules.

Also, does anyone know if ChinesePython [1] ever caught on? (Hey,
there's one case where you do NOT need to worry about keyword
conflicts!) Looking at the homepage, it appears stuck at Python 2.1.
But I don't know much Chinese, so I could be wrong.

[1]: http://www.chinesepython.org/cgi_bin/cgb.cgi/english/english.html

internationally-yrs,

-- Carl


From stefan_ml at behnel.de  Tue Apr 21 07:13:02 2009
From: stefan_ml at behnel.de (Stefan Behnel)
Date: Tue, 21 Apr 2009 07:13:02 +0200
Subject: [Python-ideas] Heap data type
In-Reply-To: <5d1a32000904201216h6fa71153keb84036ad68d5bf3@mail.gmail.com>
References: <e04bdf310904180521j76689f6j6cc7d207094b2d33@mail.gmail.com>
	<20090418124357.GA8506@panix.com>	<3CDA63554E1546DEA84A696B56BB4876@RaymondLaptop1>
	<5d1a32000904200601q4e4cb41ewe89ea781d1bfabd2@mail.gmail.com>
	<gsic4v$sa8$1@ger.gmane.org>
	<5d1a32000904201216h6fa71153keb84036ad68d5bf3@mail.gmail.com>
Message-ID: <gsjkkv$2t0$1@ger.gmane.org>

Gerald Britton wrote:
> 
> On Mon, Apr 20, 2009 at 1:41 PM, Terry Reedy wrote:
>> Gerald Britton wrote:
>>> May I suggest that you implement an interface similar to the sort()
>>> method for list objects:
>>> =========================
>>> sort([cmp[, key[, reverse]]])
>>>
>> Cmp(), .__cmp__, cmp=xxx, etc, are gone in Py3.
>>
> Is this only targeting 3.x?  I think it would be useful in 2.x as well.

1) I think the usual approach is to implement it for Py3, then backport it.

2) Since we already know that code that uses cmp=xyz is not Py3 compatible,
there is little use in providing this parameter for a new data type, so
that new code ends up being broken in the future.

Stefan



From denis.spir at free.fr  Tue Apr 21 11:58:29 2009
From: denis.spir at free.fr (spir)
Date: Tue, 21 Apr 2009 11:58:29 +0200
Subject: [Python-ideas] from __future__ import range
Message-ID: <20090421115829.48bba8ca@o>

Hello,

Below a suggestion by Paul MacGuire on the python-tutor. I found it so sensible that I asked him to propose it on python-ideas. His answer was: "Please feel free to post this on python-ideas as my proxy. I'm barely keeping up with my other pythonic duties as it is." So here it is.

====================================================
So I'll propose some usages for those who use range:

1. For Py2-Py3 range-xrange compatibility, add this code to the top of your
Python scripts:

try:
    range = xrange
except NameError:
    pass

In this way, your code under Py2 will use xrange whenever you call range,
and you will adopt the future-compatible range behavior.  (Hmm, maybe this
would have been a good option to add to the "future" module...)

2. In all cases where you really must use the result returned from range as
a list, ALWAYS write this as "list(range(n))", as in:

nonnegative_numbers_up_to_but_not_including_10 = list(range(10))
even_numbers_up_to_but_not_including_20 = list(range(0,20,2))

Using the "list(range(n))" form when you actually need a list will prepare
you for the change in idiom when you upgrade to Py3, and is fully Py2
compatible (even if you don't first bind xrange to range as in suggestion 1
- it simply copies the original list).  It also makes it very explicit that
you REALLY WANT THE RANGE AS A LIST, and not just as something for counting
up to n.
====================================================

I think that
   from __future__ import range
used together with
   list(range(...))
would:
-1- Be a good programming practice,
-2- Help & remove one of the 2to3 issues that cannot be solved automatically.

Denis
------
la vita e estrany


From ncoghlan at gmail.com  Tue Apr 21 14:13:40 2009
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Tue, 21 Apr 2009 22:13:40 +1000
Subject: [Python-ideas] from __future__ import range
In-Reply-To: <20090421115829.48bba8ca@o>
References: <20090421115829.48bba8ca@o>
Message-ID: <49EDB874.9070707@gmail.com>

spir wrote:
> I think that
>    from __future__ import range
> used together with
>    list(range(...))
> would:
> -1- Be a good programming practice,
> -2- Help & remove one of the 2to3 issues that cannot be solved automatically.

__future__ is really only for things that the compiler needs to know about.

However, adding a "range = xrange" line to Lib\future_builtins.py in 2.7
probably wouldn't be a bad idea.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------


From gerald.britton at gmail.com  Tue Apr 21 15:44:22 2009
From: gerald.britton at gmail.com (Gerald Britton)
Date: Tue, 21 Apr 2009 09:44:22 -0400
Subject: [Python-ideas] Heap data type
In-Reply-To: <gsjkkv$2t0$1@ger.gmane.org>
References: <e04bdf310904180521j76689f6j6cc7d207094b2d33@mail.gmail.com> 
	<20090418124357.GA8506@panix.com>
	<3CDA63554E1546DEA84A696B56BB4876@RaymondLaptop1> 
	<5d1a32000904200601q4e4cb41ewe89ea781d1bfabd2@mail.gmail.com> 
	<gsic4v$sa8$1@ger.gmane.org>
	<5d1a32000904201216h6fa71153keb84036ad68d5bf3@mail.gmail.com> 
	<gsjkkv$2t0$1@ger.gmane.org>
Message-ID: <5d1a32000904210644g2d41581dvcbdc3f6fc29646bf@mail.gmail.com>

Ok thanks.  Any idea what the uptake is on Py3?  I know that the
projects I work on are not even talking about moving in that direction
due to the significant migration effort.  Also, I hear that
performance is an issue, though that might be improve over the longer
term.

On Tue, Apr 21, 2009 at 1:13 AM, Stefan Behnel <stefan_ml at behnel.de> wrote:
> Gerald Britton wrote:
>>
>> On Mon, Apr 20, 2009 at 1:41 PM, Terry Reedy wrote:
>>> Gerald Britton wrote:
>>>> May I suggest that you implement an interface similar to the sort()
>>>> method for list objects:
>>>> =========================
>>>> sort([cmp[, key[, reverse]]])
>>>>
>>> Cmp(), .__cmp__, cmp=xxx, etc, are gone in Py3.
>>>
>> Is this only targeting 3.x? ?I think it would be useful in 2.x as well.
>
> 1) I think the usual approach is to implement it for Py3, then backport it.
>
> 2) Since we already know that code that uses cmp=xyz is not Py3 compatible,
> there is little use in providing this parameter for a new data type, so
> that new code ends up being broken in the future.
>
> Stefan
>
> _______________________________________________
> Python-ideas mailing list
> Python-ideas at python.org
> http://mail.python.org/mailman/listinfo/python-ideas
>



-- 
Gerald Britton


From josiah.carlson at gmail.com  Tue Apr 21 18:08:53 2009
From: josiah.carlson at gmail.com (Josiah Carlson)
Date: Tue, 21 Apr 2009 09:08:53 -0700
Subject: [Python-ideas] Heap data type
In-Reply-To: <eae285400904180749t512b6cb0g144053b4004a2f@mail.gmail.com>
References: <e04bdf310904180521j76689f6j6cc7d207094b2d33@mail.gmail.com>
	<20090418124357.GA8506@panix.com>
	<eae285400904180749t512b6cb0g144053b4004a2f@mail.gmail.com>
Message-ID: <e6511dbf0904210908g1c999e75p14fb47d5ed985b83@mail.gmail.com>

On Sat, Apr 18, 2009 at 7:49 AM, Daniel Stutzbach
<daniel at stutzbachenterprises.com> wrote:
> On Sat, Apr 18, 2009 at 7:43 AM, Aahz <aahz at pythoncraft.com> wrote:
>>
>> +1 -- I recently did a short presentation on Big O notation and Python
>> containers, and heapq looked remarkably ugly.
>
> Another issue with heapq is that it does not support modifying the priority
> of an existing element (usually called the "decrease_key" operation in
> textbooks).? The hard part of incrementing decrease_key is that somehow the
> object in the heap needs to know its current position, else you have to do
> an expensive linear search to find the object's current position.
>
> I've implemented a couple of heaps for Python over the years:
>
> 1.
> http://svn.python.org/view/sandbox/trunk/collections/pairing_heap.py?view=markup&pathrev=40887
> (checked into the Sandbox... 5 years ago! I think I also have a C
> reimplementation somewhere)
>
> If you don't care about altering the priority, then the interface is more or
> less like Facundo described.
>
> To keep track of the positions, insert operations return a wrapper around
> each object.? The user needs to keep track of the wrapper and pass it to the
> adjust_key() method if they want to change the priority of the object
> later.
>
>
> 2. HeapDict.? http://pypi.python.org/pypi/HeapDict
>
> Looks, act, and quacks like a dict/MutableMapping, except popitem() returns
> the item with the lowest value instead of a random item.? Also, there's a
> peekitem() method to examine that item without removing it.? Since the
> interface is familiar, it's super-easy to use without having to look up the
> names of the methods whenever you need to use a heap.
>
> The downside is that you can't add the same element into the heap twice
> (just like you can't have the same key in a dictionary twice).? Personally
> I've only inserted the same element more than once when doing toy examples
> with integers, so for me at least that's no great loss. ;-)

I've also got a pair heap implementation:
http://mail.python.org/pipermail/python-dev/2006-November/069845.html

There's also a proposed update of sched.py in the bugtracker (as part
of some updates to asyncore to add a scheduler) that offers a version
of decreasekey, as well as "remove" functionality.  I describe some of
the trade-offs in implementing a heap with such features (memory use,
O-notation, etc.) in a blog post last summer:
http://chouyu-31.livejournal.com/316112.html .  One of the ways we get
around the "key must be unique" limitation is that we don't use a
dictionary to reference the keys; when you put an item into the heap,
you get an opaque reference, which you can then cancel, "reschedule",
etc.

The sched module is heavily biased towards event scheduling, so adding
a collections.priority_queue or whatever is just fine with me.

 - Josiah


From larry at hastings.org  Tue Apr 21 23:39:37 2009
From: larry at hastings.org (Larry Hastings)
Date: Tue, 21 Apr 2009 14:39:37 -0700
Subject: [Python-ideas] PYTHONUSERBASES (plural!)
Message-ID: <49EE3D19.7000603@hastings.org>


I was excited to see PYTHONUSERBASE added in 2.6/3.0.  But I became disappointed when I learned it only allows you to specify one directory.  I have use cases where I'd want more than one.

Also, PYTHONUSERBASE doesn't support all the directories that site.addsitepackages does.  PYTHONUSERBASE examines exactly one; on non-Windows platforms, it's:

    {prefix}/lib/pythonX.X/site-packages

However, for each "prefix" directory Python knows about, site.addsitepackages examines these two on Linux:

    {prefix}/lib/pythonX.X/site-packages
    {prefix}/lib/site-python

(The list is even longer on OS X.)


I therefore propose a new environment variable called PYTHONUSERBASES.  PYTHONUSERBASES is a list of directories separated by whatever directory separator characters PYTHONPATH permits on the local platform.  At startup, after processing PYTHONUSERBASE but before calling site.addsitepackages, Python iterates down the list of directories listed in PYTHONUSERBASES and causes them to be processed by site.addsitepackages before the default prefix directories.  (Probably by adding them to sys.prefixes.)  PYTHONUSERBASES obeys -s, PYTHONNOUSERSITE, and the rules about effective user/group IDs, just as PYTHONUSERBASE does.  The default value of PYTHONUSERBASES is the empty string; this would add no new directories.

It seems to me that PYTHONUSERBASES would turn "virtualenv" into a twenty-line shell script.  Add the appropriate directories to LD_LIBRARY_PATH and PYTHONUSERBASES and you'd be off to the races.  You'd have to be careful to manually specify --prefix when installing new software, which I'm guessing virtualenv obviates.  But apart from that I think it would work fine.  Or am I missing something important?

I would be happy--thrilled!, even--to write this up as a patch for Python 3.1 before the feature-freeze,


/larry/



From python at rcn.com  Wed Apr 22 00:02:20 2009
From: python at rcn.com (Raymond Hettinger)
Date: Tue, 21 Apr 2009 15:02:20 -0700
Subject: [Python-ideas] An idea for a new pickling tool
Message-ID: <6C9D8FA62D3C429C9D164BEE7E9D928E@RaymondLaptop1>

Motivation
----------

Python's pickles use a custom format that has evolved over time
but they have five significant disadvantages:

    * it has lost its human readability and editability
    * is doesn't compress well
    * it isn't interoperable with other languages
    * it doesn't have the ability to enforce a schema
    * it is a major security risk for untrusted inputs


New idea
--------

Develop a solution using a mix of PyYAML, a python coded version of
Kwalify, optional compression using bz2, gzip, or zlib, and pretty
printing using pygments.

YAML ( http://yaml.org/spec/1.2/ ) is a language independent standard
for data serialization.

PyYAML ( http://pyyaml.org/wiki/PyYAML ) is a full implementation of
the YAML standard.  It uses the YAML's application-specific tags and
Python's own copy/reduce logic to provide the same power as pickle itself.

Kwalify ( http://www.kuwata-lab.com/kwalify/ruby/users-guide.01.html )
is a schema validator written in Ruby and Java.  It defines a
YAML/JSON based schema definition for enforcing tight constraints
on incoming data.

The bz2, gzip, and zlib compression libraries are already built into
the language.

Pygments ( http://pygments.org/ ) is python based syntax highlighter
with builtin support for YAML.


Advantages
----------

* The format is simple enough to hand edit or to have lightweight
  applications emit valid pickles.  For example:

      print('Todo: [go to bank, pick up food, write code]')   # valid pickle

* To date, efforts to make pickles smaller have focused on creating new
  codes for every data type.  Instead, we can use the simple text formatting
  of YAML and let general purpose data compression utilities do their job
  (letting the user control the trade-offs between speed, space, and human
  readability):

      yaml.dump(data, compressor=None)  # fast, human readable, no compression
      yaml.dump(data, compressor=bz2)   # slowest, but best compression
      yaml.dump(data, compressor=zlib)  # medium speed and medium compression

* The current pickle tools makes it easy to exchange object trees between
  two Python processes.  The new tool would make it equally easy 
  to exchange object trees between processes running any of Python, Ruby, 
  Java, C/C++, Perl, C#, PHP, OCaml, Javascript, ActionScript, and Haskell.

* Ability to use a schema for enforcing a given object model and allowing
  full security.  Which would you rather run on untrusted data:

      data = yaml.load(myfile, schema=ListOfStrings)

  or

      data = pickle.load(myfile)

* Specification of a schema using YAML itself

  ListOfStrings (a schema written in yaml)
  ........................................
  type:   seq
  sequence:
    - type:   str

  Sample of valid input
  .....................
  - foo
  - bar
  - baz

  Note, schemas can be defined for very complex, nested object models and
  allow many kinds of constraints (unique items, enumerated list of allowable
  values, min/max allowable ranges for values, data type, maximum length,
  and names of regular Python classes that can be constructed).

* YAML is a superset of JSON, so the schema validation also works equally
  well with JSON encoded data.

What needs to be done
---------------------

* Combine the tools for a single, clean interface to C speed parsing
  of a data serialization standard, with optional compression, schema
  validation, and pretty printing.


From jnoller at gmail.com  Wed Apr 22 00:10:24 2009
From: jnoller at gmail.com (Jesse Noller)
Date: Tue, 21 Apr 2009 18:10:24 -0400
Subject: [Python-ideas] An idea for a new pickling tool
In-Reply-To: <6C9D8FA62D3C429C9D164BEE7E9D928E@RaymondLaptop1>
References: <6C9D8FA62D3C429C9D164BEE7E9D928E@RaymondLaptop1>
Message-ID: <B565F7F4-3EA1-48E2-92C1-F1DB99CF2A9E@gmail.com>



On Apr 21, 2009, at 6:02 PM, "Raymond Hettinger" <python at rcn.com> wrote:

> Motivation
> ----------
>
> Python's pickles use a custom format that has evolved over time
> but they have five significant disadvantages:
>
>   * it has lost its human readability and editability
>   * is doesn't compress well
>   * it isn't interoperable with other languages
>   * it doesn't have the ability to enforce a schema
>   * it is a major security risk for untrusted inputs
>
>
> New idea
> --------
>
> Develop a solution using a mix of PyYAML, a python coded version of
> Kwalify, optional compression using bz2, gzip, or zlib, and pretty
> printing using pygments.
>
> YAML ( http://yaml.org/spec/1.2/ ) is a language independent standard
> for data serialization.
>
> PyYAML ( http://pyyaml.org/wiki/PyYAML ) is a full implementation of
> the YAML standard.  It uses the YAML's application-specific tags and
> Python's own copy/reduce logic to provide the same power as pickle  
> itself.
>
> Kwalify ( http://www.kuwata-lab.com/kwalify/ruby/users-guide.01.html )
> is a schema validator written in Ruby and Java.  It defines a
> YAML/JSON based schema definition for enforcing tight constraints
> on incoming data.
>
> The bz2, gzip, and zlib compression libraries are already built into
> the language.
>
> Pygments ( http://pygments.org/ ) is python based syntax highlighter
> with builtin support for YAML.
>
>
> Advantages
> ----------
>
> * The format is simple enough to hand edit or to have lightweight
> applications emit valid pickles.  For example:
>
>     print('Todo: [go to bank, pick up food, write code]')   # valid  
> pickle
>
> * To date, efforts to make pickles smaller have focused on creating  
> new
> codes for every data type.  Instead, we can use the simple text  
> formatting
> of YAML and let general purpose data compression utilities do their  
> job
> (letting the user control the trade-offs between speed, space, and  
> human
> readability):
>
>     yaml.dump(data, compressor=None)  # fast, human readable, no  
> compression
>     yaml.dump(data, compressor=bz2)   # slowest, but best compression
>     yaml.dump(data, compressor=zlib)  # medium speed and medium  
> compression
>
> * The current pickle tools makes it easy to exchange object trees  
> between
> two Python processes.  The new tool would make it equally easy  to  
> exchange object trees between processes running any of Python,  
> Ruby,  Java, C/C++, Perl, C#, PHP, OCaml, Javascript, ActionScript,  
> and Haskell.
>
> * Ability to use a schema for enforcing a given object model and  
> allowing
> full security.  Which would you rather run on untrusted data:
>
>     data = yaml.load(myfile, schema=ListOfStrings)
>
> or
>
>     data = pickle.load(myfile)
>
> * Specification of a schema using YAML itself
>
> ListOfStrings (a schema written in yaml)
> ........................................
> type:   seq
> sequence:
>   - type:   str
>
> Sample of valid input
> .....................
> - foo
> - bar
> - baz
>
> Note, schemas can be defined for very complex, nested object models  
> and
> allow many kinds of constraints (unique items, enumerated list of  
> allowable
> values, min/max allowable ranges for values, data type, maximum  
> length,
> and names of regular Python classes that can be constructed).
>
> * YAML is a superset of JSON, so the schema validation also works  
> equally
> well with JSON encoded data.
>
> What needs to be done
> ---------------------
>
> * Combine the tools for a single, clean interface to C speed parsing
> of a data serialization standard, with optional compression, schema
> validation, and pretty printing.
>

A huge +1 from me, I've used YAML quite a bit, and as a cross language  
communications format it's quite nice.

Jesse


From guido at python.org  Wed Apr 22 00:26:24 2009
From: guido at python.org (Guido van Rossum)
Date: Tue, 21 Apr 2009 15:26:24 -0700
Subject: [Python-ideas] An idea for a new pickling tool
In-Reply-To: <6C9D8FA62D3C429C9D164BEE7E9D928E@RaymondLaptop1>
References: <6C9D8FA62D3C429C9D164BEE7E9D928E@RaymondLaptop1>
Message-ID: <ca471dc20904211526s5174f6aep61403172081f0dfc@mail.gmail.com>

On Tue, Apr 21, 2009 at 3:02 PM, Raymond Hettinger <python at rcn.com> wrote:
> Motivation
> ----------
>
> Python's pickles use a custom format that has evolved over time
> but they have five significant disadvantages:
>
> ? * it has lost its human readability and editability
> ? * is doesn't compress well

Really? Or do you mean "it doesn't have built-in compression support"
? I don't expect that running bzip2 over a pickle would produce
unsatisfactory results, and the API supports reading from and writing
to streams.

Or do you just mean "the representation is too repetitive and bulky" ?

> ? * it isn't interoperable with other languages
> ? * it doesn't have the ability to enforce a schema
> ? * it is a major security risk for untrusted inputs

I agree that pickle doesn't satisfy these. But then again, #1, #3 and
#4 were never part of its design goals. #5 is indeed a problem.

But I think there are existing solutions already. For example, I'd say
that XML+bzip2 satisfies all these already. If you want something a
little less verbose, I recommend looking at Google Protocol Buffers
(http://code.google.com/apis/protocolbuffers/), which have both a
compact binary format (though not compressed -- but you can easily
layer that) and a verbose text format. There's a nice Python-specific
tutorial (http://code.google.com/apis/protocolbuffers/docs/pythontutorial.html)
that also explains why you would use this.

--Guido

> New idea
> --------
>
> Develop a solution using a mix of PyYAML, a python coded version of
> Kwalify, optional compression using bz2, gzip, or zlib, and pretty
> printing using pygments.
>
> YAML ( http://yaml.org/spec/1.2/ ) is a language independent standard
> for data serialization.
>
> PyYAML ( http://pyyaml.org/wiki/PyYAML ) is a full implementation of
> the YAML standard. ?It uses the YAML's application-specific tags and
> Python's own copy/reduce logic to provide the same power as pickle itself.
>
> Kwalify ( http://www.kuwata-lab.com/kwalify/ruby/users-guide.01.html )
> is a schema validator written in Ruby and Java. ?It defines a
> YAML/JSON based schema definition for enforcing tight constraints
> on incoming data.
>
> The bz2, gzip, and zlib compression libraries are already built into
> the language.
>
> Pygments ( http://pygments.org/ ) is python based syntax highlighter
> with builtin support for YAML.
>
>
> Advantages
> ----------
>
> * The format is simple enough to hand edit or to have lightweight
> ?applications emit valid pickles. ?For example:
>
> ? ? print('Todo: [go to bank, pick up food, write code]') ? # valid pickle
>
> * To date, efforts to make pickles smaller have focused on creating new
> ?codes for every data type. ?Instead, we can use the simple text formatting
> ?of YAML and let general purpose data compression utilities do their job
> ?(letting the user control the trade-offs between speed, space, and human
> ?readability):
>
> ? ? yaml.dump(data, compressor=None) ?# fast, human readable, no compression
> ? ? yaml.dump(data, compressor=bz2) ? # slowest, but best compression
> ? ? yaml.dump(data, compressor=zlib) ?# medium speed and medium compression
>
> * The current pickle tools makes it easy to exchange object trees between
> ?two Python processes. ?The new tool would make it equally easy ?to exchange
> object trees between processes running any of Python, Ruby, ?Java, C/C++,
> Perl, C#, PHP, OCaml, Javascript, ActionScript, and Haskell.
>
> * Ability to use a schema for enforcing a given object model and allowing
> ?full security. ?Which would you rather run on untrusted data:
>
> ? ? data = yaml.load(myfile, schema=ListOfStrings)
>
> ?or
>
> ? ? data = pickle.load(myfile)
>
> * Specification of a schema using YAML itself
>
> ?ListOfStrings (a schema written in yaml)
> ?........................................
> ?type: ? seq
> ?sequence:
> ? - type: ? str
>
> ?Sample of valid input
> ?.....................
> ?- foo
> ?- bar
> ?- baz
>
> ?Note, schemas can be defined for very complex, nested object models and
> ?allow many kinds of constraints (unique items, enumerated list of allowable
> ?values, min/max allowable ranges for values, data type, maximum length,
> ?and names of regular Python classes that can be constructed).
>
> * YAML is a superset of JSON, so the schema validation also works equally
> ?well with JSON encoded data.
>
> What needs to be done
> ---------------------
>
> * Combine the tools for a single, clean interface to C speed parsing
> ?of a data serialization standard, with optional compression, schema
> ?validation, and pretty printing.
> _______________________________________________
> Python-ideas mailing list
> Python-ideas at python.org
> http://mail.python.org/mailman/listinfo/python-ideas
>



-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)


From jnoller at gmail.com  Wed Apr 22 02:41:21 2009
From: jnoller at gmail.com (Jesse Noller)
Date: Tue, 21 Apr 2009 20:41:21 -0400
Subject: [Python-ideas] An idea for a new pickling tool
In-Reply-To: <6C9D8FA62D3C429C9D164BEE7E9D928E@RaymondLaptop1>
References: <6C9D8FA62D3C429C9D164BEE7E9D928E@RaymondLaptop1>
Message-ID: <4222a8490904211741s5f31c733n55f2fda209f294e7@mail.gmail.com>

On Tue, Apr 21, 2009 at 6:02 PM, Raymond Hettinger <python at rcn.com> wrote:
> Motivation
> ----------
>
> Python's pickles use a custom format that has evolved over time
> but they have five significant disadvantages:
>
> ? * it has lost its human readability and editability
> ? * is doesn't compress well
> ? * it isn't interoperable with other languages
> ? * it doesn't have the ability to enforce a schema
> ? * it is a major security risk for untrusted inputs
>
>
> New idea
> --------
>
> Develop a solution using a mix of PyYAML, a python coded version of
> Kwalify, optional compression using bz2, gzip, or zlib, and pretty
> printing using pygments.
>
> YAML ( http://yaml.org/spec/1.2/ ) is a language independent standard
> for data serialization.
>
> PyYAML ( http://pyyaml.org/wiki/PyYAML ) is a full implementation of
> the YAML standard. ?It uses the YAML's application-specific tags and
> Python's own copy/reduce logic to provide the same power as pickle itself.
>
> Kwalify ( http://www.kuwata-lab.com/kwalify/ruby/users-guide.01.html )
> is a schema validator written in Ruby and Java. ?It defines a
> YAML/JSON based schema definition for enforcing tight constraints
> on incoming data.
>
> The bz2, gzip, and zlib compression libraries are already built into
> the language.
>
> Pygments ( http://pygments.org/ ) is python based syntax highlighter
> with builtin support for YAML.
>
>
> Advantages
> ----------
>
> * The format is simple enough to hand edit or to have lightweight
> ?applications emit valid pickles. ?For example:
>
> ? ? print('Todo: [go to bank, pick up food, write code]') ? # valid pickle
>
> * To date, efforts to make pickles smaller have focused on creating new
> ?codes for every data type. ?Instead, we can use the simple text formatting
> ?of YAML and let general purpose data compression utilities do their job
> ?(letting the user control the trade-offs between speed, space, and human
> ?readability):
>
> ? ? yaml.dump(data, compressor=None) ?# fast, human readable, no compression
> ? ? yaml.dump(data, compressor=bz2) ? # slowest, but best compression
> ? ? yaml.dump(data, compressor=zlib) ?# medium speed and medium compression
>
> * The current pickle tools makes it easy to exchange object trees between
> ?two Python processes. ?The new tool would make it equally easy ?to exchange
> object trees between processes running any of Python, Ruby, ?Java, C/C++,
> Perl, C#, PHP, OCaml, Javascript, ActionScript, and Haskell.
>
> * Ability to use a schema for enforcing a given object model and allowing
> ?full security. ?Which would you rather run on untrusted data:
>
> ? ? data = yaml.load(myfile, schema=ListOfStrings)
>
> ?or
>
> ? ? data = pickle.load(myfile)
>
> * Specification of a schema using YAML itself
>
> ?ListOfStrings (a schema written in yaml)
> ?........................................
> ?type: ? seq
> ?sequence:
> ? - type: ? str
>
> ?Sample of valid input
> ?.....................
> ?- foo
> ?- bar
> ?- baz
>
> ?Note, schemas can be defined for very complex, nested object models and
> ?allow many kinds of constraints (unique items, enumerated list of allowable
> ?values, min/max allowable ranges for values, data type, maximum length,
> ?and names of regular Python classes that can be constructed).
>
> * YAML is a superset of JSON, so the schema validation also works equally
> ?well with JSON encoded data.
>
> What needs to be done
> ---------------------
>
> * Combine the tools for a single, clean interface to C speed parsing
> ?of a data serialization standard, with optional compression, schema
> ?validation, and pretty printing.

Just to add to this, I remembered someone recently did a simple
benchmark of thift/JSON/YAML/Protocol Buffers, here are the links:

http://www.bouncybouncy.net/ramblings/posts/thrift_and_protocol_buffers/
http://www.bouncybouncy.net/ramblings/posts/more_on_json_vs_thrift_and_protocol_buffers/
http://www.bouncybouncy.net/ramblings/posts/json_vs_thrift_and_protocol_buffers_round_2/

Without digging into the numbers too much, it's worth noting that
PyYAML is written in pure python but also has Libyaml
(http://pyyaml.org/wiki/LibYAML) bindings for speed. When I get a
chance, I can run the same test(s) with both the pure-python
implementation and the libyaml one as well as see how much the speedup
is.

We would definitely need a c-based parser/emitter for something like
this to really fly.

jesse

jesse


From python at rcn.com  Wed Apr 22 02:56:25 2009
From: python at rcn.com (Raymond Hettinger)
Date: Tue, 21 Apr 2009 17:56:25 -0700
Subject: [Python-ideas] An idea for a new pickling tool
References: <6C9D8FA62D3C429C9D164BEE7E9D928E@RaymondLaptop1>
	<ca471dc20904211526s5174f6aep61403172081f0dfc@mail.gmail.com>
Message-ID: <635AD1835B614CD1A77DD758B9B7BBBB@RaymondLaptop1>


>> Python's pickles use a custom format that has evolved over time
>> but they have five significant disadvantages:
>>
>> * it has lost its human readability and editability
>> * is doesn't compress well
>
> Really? Or do you mean "it doesn't have built-in compression support"
> ? I don't expect that running bzip2 over a pickle would produce
> unsatisfactory results, and the API supports reading from and writing
> to streams.
>
>Or do you just mean "the representation is too repetitive and bulky" ?
>
>> * it isn't interoperable with other languages
>> * it doesn't have the ability to enforce a schema
>> * it is a major security risk for untrusted inputs
>
> I agree that pickle doesn't satisfy these. But then again, #1, #3 and
> #4 were never part of its design goals. #5 is indeed a problem.

Pickle does well with its original design goal.

It would be really nice if we also provided a builtin solution that incorportated 
the other design goals listed above and adopted a format based on a published
standard.


>But I think there are existing solutions already. For example, I'd say
> that XML+bzip2 satisfies all these already.

No doubt that would work.  There is however a pretty high barrier to bringing
together all the right tools  (an xml pickler/unpickler providing the equivalent of 
pickle.dumps/pickle.loads, a fast xml parser, a xml schema validator, an  xml 
pretty printer, and data compression).  Even with the right tools brought 
together under a single convenient API, it wouldn't be any fun to write the
DTDs for the validator.  I think the barrier is so high, that in practice these tools
will rarely be brought together for this purpose and instead are focused on
ad hoc approaches geared to a particular application.


> If you want something a
> little less verbose, I recommend looking at Google Protocol Buffers
> (http://code.google.com/apis/protocolbuffers/), 

That is a nice package.  It seems to have shared several of the goals
listed above (interlanguage data exchange, use of schemas, and security).

I scanned through all of the docs but didn't see a pickle.dumps() style API;
instead, it seems to be focused on making the user build up parts of a non-subclassable
custom  object that knows how to serialize itself.  In contrast, pyyaml rides on our
existing __reduce__ logic to fully emulate what pickle can do (meaning that
most apps can add serialization with just a single line). 

It doesn't look like the actual data formatting is based on a published standard
so it requires the Google tool on each end (with support offered for Python, Java, and C++).

Hope you're not negative on the idea of a compressing, validating, pretty printing, yaml pickler.
Without your support, the idea is dead before it can get started.

FWIW, I found some of the Kwalify examples to be compelling.  Am attaching one
for you guys to look at.  I don't find think an equivalent XML solution would come
together as effortlessly or as beautifully.  From a python point of view, the example boils 
down to:  yaml.dump(donors, file) and donors = yaml.load(file, schema=donor_schema).
No extra work is required.  Human readability/editability comes for free, inter-language 
operability comes for free, and so do the security guarantees.

I think it would be great if we took a batteries included approach and offered
something like this as part of the standard library.


Raymond


----------- donor_schema ----------
type:      seq
sequence:
  -
    type:      map
    mapping:
     "name":
        type:       str
        required:   yes
     "email":
        type:       str
        required:   yes
        pattern:    /@/
     "password":
        type:       text
        length:     { max: 16, min: 8 }
     "age":
        type:       int
        range:      { max: 30, min: 18 }
        # or assert: 18 <= val && val <= 30
     "blood":
        type:       str
        enum:       [A, B, O, AB]
     "birth":
        type:       date
     "deleted":
        type:       bool
        default:    false
----------- valid document ------------ name:     foo
  email:    foo at mail.com
  password: xxx123456
  age:      20
  blood:    A
  birth:    1985-01-01
- name:     bar
  email:    bar at mail.net
  age:      25
  blood:    AB
  birth:    1980-01-01



From santagada at gmail.com  Wed Apr 22 03:02:41 2009
From: santagada at gmail.com (Leonardo Santagada)
Date: Tue, 21 Apr 2009 22:02:41 -0300
Subject: [Python-ideas] An idea for a new pickling tool
In-Reply-To: <6C9D8FA62D3C429C9D164BEE7E9D928E@RaymondLaptop1>
References: <6C9D8FA62D3C429C9D164BEE7E9D928E@RaymondLaptop1>
Message-ID: <3D3ED33F-3BC1-4162-A0AA-81FB934F7061@gmail.com>


For starters I want to say that this is a great idea overall but why  
not:

On Apr 21, 2009, at 7:02 PM, Raymond Hettinger wrote:

> * YAML is a superset of JSON, so the schema validation also works  
> equally
> well with JSON encoded data.


With the already existing fast json serializers (simplejson) why not  
use JSON instead of YAML? I like the YAML language, but JSON  
serializers exists for many more languages and are usually much more  
used than YAML ones.


--
Leonardo Santagada
santagada at gmail.com





From ben+python at benfinney.id.au  Wed Apr 22 03:11:03 2009
From: ben+python at benfinney.id.au (Ben Finney)
Date: Wed, 22 Apr 2009 11:11:03 +1000
Subject: [Python-ideas] PYTHONUSERBASES (plural!)
References: <49EE3D19.7000603@hastings.org>
Message-ID: <8763gxijbs.fsf@benfinney.id.au>

Larry Hastings <larry at hastings.org> writes:

> I was excited to see PYTHONUSERBASE added in 2.6/3.0. But I became
> disappointed when I learned it only allows you to specify one
> directory. I have use cases where I'd want more than one.
[?]

> I therefore propose a new environment variable called PYTHONUSERBASES.
> PYTHONUSERBASES is a list of directories separated by whatever
> directory separator characters PYTHONPATH permits on the local
> platform.
[?]

I like the described feature.

Adding a new variable whose meaning entirely obsoletes the existing one
with confusingly similar names, especially since the existing one is
itself so new, seems sub-optimal.

Instead of adding another variable, I would prefer the meaning of the
existing ?PYTHONUSERBASE? altered to do this.

-- 
 \     ?I know you believe you understood what you think I said, but I |
  `\         am not sure you realize that what you heard is not what I |
_o__)                                     meant.? ?Robert J. McCloskey |
Ben Finney



From dangyogi at gmail.com  Wed Apr 22 03:13:20 2009
From: dangyogi at gmail.com (Bruce Frederiksen)
Date: Tue, 21 Apr 2009 21:13:20 -0400
Subject: [Python-ideas] An idea for a new pickling tool
In-Reply-To: <ca471dc20904211526s5174f6aep61403172081f0dfc@mail.gmail.com>
References: <6C9D8FA62D3C429C9D164BEE7E9D928E@RaymondLaptop1>
	<ca471dc20904211526s5174f6aep61403172081f0dfc@mail.gmail.com>
Message-ID: <49EE6F30.9070801@gmail.com>

Guido van Rossum wrote:
> I recommend looking at Google Protocol Buffers
> (http://code.google.com/apis/protocolbuffers/), which have both a
> compact binary format (though not compressed -- but you can easily
> layer that) and a verbose text format. There's a nice Python-specific
> tutorial (http://code.google.com/apis/protocolbuffers/docs/pythontutorial.html)
> that also explains why you would use this.
>   
I was under the impression that Google Protocol Buffers neither allowed 
for dynamic typing, for example [1, "hi mom", []] nor for multiple 
references to the same object, for example: x = [1,2,3]; (x, x).

Have I missing something?

-bruce frederiksen


From rdmurray at bitdance.com  Wed Apr 22 03:39:50 2009
From: rdmurray at bitdance.com (R. David Murray)
Date: Wed, 22 Apr 2009 01:39:50 +0000 (UTC)
Subject: [Python-ideas] Add setpriority / getpriority to os module.
References: <grgtlk$pdt$1@ger.gmane.org>
	<acd65fa20904201945q145a51b3p7c4adcd3668e6215@mail.gmail.com>
Message-ID: <gslsh5$678$1@ger.gmane.org>

Alexandre Vassalotti <alexandre at peadrop.com> wrote:
> On Tue, Apr 7, 2009 at 9:12 PM, Christian Heimes <lists at cheimes.de> wrote:
> > Hello,
> >
> > I would like to add straight forward wrapper for the setpriority and
> > getpriority functions to posixmodule.c for the os module.
> 
> Wouldn't it be better to add these functions to the 'resource' module?

'resource' is Unix specific, but Josiah indicates Windows equivalents
to set/getpriority are possible.  In that case it seems like it belongs
in posix/os since it can eventually be made cross-platform.

--
R. David Murray             http://www.bitdance.com



From guido at python.org  Wed Apr 22 05:08:18 2009
From: guido at python.org (Guido van Rossum)
Date: Tue, 21 Apr 2009 20:08:18 -0700
Subject: [Python-ideas] An idea for a new pickling tool
In-Reply-To: <635AD1835B614CD1A77DD758B9B7BBBB@RaymondLaptop1>
References: <6C9D8FA62D3C429C9D164BEE7E9D928E@RaymondLaptop1> 
	<ca471dc20904211526s5174f6aep61403172081f0dfc@mail.gmail.com> 
	<635AD1835B614CD1A77DD758B9B7BBBB@RaymondLaptop1>
Message-ID: <ca471dc20904212008s4cb605cfy684484a82f796b0f@mail.gmail.com>

On Tue, Apr 21, 2009 at 5:56 PM, Raymond Hettinger <python at rcn.com> wrote:
>
>>> Python's pickles use a custom format that has evolved over time
>>> but they have five significant disadvantages:
>>>
>>> * it has lost its human readability and editability
>>> * is doesn't compress well
>>
>> Really? Or do you mean "it doesn't have built-in compression support"
>> ? I don't expect that running bzip2 over a pickle would produce
>> unsatisfactory results, and the API supports reading from and writing
>> to streams.
>>
>> Or do you just mean "the representation is too repetitive and bulky" ?
>>
>>> * it isn't interoperable with other languages
>>> * it doesn't have the ability to enforce a schema
>>> * it is a major security risk for untrusted inputs
>>
>> I agree that pickle doesn't satisfy these. But then again, #1, #3 and
>> #4 were never part of its design goals. #5 is indeed a problem.
>
> Pickle does well with its original design goal.
>
> It would be really nice if we also provided a builtin solution that
> incorportated the other design goals listed above and adopted a format based
> on a published
> standard.
>
>
>> But I think there are existing solutions already. For example, I'd say
>> that XML+bzip2 satisfies all these already.
>
> No doubt that would work. ?There is however a pretty high barrier to
> bringing
> together all the right tools ?(an xml pickler/unpickler providing the
> equivalent of pickle.dumps/pickle.loads, a fast xml parser, a xml schema
> validator, an ?xml pretty printer, and data compression). ?Even with the
> right tools brought together under a single convenient API, it wouldn't be
> any fun to write the
> DTDs for the validator. ?I think the barrier is so high, that in practice
> these tools
> will rarely be brought together for this purpose and instead are focused on
> ad hoc approaches geared to a particular application.
>
>
>> If you want something a
>> little less verbose, I recommend looking at Google Protocol Buffers
>> (http://code.google.com/apis/protocolbuffers/),
>
> That is a nice package. ?It seems to have shared several of the goals
> listed above (interlanguage data exchange, use of schemas, and security).
>
> I scanned through all of the docs but didn't see a pickle.dumps() style API;
> instead, it seems to be focused on making the user build up parts of a
> non-subclassable
> custom ?object that knows how to serialize itself. ?In contrast, pyyaml
> rides on our
> existing __reduce__ logic to fully emulate what pickle can do (meaning that
> most apps can add serialization with just a single line).

Right. That is not one of the design goals. (It also generally is
incompatible with several other design goals, like cross-language
support and schema enforcement -- though I now realize you mean the
latter to be optional.)

> It doesn't look like the actual data formatting is based on a published
> standard
> so it requires the Google tool on each end (with support offered for Python,
> Java, and C++).

I'm not too worried about the "published standard" thing. Python
itself doesn't have anything like it either. :-) If you want real
enterprise-level standards compliance, I doubt that anything short of
XML will satisfy those die-hard conservatives. (And they probably
haven't even heard of bzip2.) I don't actually think there's such a
thing as a YAML standard either.

> Hope you're not negative on the idea of a compressing, validating, pretty
> printing, yaml pickler.
> Without your support, the idea is dead before it can get started.

You have my full support -- just very few of my cycles in getting
something working. I think there are enough people around here to
help. I'm skeptical that trying to create something new, whether
standards-based or not, is going to be worth it -- you have to be
careful to define your target audience and the design goals and see if
your solution would actually be enticing for that audience compared to
what they can do today. (Hence my plug for Protocol Buffers -- but
I'll stop now.)

> FWIW, I found some of the Kwalify examples to be compelling. ?Am attaching
> one
> for you guys to look at. ?I don't find think an equivalent XML solution
> would come
> together as effortlessly or as beautifully. ?From a python point of view,
> the example boils down to: ?yaml.dump(donors, file) and donors =
> yaml.load(file, schema=donor_schema).
> No extra work is required.

How easy is it to define a schema though? What about schema migration?
(An explicit goal of Protocol Buffers BTW, and in my experience very
important.)

> Human readability/editability comes for free,

How important is that though?

> inter-language operability comes for free, and so do the security
> guarantees.
>
> I think it would be great if we took a batteries included approach and
> offered
> something like this as part of the standard library.

First you have to have working code as a 3rd party package with a lot
of happy users.

> Raymond
>
>
> ----------- donor_schema ----------
> type: ? ? ?seq
> sequence:
> ?-
> ? type: ? ? ?map
> ? mapping:
> ? ?"name":
> ? ? ? type: ? ? ? str
> ? ? ? required: ? yes
> ? ?"email":
> ? ? ? type: ? ? ? str
> ? ? ? required: ? yes
> ? ? ? pattern: ? ?/@/
> ? ?"password":
> ? ? ? type: ? ? ? text
> ? ? ? length: ? ? { max: 16, min: 8 }
> ? ?"age":
> ? ? ? type: ? ? ? int
> ? ? ? range: ? ? ?{ max: 30, min: 18 }
> ? ? ? # or assert: 18 <= val && val <= 30
> ? ?"blood":
> ? ? ? type: ? ? ? str
> ? ? ? enum: ? ? ? [A, B, O, AB]
> ? ?"birth":
> ? ? ? type: ? ? ? date
> ? ?"deleted":
> ? ? ? type: ? ? ? bool
> ? ? ? default: ? ?false
> ----------- valid document ------------ name: ? ? foo
> ?email: ? ?foo at mail.com
> ?password: xxx123456
> ?age: ? ? ?20
> ?blood: ? ?A
> ?birth: ? ?1985-01-01
> - name: ? ? bar
> ?email: ? ?bar at mail.net
> ?age: ? ? ?25
> ?blood: ? ?AB
> ?birth: ? ?1980-01-01
>
>



-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)


From josiah.carlson at gmail.com  Wed Apr 22 05:31:29 2009
From: josiah.carlson at gmail.com (Josiah Carlson)
Date: Tue, 21 Apr 2009 20:31:29 -0700
Subject: [Python-ideas] An idea for a new pickling tool
In-Reply-To: <3D3ED33F-3BC1-4162-A0AA-81FB934F7061@gmail.com>
References: <6C9D8FA62D3C429C9D164BEE7E9D928E@RaymondLaptop1>
	<3D3ED33F-3BC1-4162-A0AA-81FB934F7061@gmail.com>
Message-ID: <e6511dbf0904212031n1e2402f6oaba7f341d0bc2284@mail.gmail.com>

On Tue, Apr 21, 2009 at 6:02 PM, Leonardo Santagada <santagada at gmail.com> wrote:
>
> For starters I want to say that this is a great idea overall but why not:
>
> On Apr 21, 2009, at 7:02 PM, Raymond Hettinger wrote:
>
>> * YAML is a superset of JSON, so the schema validation also works equally
>> well with JSON encoded data.
>
>
> With the already existing fast json serializers (simplejson) why not use
> JSON instead of YAML? I like the YAML language, but JSON serializers exists
> for many more languages and are usually much more used than YAML ones.

+1 for json.  I've had bad luck with yaml in the past, but only have
good things to say about json.

 - Josiah


From larry at hastings.org  Wed Apr 22 06:58:51 2009
From: larry at hastings.org (Larry Hastings)
Date: Tue, 21 Apr 2009 21:58:51 -0700
Subject: [Python-ideas] PYTHONUSERBASES (plural!)
In-Reply-To: <8763gxijbs.fsf@benfinney.id.au>
References: <49EE3D19.7000603@hastings.org> <8763gxijbs.fsf@benfinney.id.au>
Message-ID: <49EEA40B.5060007@hastings.org>


Ben Finney wrote:
> I like the described feature.
>   

You vote +1 then?

> Adding a new variable whose meaning entirely obsoletes the existing one
> with confusingly similar names, especially since the existing one is
> itself so new, seems sub-optimal.
>
> Instead of adding another variable, I would prefer the meaning of the
> existing ?PYTHONUSERBASE? altered to do this.

I don't think we should break PYTHONUSERBASE.  Also, PYTHONUSERBASES has 
slightly different semantics; PYTHONUSERBASE has a default value 
(~/.local on UNIX for instance), wheras my proposed variable has a no-op 
default.

I'm open to suggestions for an alternate name.  However, the flipside to 
"confusingly similar" would be "confusingly different".  PYTHONUSERBASES 
is very similar to PYTHONUSERBASE except that it supports more than one, 
so giving it a different name seemed natural to me.


/larry/


From ben+python at benfinney.id.au  Wed Apr 22 08:03:13 2009
From: ben+python at benfinney.id.au (Ben Finney)
Date: Wed, 22 Apr 2009 16:03:13 +1000
Subject: [Python-ideas] PYTHONUSERBASES (plural!)
References: <49EE3D19.7000603@hastings.org> <8763gxijbs.fsf@benfinney.id.au>
	<49EEA40B.5060007@hastings.org>
Message-ID: <87eivlgr8e.fsf@benfinney.id.au>

Larry Hastings <larry at hastings.org> writes:

> Ben Finney wrote:
> > I like the described feature.
> 
> You vote +1 then?

With the caveat that I think it should be ?PYTHONUSERBASE? that causes
this behaviour, not a new variable.

> I'm open to suggestions for an alternate name. However, the flipside
> to "confusingly similar" would be "confusingly different".
> PYTHONUSERBASES is very similar to PYTHONUSERBASE except that it
> supports more than one, so giving it a different name seemed natural
> to me.

As I understand it, a valid value for ?PYTHONUSERBASE? as-is (i.e. a
single directory path) is also a valid value for this new behaviour. All
we're doing is adding the capability to *also* supply multiple directory
paths in the value, separated by ?:?. It seems logical to me that this
should be the same variable.

-- 
 \     ?Some mornings, it's just not worth chewing through the leather |
  `\                                             straps.? ?Emo Philips |
_o__)                                                                  |
Ben Finney



From dirkjan at ochtman.nl  Wed Apr 22 10:12:38 2009
From: dirkjan at ochtman.nl (Dirkjan Ochtman)
Date: Wed, 22 Apr 2009 10:12:38 +0200
Subject: [Python-ideas] An idea for a new pickling tool
In-Reply-To: <3D3ED33F-3BC1-4162-A0AA-81FB934F7061@gmail.com>
References: <6C9D8FA62D3C429C9D164BEE7E9D928E@RaymondLaptop1>
	<3D3ED33F-3BC1-4162-A0AA-81FB934F7061@gmail.com>
Message-ID: <49EED176.5000103@ochtman.nl>

On 22/04/2009 03:02, Leonardo Santagada wrote:
> With the already existing fast json serializers (simplejson) why not use
> JSON instead of YAML? I like the YAML language, but JSON serializers
> exists for many more languages and are usually much more used than YAML
> ones.

+1 from me as well. YAML is still evolving, and MUCH more complex than 
JSON. JSON is simpler and  has an IETF RFC to back it (much shorter than 
the YAML spec -- and the flow diagrams on json.org are really all you 
need). JSON schema might be used to do schemas [1].

Some kind of versioning would also be very useful [2,3]. I've had a lot 
of trouble with unpickling pickles for which the classes had changed; 
you often don't really get a useful error message.

Cheers,

Dirkjan

[1] http://www.json.com/json-schema-proposal/
[2] http://utcc.utoronto.ca/~cks/space/blog/python/PickleNotForSaving
[3] http://utcc.utoronto.ca/~cks/space/blog/python/VersioningPickle



From jeremiah.dodds at gmail.com  Wed Apr 22 10:33:39 2009
From: jeremiah.dodds at gmail.com (Jeremiah Dodds)
Date: Wed, 22 Apr 2009 09:33:39 +0100
Subject: [Python-ideas] An idea for a new pickling tool
In-Reply-To: <e6511dbf0904212031n1e2402f6oaba7f341d0bc2284@mail.gmail.com>
References: <6C9D8FA62D3C429C9D164BEE7E9D928E@RaymondLaptop1>
	<3D3ED33F-3BC1-4162-A0AA-81FB934F7061@gmail.com>
	<e6511dbf0904212031n1e2402f6oaba7f341d0bc2284@mail.gmail.com>
Message-ID: <12cbbbfc0904220133o18fad73av1f3242a5a632302a@mail.gmail.com>

On Wed, Apr 22, 2009 at 4:31 AM, Josiah Carlson <josiah.carlson at gmail.com>wrote:

> On Tue, Apr 21, 2009 at 6:02 PM, Leonardo Santagada <santagada at gmail.com>
> wrote:
> >
> > For starters I want to say that this is a great idea overall but why not:
> >
> > On Apr 21, 2009, at 7:02 PM, Raymond Hettinger wrote:
> >
> >> * YAML is a superset of JSON, so the schema validation also works
> equally
> >> well with JSON encoded data.
> >
> >
> > With the already existing fast json serializers (simplejson) why not use
> > JSON instead of YAML? I like the YAML language, but JSON serializers
> exists
> > for many more languages and are usually much more used than YAML ones.
>
> +1 for json.  I've had bad luck with yaml in the past, but only have
> good things to say about json.
>
>
Another +1 for json here. I use it to communicate with python and perl on a
pretty regular basis, and never have a problem with it. It's a dead simple
serialization protocol, very hard to get wrong, and very interoperable.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-ideas/attachments/20090422/ed70f90d/attachment.html>

From daniel at stutzbachenterprises.com  Wed Apr 22 11:39:09 2009
From: daniel at stutzbachenterprises.com (Daniel Stutzbach)
Date: Wed, 22 Apr 2009 04:39:09 -0500
Subject: [Python-ideas] An idea for a new pickling tool
In-Reply-To: <3D3ED33F-3BC1-4162-A0AA-81FB934F7061@gmail.com>
References: <6C9D8FA62D3C429C9D164BEE7E9D928E@RaymondLaptop1>
	<3D3ED33F-3BC1-4162-A0AA-81FB934F7061@gmail.com>
Message-ID: <eae285400904220239k36c24db6m932b2196f1a5fff9@mail.gmail.com>

On Tue, Apr 21, 2009 at 8:02 PM, Leonardo Santagada <santagada at gmail.com>wrote:

> With the already existing fast json serializers (simplejson) why not use
> JSON instead of YAML? I like the YAML language, but JSON serializers exists
> for many more languages and are usually much more used than YAML ones.
>

JSON's appeal is in its simplicity, but it's TOO simple to serve as a
replacement for pickle.  For example, it can't encode recursive objects.

Since YAML is a superset of JSON, it's a very natural choice for those
already familiar with JSON who need a little more power.

--
Daniel Stutzbach, Ph.D.
President, Stutzbach Enterprises, LLC <http://stutzbachenterprises.com>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-ideas/attachments/20090422/9021feae/attachment.html>

From daniel at stutzbachenterprises.com  Wed Apr 22 11:53:32 2009
From: daniel at stutzbachenterprises.com (Daniel Stutzbach)
Date: Wed, 22 Apr 2009 04:53:32 -0500
Subject: [Python-ideas] An idea for a new pickling tool
In-Reply-To: <6C9D8FA62D3C429C9D164BEE7E9D928E@RaymondLaptop1>
References: <6C9D8FA62D3C429C9D164BEE7E9D928E@RaymondLaptop1>
Message-ID: <eae285400904220253od4b9399q609f3a65657e41e9@mail.gmail.com>

On Tue, Apr 21, 2009 at 5:02 PM, Raymond Hettinger <python at rcn.com> wrote:

> PyYAML ( http://pyyaml.org/wiki/PyYAML ) is a full implementation of
> the YAML standard.  It uses the YAML's application-specific tags and
> Python's own copy/reduce logic to provide the same power as pickle itself.
>
> Kwalify ( http://www.kuwata-lab.com/kwalify/ruby/users-guide.01.html )
> is a schema validator written in Ruby and Java.  It defines a
> YAML/JSON based schema definition for enforcing tight constraints
> on incoming data.
>

+1 on the general idea.  I abandoned pickle for JSON/YAML long ago.

To be useful, wouldn't a schema validator have to be built-in to the YAML
parser?

--
Daniel Stutzbach, Ph.D.
President, Stutzbach Enterprises, LLC <http://stutzbachenterprises.com>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-ideas/attachments/20090422/43769d06/attachment.html>

From ncoghlan at gmail.com  Wed Apr 22 14:14:35 2009
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Wed, 22 Apr 2009 22:14:35 +1000
Subject: [Python-ideas] from __future__ import range
In-Reply-To: <20090421161807.62e0c1d5@o>
References: <20090421115829.48bba8ca@o>	<49EDB874.9070707@gmail.com>
	<20090421161807.62e0c1d5@o>
Message-ID: <49EF0A2B.1030601@gmail.com>

spir wrote:
> Well, if (as intended) "from __future__ import range" means "use
> future (py3) semantics for 'range'", then the compiler need to know
> it ;-)

Nope, the compiler hasn't got a clue what different functions do - it's
job is done and dusted long before the bytecode is actually executed. So
it doesn't care what the semantics of range actually are - it just cares
that it is the name of something.

The reason "__future__" is special is because the compiler *does* know
about it, so such imports can actually change the behaviour of the
parser and code generator, thus making statements and expressions
actually mean different things.

It does exist as a real module as well, which provides information about
the various features that it can be used to enable:

>>> for name in __future__.all_feature_names:
...   feature = getattr(__future__, name)
...   print("{0:18}{1.optional:22}{1.mandatory}".format(name, feature))
...
nested_scopes     (2, 1, 0, 'beta', 1)  (2, 2, 0, 'alpha', 0)
generators        (2, 2, 0, 'alpha', 1) (2, 3, 0, 'final', 0)
division          (2, 2, 0, 'alpha', 2) (3, 0, 0, 'alpha', 0)
absolute_import   (2, 5, 0, 'alpha', 1) (2, 7, 0, 'alpha', 0)
with_statement    (2, 5, 0, 'alpha', 1) (2, 6, 0, 'alpha', 0)
print_function    (2, 6, 0, 'alpha', 2) (3, 0, 0, 'alpha', 0)
unicode_literals  (2, 6, 0, 'alpha', 2) (3, 0, 0, 'alpha', 0)
barry_as_FLUFL    (3, 1, 0, 'alpha', 2) (3, 9, 0, 'alpha', 0)

All of those are changes which affect the behaviour of the compiler. The
nested_scopes, division, absolute_import and unicode_literals features
don't add new syntax, but they change the meaning of existing syntax
(i.e. the compiler has to generate different bytecode when they are in
effect).

The generators and with_statement features actually added new keywords
for the parser to recognise (yield for generators, with and as for the
with statement).

The print_function features goes the other way: it tells the parser
*not* to treat print as a keyword.

All the details about this mechanism and how it works can be found in
the documentation ([1], [2]) and in the PEP that added it [3].

That's all tangential to the current point though, since the change to
range's semantics *isn't* something the compiler cares about. range() is
still a function in Py3k, it just behaves differently from the way it
behaves in 2.x.

That's where future_builtins comes in: as a location for new builtins
where the Py3k semantics are sufficiently different that they can't
readily be introduced into the 2.x series, but don't involve any changes
to syntax.

Adding the Py3k range type to that module would be a good idea, since it
actually isn't exactly the same as xrange in 2.x. Unlike xrange, Py3k's
range object can handle arbitrary ranges instead of being restricted by
the size of a C integer.

I suggest making this idea (add Py3k range implementation to 2.x
future_builtins) a feature request on the bug tracker. The benefits are
consistency with other builtins that are changing in Py3k and also
providing an arbitrary length range implementation in the 2.x series.

Cheers,
Nick.

[1] http://docs.python.org/library/__future__
[2] http://docs.python.org/reference/simple_stmts.html#future-statements
[3] http://www.python.org/dev/peps/pep-0236/

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------


From erik at cq2.org  Wed Apr 22 14:53:17 2009
From: erik at cq2.org (Erik Groeneveld)
Date: Wed, 22 Apr 2009 14:53:17 +0200
Subject: [Python-ideas] Revised**12 PEP on Yield-From
In-Reply-To: <49EAC4F9.90107@canterbury.ac.nz>
References: <49EAC4F9.90107@canterbury.ac.nz>
Message-ID: <aaec99390904220553u2a4f937sff6de2aedb73537f@mail.gmail.com>

Greg,

I am still busy understanding what your PEP means to the framework
that I have been building.  I believe that, for practical usage, there
is still something missing, or at least, not clear.

Suppose we would like to write a HTTP protocol stack using generators,
and we have something like (actual working code, read 'yield from' for
every 'yield'):

   try:
           reqArgs = yield readRe(REGEXP.REQUEST, MAXREQUESTSIZE)
   except OverflowError:
       yield requestEntityTooLarge()
       yield HTTP.CRLF
       return
   headers = parseHeaders(reqArgs)
   yield processBody(headers)

The function 'readRe' will read and buffer a part of the request as
defined by the regular expression REGEXP.REQUEST, and it will raise
OverflowError when it keeps reading while never making a match.

The point is that readRe accepts chunks of data that are not aligned
to protocol boundaries.  This is a typical boundary clash as Jackson
calls it (I tend to think of this stuff as JSP pipelines) and JSP
describes how to solve it.

But to be able to solve it, the readRe generator must be able to
indicate that it has superfluous data, and this data must be processed
by other generators.  In the case of the example, 'readRe' might have
been reading parts of the body (assuming a POST request).

After I created 'compose' I started implementing practical stuff like
this, and it soon turned out that 'compose' must support boundary
clashes or all but toy problems would still be unsolvable with
generators.

Therefor 'compose' now has a feature to return a value together with
remaining data:

   raise StopIteration(retval, remainingChunk1, remainingChunk2, ...)

This will push retval to the delegating generator as a return value
for yield, and then feed the remainingChunk's to whatever generators
come next.  In your PEP, this would be a return statement of course.

Have you though about this?  How would you solve it?


Best regards,
Erik

E.J. Groeneveld
Seek You Too


From jnoller at gmail.com  Wed Apr 22 15:10:31 2009
From: jnoller at gmail.com (Jesse Noller)
Date: Wed, 22 Apr 2009 09:10:31 -0400
Subject: [Python-ideas] An idea for a new pickling tool
In-Reply-To: <4222a8490904211741s5f31c733n55f2fda209f294e7@mail.gmail.com>
References: <6C9D8FA62D3C429C9D164BEE7E9D928E@RaymondLaptop1>
	<4222a8490904211741s5f31c733n55f2fda209f294e7@mail.gmail.com>
Message-ID: <4222a8490904220610i30ea7e27xc23181c7530c59e2@mail.gmail.com>

On Tue, Apr 21, 2009 at 8:41 PM, Jesse Noller <jnoller at gmail.com> wrote:
> On Tue, Apr 21, 2009 at 6:02 PM, Raymond Hettinger <python at rcn.com> wrote:
>> Motivation
>> ----------
>>
>> Python's pickles use a custom format that has evolved over time
>> but they have five significant disadvantages:
>>
>> ? * it has lost its human readability and editability
>> ? * is doesn't compress well
>> ? * it isn't interoperable with other languages
>> ? * it doesn't have the ability to enforce a schema
>> ? * it is a major security risk for untrusted inputs
>>
>>
>> New idea
>> --------
>>
>> Develop a solution using a mix of PyYAML, a python coded version of
>> Kwalify, optional compression using bz2, gzip, or zlib, and pretty
>> printing using pygments.
>>
>> YAML ( http://yaml.org/spec/1.2/ ) is a language independent standard
>> for data serialization.
>>
>> PyYAML ( http://pyyaml.org/wiki/PyYAML ) is a full implementation of
>> the YAML standard. ?It uses the YAML's application-specific tags and
>> Python's own copy/reduce logic to provide the same power as pickle itself.
>>
>> Kwalify ( http://www.kuwata-lab.com/kwalify/ruby/users-guide.01.html )
>> is a schema validator written in Ruby and Java. ?It defines a
>> YAML/JSON based schema definition for enforcing tight constraints
>> on incoming data.
>>
>> The bz2, gzip, and zlib compression libraries are already built into
>> the language.
>>
>> Pygments ( http://pygments.org/ ) is python based syntax highlighter
>> with builtin support for YAML.
>>
>>
>> Advantages
>> ----------
>>
>> * The format is simple enough to hand edit or to have lightweight
>> ?applications emit valid pickles. ?For example:
>>
>> ? ? print('Todo: [go to bank, pick up food, write code]') ? # valid pickle
>>
>> * To date, efforts to make pickles smaller have focused on creating new
>> ?codes for every data type. ?Instead, we can use the simple text formatting
>> ?of YAML and let general purpose data compression utilities do their job
>> ?(letting the user control the trade-offs between speed, space, and human
>> ?readability):
>>
>> ? ? yaml.dump(data, compressor=None) ?# fast, human readable, no compression
>> ? ? yaml.dump(data, compressor=bz2) ? # slowest, but best compression
>> ? ? yaml.dump(data, compressor=zlib) ?# medium speed and medium compression
>>
>> * The current pickle tools makes it easy to exchange object trees between
>> ?two Python processes. ?The new tool would make it equally easy ?to exchange
>> object trees between processes running any of Python, Ruby, ?Java, C/C++,
>> Perl, C#, PHP, OCaml, Javascript, ActionScript, and Haskell.
>>
>> * Ability to use a schema for enforcing a given object model and allowing
>> ?full security. ?Which would you rather run on untrusted data:
>>
>> ? ? data = yaml.load(myfile, schema=ListOfStrings)
>>
>> ?or
>>
>> ? ? data = pickle.load(myfile)
>>
>> * Specification of a schema using YAML itself
>>
>> ?ListOfStrings (a schema written in yaml)
>> ?........................................
>> ?type: ? seq
>> ?sequence:
>> ? - type: ? str
>>
>> ?Sample of valid input
>> ?.....................
>> ?- foo
>> ?- bar
>> ?- baz
>>
>> ?Note, schemas can be defined for very complex, nested object models and
>> ?allow many kinds of constraints (unique items, enumerated list of allowable
>> ?values, min/max allowable ranges for values, data type, maximum length,
>> ?and names of regular Python classes that can be constructed).
>>
>> * YAML is a superset of JSON, so the schema validation also works equally
>> ?well with JSON encoded data.
>>
>> What needs to be done
>> ---------------------
>>
>> * Combine the tools for a single, clean interface to C speed parsing
>> ?of a data serialization standard, with optional compression, schema
>> ?validation, and pretty printing.
>
> Just to add to this, I remembered someone recently did a simple
> benchmark of thift/JSON/YAML/Protocol Buffers, here are the links:
>
> http://www.bouncybouncy.net/ramblings/posts/thrift_and_protocol_buffers/
> http://www.bouncybouncy.net/ramblings/posts/more_on_json_vs_thrift_and_protocol_buffers/
> http://www.bouncybouncy.net/ramblings/posts/json_vs_thrift_and_protocol_buffers_round_2/
>
> Without digging into the numbers too much, it's worth noting that
> PyYAML is written in pure python but also has Libyaml
> (http://pyyaml.org/wiki/LibYAML) bindings for speed. When I get a
> chance, I can run the same test(s) with both the pure-python
> implementation and the libyaml one as well as see how much the speedup
> is.
>

Speaking of benchmarks, last night I took the first benchmark cited in
the links above, and with some work I ran the same benchmark with
PyYAML (pure python) and PyYAML with libyaml (the C version). The
PyYAML -> libyaml bindings require Cython right now, but here are the
numbers.

I removed thift and proto buffers, as I wanted to focus on YAML/JSON right now:

5000 total records (0.510s)

ser_json             (0.030s) 718147 bytes
ser_cjson            (0.030s) 718147 bytes
ser_yaml             (6.230s) 623147 bytes
ser_cyaml            (2.040s) 623147 bytes

ser_json_compressed    (0.100s) 292987 bytes
ser_cjson_compressed   (0.110s) 292987 bytes
ser_yaml_compressed    (6.310s) 291018 bytes
ser_cyaml_compressed   (2.140s) 291018 bytes

serde_json           (0.050s)
serde_cjson          (0.050s)
serde_yaml           (19.020s)
serde_cyaml          (4.460s)

Running the second benchmark (the integer one) I see:

10000 total records (0.130s)

ser_json             (0.040s) 680749 bytes
ser_cjson            (0.030s) 680749 bytes
ser_yaml             (8.250s) 610749 bytes
ser_cyaml            (3.040s) 610749 bytes

ser_json_compressed    (0.100s) 124924 bytes
ser_cjson_compressed   (0.090s) 124924 bytes
ser_yaml_compressed    (8.320s) 121090 bytes
ser_cyaml_compressed   (3.110s) 121090 bytes

serde_json           (0.060s)
serde_cjson          (0.070s)
serde_yaml           (24.190s)
serde_cyaml          (6.690s)


So yes, the pure python numbers for yaml (_yaml) are pretty bad; the
libyaml (_cyaml) numbers are significantly improved, but not as fast
as JSON/CJSON.

One thing to note in this discussion as others have pointed out, while
JSON itself it awfully fast/nice, it lacks some of the capabilities of
YAML, for example certain objects can not be represented in JSON.
Additionally, if we want to simply state "objects which you desire to
be compatible with JSON have the following restrictions" we can - this
means we can also leverage things within PyYAML which are also
nice-to-haves, for example the !!python additions.

Picking YAML in this case means we get all of the YAML syntax,
objects, etc - and if consumers want to stick with JSON compatibility,
we could add a dump(canonical=True, compatibility=JSON) or somesuch
flag.

jesse


From erik at cq2.org  Wed Apr 22 16:51:17 2009
From: erik at cq2.org (Erik Groeneveld)
Date: Wed, 22 Apr 2009 16:51:17 +0200
Subject: [Python-ideas] Revised**12 PEP on Yield-From
In-Reply-To: <49EAC4F9.90107@canterbury.ac.nz>
References: <49EAC4F9.90107@canterbury.ac.nz>
Message-ID: <aaec99390904220751l2a51a056xf0cc86bd99504264@mail.gmail.com>

Dear Greg,

2009/4/19 Greg Ewing <greg.ewing at canterbury.ac.nz>:
> Draft 13 of the PEP.

> ? ?* Exceptions other than GeneratorExit thrown into the delegating
> ? ? ?generator are passed to the ``throw()`` method of the iterator.
> ? ? ?If the call raises StopIteration, the delegating generator is resumed.
> ? ? ?Any other exception is propagated to the delegating generator.

I have implemented this in Weightless, and changed my implementation
as to work with BaseException instead of Exception.  This works well.
However, I was not able to make an exception for GeneratorExit, see
next point.

> ? ?* If a GeneratorExit exception is thrown into the delegating generator,
> ? ? ?or the ``close()`` method of the delegating generator is called, then
> ? ? ?the ``close()`` method of the iterator is called if it has one. If this
> ? ? ?call results in an exception, it is propagated to the delegating generator.
> ? ? ?Otherwise, GeneratorExit is raised in the delegating generator.

I tried to implement this, but I failed.  The reason is that a
generator's close() checks for an exception being raised by its
generator.  When no exception has been raised, it will raise the
RuntimeError('generator ignored GeneratorExit').  And when an
exception has been raised (regardless what type), it will exit just
normally.  So the phrase

> If this
>      call results in an exception, it is propagated to the delegating generator.

applies only to the RuntimeError close() might throw.  And the ramaining phrase

>      Otherwise, GeneratorExit is raised in the delegating generator.

makes it behave no different than for all other types of exceptions.

Was this intended?  If Yes, I suggest to make the text clearer and
more specific about it.  If No, then what is the correct expansion?

Best regards
Erik Groeneveld


From jh at improva.dk  Wed Apr 22 17:18:18 2009
From: jh at improva.dk (Jacob Holm)
Date: Wed, 22 Apr 2009 17:18:18 +0200
Subject: [Python-ideas] Revised**12 PEP on Yield-From
In-Reply-To: <aaec99390904220751l2a51a056xf0cc86bd99504264@mail.gmail.com>
References: <49EAC4F9.90107@canterbury.ac.nz>
	<aaec99390904220751l2a51a056xf0cc86bd99504264@mail.gmail.com>
Message-ID: <49EF353A.1060204@improva.dk>

Hi Erik

Erik Groeneveld wrote:
> 
>>    * If a GeneratorExit exception is thrown into the delegating generator,
>>      or the ``close()`` method of the delegating generator is called, then
>>      the ``close()`` method of the iterator is called if it has one. If this
>>      call results in an exception, it is propagated to the delegating generator.
>>      Otherwise, GeneratorExit is raised in the delegating generator.
> 
> I tried to implement this, but I failed.  The reason is that a
> generator's close() checks for an exception being raised by its
> generator.  When no exception has been raised, it will raise the
> RuntimeError('generator ignored GeneratorExit').  And when an
> exception has been raised (regardless what type), it will exit just
> normally.


In other words, the ValueError in the following example is swallowed by 
close():


 >>> def foo():
...     yield 'bar'
...     raise ValueError('baz')
 >>> g = foo()
 >>> g.next()
'bar'
 >>> g.close()


This looks like close() doesn't actually behave as it should according 
to PEP342.  Interesting...

I would call that a bug.


>  So the phrase
> 
>> If this
>>      call results in an exception, it is propagated to the delegating generator.
> 
> applies only to the RuntimeError close() might throw.  And the ramaining phrase
> 
>>      Otherwise, GeneratorExit is raised in the delegating generator.
> 
> makes it behave no different than for all other types of exceptions.
> 
> Was this intended?  If Yes, I suggest to make the text clearer and
> more specific about it.  If No, then what is the correct expansion?


The assumption was that close() would behave as described in PEP342, 
which would make the description in the yield-from PEP380 correct.

I can't believe that none of us actually tested that...

Cheers
- Jacob



From dstanek at dstanek.com  Wed Apr 22 17:51:42 2009
From: dstanek at dstanek.com (David Stanek)
Date: Wed, 22 Apr 2009 11:51:42 -0400
Subject: [Python-ideas] An idea for a new pickling tool
In-Reply-To: <4222a8490904220610i30ea7e27xc23181c7530c59e2@mail.gmail.com>
References: <6C9D8FA62D3C429C9D164BEE7E9D928E@RaymondLaptop1>
	<4222a8490904211741s5f31c733n55f2fda209f294e7@mail.gmail.com>
	<4222a8490904220610i30ea7e27xc23181c7530c59e2@mail.gmail.com>
Message-ID: <de32cc030904220851i65859559v33005f1505a07c38@mail.gmail.com>

On Wed, Apr 22, 2009 at 9:10 AM, Jesse Noller <jnoller at gmail.com> wrote:
>
> Picking YAML in this case means we get all of the YAML syntax,
> objects, etc - and if consumers want to stick with JSON compatibility,
> we could add a dump(canonical=True, compatibility=JSON) or somesuch
> flag.
>

I think that someone should create a new pickle module and put it up
on the Cheeseshop. I would prefer explicit dumpJSON and dumpYAML
functions, but I realize that breaks the current interface. If people
bite then start talking about using it as pickle's implementation.

-- 
David
blog: http://www.traceback.org
twitter: http://twitter.com/dstanek


From erik at cq2.org  Wed Apr 22 18:05:11 2009
From: erik at cq2.org (Erik Groeneveld)
Date: Wed, 22 Apr 2009 18:05:11 +0200
Subject: [Python-ideas] Revised**12 PEP on Yield-From
In-Reply-To: <49EF353A.1060204@improva.dk>
References: <49EAC4F9.90107@canterbury.ac.nz>
	<aaec99390904220751l2a51a056xf0cc86bd99504264@mail.gmail.com>
	<49EF353A.1060204@improva.dk>
Message-ID: <aaec99390904220905h31d139f7sd9ae98c62aee58b5@mail.gmail.com>

Hi Jacob,

2009/4/22 Jacob Holm <jh at improva.dk>:
> In other words, the ValueError in the following example is swallowed by close():
>
>
>>>> def foo():
> ... ? ? yield 'bar'
> ... ? ? raise ValueError('baz')
>>>> g = foo()
>>>> g.next()
> 'bar'
>>>> g.close()
>

I have the same code here, it does not raise an exception indeed.

> This looks like close() doesn't actually behave as it should according to PEP342. ?Interesting...

>From PEP342 (Thanks Jacob, for the reference):

4. Add a close() method for generator-iterators, which raises
       GeneratorExit at the point where the generator was paused.  If
       the generator then raises StopIteration (by exiting normally, or
       due to already being closed) or GeneratorExit (by not catching
       the exception), close() returns to its caller.  If the generator
       yields a value, a RuntimeError is raised.  If the generator
       raises any other exception, it is propagated to the caller.
       close() does nothing if the generator has already exited due to
       an exception or normal exit.


> I would call that a bug.

I agree with that.

> I can't believe that none of us actually tested that...

Someone did!

Erik


From solipsis at pitrou.net  Wed Apr 22 20:38:30 2009
From: solipsis at pitrou.net (Antoine Pitrou)
Date: Wed, 22 Apr 2009 18:38:30 +0000 (UTC)
Subject: [Python-ideas] An idea for a new pickling tool
References: <6C9D8FA62D3C429C9D164BEE7E9D928E@RaymondLaptop1>
Message-ID: <loom.20090422T182950-171@post.gmane.org>

Raymond Hettinger <python at ...> writes:
> 
> Python's pickles use a custom format that has evolved over time
> but they have five significant disadvantages:
> 
>     * is doesn't compress well

Do you mean the binary representation is already memory efficient enough? It
doesn't sound like a disadvantage.

>     * it is a major security risk for untrusted inputs

Any untrusted input is a security risk. I don't see how enforcing that the
values received are strings or numbers is enough to guarantee security. It all
depends on the context. For example, if the strings are meant to be interpreted
as filenames, you'd better check that the user doesn't try to mess with system
files.

Regards

Antoine.




From robert.kern at gmail.com  Wed Apr 22 22:38:00 2009
From: robert.kern at gmail.com (Robert Kern)
Date: Wed, 22 Apr 2009 15:38:00 -0500
Subject: [Python-ideas] An idea for a new pickling tool
In-Reply-To: <ca471dc20904212008s4cb605cfy684484a82f796b0f@mail.gmail.com>
References: <6C9D8FA62D3C429C9D164BEE7E9D928E@RaymondLaptop1>
	<ca471dc20904211526s5174f6aep61403172081f0dfc@mail.gmail.com>
	<635AD1835B614CD1A77DD758B9B7BBBB@RaymondLaptop1>
	<ca471dc20904212008s4cb605cfy684484a82f796b0f@mail.gmail.com>
Message-ID: <gsnv79$l4t$1@ger.gmane.org>

On 2009-04-21 22:08, Guido van Rossum wrote:
> On Tue, Apr 21, 2009 at 5:56 PM, Raymond Hettinger<python at rcn.com>  wrote:

>> Human readability/editability comes for free,
>
> How important is that though?

I've had to debug a number of pickle problems where answering "What's pulling in 
*that* object?" by a quick grep would have been really handy. pickletools.dis() 
goes part of the way, but a hierarchical text representation rather than 
bytecode would have been better.

-- 
Robert Kern

"I have come to believe that the whole world is an enigma, a harmless enigma
  that is made terrible by our own mad attempt to interpret it as though it had
  an underlying truth."
   -- Umberto Eco



From jess.austin at gmail.com  Wed Apr 22 22:59:55 2009
From: jess.austin at gmail.com (Jess Austin)
Date: Wed, 22 Apr 2009 15:59:55 -0500
Subject: [Python-ideas] [Python-Dev] Issue5434: datetime.monthdelta
In-Reply-To: <b8ad139e0904161632v386610f5w3078a48d293169e7@mail.gmail.com>
References: <b8ad139e0904161441g9782b75r1bd5811157eca079@mail.gmail.com>
	<49E7A823.3060702@trueblade.com>
	<b8ad139e0904161535y3257a322wb1a31c93927ea3c7@mail.gmail.com>
	<50697b2c0904161541k613f8ad8jba3e3f553853dfb@mail.gmail.com>
	<b8ad139e0904161632v386610f5w3078a48d293169e7@mail.gmail.com>
Message-ID: <b8ad139e0904221359k1107ad46h7e78034a8c89c9ff@mail.gmail.com>

On Thu, Apr 16, 2009 at 6:32 PM, Jess Austin <jess.austin at gmail.com> wrote:
> My take on that is that if you want an exception for invalid dates,
> use date.replace(). ?If you want an exact number of days offset, use
> timedelta. ?If you want the same date, some number of months offset,
> while month-end issues are silently handled, you can use the
> monthdelta patch I have at http://bugs.python.org/issue5434 and
> introduce at http://mail.python.org/pipermail/python-dev/2009-April/088794.html

I've uploaded the backported python version source distribution to
PyPI, http://pypi.python.org/pypi?name=MonthDelta&:action=display with
better-formatted documentation at
http://packages.python.org/MonthDelta/

"easy_install MonthDelta" works too.

cheers,
Jess


From alexandre at peadrop.com  Wed Apr 22 23:14:58 2009
From: alexandre at peadrop.com (Alexandre Vassalotti)
Date: Wed, 22 Apr 2009 17:14:58 -0400
Subject: [Python-ideas] An idea for a new pickling tool
In-Reply-To: <6C9D8FA62D3C429C9D164BEE7E9D928E@RaymondLaptop1>
References: <6C9D8FA62D3C429C9D164BEE7E9D928E@RaymondLaptop1>
Message-ID: <acd65fa20904221414u78bf33efse67c71efeff9c5ce@mail.gmail.com>

On Tue, Apr 21, 2009 at 6:02 PM, Raymond Hettinger <python at rcn.com> wrote:
> Motivation
> ----------
>
> Python's pickles use a custom format that has evolved over time
> but they have five significant disadvantages:
>
> ? * it has lost its human readability and editability
>

This is not part of pickle design goals. Also, I don't think the
pickle protocol ever been a human-friendly format. Even if protocol 0
is ASCII-based, it doesn't mean one would like to edit it by hand.

> ? * is doesn't compress well

Do you have numbers to support this? The last time I tested
compression on pickle data, it worked fairly well. In fact, I get a
2.70 compression ratio for some pickles using gzip.

>From my experience with pickle, I doubt you can improve significantly
the size of pickled data, without using static schemata (like Google
Protocol Buffers and Thrift). The only inefficient thing in pickle, I
am aware of, is the handling of PUT and GET opcodes.

> ? * it isn't interoperable with other languages
> ? * it doesn't have the ability to enforce a schema

Again, these are not part of pickle's design goals.

> ? * it is a major security risk for untrusted inputs
>

There are way to fix this without replacing pickle. See the recipe in
pickle documentation:

http://docs.python.org/3.0/library/pickle.html#restricting-globals

> New idea
> --------
>
> Develop a solution using a mix of PyYAML, a python coded version of
> Kwalify, optional compression using bz2, gzip, or zlib, and pretty
> printing using pygments.
>
> YAML ( http://yaml.org/spec/1.2/ ) is a language independent standard
> for data serialization.
>
> PyYAML ( http://pyyaml.org/wiki/PyYAML ) is a full implementation of
> the YAML standard. ?It uses the YAML's application-specific tags and
> Python's own copy/reduce logic to provide the same power as pickle itself.
>

But how are you going to handle serialization of class instances in a
language independent manner?

Regards,
-- Alexandre


From python at rcn.com  Wed Apr 22 23:27:56 2009
From: python at rcn.com (Raymond Hettinger)
Date: Wed, 22 Apr 2009 14:27:56 -0700
Subject: [Python-ideas] An idea for a new pickling tool
References: <6C9D8FA62D3C429C9D164BEE7E9D928E@RaymondLaptop1>
	<acd65fa20904221414u78bf33efse67c71efeff9c5ce@mail.gmail.com>
Message-ID: <206AC57CF0A341128B419B1987FF14C1@RaymondLaptop1>

>> * it has lost its human readability and editability

> This is not part of pickle design goals.

However, it's one of my design goals for something
better than the pickle we have now.

One benefit is that it eliminates the need for a pickle
disassembler.

Another benefit is that valid pickles can be created easily
by something other than a pickler (experiences with json
have shown this to be useful).  It's nice for a random
javascript fragment or iphone app to just be able to print 
a valid pickle for a particular use.  It's all about loose
coupling.

Also, human readability goes hand in hand with the new
design goal for language independence.  It's a lot easier
to design and test two-way communication with Java, C++,
and others if you can easily see what is in the pickler.


Raymond



From alexandre at peadrop.com  Thu Apr 23 01:05:14 2009
From: alexandre at peadrop.com (Alexandre Vassalotti)
Date: Wed, 22 Apr 2009 19:05:14 -0400
Subject: [Python-ideas] [Private Note] An idea for a new pickling tool
In-Reply-To: <4B3E63C2BDD544C78232C08E081158EA@RaymondLaptop1>
References: <6C9D8FA62D3C429C9D164BEE7E9D928E@RaymondLaptop1> 
	<acd65fa20904221414u78bf33efse67c71efeff9c5ce@mail.gmail.com> 
	<4B3E63C2BDD544C78232C08E081158EA@RaymondLaptop1>
Message-ID: <acd65fa20904221605q2a0af099q6bc2bcc58c14d7c9@mail.gmail.com>

On Wed, Apr 22, 2009 at 5:37 PM, Raymond Hettinger <python at rcn.com> wrote:
>>> * it is a major security risk for untrusted inputs
>
>> There are way to fix this without replacing pickle. See the recipe in
>> pickle documentation:
>>
>> http://docs.python.org/3.0/library/pickle.html#restricting-globals
>
> If you think untrusted pickles can easily be made secure, then you've
> missed the last ten years of discussions on the subject. ? There's a
> reason we put the big red warnings in the docs.
>

Could you elaborate on this, or point me to the specific discussions?
And how to you plan to make your alternative secure?

>
>> But how are you going to handle serialization of class instances in a
>> language independent manner?
>
> The same way RPC works, you need to have similar structures on
> each end. ?Take a look at JSON-RPC to get an idea of how this
> works.

That makes sense, thanks.

> Overall, I don't see what you're getting at. ?I'm not looking to
> eliminate the current pickles.

Ah then I have nothing against your proposal. It is the way you
presented your idea against pickle that confused me; I actually
thought you wanted to replace pickle.

In that case, you probably want to take a look at the twisted.jelly
module and pysyck. They each share some of the goals you aiming for.

Cheers,
-- Alexandre


From tleeuwenburg at gmail.com  Thu Apr 23 01:06:00 2009
From: tleeuwenburg at gmail.com (Tennessee Leeuwenburg)
Date: Thu, 23 Apr 2009 09:06:00 +1000
Subject: [Python-ideas] [Python-Dev] Issue5434: datetime.monthdelta
In-Reply-To: <b8ad139e0904221359k1107ad46h7e78034a8c89c9ff@mail.gmail.com>
References: <b8ad139e0904161441g9782b75r1bd5811157eca079@mail.gmail.com>
	<49E7A823.3060702@trueblade.com>
	<b8ad139e0904161535y3257a322wb1a31c93927ea3c7@mail.gmail.com>
	<50697b2c0904161541k613f8ad8jba3e3f553853dfb@mail.gmail.com>
	<b8ad139e0904161632v386610f5w3078a48d293169e7@mail.gmail.com>
	<b8ad139e0904221359k1107ad46h7e78034a8c89c9ff@mail.gmail.com>
Message-ID: <43c8685c0904221606u2686bdccgbaeafb11b29de30b@mail.gmail.com>

Good doco!

-T

On Thu, Apr 23, 2009 at 6:59 AM, Jess Austin <jess.austin at gmail.com> wrote:

> On Thu, Apr 16, 2009 at 6:32 PM, Jess Austin <jess.austin at gmail.com>
> wrote:
> > My take on that is that if you want an exception for invalid dates,
> > use date.replace().  If you want an exact number of days offset, use
> > timedelta.  If you want the same date, some number of months offset,
> > while month-end issues are silently handled, you can use the
> > monthdelta patch I have at http://bugs.python.org/issue5434 and
> > introduce at
> http://mail.python.org/pipermail/python-dev/2009-April/088794.html
>
> I've uploaded the backported python version source distribution to
> PyPI, http://pypi.python.org/pypi?name=MonthDelta&:action=display with
> better-formatted documentation at
> http://packages.python.org/MonthDelta/
>
> "easy_install MonthDelta" works too.
>
> cheers,
> Jess
> _______________________________________________
> Python-ideas mailing list
> Python-ideas at python.org
> http://mail.python.org/mailman/listinfo/python-ideas
>



-- 
--------------------------------------------------
Tennessee Leeuwenburg
http://myownhat.blogspot.com/
"Don't believe everything you think"
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-ideas/attachments/20090423/ac345188/attachment.html>

From Scott.Daniels at Acm.Org  Thu Apr 23 01:16:49 2009
From: Scott.Daniels at Acm.Org (Scott David Daniels)
Date: Wed, 22 Apr 2009 16:16:49 -0700
Subject: [Python-ideas] PYTHONUSERBASES (plural!)
In-Reply-To: <49EE3D19.7000603@hastings.org>
References: <49EE3D19.7000603@hastings.org>
Message-ID: <gso88g$hnv$1@ger.gmane.org>

Larry Hastings wrote:
> 
> I was excited to see PYTHONUSERBASE added in 2.6/3.0.  But I became 
> disappointed when I learned it only allows you to specify one 
> directory.  I have use cases where I'd want more than one.

But you can put a .pth file in that directory specifying a number of
directories to add to the path.  The current mechanism is sufficient.

--Scott David Daniels
Scott.Daniels at Acm.Org



From lists+python-ideas at jimpryor.net  Thu Apr 23 02:09:33 2009
From: lists+python-ideas at jimpryor.net (Jim Pryor)
Date: Wed, 22 Apr 2009 20:09:33 -0400
Subject: [Python-ideas] Revised**12 PEP on Yield-From
In-Reply-To: <aaec99390904220905h31d139f7sd9ae98c62aee58b5@mail.gmail.com>
References: <49EAC4F9.90107@canterbury.ac.nz>
	<aaec99390904220751l2a51a056xf0cc86bd99504264@mail.gmail.com>
	<49EF353A.1060204@improva.dk>
	<aaec99390904220905h31d139f7sd9ae98c62aee58b5@mail.gmail.com>
Message-ID: <20090423000933.GB11800@vaio.nyu.edu>

On Wed, Apr 22, 2009 at 06:05:11PM +0200, Erik Groeneveld wrote:
> 2009/4/22 Jacob Holm <jh at improva.dk>:
> > In other words, the ValueError in the following example is swallowed
> > by close():
> >
> >
> >>>> def foo():
> > ...     yield 'bar'
> > ...     raise ValueError('baz')
> >>>> g = foo()
> >>>> g.next()
> > 'bar'
> >>>> g.close()
> >
>
> I have the same code here, it does not raise an exception indeed.
>
> > This looks like close() doesn't actually behave as it should
> > according to PEP342.  Interesting...
>
> From PEP342 (Thanks Jacob, for the reference):
>
> 4. Add a close() method for generator-iterators, which raises
>        GeneratorExit at the point where the generator was paused.  If
>        the generator then raises StopIteration (by exiting normally,
>        or
>        due to already being closed) or GeneratorExit (by not catching
>        the exception), close() returns to its caller.  If the
>        generator
>        yields a value, a RuntimeError is raised.  If the generator
>        raises any other exception, it is propagated to the caller.
>        close() does nothing if the generator has already exited due to
>        an exception or normal exit.
>
>
> > I would call that a bug.
>
> I agree with that.

I don't understand why you're both counting this as a bug. It looks like
exactly the behavior specified in PEP 342. When g.close() is evaluated,
a GeneratorExit is thrown to the suspended 'yield' expression in foo.
That exception is is not caught, so g terminates without executing the
rest of its code. The 'raise ValueError' line is never executed.

-- 
Jim Pryor
jim at jimpryor.net


From jh at improva.dk  Thu Apr 23 02:20:20 2009
From: jh at improva.dk (Jacob Holm)
Date: Thu, 23 Apr 2009 02:20:20 +0200
Subject: [Python-ideas] Revised**12 PEP on Yield-From
In-Reply-To: <20090423000933.GB11800@vaio.nyu.edu>
References: <49EAC4F9.90107@canterbury.ac.nz>	<aaec99390904220751l2a51a056xf0cc86bd99504264@mail.gmail.com>	<49EF353A.1060204@improva.dk>	<aaec99390904220905h31d139f7sd9ae98c62aee58b5@mail.gmail.com>
	<20090423000933.GB11800@vaio.nyu.edu>
Message-ID: <49EFB444.9080808@improva.dk>

Hi Jim

Jim Pryor wrote:
>>> I would call that a bug.
>> I agree with that.
> 
> I don't understand why you're both counting this as a bug. It looks like
> exactly the behavior specified in PEP 342. When g.close() is evaluated,
> a GeneratorExit is thrown to the suspended 'yield' expression in foo.
> That exception is is not caught, so g terminates without executing the
> rest of its code. The 'raise ValueError' line is never executed.
> 


You are of course completely right.  What I *should* have tried was the 
following:

 >>> def foo():
...     try:
...         yield 'bar'
...     except:
...         raise ValueError('baz')
...
 >>> g = foo()
 >>> g.next()
'bar'
 >>> g.close()
Traceback (most recent call last):
   File "<stdin>", line 1, in <module>
   File "<stdin>", line 5, in foo
ValueError: baz


Which works as expected.  No bug in sight.  In that case I am not sure 
what Eriks original problem was.

Cheers
- Jacob


From greg.ewing at canterbury.ac.nz  Thu Apr 23 02:53:02 2009
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Thu, 23 Apr 2009 12:53:02 +1200
Subject: [Python-ideas] Revised**12 PEP on Yield-From
In-Reply-To: <aaec99390904220551g325c6905redd038324ccce648@mail.gmail.com>
References: <49EAC4F9.90107@canterbury.ac.nz>
	<aaec99390904220551g325c6905redd038324ccce648@mail.gmail.com>
Message-ID: <49EFBBEE.20401@canterbury.ac.nz>

Erik Groeneveld wrote:

> the readRe generator must be able to
> indicate that it has superfluous data, and this data must be processed
> by other generators.
> 
> Have you though about this?  How would you solve it?

I think you're expecting a bit much from yield-from.

In a situation like this, I wouldn't use yield to
receive the values. I'd read them from some kind of
buffering object that allows peeking ahead however
far is needed.

-- 
Greg



From greg.ewing at canterbury.ac.nz  Thu Apr 23 03:20:22 2009
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Thu, 23 Apr 2009 13:20:22 +1200
Subject: [Python-ideas] Revised**12 PEP on Yield-From
In-Reply-To: <aaec99390904220751l2a51a056xf0cc86bd99504264@mail.gmail.com>
References: <49EAC4F9.90107@canterbury.ac.nz>
	<aaec99390904220751l2a51a056xf0cc86bd99504264@mail.gmail.com>
Message-ID: <49EFC256.3000208@canterbury.ac.nz>

Erik Groeneveld wrote:

> When no exception has been raised, it will raise the
> RuntimeError('generator ignored GeneratorExit').  And when an
> exception has been raised (regardless what type), it will exit just
> normally.

Eh? That's not what happens according to the following
experiment:

   def g():
     try:
       yield
     except GeneratorExit:
       raise ValueError("Blarg!")

   gi = g()
   gi.next()
   gi.close()

which produces

Traceback (most recent call last):
   File "g.py", line 9, in <module>
     gi.close()
   File "g.py", line 5, in g
     raise ValueError("Blarg!")
ValueError: Blarg!

-- 
Greg


From greg.ewing at canterbury.ac.nz  Thu Apr 23 03:23:58 2009
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Thu, 23 Apr 2009 13:23:58 +1200
Subject: [Python-ideas] Revised**12 PEP on Yield-From
In-Reply-To: <49EF353A.1060204@improva.dk>
References: <49EAC4F9.90107@canterbury.ac.nz>
	<aaec99390904220751l2a51a056xf0cc86bd99504264@mail.gmail.com>
	<49EF353A.1060204@improva.dk>
Message-ID: <49EFC32E.3080406@canterbury.ac.nz>

Jacob Holm wrote:

> In other words, the ValueError in the following example is swallowed by 
> close():
> 
>  >>> def foo():
> ...     yield 'bar'
> ...     raise ValueError('baz')
>  >>> g = foo()
>  >>> g.next()
> 'bar'
>  >>> g.close()

No, g never raises a ValueError here, because the raise
statement is not reached.

You need to wrap the yield in a try-except to catch the
GeneratorExit.

-- 
Greg


From larry at hastings.org  Thu Apr 23 04:33:44 2009
From: larry at hastings.org (Larry Hastings)
Date: Wed, 22 Apr 2009 19:33:44 -0700
Subject: [Python-ideas] PYTHONUSERBASES (plural!)
In-Reply-To: <gso88g$hnv$1@ger.gmane.org>
References: <49EE3D19.7000603@hastings.org> <gso88g$hnv$1@ger.gmane.org>
Message-ID: <49EFD388.7080002@hastings.org>

Scott David Daniels wrote:
> Larry Hastings wrote:
>>
>> I was excited to see PYTHONUSERBASE added in 2.6/3.0.  But I became 
>> disappointed when I learned it only allows you to specify one 
>> directory.  I have use cases where I'd want more than one.
>
> But you can put a .pth file in that directory specifying a number of
> directories to add to the path.  The current mechanism is sufficient.

So when I switch between two virtualized environments, you suggest that 
I remove the .pth files from the old one and add the .pth files from the 
new one?  This strikes you as a good solution?

If I have two virtualized environments, and I want to work with them 
concurrently, and they each require different versions of the same 
library, how do I achieve that with the current mechanism?


/larry/


From ncoghlan at gmail.com  Thu Apr 23 04:46:12 2009
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Thu, 23 Apr 2009 12:46:12 +1000
Subject: [Python-ideas] PYTHONUSERBASES (plural!)
In-Reply-To: <49EFD388.7080002@hastings.org>
References: <49EE3D19.7000603@hastings.org> <gso88g$hnv$1@ger.gmane.org>
	<49EFD388.7080002@hastings.org>
Message-ID: <49EFD674.7090401@gmail.com>

Larry Hastings wrote:
> So when I switch between two virtualized environments, you suggest that
> I remove the .pth files from the old one and add the .pth files from the
> new one?  This strikes you as a good solution?

You would have two different directories each with a different .pth file
in it and point PYTHONUSERBASE at the correct one for the virtual
environment you want to use.

PYTHONUSERBASE would still be the switching point - including a .pth
file just lets you extend a given value for PYTHONUSERBASE to mean more
than one directory.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------


From tjreedy at udel.edu  Thu Apr 23 05:14:33 2009
From: tjreedy at udel.edu (Terry Reedy)
Date: Wed, 22 Apr 2009 23:14:33 -0400
Subject: [Python-ideas] An idea for a new pickling tool
In-Reply-To: <acd65fa20904221414u78bf33efse67c71efeff9c5ce@mail.gmail.com>
References: <6C9D8FA62D3C429C9D164BEE7E9D928E@RaymondLaptop1>
	<acd65fa20904221414u78bf33efse67c71efeff9c5ce@mail.gmail.com>
Message-ID: <gsomep$hai$1@ger.gmane.org>

Alexandre Vassalotti wrote:

>>   * it is a major security risk for untrusted inputs
>>
> 
> There are way to fix this without replacing pickle. See the recipe in
> pickle documentation:
> 
> http://docs.python.org/3.0/library/pickle.html#restricting-globals

On reading that, I notice that it ends with "As our examples shows, you 
have to be careful with what you allow to be unpickled. Therefore if 
security is a concern, you may want to consider alternatives such as the 
marshalling API in xmlrpc.client or third-party solutions."  Raymond's 
proposal is to integrate some third-parth solutions with an eye to the 
product becoming a first-party solution.



From larry at hastings.org  Thu Apr 23 05:41:26 2009
From: larry at hastings.org (Larry Hastings)
Date: Wed, 22 Apr 2009 20:41:26 -0700
Subject: [Python-ideas] PYTHONUSERBASES (plural!)
In-Reply-To: <49EFD674.7090401@gmail.com>
References: <49EE3D19.7000603@hastings.org> <gso88g$hnv$1@ger.gmane.org>
	<49EFD388.7080002@hastings.org> <49EFD674.7090401@gmail.com>
Message-ID: <49EFE366.9020109@hastings.org>


Nick Coghlan wrote:
> You would have two different directories each with a different .pth file
> in it and point PYTHONUSERBASE at the correct one for the virtual
> environment you want to use.

This would preclude finding packages in my default PYTHONUSERBASE.  Or 
combining any two virtualized environments together.

I suppose that's solvable if we go back to copying files around, like a 
cheap "virtualenv".  I could make a temporary directory, copy in the 
lowest-priority packages and .pth files, then copy in the next-highest, 
and so on for all "user bases" I want to compose together.  Then point 
PYTHONUSERBASE at that.

But I'd really rather have something like PYTHONUSERBASES.  For the 
record, would you be for or against PYTHONUSERBASES?  I think I'm going 
to knock together a patch tonight; I hope you'll support it.


/larry//
/
p.s.  Having ruminated about it some more, I'm going to change the name 
to PYTHONPREFIXPATH.  Like "PYTHONPATH", but for "prefix" directories 
(as in "site.PREFIXES").  Still open to suggestions though.//


From ben+python at benfinney.id.au  Thu Apr 23 06:23:55 2009
From: ben+python at benfinney.id.au (Ben Finney)
Date: Thu, 23 Apr 2009 14:23:55 +1000
Subject: [Python-ideas] PYTHONUSERBASES (plural!)
References: <49EE3D19.7000603@hastings.org> <gso88g$hnv$1@ger.gmane.org>
	<49EFD388.7080002@hastings.org> <49EFD674.7090401@gmail.com>
	<49EFE366.9020109@hastings.org>
Message-ID: <87bpqo576s.fsf@benfinney.id.au>

Larry Hastings <larry at hastings.org> writes:

> But I'd really rather have something like PYTHONUSERBASES.  For the
> record, would you be for or against PYTHONUSERBASES?

I'm for the feature as an extension of what we have, and against the
implementation of a new variable for that feature. It belongs as an
extended interpretation of PYTHONUSERBASE, to my understanding.

-- 
 \                              ?Holy polar ice sheet, Batman!? ?Robin |
  `\                                                                   |
_o__)                                                                  |
Ben Finney



From ncoghlan at gmail.com  Thu Apr 23 07:54:23 2009
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Thu, 23 Apr 2009 15:54:23 +1000
Subject: [Python-ideas] PYTHONUSERBASES (plural!)
In-Reply-To: <49EFE366.9020109@hastings.org>
References: <49EE3D19.7000603@hastings.org> <gso88g$hnv$1@ger.gmane.org>
	<49EFD388.7080002@hastings.org> <49EFD674.7090401@gmail.com>
	<49EFE366.9020109@hastings.org>
Message-ID: <49F0028F.3020504@gmail.com>

Larry Hastings wrote:
> 
> Nick Coghlan wrote:
>> You would have two different directories each with a different .pth file
>> in it and point PYTHONUSERBASE at the correct one for the virtual
>> environment you want to use.
> 
> This would preclude finding packages in my default PYTHONUSERBASE.  Or
> combining any two virtualized environments together.

Put "my_default_dir.pth" in each of your virtual environments
(preferably via a symlink if you aren't on Windows). If you want to mix
two environments together, put both of their .pth files in a single
directory (again via symlink if you can).

A collection of .pth files in a directory is easy to manage. Messing
with environment variables is a PITA - you need an external solution to
keep track of what the environment variable should contain for each
environment.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------


From larry at hastings.org  Thu Apr 23 09:30:26 2009
From: larry at hastings.org (Larry Hastings)
Date: Thu, 23 Apr 2009 00:30:26 -0700
Subject: [Python-ideas] PYTHONUSERBASES (plural!)
In-Reply-To: <49F0028F.3020504@gmail.com>
References: <49EE3D19.7000603@hastings.org> <gso88g$hnv$1@ger.gmane.org>
	<49EFD388.7080002@hastings.org> <49EFD674.7090401@gmail.com>
	<49EFE366.9020109@hastings.org> <49F0028F.3020504@gmail.com>
Message-ID: <49F01912.2070708@hastings.org>


Nick Coghlan wrote:
> A collection of .pth files in a directory is easy to manage. Messing
> with environment variables is a PITA - you need an external solution to
> keep track of what the environment variable should contain for each
> environment.

But to do as you suggest, you also need to "mess with environment 
variables": PYTHONUSERBASE.  Unless you propose changing ~/.local every 
time you switch environments, you need to change that variable every 
time you switch environments.  Therefore your proposed solution must 
modify exactly same amount of environment variables as mine.

Your proposal also adds overhead maintaining those collections of .pth 
files.  For example, if you add a new package to one of your virtualized 
environments, you'd need to update every .pth collection directory which 
references that environment.

Even ignoring those details I still don't agree.  I think adding to a 
single environment variable is far cleaner than copying/linking files 
around on disks.  It can be done easily on a command-by-command basis, 
or stowed in a shell script or an alias, not to mention easily managed 
by a virtualized environment system like virtualenv.

Shall I put you down as a -1?  (I'm not having a lot of luck getting 
numeric votes outta folks.)


/larry/


From ben+python at benfinney.id.au  Thu Apr 23 09:39:24 2009
From: ben+python at benfinney.id.au (Ben Finney)
Date: Thu, 23 Apr 2009 17:39:24 +1000
Subject: [Python-ideas] PYTHONUSERBASES (plural!)
References: <49EE3D19.7000603@hastings.org> <gso88g$hnv$1@ger.gmane.org>
	<49EFD388.7080002@hastings.org> <49EFD674.7090401@gmail.com>
	<49EFE366.9020109@hastings.org> <49F0028F.3020504@gmail.com>
	<49F01912.2070708@hastings.org>
Message-ID: <87y6tr4y4z.fsf@benfinney.id.au>

Larry Hastings <larry at hastings.org> writes:

> Shall I put you down as a -1?  (I'm not having a lot of luck getting
> numeric votes outta folks.)

Perhaps that should be interpreted as a preference for discussing ideas
and finding the good *and* bad in them, rather than artificially
distilling the discussion to a linear vote.

-- 
 \          ?Our products just aren't engineered for security.? ?Brian |
  `\             Valentine, senior vice-president of Microsoft Windows |
_o__)                                                development, 2002 |
Ben Finney



From denis.spir at free.fr  Thu Apr 23 09:48:35 2009
From: denis.spir at free.fr (spir)
Date: Thu, 23 Apr 2009 09:48:35 +0200
Subject: [Python-ideas] keywording prohibited
Message-ID: <20090423094835.19fdf337@o>

Hello,

I have noticed a rather strong and nearly systematic opposition to (new) keywords. Cannot really figure out why. Would someone clearly expose the reasoning behind keeping the keyword set as small as possible? (I thought at not preventing users to use the same words as names, but this reason does not seem to hold. On the contrary: non-keyword builtins bite painfully!)
Could not find references on the topic -- probably because a search containing "keyword" returns loads of unrelated stuff.

Denis
------
la vita e estrany


From ben+python at benfinney.id.au  Thu Apr 23 10:11:48 2009
From: ben+python at benfinney.id.au (Ben Finney)
Date: Thu, 23 Apr 2009 18:11:48 +1000
Subject: [Python-ideas] keywording prohibited
References: <20090423094835.19fdf337@o>
Message-ID: <87tz4f4wmz.fsf@benfinney.id.au>

spir <denis.spir at free.fr> writes:

> I have noticed a rather strong and nearly systematic opposition to
> (new) keywords. Cannot really figure out why. Would someone clearly
> expose the reasoning behind keeping the keyword set as small as
> possible?

Because a programming language whose syntax and keyword set can be
easily contained all in one's head, concurrently with all the other
things that one must keep thinking about while programming, is superior
to one for which that's more difficult. Increasing the size of the
language (its keywords or other syntax) moves it in that dimension from
easy to difficult.

Sometimes that's worth the cost; but the cost is significant, so the
resistance must be strong.

-- 
 \         ?Broken promises don't upset me. I just think, why did they |
  `\                                         believe me?? ?Jack Handey |
_o__)                                                                  |
Ben Finney



From larry at hastings.org  Thu Apr 23 10:34:56 2009
From: larry at hastings.org (Larry Hastings)
Date: Thu, 23 Apr 2009 01:34:56 -0700
Subject: [Python-ideas] PYTHONUSERBASES (plural!)
In-Reply-To: <87y6tr4y4z.fsf@benfinney.id.au>
References: <49EE3D19.7000603@hastings.org>
	<gso88g$hnv$1@ger.gmane.org>	<49EFD388.7080002@hastings.org>
	<49EFD674.7090401@gmail.com>	<49EFE366.9020109@hastings.org>
	<49F0028F.3020504@gmail.com>	<49F01912.2070708@hastings.org>
	<87y6tr4y4z.fsf@benfinney.id.au>
Message-ID: <49F02830.20104@hastings.org>


Ben Finney wrote:
> Larry Hastings <larry at hastings.org> writes:
>   
>> Shall I put you down as a -1?  (I'm not having a lot of luck getting
>> numeric votes outta folks.)
>>     
> Perhaps that should be interpreted as a preference for discussing ideas
> and finding the good *and* bad in them, rather than artificially
> distilling the discussion to a linear vote.

Surely numeric voting in Python newsgroups has a long and rich 
tradition.  And I don't think "-1" is a misrepresentation of Nick's 
reaction to my proposal.  But I do await his reply.

While we're on the subject, I've just posted a patch, here:

    http://bugs.python.org/issue5819

Note that I had a change of heart and renamed the environment variable, 
as someone suggested.  The current name is PYTHONPREFIXES.

I'm going to nip off and post about it to python-dev,


/larry/


From ben+python at benfinney.id.au  Thu Apr 23 10:51:34 2009
From: ben+python at benfinney.id.au (Ben Finney)
Date: Thu, 23 Apr 2009 18:51:34 +1000
Subject: [Python-ideas] PYTHONUSERBASES (plural!)
References: <49EE3D19.7000603@hastings.org> <gso88g$hnv$1@ger.gmane.org>
	<49EFD388.7080002@hastings.org> <49EFD674.7090401@gmail.com>
	<49EFE366.9020109@hastings.org> <49F0028F.3020504@gmail.com>
	<49F01912.2070708@hastings.org>
	<87y6tr4y4z.fsf@benfinney.id.au> <49F02830.20104@hastings.org>
Message-ID: <87ljpr4usp.fsf@benfinney.id.au>

Larry Hastings <larry at hastings.org> writes:

> Ben Finney wrote:
> > Larry Hastings <larry at hastings.org> writes:
> >   
> >> Shall I put you down as a -1? (I'm not having a lot of luck getting
> >> numeric votes outta folks.)
> >>     
> > Perhaps that should be interpreted as a preference for discussing
> > ideas and finding the good *and* bad in them, rather than
> > artificially distilling the discussion to a linear vote.
> 
> Surely numeric voting in Python newsgroups has a long and rich
> tradition.

Yes, and with good reason: it's a useful short cut *if* the person
merely wants to say they unilaterally support or oppose a proposal.

I'm suggesting that, on a forum dedicated to *discussing ideas* which
are hopefully interesting enough to be worth thinking about and
discussing, you're perhaps oversimplifying if you hope the responses to
an idea can be usefully distilled to one of those two.

> While we're on the subject, I've just posted a patch, here:

Thanks for proceeding with this work without waiting for votes :-)

-- 
 \           ?[W]hoever is able to make you absurd is able to make you |
  `\                                                unjust.? ?Voltaire |
_o__)                                                                  |
Ben Finney



From erik at cq2.org  Thu Apr 23 11:31:03 2009
From: erik at cq2.org (Erik Groeneveld)
Date: Thu, 23 Apr 2009 11:31:03 +0200
Subject: [Python-ideas] Revised**12 PEP on Yield-From
In-Reply-To: <49EFC32E.3080406@canterbury.ac.nz>
References: <49EAC4F9.90107@canterbury.ac.nz>
	<aaec99390904220751l2a51a056xf0cc86bd99504264@mail.gmail.com>
	<49EF353A.1060204@improva.dk> <49EFC32E.3080406@canterbury.ac.nz>
Message-ID: <aaec99390904230231t35772cf1w3c95090d0d903df1@mail.gmail.com>

Greg, Jacob, Jim,

> No, g never raises a ValueError here, because the raise
> statement is not reached.
>
> You need to wrap the yield in a try-except to catch the
> GeneratorExit.

Thanks for signaling this.  The example was wrong.  I was confused by
my code implicitly turning a GeneratorExit into StopIteration. In my
unittests I have:

        def f2():
            try:
                yield
            except GeneratorExit:
                pass
            # implicit raise StopIteration here

and I added a duplicate to test the problem more explicitly:

        def f3():
            try:
                yield
            except GeneratorExit:
                pass
            yield  # does not raise an exception but yields None

Thanks a lot, I was now able to complete all unittests and code
according to the new PEP.

There is one problem left however.  The code dealing with
GeneratorExit has to undo the work of close() a bit.  To account for:

      "If this call results in an exception, it is propagated to the delegating
      generator. Otherwise, GeneratorExit is raised in the delegating
generator."

and knowing that close() raises a RuntimeError when a generator
ignores GeneratorExit, I have:

                    try:
                        generator.close()
                    except RuntimeError:
                        pass
                    raise GeneratorExit

But this code cannot tell if the generator intended to raise a
RuntimeError.  Indeed, I can't make this test work with RuntimeError
(see commented lines):

        msg = []
        def f8():
            try:
                yield f9()
            #except RuntimeError, e:
            except ValueError, e:
                msg.append(str(e))
                raise StopIteration()
        def f9():
            try:
                yield
            except GeneratorExit:
                msg.append('GeneratorExit turned into ValueError')
                #raise RuntimeError('stop here')
                raise ValueError('stop here')
            yield

        g8 = compose(f8())
        g8.next()
        try:
            g8.throw(GeneratorExit())
            self.fail('must raise StopIteration')
        except StopIteration:
            pass
        self.assertEquals(['GeneratorExit turned into ValueError',
'stop here'], msg)

I wonder what you think about this and how to get this right.

Erik


From jh at improva.dk  Thu Apr 23 12:34:48 2009
From: jh at improva.dk (Jacob Holm)
Date: Thu, 23 Apr 2009 12:34:48 +0200
Subject: [Python-ideas] Revised**12 PEP on Yield-From
In-Reply-To: <aaec99390904230231t35772cf1w3c95090d0d903df1@mail.gmail.com>
References: <49EAC4F9.90107@canterbury.ac.nz>	
	<aaec99390904220751l2a51a056xf0cc86bd99504264@mail.gmail.com>	
	<49EF353A.1060204@improva.dk> <49EFC32E.3080406@canterbury.ac.nz>
	<aaec99390904230231t35772cf1w3c95090d0d903df1@mail.gmail.com>
Message-ID: <49F04448.7040807@improva.dk>

Hi Erik

Erik Groeneveld wrote:
> There is one problem left however.  The code dealing with
> GeneratorExit has to undo the work of close() a bit.  To account for:
> 
>       "If this call results in an exception, it is propagated to the delegating
>       generator. Otherwise, GeneratorExit is raised in the delegating
> generator."
> 
> and knowing that close() raises a RuntimeError when a generator
> ignores GeneratorExit, I have:
> 
>                     try:
>                         generator.close()
>                     except RuntimeError:
>                         pass
>                     raise GeneratorExit
> 
> But this code cannot tell if the generator intended to raise a
> RuntimeError. 

Why do you think you can't just do:

    generator.close()
    raise GeneratorExit

instead?

This does excatly what the quote says, and if you look at the expansion 
in the PEP that is exactly how it is defined there.

HTH
-Jacob


From erik at cq2.org  Thu Apr 23 13:31:36 2009
From: erik at cq2.org (Erik Groeneveld)
Date: Thu, 23 Apr 2009 13:31:36 +0200
Subject: [Python-ideas] Revised**12 PEP on Yield-From
In-Reply-To: <49F04448.7040807@improva.dk>
References: <49EAC4F9.90107@canterbury.ac.nz>
	<aaec99390904220751l2a51a056xf0cc86bd99504264@mail.gmail.com>
	<49EF353A.1060204@improva.dk> <49EFC32E.3080406@canterbury.ac.nz>
	<aaec99390904230231t35772cf1w3c95090d0d903df1@mail.gmail.com>
	<49F04448.7040807@improva.dk>
Message-ID: <aaec99390904230431s50108be7u7cabf64344c55f32@mail.gmail.com>

2009/4/23 Jacob Holm <jh at improva.dk>:

>> ? ? ?"If this call results in an exception, it is propagated to the delegating
>> ? ? ?generator. Otherwise, GeneratorExit is raised in the delegating
>> generator."
>>
>> and knowing that close() raises a RuntimeError when a generator
>> ignores GeneratorExit, I have:
>>
>> ? ? ? ? ? ? ? ? ? ?try:
>> ? ? ? ? ? ? ? ? ? ? ? ?generator.close()
>> ? ? ? ? ? ? ? ? ? ?except RuntimeError:
>> ? ? ? ? ? ? ? ? ? ? ? ?pass
>> ? ? ? ? ? ? ? ? ? ?raise GeneratorExit
>>
>> But this code cannot tell if the generator intended to raise a
>> RuntimeError.
>
> Why do you think you can't just do:
>
> ? generator.close()
> ? raise GeneratorExit
>
> instead?
>
> This does excatly what the quote says, and if you look at the expansion in the PEP that is exactly how it is defined there.

These two line were exactly what I started out with, but I see that
expanded it because I had a different interpretation of the PEP.  I
interpreted it as when the generator did not raise an exception, but
close does, it is a different situation.  I'll think about it more
deeply. Thanks btw!

Erik


>
> HTH
> -Jacob
>


From erik at cq2.org  Thu Apr 23 13:35:06 2009
From: erik at cq2.org (Erik Groeneveld)
Date: Thu, 23 Apr 2009 13:35:06 +0200
Subject: [Python-ideas] Revised**12 PEP on Yield-From
In-Reply-To: <49EFBBEE.20401@canterbury.ac.nz>
References: <49EAC4F9.90107@canterbury.ac.nz>
	<aaec99390904220551g325c6905redd038324ccce648@mail.gmail.com>
	<49EFBBEE.20401@canterbury.ac.nz>
Message-ID: <aaec99390904230435g2787ddbuf621261957e96810@mail.gmail.com>

2009/4/23 Greg Ewing <greg.ewing at canterbury.ac.nz>:
> Erik Groeneveld wrote:
>
>> the readRe generator must be able to
>> indicate that it has superfluous data, and this data must be processed
>> by other generators.
>>
>> Have you though about this? =C2=A0How would you solve it?
>
> I think you're expecting a bit much from yield-from.

Well, you asked for practical applications, and here is one.  I hope
to be able to use yield-from in Weightless instead of its compose
function.  However, I do not see how a yield-from without support for
splitting boundaries would be combined with my own code to do the
latter.  If this combination is not possible, I would be forced to
still use compose instead of yield-from.  I would regret that mostly.

So I am expecting at least a yield-from that can be combined
orthogonally with my boundary splitting code (and other things, see
below).  At present, it can't because there is no way to detect of
intercept an yield-from.

> In a situation like this, I wouldn't use yield to
> receive the values. I'd read them from some kind of
> buffering object that allows peeking ahead however
> far is needed.

Well, the whole point of using coroutines is to avoid buffering.  I'll
try to elaborate on this point a bit, and I hope I can convince you
and others to investigate what the consequences of this type of
applications could be for the usage or implementation of yield-from.


When generalized generators were introduced many people immediately
saw the advantage for using them for thread-less I/O: tie a generator
to a socket.  I took the challenge and found it to be extraordinary
complicated.  Back to that later, first a little background.

I started with Michael Jacksons now more than 30 years old JSP theory
about structuring programs based on the input and output stream they
process. All based on coroutines.  His assumptions about memory and
storage latency of mainframes are valid today for web-servers.  The
idea basically boils down to decompose a data-processing program into
coroutines, as easily as you are used to do with functions.  A
programmer would be able to 'call' subcoroutines as if they were
functions, without need for diving into subtle and hard to understand
differences or inconsistencies between the two.

It took me two years to get it right.  Every time I switched to role
of 'a programmer', I got stuck with code not working as expected,
incomprehensible stack-traces etc.  Others were even more puzzled. It
was not transparent in its usage and I had to go back to the working
bench.

But what a reward when it finally worked!  I have never seen such
simple easy to read code for for example an HTTP server.  Notoriously
difficult bugs in my call-back based HTTP server I was not able to
solved just vanished.  I still am impressed by the cleanness of the
code and I keep wondering: 'can it really be that simple'?.  Was this
really conceived more than 30 years ago?  Jackson must have been a
genius!

Since then I have presented this on the British SPA conference and two
Dutch Pythonist groups.  I assembled a callback-vs-coroutine test case
which clearly demonstrates the differences in amount of code,
readability of code and locality of change when adding features.
People explicitly appreciated the intuitive behavior for a programmer.
(all documented at http://weightless.io and code fragments in svn)

Back to why it was so complicated.

First of all, as you already know, it is not possible to use just a
straightforward for-loop to delegate to another coroutine. The
yield-from proposal covers this all I believe.

Secondly, if something goes wrong and a stack-trace is printed, this
stack-trace would not reflect the proper sequence in which coroutines
were called (this really make a programmer go mad!), at least not
without additional efforts to maintain an explicit callstack with each
generator on it, and using this to adjust the stack-trace when needed.
(This is why I asked if the coroutine will be on the call-stack and
hence be visible in a stack-trace).

Thirdly, there seems to be some sort of unspoken 'protocol' with
generators.  A next() is actually send(None) and vaguely means 'I want
data'.  It the same vein 'x =3D yield' actually is 'x =3D yield None' and
also vaguely means 'I want data'.  So the None seems to play a special
role.  I hesitated a lot, but I had to apply this 'protocol' to
couroutines, otherwise It was next to impossible to work with them as
being 'the programmer'; it requires constant checking what happened.
Funny enough, it turned out to be a major break-through in getting it
transparent to a programmer.

Fourthly, there is the issue of boundary clashes.  These are common in
any data-processing problem.  The input data is simply not structured
or tokenized according to boundaries on a certain level of
abstraction.  This is the *very reason* to use coroutines and Jackson
describes elegant ways to solve the problem.  JSP requires a lookahead
and the coroutines must have some way to support this.  (Introducing a
stream or buffer would put us back to where we started of course).
After several tries I settled for a push-back mechanism as this was
the most elegant way (IMHO) to solve it.  (This is why I suggested
'return value, pushbackdata').


At this point I hope I have gained you interest for this kind of
data-processing applications and I hope that we can have a fruitful
discussion about it.

Also, I would like to here what other kind of applications you have in mind.

Best regards
Erik


From ncoghlan at gmail.com  Thu Apr 23 13:35:39 2009
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Thu, 23 Apr 2009 21:35:39 +1000
Subject: [Python-ideas] PYTHONUSERBASES (plural!)
In-Reply-To: <49F01912.2070708@hastings.org>
References: <49EE3D19.7000603@hastings.org> <gso88g$hnv$1@ger.gmane.org>
	<49EFD388.7080002@hastings.org> <49EFD674.7090401@gmail.com>
	<49EFE366.9020109@hastings.org> <49F0028F.3020504@gmail.com>
	<49F01912.2070708@hastings.org>
Message-ID: <49F0528B.9000102@gmail.com>

Larry Hastings wrote:
> Even ignoring those details I still don't agree.  I think adding to a
> single environment variable is far cleaner than copying/linking files
> around on disks.  It can be done easily on a command-by-command basis,
> or stowed in a shell script or an alias, not to mention easily managed
> by a virtualized environment system like virtualenv.

If you're monkeying with an environment variable, then why not just
modify PYTHONPATH directly?

Also, messing with environment variables on Windows is a real PITA
(largely because there is no env equivalent).

> Shall I put you down as a -1?  (I'm not having a lot of luck getting
> numeric votes outta folks.)

Yep, it currently gets a "-1, redundant" from me.

However, I think you would get much less push back if you:
1. Articulated clearly the specific scenarios you want to support beyond
the simple single user site-packages equivalent that PYTHONUSERBASE was
designed to support
2. Described why PYTHONUSERBASE + .pth files don't support them well
3. Described why traditional sys.path altering techniques (e.g. via
PYTHONPATH) or tools like virtualenv aren't adequate to address these
scenarios
4. Described precisely how your new mechanism improves support for the
identified scenarios.

So far it appears to me that you've picked up the shiny new
PYTHONUSERBASE hammer and are trying to hit a screw with it when there
are plenty of existing screwdrivers lying around. However, I don't know
if that is actually what is going on, or if I just haven't understood
the use cases you are wanting to support.

Cheers,
Nick.

P.S. I'm moving house tomorrow and don't know when I will get my net
connection back. So don't worry too much about trying to persuade me - I
probably won't be around until after the first 3.1 beta has already
happened.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------


From ncoghlan at gmail.com  Thu Apr 23 13:50:52 2009
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Thu, 23 Apr 2009 21:50:52 +1000
Subject: [Python-ideas] keywording prohibited
In-Reply-To: <20090423094835.19fdf337@o>
References: <20090423094835.19fdf337@o>
Message-ID: <49F0561C.8050807@gmail.com>

spir wrote:
> Hello,
> 
> I have noticed a rather strong and nearly systematic opposition to
> (new) keywords. Cannot really figure out why. Would someone clearly
> expose the reasoning behind keeping the keyword set as small as
> possible? (I thought at not preventing users to use the same words as
> names, but this reason does not seem to hold. On the contrary:
> non-keyword builtins bite painfully!)

However, using builtin names as attribute or method names is often quite
useful and sensible. You can't do that with keywords - those are
disallowed everywhere other than in the syntactic constructs that rely
on them. You can see the ugly workarounds that are needed in the case of
a couple of existing keywords like 'class' (where people have to use
'cls', 'class_' or 'klass' instead) and 'assert' (generally replaced
with either 'assert_' or 'assertTrue').

In addition to the above and to what Ben said, every new keyword also
means going through the whole __future__ and DeprecationWarning dance in
order to warn people about the introduction of the new keyword without
breaking existing code. Any users whose code triggers those warnings
will have to expend additional effort to port their application to the
Python version where the new keyword is enabled by default.

That last point is enough of a reason to dislike the idea of new
keywords purely from the point of view of aggregate development effort
(i.e. not just effort from the core devs, but the porting effort from
all the affected third part developers as well).

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------


From ncoghlan at gmail.com  Thu Apr 23 14:05:25 2009
From: ncoghlan at gmail.com (Nick Coghlan)
Date: Thu, 23 Apr 2009 22:05:25 +1000
Subject: [Python-ideas] Revised**12 PEP on Yield-From
In-Reply-To: <aaec99390904230435g2787ddbuf621261957e96810@mail.gmail.com>
References: <49EAC4F9.90107@canterbury.ac.nz>	<aaec99390904220551g325c6905redd038324ccce648@mail.gmail.com>	<49EFBBEE.20401@canterbury.ac.nz>
	<aaec99390904230435g2787ddbuf621261957e96810@mail.gmail.com>
Message-ID: <49F05985.6050104@gmail.com>

Erik Groeneveld wrote:
> Fourthly, there is the issue of boundary clashes.  These are common in
> any data-processing problem.  The input data is simply not structured
> or tokenized according to boundaries on a certain level of
> abstraction.  This is the *very reason* to use coroutines and Jackson
> describes elegant ways to solve the problem.  JSP requires a lookahead
> and the coroutines must have some way to support this.  (Introducing a
> stream or buffer would put us back to where we started of course).
> After several tries I settled for a push-back mechanism as this was
> the most elegant way (IMHO) to solve it.  (This is why I suggested
> 'return value, pushbackdata').

Generators will be allowed to return tuples under the PEP, just like
normal functions. So what's wrong with doing something like the following?:

  def dummy_example():
    pushback = None
    while 1:
      item, pushback = yield from read_item(pushback)
      process_item(item)

  def read_item(init_data=None):
    if init_data is not None:
      # Initialise state based on init_data
    else:
     # Use default initial state
    # Read enough data to get a complete item
    # Since this is a coroutine, there will be at least 1 yield
    return item, excess_data

Cheers,
Nick.

-- 
Nick Coghlan   |   ncoghlan at gmail.com   |   Brisbane, Australia
---------------------------------------------------------------


From aahz at pythoncraft.com  Thu Apr 23 14:53:54 2009
From: aahz at pythoncraft.com (Aahz)
Date: Thu, 23 Apr 2009 05:53:54 -0700
Subject: [Python-ideas] keywording prohibited
In-Reply-To: <20090423094835.19fdf337@o>
References: <20090423094835.19fdf337@o>
Message-ID: <20090423125354.GA59@panix.com>

On Thu, Apr 23, 2009, spir wrote:
>
> I have noticed a rather strong and nearly systematic opposition to
> (new) keywords. Cannot really figure out why. Would someone clearly
> expose the reasoning behind keeping the keyword set as small as
> possible? (I thought at not preventing users to use the same words
> as names, but this reason does not seem to hold. On the contrary:
> non-keyword builtins bite painfully!)

How do non-keyword builtins bite?

Also, consider that every single non-Python syntax checking mechanism
must be updated (e.g. vim/emacs syntax highlighter).
-- 
Aahz (aahz at pythoncraft.com)           <*>         http://www.pythoncraft.com/

"If you think it's expensive to hire a professional to do the job, wait
until you hire an amateur."  --Red Adair


From seb.binet at gmail.com  Thu Apr 23 20:31:30 2009
From: seb.binet at gmail.com (Sebastien Binet)
Date: Thu, 23 Apr 2009 20:31:30 +0200
Subject: [Python-ideas] registery system in Python ?
In-Reply-To: <94bdd2610904071105p718c1f72md0fbb6e8e59d9663@mail.gmail.com>
References: <94bdd2610904071105p718c1f72md0fbb6e8e59d9663@mail.gmail.com>
Message-ID: <200904232031.30825.binet@cern.ch>

hi,

> I am working on a plugin system for Distutils, inspired from what
> setuptools provides (entry_points)
> so I am trying to describe how a generic registery could work.
>
> But, as discussed with some people at Pycon, this is a general need.
> What about adding a simple generic registery system in Python stdlib ?
>
> The APIs I was thinking about would register plugins under group names
> for an easy classification:
>
> - get_plugin(group, name) : returns an object for (group, name)
> - register_plugin(group, name, object): register an object, for (group,
> name) - unregister_plugin(group, name): removes an object for (group, name)
> - list_plugins(group=None, doc=False): returns a list of all objects for
> the given group.
> - list_groups(): return a list of all groups
>
> having groups make it simpler to classify plugins. In my use case,
> group could be : 'distutils:filelist'
> to list all plugins that knows how to build a file list. (see
> http://wiki.python.org/moin/Distutils/ManifestPluginSystem)

sounds nice but what about 'discovering' new plugins ?
writting a plugin system in python isn't super hard (it is merely a glorified 
API around a dict plus some bells and whistles,) the discovering part is a bit 
more nagging though:
 - populate a "well known" (set of) directory
 - naming conventions: all files *_plugin.{ini,py} are implicit plugins 
registration files
 - ?

or is the underlying idea to write a 'filelist' plugin which will discover new 
plugins ?

cheers,
sebastien.


From benjamin at python.org  Thu Apr 23 23:46:36 2009
From: benjamin at python.org (Benjamin Peterson)
Date: Thu, 23 Apr 2009 21:46:36 +0000 (UTC)
Subject: [Python-ideas] Heap data type
References: <e04bdf310904180521j76689f6j6cc7d207094b2d33@mail.gmail.com>
	<20090418124357.GA8506@panix.com>
	<3CDA63554E1546DEA84A696B56BB4876@RaymondLaptop1>
	<5d1a32000904200601q4e4cb41ewe89ea781d1bfabd2@mail.gmail.com>
	<gsic4v$sa8$1@ger.gmane.org>
	<5d1a32000904201216h6fa71153keb84036ad68d5bf3@mail.gmail.com>
	<gsjkkv$2t0$1@ger.gmane.org>
	<5d1a32000904210644g2d41581dvcbdc3f6fc29646bf@mail.gmail.com>
Message-ID: <loom.20090423T214553-636@post.gmane.org>

Gerald Britton <gerald.britton at ...> writes:

> 
> Ok thanks.  Any idea what the uptake is on Py3?  I know that the
> projects I work on are not even talking about moving in that direction
> due to the significant migration effort.  Also, I hear that
> performance is an issue, though that might be improve over the longer
> term.

3.0, of course, doesn't have yet a huge following.

Most of the performance problems will be fixed in 3.1.






From greg.ewing at canterbury.ac.nz  Fri Apr 24 00:19:00 2009
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Fri, 24 Apr 2009 10:19:00 +1200
Subject: [Python-ideas] keywording prohibited
In-Reply-To: <20090423094835.19fdf337@o>
References: <20090423094835.19fdf337@o>
Message-ID: <49F0E954.3060004@canterbury.ac.nz>

spir wrote:

> I have noticed a rather strong and nearly systematic opposition to (new)
> keywords. Cannot really figure out why.

Every time a new keyword is added, it has the potential
to break the code of someone who is using it as a
variable name.

Since we place a high value on not breaking existing
code, we are naturally reluctant to add new keywords.

We do sometimes, but there has to be an extremely good
reason for it.

-- 
Greg


From greg.ewing at canterbury.ac.nz  Fri Apr 24 00:25:27 2009
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Fri, 24 Apr 2009 10:25:27 +1200
Subject: [Python-ideas] Revised**12 PEP on Yield-From
In-Reply-To: <aaec99390904230231t35772cf1w3c95090d0d903df1@mail.gmail.com>
References: <49EAC4F9.90107@canterbury.ac.nz>
	<aaec99390904220751l2a51a056xf0cc86bd99504264@mail.gmail.com>
	<49EF353A.1060204@improva.dk> <49EFC32E.3080406@canterbury.ac.nz>
	<aaec99390904230231t35772cf1w3c95090d0d903df1@mail.gmail.com>
Message-ID: <49F0EAD7.5010703@canterbury.ac.nz>

Erik Groeneveld wrote:

> and knowing that close() raises a RuntimeError when a generator
> ignores GeneratorExit, I have:
> 
>                     try:
>                         generator.close()
>                     except RuntimeError:
>                         pass
>                     raise GeneratorExit

No, you shouldn't be doing that -- just let the
RuntimError propagate. The "otherwise" refers to the
case where the close() call completes normally
without raising any exception.

-- 
Greg


From michael.s.gilbert at gmail.com  Fri Apr 24 00:29:15 2009
From: michael.s.gilbert at gmail.com (Michael S. Gilbert)
Date: Thu, 23 Apr 2009 18:29:15 -0400
Subject: [Python-ideas] A conditional "for" statement
Message-ID: <20090423182915.7bda606c.michael.s.gilbert@gmail.com>

Hello, 

I've recently been working on some code where i am processing a 
list, but excluding certain items.  The solution is to use a list 
comprehension in the "for" statement.  An example is:

  for m in [n for n in range( 0 , 5 ) if n != 2]

Determining what's going on here isn't immediately obvious (i.e.
what's this new variable n doing?).  It would be nice to have a more 
streamlined syntax such as:

  for m in range( 0 , 5 ) with m != 2

which is much cleaner and obvious.  It is also very much like the
list comprehension syntax (although i've changed "if" to "with", and
the better of the two is subject to personal opinion and debatable).  I
would hope that the statements following "with" could be any conditional
expression.

This is just a thought I had while working today.  Thank you for your
consideration.

Best Regards,
Mike


From pyideas at rebertia.com  Fri Apr 24 00:34:58 2009
From: pyideas at rebertia.com (Chris Rebert)
Date: Thu, 23 Apr 2009 15:34:58 -0700
Subject: [Python-ideas] A conditional "for" statement
In-Reply-To: <20090423182915.7bda606c.michael.s.gilbert@gmail.com>
References: <20090423182915.7bda606c.michael.s.gilbert@gmail.com>
Message-ID: <50697b2c0904231534r40d73263md478ddac6fef2e90@mail.gmail.com>

On Thu, Apr 23, 2009 at 3:29 PM, Michael S. Gilbert
<michael.s.gilbert at gmail.com> wrote:
> Hello,
>
> I've recently been working on some code where i am processing a
> list, but excluding certain items. ?The solution is to use a list
> comprehension in the "for" statement. ?An example is:
>
> ?for m in [n for n in range( 0 , 5 ) if n != 2]
>
> Determining what's going on here isn't immediately obvious (i.e.
> what's this new variable n doing?). ?It would be nice to have a more
> streamlined syntax such as:
>
> ?for m in range( 0 , 5 ) with m != 2

I don't see how this is clearer than either of the obvious alternatives:

for m in range(0 , 5):
    if m == 2:
        continue
    #loop body

for m in range(0 , 5):
    if m != 2:
        #loop body

It's certainly /slightly/ shorter, but the problem is not severe
enough to warrant new syntax, imho.
Also, this uses the `with` keyword in a completely different way from
its existing use, which could be confusing.

Cheers,
Chris
-- 
I have a blog:
http://blog.rebertia.com


From michael.s.gilbert at gmail.com  Fri Apr 24 00:49:29 2009
From: michael.s.gilbert at gmail.com (Michael S. Gilbert)
Date: Thu, 23 Apr 2009 18:49:29 -0400
Subject: [Python-ideas] A conditional "for" statement
In-Reply-To: <50697b2c0904231534r40d73263md478ddac6fef2e90@mail.gmail.com>
References: <20090423182915.7bda606c.michael.s.gilbert@gmail.com>
	<50697b2c0904231534r40d73263md478ddac6fef2e90@mail.gmail.com>
Message-ID: <20090423184929.40eff2e9.michael.s.gilbert@gmail.com>

On Thu, 23 Apr 2009 15:34:58 -0700, Chris Rebert wrote:
> I don't see how this is clearer than either of the obvious alternatives:

The purpose would also be efficiency.  It's less efficient to check the
condition m==2 during every loop iteration than it is to set up a list
that excludes m=2 to begin with.

> It's certainly /slightly/ shorter, but the problem is not severe
> enough to warrant new syntax, imho.
> Also, this uses the `with` keyword in a completely different way from
> its existing use, which could be confusing.

Right, like I said, that's debatable, and "if" probably makes more
sense since it mimicks the list comprehension syntax anyway.  Example:

  for m in range( 0 , 5 ) if m != 2:
    print m
vs

  [m for m in range( 0 , 5 ) if m != 2 ]:

Mike


From tjreedy at udel.edu  Fri Apr 24 00:56:28 2009
From: tjreedy at udel.edu (Terry Reedy)
Date: Thu, 23 Apr 2009 18:56:28 -0400
Subject: [Python-ideas] A conditional "for" statement
In-Reply-To: <20090423182915.7bda606c.michael.s.gilbert@gmail.com>
References: <20090423182915.7bda606c.michael.s.gilbert@gmail.com>
Message-ID: <gsqrmr$hic$1@ger.gmane.org>

Proposed and rejected.

The difference between

for i in seq:
   if p(i):
     do(i)

and

for i in seq if p(i):
   do(i)

is deletion of ':\n'.  Hardly worth the bother.



From michael.s.gilbert at gmail.com  Fri Apr 24 01:07:07 2009
From: michael.s.gilbert at gmail.com (Michael S. Gilbert)
Date: Thu, 23 Apr 2009 19:07:07 -0400
Subject: [Python-ideas] A conditional "for" statement
In-Reply-To: <gsqrmr$hic$1@ger.gmane.org>
References: <20090423182915.7bda606c.michael.s.gilbert@gmail.com>
	<gsqrmr$hic$1@ger.gmane.org>
Message-ID: <20090423190707.65ef4471.michael.s.gilbert@gmail.com>

On Thu, 23 Apr 2009 18:56:28 -0400, Terry Reedy wrote:
> Proposed and rejected.
> 
> The difference between
> 
> for i in seq:
>    if p(i):
>      do(i)
> 
> and
> 
> for i in seq if p(i):
>    do(i)
> 
> is deletion of ':\n'.  Hardly worth the bother.

What about the difference in efficiency?  For the second case, you've
reduced the number of iterations (by the number of items that your
conditional expression has excluded) and eliminated one operation per
iteration (evaluation the if statement).

Mike


From bruce at leapyear.org  Fri Apr 24 01:19:13 2009
From: bruce at leapyear.org (Bruce Leban)
Date: Thu, 23 Apr 2009 16:19:13 -0700
Subject: [Python-ideas] A conditional "for" statement
In-Reply-To: <20090423190707.65ef4471.michael.s.gilbert@gmail.com>
References: <20090423182915.7bda606c.michael.s.gilbert@gmail.com> 
	<gsqrmr$hic$1@ger.gmane.org>
	<20090423190707.65ef4471.michael.s.gilbert@gmail.com>
Message-ID: <cf5b87740904231619h11de93c2nec9d32a66064b66@mail.gmail.com>

what makes you think that this would be more efficient? If it was
significant, surely the compiler could detect the for/if idiom and optimize
it.

On Thu, Apr 23, 2009 at 4:07 PM, Michael S. Gilbert <
michael.s.gilbert at gmail.com> wrote:

> On Thu, 23 Apr 2009 18:56:28 -0400, Terry Reedy wrote:
> > The difference between
> >
> > for i in seq:
> >    if p(i):
> >      do(i)
> >
> > and
> >
> > for i in seq if p(i):
> >    do(i)
> >
> > is deletion of ':\n'.  Hardly worth the bother.
>
> What about the difference in efficiency?  For the second case, you've
> reduced the number of iterations (by the number of items that your
> conditional expression has excluded) and eliminated one operation per
> iteration (evaluation the if statement).
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-ideas/attachments/20090423/d763c185/attachment.html>

From michael.s.gilbert at gmail.com  Fri Apr 24 01:40:16 2009
From: michael.s.gilbert at gmail.com (Michael S. Gilbert)
Date: Thu, 23 Apr 2009 19:40:16 -0400
Subject: [Python-ideas] A conditional "for" statement
In-Reply-To: <20090423190914.2281b0d3@bhuda.mired.org>
References: <20090423182915.7bda606c.michael.s.gilbert@gmail.com>
	<50697b2c0904231534r40d73263md478ddac6fef2e90@mail.gmail.com>
	<20090423184929.40eff2e9.michael.s.gilbert@gmail.com>
	<20090423190914.2281b0d3@bhuda.mired.org>
Message-ID: <20090423194016.949f2f75.michael.s.gilbert@gmail.com>

On Thu, 23 Apr 2009 19:09:14 -0400, Mike Meyer wrote:
> If you believe this is still true (though as has been pointed out,
> this idea has already been considered and rejected), show us the
> timings. In particular, with a range of different values for the
> percentage of excluded elements: say 0, 10, 50, 90 and 100?

Ok, I'll admit when I'm wrong.  It actually takes longer to use the
list comprehension (for all exclusion percentages), which, for me at
least, is completely unintuitive.  For 1 million samples:

excluded: 0%, comprehension time: 0.840000
excluded: 0%, if time: 0.530000
excluded: 10%, comprehension time: 0.770000
excluded: 10%, if time: 0.520000
excluded: 20%, comprehension time: 0.710000
excluded: 20%, if time: 0.490000
excluded: 30%, comprehension time: 0.670000
excluded: 30%, if time: 0.460000
excluded: 40%, comprehension time: 0.610000
excluded: 40%, if time: 0.450000
excluded: 50%, comprehension time: 0.560000
excluded: 50%, if time: 0.420000
excluded: 60%, comprehension time: 0.510000
excluded: 60%, if time: 0.400000
excluded: 70%, comprehension time: 0.450000
excluded: 70%, if time: 0.380000
excluded: 80%, comprehension time: 0.410000
excluded: 80%, if time: 0.350000
excluded: 90%, comprehension time: 0.360000
excluded: 90%, if time: 0.330000
excluded: 100%, comprehension time: 0.310000
excluded: 100%, if time: 0.300000

I hereby accept the rejection of this idea since it makes sense neither
from a syntax nor efficiency point of view.

Best Regards,
Mike


From greg.ewing at canterbury.ac.nz  Fri Apr 24 01:44:49 2009
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Fri, 24 Apr 2009 11:44:49 +1200
Subject: [Python-ideas] Revised**12 PEP on Yield-From
In-Reply-To: <aaec99390904230431s50108be7u7cabf64344c55f32@mail.gmail.com>
References: <49EAC4F9.90107@canterbury.ac.nz>
	<aaec99390904220751l2a51a056xf0cc86bd99504264@mail.gmail.com>
	<49EF353A.1060204@improva.dk> <49EFC32E.3080406@canterbury.ac.nz>
	<aaec99390904230231t35772cf1w3c95090d0d903df1@mail.gmail.com>
	<49F04448.7040807@improva.dk>
	<aaec99390904230431s50108be7u7cabf64344c55f32@mail.gmail.com>
Message-ID: <49F0FD71.3040207@canterbury.ac.nz>

Erik Groeneveld wrote:

> I interpreted it as when the generator did not raise an exception, but
> close does, it is a different situation.  I'll think about it more
> deeply.

Keep in mind that as far as the PEP is concerned,
it's not necessarily dealing with a generator, just
some object that happens to implement certain
methods. It has no idea what's going on inside the
close() method -- it can only go by the end result
of the call.

-- 
Greg


From greg.ewing at canterbury.ac.nz  Fri Apr 24 02:20:40 2009
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Fri, 24 Apr 2009 12:20:40 +1200
Subject: [Python-ideas] A conditional "for" statement
In-Reply-To: <20090423190707.65ef4471.michael.s.gilbert@gmail.com>
References: <20090423182915.7bda606c.michael.s.gilbert@gmail.com>
	<gsqrmr$hic$1@ger.gmane.org>
	<20090423190707.65ef4471.michael.s.gilbert@gmail.com>
Message-ID: <49F105D8.8090501@canterbury.ac.nz>

Michael S. Gilbert wrote:

> What about the difference in efficiency?  For the second case, you've
> reduced the number of iterations (by the number of items that your
> conditional expression has excluded) and eliminated one operation per
> iteration (evaluation the if statement).

No, you haven't. You still have to perform the test
for every iteration. Your proposed syntax would do
*exactly* the same thing as the 3-line version.

-- 
Greg


From greg.ewing at canterbury.ac.nz  Fri Apr 24 02:22:10 2009
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Fri, 24 Apr 2009 12:22:10 +1200
Subject: [Python-ideas] A conditional "for" statement
In-Reply-To: <20090423194016.949f2f75.michael.s.gilbert@gmail.com>
References: <20090423182915.7bda606c.michael.s.gilbert@gmail.com>
	<50697b2c0904231534r40d73263md478ddac6fef2e90@mail.gmail.com>
	<20090423184929.40eff2e9.michael.s.gilbert@gmail.com>
	<20090423190914.2281b0d3@bhuda.mired.org>
	<20090423194016.949f2f75.michael.s.gilbert@gmail.com>
Message-ID: <49F10632.7020504@canterbury.ac.nz>

Michael S. Gilbert wrote:
> It actually takes longer to use the
> list comprehension (for all exclusion percentages), which, for me at
> least, is completely unintuitive.

The LC version is doing everything that the non-LC
version is doing, plus building a list. Doing more
work takes longer.

-- 
Greg


From ben+python at benfinney.id.au  Fri Apr 24 02:22:54 2009
From: ben+python at benfinney.id.au (Ben Finney)
Date: Fri, 24 Apr 2009 10:22:54 +1000
Subject: [Python-ideas] keywording prohibited
References: <20090423094835.19fdf337@o> <49F0E954.3060004@canterbury.ac.nz>
Message-ID: <87r5zi3noh.fsf@benfinney.id.au>

Greg Ewing <greg.ewing at canterbury.ac.nz> writes:

> Since we place a high value on not breaking existing code, we are
> naturally reluctant to add new keywords.

This also works in the opposite direction.

Despite all the natural barriers to adding a new keyword, it is still
far easier to add a new keyword than to *remove* one from the language
if it later turns out to be problematic: some portion of working code
will be thereby broken, which is much more likely than the breakage
caused by adding it in the first place.

Hance, since it's far more difficult to go back (from complex to
simple), there must be great burden of proof for the benefit in moving
from simple to complex.

-- 
 \           ?Two hands working can do more than a thousand clasped in |
  `\                                               prayer.? ?Anonymous |
_o__)                                                                  |
Ben Finney



From ben+python at benfinney.id.au  Fri Apr 24 02:24:58 2009
From: ben+python at benfinney.id.au (Ben Finney)
Date: Fri, 24 Apr 2009 10:24:58 +1000
Subject: [Python-ideas] A conditional "for" statement
References: <20090423182915.7bda606c.michael.s.gilbert@gmail.com>
	<50697b2c0904231534r40d73263md478ddac6fef2e90@mail.gmail.com>
	<20090423184929.40eff2e9.michael.s.gilbert@gmail.com>
Message-ID: <87mya63nl1.fsf@benfinney.id.au>

"Michael S. Gilbert"
<michael.s.gilbert at gmail.com> writes:

> The purpose would also be efficiency. It's less efficient to check the
> condition m==2 during every loop iteration than it is to set up a list
> that excludes m=2 to begin with.

Huh? ?Set up the list? involves exactly the conditional check for each
item in the original sequence. So those condition checks for each item
in the original sequence are going to be paid anyway.

-- 
 \       ?bash awk grep perl sed, df du, du-du du-du, vi troff su fsck |
  `\                     rm * halt LART LART LART!? ?The Swedish BOFH, |
_o__)                                            alt.sysadmin.recovery |
Ben Finney



From greg.ewing at canterbury.ac.nz  Fri Apr 24 03:16:17 2009
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Fri, 24 Apr 2009 13:16:17 +1200
Subject: [Python-ideas] Revised**12 PEP on Yield-From
In-Reply-To: <aaec99390904230422l4e7b2ccew20b46a55487523f2@mail.gmail.com>
References: <49EAC4F9.90107@canterbury.ac.nz>
	<aaec99390904220551g325c6905redd038324ccce648@mail.gmail.com>
	<49EFBBEE.20401@canterbury.ac.nz>
	<aaec99390904230422l4e7b2ccew20b46a55487523f2@mail.gmail.com>
Message-ID: <49F112E1.2080903@canterbury.ac.nz>

Erik Groeneveld wrote:

> I do not see how a yield-from without support for
> splitting boundaries would be combined with my own code to do the
> latter.

Seems to me you could yield a sequence of values
to be pushed back, and use a top-level driver
something like this:

   def pushingback(g):
     queue = []
     try:
       while 1:
         queue.append(yield)
         while queue:
           pushback = g.send(queue.pop(0))
           if pushback:
             queue.extend(pushback)
     except StopIteration, e:
       return e.value

> Well, the whole point of using coroutines is to avoid buffering.

Arranging processing as a pipeline avoids having to buffer
all of the intermediate data. But if lookahead is required,
then you need at least a small amount of buffering somewhere,
whether it's explicit or implicit.

You want to do the buffering implicitly, by reading ahead
and pushing back the stuff you don't want, making the
caller responsible for holding onto it.

While that's certainly possible, I'm far from convinced
it's the easiest or clearest way to go about it. I would
be more inclined to insert an explicit buffering object,
such as a queue, between the two stages of the pipeline.

> Thirdly, there seems to be some sort of unspoken 'protocol' with
> generators.  A next() is actually send(None) and vaguely means 'I want
> data'.  It the same vein 'x = yield' actually is 'x = yield None' and
> also vaguely means 'I want data'.  So the None seems to play a special
> role.

There is a one-to-one relationship between yields in
the generator and next/send calls by the caller. Each
yield provides an opportunity to transmit one value
and receive one value. If either of these opportunities
is unused, None is substituted. This is no more special
than a function returning None when it has no other value
of interest.

This lockstep execution between producer and consumer
is a feature of any coroutine system in which one
coroutine sends values directly to another, without
any form of buffering between.

This may be an unfamiliar concept to programmers these
days, because the modern incarnations of coroutines
normally encountered (OS-provided processes and threads)
*never* send values to each other directly -- there's
always some kind of IPC object in between, such as a
pipe or queue.

So when seeing the generator send/yield mechanism for
the first time, it's tempting to think of it as being
like sending data to a process through a pipe. But
it's not really like that -- it's a lower-level
facility.

> JSP requires a lookahead
> and the coroutines must have some way to support this.  (Introducing a
> stream or buffer would put us back to where we started of course).

I don't see why. You don't have to buffer all the data,
only the minimum required to support the amount of
lookahead needed. And I can't see why a pair of
coroutines communicating via an IPC object that
supports lookahead can't be just as simple and clear
as using some kind of pushback mechanism.

-- 
Greg


From josiah.carlson at gmail.com  Fri Apr 24 07:19:59 2009
From: josiah.carlson at gmail.com (Josiah Carlson)
Date: Thu, 23 Apr 2009 22:19:59 -0700
Subject: [Python-ideas] An idea for a new pickling tool
In-Reply-To: <4222a8490904220610i30ea7e27xc23181c7530c59e2@mail.gmail.com>
References: <6C9D8FA62D3C429C9D164BEE7E9D928E@RaymondLaptop1>
	<4222a8490904211741s5f31c733n55f2fda209f294e7@mail.gmail.com>
	<4222a8490904220610i30ea7e27xc23181c7530c59e2@mail.gmail.com>
Message-ID: <e6511dbf0904232219y2862fa0es5bada7b34bfb4619@mail.gmail.com>

On Wed, Apr 22, 2009 at 6:10 AM, Jesse Noller <jnoller at gmail.com> wrote:
> On Tue, Apr 21, 2009 at 8:41 PM, Jesse Noller <jnoller at gmail.com> wrote:
>> On Tue, Apr 21, 2009 at 6:02 PM, Raymond Hettinger <python at rcn.com> wrote:
>>> Motivation
>>> ----------
>>>
>>> Python's pickles use a custom format that has evolved over time
>>> but they have five significant disadvantages:
>>>
>>> ? * it has lost its human readability and editability
>>> ? * is doesn't compress well
>>> ? * it isn't interoperable with other languages
>>> ? * it doesn't have the ability to enforce a schema
>>> ? * it is a major security risk for untrusted inputs
>>>
>>>
>>> New idea
>>> --------
>>>
>>> Develop a solution using a mix of PyYAML, a python coded version of
>>> Kwalify, optional compression using bz2, gzip, or zlib, and pretty
>>> printing using pygments.
>>>
>>> YAML ( http://yaml.org/spec/1.2/ ) is a language independent standard
>>> for data serialization.
>>>
>>> PyYAML ( http://pyyaml.org/wiki/PyYAML ) is a full implementation of
>>> the YAML standard. ?It uses the YAML's application-specific tags and
>>> Python's own copy/reduce logic to provide the same power as pickle itself.
>>>
>>> Kwalify ( http://www.kuwata-lab.com/kwalify/ruby/users-guide.01.html )
>>> is a schema validator written in Ruby and Java. ?It defines a
>>> YAML/JSON based schema definition for enforcing tight constraints
>>> on incoming data.
>>>
>>> The bz2, gzip, and zlib compression libraries are already built into
>>> the language.
>>>
>>> Pygments ( http://pygments.org/ ) is python based syntax highlighter
>>> with builtin support for YAML.
>>>
>>>
>>> Advantages
>>> ----------
>>>
>>> * The format is simple enough to hand edit or to have lightweight
>>> ?applications emit valid pickles. ?For example:
>>>
>>> ? ? print('Todo: [go to bank, pick up food, write code]') ? # valid pickle
>>>
>>> * To date, efforts to make pickles smaller have focused on creating new
>>> ?codes for every data type. ?Instead, we can use the simple text formatting
>>> ?of YAML and let general purpose data compression utilities do their job
>>> ?(letting the user control the trade-offs between speed, space, and human
>>> ?readability):
>>>
>>> ? ? yaml.dump(data, compressor=None) ?# fast, human readable, no compression
>>> ? ? yaml.dump(data, compressor=bz2) ? # slowest, but best compression
>>> ? ? yaml.dump(data, compressor=zlib) ?# medium speed and medium compression
>>>
>>> * The current pickle tools makes it easy to exchange object trees between
>>> ?two Python processes. ?The new tool would make it equally easy ?to exchange
>>> object trees between processes running any of Python, Ruby, ?Java, C/C++,
>>> Perl, C#, PHP, OCaml, Javascript, ActionScript, and Haskell.
>>>
>>> * Ability to use a schema for enforcing a given object model and allowing
>>> ?full security. ?Which would you rather run on untrusted data:
>>>
>>> ? ? data = yaml.load(myfile, schema=ListOfStrings)
>>>
>>> ?or
>>>
>>> ? ? data = pickle.load(myfile)
>>>
>>> * Specification of a schema using YAML itself
>>>
>>> ?ListOfStrings (a schema written in yaml)
>>> ?........................................
>>> ?type: ? seq
>>> ?sequence:
>>> ? - type: ? str
>>>
>>> ?Sample of valid input
>>> ?.....................
>>> ?- foo
>>> ?- bar
>>> ?- baz
>>>
>>> ?Note, schemas can be defined for very complex, nested object models and
>>> ?allow many kinds of constraints (unique items, enumerated list of allowable
>>> ?values, min/max allowable ranges for values, data type, maximum length,
>>> ?and names of regular Python classes that can be constructed).
>>>
>>> * YAML is a superset of JSON, so the schema validation also works equally
>>> ?well with JSON encoded data.
>>>
>>> What needs to be done
>>> ---------------------
>>>
>>> * Combine the tools for a single, clean interface to C speed parsing
>>> ?of a data serialization standard, with optional compression, schema
>>> ?validation, and pretty printing.
>>
>> Just to add to this, I remembered someone recently did a simple
>> benchmark of thift/JSON/YAML/Protocol Buffers, here are the links:
>>
>> http://www.bouncybouncy.net/ramblings/posts/thrift_and_protocol_buffers/
>> http://www.bouncybouncy.net/ramblings/posts/more_on_json_vs_thrift_and_protocol_buffers/
>> http://www.bouncybouncy.net/ramblings/posts/json_vs_thrift_and_protocol_buffers_round_2/
>>
>> Without digging into the numbers too much, it's worth noting that
>> PyYAML is written in pure python but also has Libyaml
>> (http://pyyaml.org/wiki/LibYAML) bindings for speed. When I get a
>> chance, I can run the same test(s) with both the pure-python
>> implementation and the libyaml one as well as see how much the speedup
>> is.
>>
>
> Speaking of benchmarks, last night I took the first benchmark cited in
> the links above, and with some work I ran the same benchmark with
> PyYAML (pure python) and PyYAML with libyaml (the C version). The
> PyYAML -> libyaml bindings require Cython right now, but here are the
> numbers.
>
> I removed thift and proto buffers, as I wanted to focus on YAML/JSON right now:
>
> 5000 total records (0.510s)
>
> ser_json ? ? ? ? ? ? (0.030s) 718147 bytes
> ser_cjson ? ? ? ? ? ?(0.030s) 718147 bytes
> ser_yaml ? ? ? ? ? ? (6.230s) 623147 bytes
> ser_cyaml ? ? ? ? ? ?(2.040s) 623147 bytes
>
> ser_json_compressed ? ?(0.100s) 292987 bytes
> ser_cjson_compressed ? (0.110s) 292987 bytes
> ser_yaml_compressed ? ?(6.310s) 291018 bytes
> ser_cyaml_compressed ? (2.140s) 291018 bytes
>
> serde_json ? ? ? ? ? (0.050s)
> serde_cjson ? ? ? ? ?(0.050s)
> serde_yaml ? ? ? ? ? (19.020s)
> serde_cyaml ? ? ? ? ?(4.460s)
>
> Running the second benchmark (the integer one) I see:
>
> 10000 total records (0.130s)
>
> ser_json ? ? ? ? ? ? (0.040s) 680749 bytes
> ser_cjson ? ? ? ? ? ?(0.030s) 680749 bytes
> ser_yaml ? ? ? ? ? ? (8.250s) 610749 bytes
> ser_cyaml ? ? ? ? ? ?(3.040s) 610749 bytes
>
> ser_json_compressed ? ?(0.100s) 124924 bytes
> ser_cjson_compressed ? (0.090s) 124924 bytes
> ser_yaml_compressed ? ?(8.320s) 121090 bytes
> ser_cyaml_compressed ? (3.110s) 121090 bytes
>
> serde_json ? ? ? ? ? (0.060s)
> serde_cjson ? ? ? ? ?(0.070s)
> serde_yaml ? ? ? ? ? (24.190s)
> serde_cyaml ? ? ? ? ?(6.690s)
>
>
> So yes, the pure python numbers for yaml (_yaml) are pretty bad; the
> libyaml (_cyaml) numbers are significantly improved, but not as fast
> as JSON/CJSON.

Saying "not as fast" is a bit misleading.  Roughly 100x slower than
json is a more precise description, and is one of the major reasons
why a bunch of people stick with json rather than yaml.

> One thing to note in this discussion as others have pointed out, while
> JSON itself it awfully fast/nice, it lacks some of the capabilities of
> YAML, for example certain objects can not be represented in JSON.
> Additionally, if we want to simply state "objects which you desire to
> be compatible with JSON have the following restrictions" we can - this
> means we can also leverage things within PyYAML which are also
> nice-to-haves, for example the !!python additions.

In fact, no custom objects are generally representable with
json...unless you use the custom encoders/decoders like simplejson
has.  However, in the times when I've used json, being able to store
arbitrary Python objects wasn't a huge chore.  I just threw a
'to_json()' method on every object that I wanted to serialize, they
knew about their contents, and would check for 'to_json()' methods as
necessary.  Deserialization just meant passing the lists,
dictionaries, etc., to a base '.from_json()' classmethod, which did
all of the right stuff.  It was trivial, it worked, and it was fast.

> Picking YAML in this case means we get all of the YAML syntax,
> objects, etc - and if consumers want to stick with JSON compatibility,
> we could add a dump(canonical=True, compatibility=JSON) or somesuch
> flag.

My vote is to keep it simple and fast.  JSON satisfies that.  YAML
doesn't.  While I appreciate the desire to be able to store recursive
references, I don't believe it's necessary in the general case.

 - Josiah


From erik at cq2.org  Fri Apr 24 09:27:35 2009
From: erik at cq2.org (Erik Groeneveld)
Date: Fri, 24 Apr 2009 09:27:35 +0200
Subject: [Python-ideas] Revised**12 PEP on Yield-From
In-Reply-To: <49F05985.6050104@gmail.com>
References: <49EAC4F9.90107@canterbury.ac.nz>
	<aaec99390904220551g325c6905redd038324ccce648@mail.gmail.com>
	<49EFBBEE.20401@canterbury.ac.nz>
	<aaec99390904230435g2787ddbuf621261957e96810@mail.gmail.com>
	<49F05985.6050104@gmail.com>
Message-ID: <aaec99390904240027q6871ab21w5ff9062450818942@mail.gmail.com>

Hi Nick,

2009/4/23 Nick Coghlan <ncoghlan at gmail.com>:
> So what's wrong with doing something like the following?:
>
> ?def dummy_example():
> ? ?pushback = None
> ? ?while 1:
> ? ? ?item, pushback = yield from read_item(pushback)
> ? ? ?process_item(item)
>
> ?def read_item(init_data=None):
> ? ?if init_data is not None:
> ? ? ?# Initialise state based on init_data
> ? ?else:
> ? ? # Use default initial state
> ? ?# Read enough data to get a complete item
> ? ?# Since this is a coroutine, there will be at least 1 yield
> ? ?return item, excess_data

Well, there is nothing wrong with this code, but I don't want to
repeat it for every generator and every generator 'call', just because
one of them *might* have excess data.  I would like a generic solution
to have this code only once, but I can't see a solution yet.

Erik


From denis.spir at free.fr  Fri Apr 24 12:07:05 2009
From: denis.spir at free.fr (spir)
Date: Fri, 24 Apr 2009 12:07:05 +0200
Subject: [Python-ideas] A conditional "for" statement -- postfix
In-Reply-To: <20090423182915.7bda606c.michael.s.gilbert@gmail.com>
References: <20090423182915.7bda606c.michael.s.gilbert@gmail.com>
Message-ID: <20090424120705.4a195147@o>

Le Thu, 23 Apr 2009 18:29:15 -0400,
"Michael S. Gilbert" <michael.s.gilbert at gmail.com> s'exprima ainsi:

> Hello, 
> 
> I've recently been working on some code where i am processing a 
> list, but excluding certain items.  The solution is to use a list 
> comprehension in the "for" statement.  An example is:
> 
>   for m in [n for n in range( 0 , 5 ) if n != 2]
> 
> Determining what's going on here isn't immediately obvious (i.e.
> what's this new variable n doing?).  It would be nice to have a more 
> streamlined syntax such as:
> 
>   for m in range( 0 , 5 ) with m != 2

Should be 'if' anyway; certainly not 'with', anyway.

> which is much cleaner and obvious. 

Actually, I see the syntactic issue differently. I applies to lambdas as well as list comps (that, in fact, also implicitely define unnamed funcs).
Often filtering or mapping expressions are straightforward operations on the items, so that the whole construct needed seems heavy redundance:
   [n for n in numbers if n!=2]
   filter(numbers, lambda n: n!=2)

What we would like to express is only the "operational" part of the expression:
   [numbers if !=2]
   filter(numbers, !=2)
This would be nice as well for a swtich-like statement:
   swith on x:
      case == 0: ...
      case < 0:  ...
      case > 0:  ...
      else:      ;-)

But this is (imo elegant) postfix syntax that allows omitting parameters using an implicit value stack, but does not fit in python, I guess.
   def half: return 2 /
   def square: return dup *
   def sum: return +
The price to pay is difficult stack juggling as soon as operations are not trivial.

What I would certainly like is restricting the field of anonymous func expression (in both lambdas and list comps) to these trivial cases -- so that we could then omit parameters/items using postfix syntax. Otherwise use a defined func wich implicit parameter is the item.
Of course the drawback is that expressions of very low level of complication would have to be defined in a func:
   def hasValidValue(result):
      return result.value is not None
   validResults = [results if hasValidValue]
But we may allow attribute syntax:
   noneResults = [results if .value]

> Best Regards,
> Mike


Denis
------
la vita e estrany


From erik at cq2.org  Fri Apr 24 13:27:30 2009
From: erik at cq2.org (Erik Groeneveld)
Date: Fri, 24 Apr 2009 13:27:30 +0200
Subject: [Python-ideas] Revised**12 PEP on Yield-From
In-Reply-To: <49F112E1.2080903@canterbury.ac.nz>
References: <49EAC4F9.90107@canterbury.ac.nz>
	<aaec99390904220551g325c6905redd038324ccce648@mail.gmail.com>
	<49EFBBEE.20401@canterbury.ac.nz>
	<aaec99390904230422l4e7b2ccew20b46a55487523f2@mail.gmail.com>
	<49F112E1.2080903@canterbury.ac.nz>
Message-ID: <aaec99390904240427m5a6e5cf3p9dd38bf01b706b92@mail.gmail.com>

Greg,

2009/4/24 Greg Ewing <greg.ewing at canterbury.ac.nz>:
> Seems to me you could yield a sequence of values
> to be pushed back, and use a top-level driver
> something like this:
>
>  def pushingback(g):
>    queue = []
>    try:
>      while 1:
>        queue.append(yield)
>        while queue:
>          pushback = g.send(queue.pop(0))
>          if pushback:
>            queue.extend(pushback)
>    except StopIteration, e:
>      return e.value

Great.  I think this is gonna work as a top level driver and I'll work
it out. Thanks!

> So when seeing the generator send/yield mechanism for
> the first time, it's tempting to think of it as being
> like sending data to a process through a pipe. But
> it's not really like that -- it's a lower-level
> facility.

I regret I apparently didn't raise your interest in mapping JSP onto
Python coroutines. If you ever get sick of queues, buffers and
callbacks, give it a try!

Thanks so far, I look forward to the definite implementation of your PEP.

--Erik


From larry at hastings.org  Fri Apr 24 15:55:04 2009
From: larry at hastings.org (Larry Hastings)
Date: Fri, 24 Apr 2009 06:55:04 -0700
Subject: [Python-ideas] For CPython 3.1: make type Check and CheckExact
 macros consistent? universal?
Message-ID: <49F1C4B8.7040604@hastings.org>



There are a bunch of type-checking macros in Include/*.h.  Most classes 
only provide one, Check(), implemented as follows:

    #define PyWhatever_Check(ob) (Py_TYPE(ob) == &PyWhatever_Type)

Examples of this style include PyRange, PyMemoryView, PyFunction, and 
PySlice.

Then there are types with a more sophisticated view of type identity.  
For example, an object can qualify as a list even if its type is not 
PyList_Type.  For such types, there's a Check() macro that does a more 
sophisticated check.  They'll also have a CheckExact() macro doing an 
exact type check; that'll look like PyWhatever_Check above.

Then there's PyMethod.  PyMethod has a CheckExact macro but *doesn't* 
have a Check macro.  Hmm!

When I stumbled across that it got me to thinking.  Might it be a good 
idea for all Python types to have both Check and CheckExact macros?  The 
CheckExact form would be consistent, and for types that only need an 
"exact" check they could define Check in terms of CheckExact:

    #define PyWhatever_CheckExact(ob) (Py_TYPE(ob) == &PyWhatever_Type)
    #define PyWhatever_Check PyWhatever_CheckExact

This would let you express your intent.  If you were expecting a 
Whatever-like object, you always use Check.  If you require that exact 
object, you always use CheckExact.

On the other hand, maybe every type-checking macro that does 
(Py_TYPE(ob) == &PyWhatever_Type) should be called CheckExact, and for 
types that don't have an inexact check they don't have a Check.

Or maybe the existing subclass-style Check macros should be renamed 
CheckSubclass (or Implemented or something) and Check should always do a 
CheckExact-style check.


I admit this isn't a stupendous idea, but I thought it was worth typing 
up.  If there's interest I'd be happy to supply a patch for whichever 
form found favor.


Or we could forget the whole thing, you and I,


/larry/
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-ideas/attachments/20090424/937d24e3/attachment.html>

From zooko at zooko.com  Fri Apr 24 21:04:32 2009
From: zooko at zooko.com (Zooko O'Whielacronx)
Date: Fri, 24 Apr 2009 13:04:32 -0600
Subject: [Python-ideas] For CPython 3.1: make type Check and CheckExact
	macros consistent? universal?
In-Reply-To: <49F1C4B8.7040604@hastings.org>
References: <49F1C4B8.7040604@hastings.org>
Message-ID: <71E1FD12-A915-4DEC-BA72-6979EBF02341@zooko.com>

Hi Larry.

It was fun meeting you at PyCon and seeing your minuteman demo.

I like it the way it is -- "Check" means "any satisfactory type" and  
"CheckExact" means "that type".  For types where only that type is  
satisfactory, then their Check should be the same as CheckExact.  I  
guess PyMethod not having a "Check" is perhaps an oversight.

Regards,

Zooko


From greg.ewing at canterbury.ac.nz  Sat Apr 25 04:51:39 2009
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Sat, 25 Apr 2009 14:51:39 +1200
Subject: [Python-ideas] For CPython 3.1: make type Check and CheckExact
 macros consistent? universal?
In-Reply-To: <71E1FD12-A915-4DEC-BA72-6979EBF02341@zooko.com>
References: <49F1C4B8.7040604@hastings.org>
	<71E1FD12-A915-4DEC-BA72-6979EBF02341@zooko.com>
Message-ID: <49F27ABB.6090400@canterbury.ac.nz>

Zooko O'Whielacronx wrote:

> I like it the way it is -- "Check" means "any satisfactory type" and  
> "CheckExact" means "that type".  For types where only that type is  
> satisfactory, then their Check should be the same as CheckExact.

But not having a CheckExact for all types means if you
want an exact check you have to look up the API docs
for the type concerned, instead of just writing
PyFoo_CheckExact.

More importantly, if the type is ever changed to be
subclassable, and its Check function updated accordingly,
existing calls to Check that were relying on an exact
check will become broken.

-- 
Greg


From denis.spir at free.fr  Sat Apr 25 10:32:07 2009
From: denis.spir at free.fr (spir)
Date: Sat, 25 Apr 2009 10:32:07 +0200
Subject: [Python-ideas] why not try without except?
Message-ID: <20090425103207.5e6479ab@o>

Hello,

In various cases, we need to do something (set or update a var or attribute, launch an action, compute data), or not, according to a condition that is a potential case of exception.
Python provides the try...except construct to allow straightforward expression of the non-exceptional case without overloading the code with explicit checkings.

Still, in the common case above, the result is
   try:
      <do_something>
   except ErrorType:
      pass
or we have to fall back to a pre-checking construct
   if not <error_case>:
      <do_something>

For instance
   try:
      text += doc.footer
   except ErrorType:
      pass
or
   if hasattr(doc, 'footer')
      text += doc.footer


Actually, this shows that 'try' alone can be considered analog to 'if' without 'else' clause, needed when the condition may raise an exception. Both express an optional action. This could be written more directly, because we do not need the condition:
   ? <do_something>
   option <do_something>
with the meaning: "Try & do this, but if there's an exception just let it down."

The same syntax may be reused in other contexts such as for having optional parameters:
   def f(data, option param):
   def f(data, ? param):
Actually, the meaning is the same: "Try & and read a second argument, but if you step on an exception just let it down." Moreover, the body of the func will probably use precisely the same construct to try and do something using the optional param, e.g.:
   class Circle:
      .......
      def draw(self, ? fill_color):
         <draw outline>
         ? self.fill(fill_color)

As a nice side-effect, this would also remove one of the common (mis?)uses of None -- which is often considered problematic because of conflicting various uses.

I would then support the introduction of such a syntax, or of 'try' without 'except'. And sincerally thank you for explaining why the latter was not allowed from start.

Denis
------
la vita e estrany


From denis.spir at free.fr  Sat Apr 25 10:42:31 2009
From: denis.spir at free.fr (spir)
Date: Sat, 25 Apr 2009 10:42:31 +0200
Subject: [Python-ideas] why not try without except?
In-Reply-To: <20090425103207.5e6479ab@o>
References: <20090425103207.5e6479ab@o>
Message-ID: <20090425104231.67d6ffe8@o>

Le Sat, 25 Apr 2009 10:32:07 +0200,
spir <denis.spir at free.fr> s'exprima ainsi:

> Hello,
> 
> In various cases, we need to do something (set or update a var or
> attribute, launch an action, compute data), or not, according to a
> condition that is a potential case of exception. 
[...]

Also note the following parallel:

if...    if...else...      if...elif...elif...elif...
N/A      try...except...   try...except...except...except...

Denis
------
la vita e estrany


From andreengels at gmail.com  Sat Apr 25 10:58:53 2009
From: andreengels at gmail.com (Andre Engels)
Date: Sat, 25 Apr 2009 10:58:53 +0200
Subject: [Python-ideas] why not try without except?
In-Reply-To: <20090425103207.5e6479ab@o>
References: <20090425103207.5e6479ab@o>
Message-ID: <6faf39c90904250158l1b819698g790514ff489c6147@mail.gmail.com>

On Sat, Apr 25, 2009 at 10:32 AM, spir <denis.spir at free.fr> wrote:
> Hello,
>
> In various cases, we need to do something (set or update a var or attribute, launch an action, compute data), or not, according to a condition that is a potential case of exception.
> Python provides the try...except construct to allow straightforward expression of the non-exceptional case without overloading the code with explicit checkings.
>
> Still, in the common case above, the result is
> ? try:
> ? ? ?<do_something>
> ? except ErrorType:
> ? ? ?pass
> or we have to fall back to a pre-checking construct
> ? if not <error_case>:
> ? ? ?<do_something>
>
> For instance
> ? try:
> ? ? ?text += doc.footer
> ? except ErrorType:
> ? ? ?pass
> or
> ? if hasattr(doc, 'footer')
> ? ? ?text += doc.footer
>
>
> Actually, this shows that 'try' alone can be considered analog to 'if' without 'else' clause, needed when the condition may raise an exception. Both express an optional action. This could be written more directly, because we do not need the condition:
> ? ? <do_something>
> ? option <do_something>
> with the meaning: "Try & do this, but if there's an exception just let it down."
>
> The same syntax may be reused in other contexts such as for having optional parameters:
> ? def f(data, option param):
> ? def f(data, ? param):
> Actually, the meaning is the same: "Try & and read a second argument, but if you step on an exception just let it down." Moreover, the body of the func will probably use precisely the same construct to try and do something using the optional param, e.g.:
> ? class Circle:
> ? ? ?.......
> ? ? ?def draw(self, ? fill_color):
> ? ? ? ? <draw outline>
> ? ? ? ? ? self.fill(fill_color)
>
> As a nice side-effect, this would also remove one of the common (mis?)uses of None -- which is often considered problematic because of conflicting various uses.
>
> I would then support the introduction of such a syntax, or of 'try' without 'except'. And sincerally thank you for explaining why the latter was not allowed from start.

-1 for making bad programming easier, but not good programming. One
should only ignore exceptions if one is 100% sure why an error
occured, if not, the fact that an error occured should somehow be
noted.

That is, try without except would be equivalent to the following bad
programming:

try:
    do_something
except:
    pass

Good programming would be either:

try:
    do_something
except some_specific_error:
    pass

or

try:
    do_something
except:
    notify_that_error_occurred

neither of which would be made easier by try-without-except.

-- 
Andr? Engels, andreengels at gmail.com


From ben+python at benfinney.id.au  Sat Apr 25 11:12:11 2009
From: ben+python at benfinney.id.au (Ben Finney)
Date: Sat, 25 Apr 2009 19:12:11 +1000
Subject: [Python-ideas] why not try without except?
References: <20090425103207.5e6479ab@o>
Message-ID: <873abxyu50.fsf@benfinney.id.au>

spir <denis.spir at free.fr> writes:

> Actually, this shows that 'try' alone can be considered analog to 'if'
> without 'else' clause

And without a condition, or a branch. Remember that ?if? introduces a
branch point, whereas ?try? does not.

> needed when the condition may raise an exception.

But there *is* no condition in a ?try? statement.

If what you mean is ?when the suite introduced by ?try:? may raise an
exception?, that's true of just about every interesting statement in
Python: an exception might be raised at any point.

> Both express an optional action.

No. The suite that follows ?try? will be executed, just as surely as a
suite *not* enclosed in ?try?.

> This could be written more directly, because we do not need the condition:
>    ? <do_something>
>    option <do_something>
> with the meaning: "Try & do this, but if there's an exception just let
>    it down."

What does ?let it down? mean?

If you mean ?if there's an exception raised, just let it propagate up?,
that's what happens *without* a ?try?. So I can only assume you mean
something different.

I don't understand the behaviour you describe. Perhaps if you would
explain what you think the difference should be between:

    wibble()
    try:
        foo()
        bar()
        baz()
    wobble()

versus this:

    wibble()
    foo()
    bar()
    baz()
    wobble()

How should the behaviour differ?

-- 
 \      ?It takes a big man to cry, but it takes a bigger man to laugh |
  `\                                        at that man.? ?Jack Handey |
_o__)                                                                  |
Ben Finney



From denis.spir at free.fr  Sat Apr 25 12:23:22 2009
From: denis.spir at free.fr (spir)
Date: Sat, 25 Apr 2009 12:23:22 +0200
Subject: [Python-ideas] why not try without except?
In-Reply-To: <873abxyu50.fsf@benfinney.id.au>
References: <20090425103207.5e6479ab@o>
	<873abxyu50.fsf@benfinney.id.au>
Message-ID: <20090425122322.7f7da364@o>

Le Sat, 25 Apr 2009 19:12:11 +1000,
Ben Finney <ben+python at benfinney.id.au> s'exprima ainsi:


> But there *is* no condition in a ?try? statement.
> 
> If what you mean is ?when the suite introduced by ?try:? may raise an
> exception?, that's true of just about every interesting statement in
> Python: an exception might be raised at any point.

You're right, that's what I mean.

> > Both express an optional action.
> 
> No. The suite that follows ?try? will be executed, just as surely as a
> suite *not* enclosed in ?try?.

It's a question of point of view. From the language side, you're certainly right. But from the programmer side, I guess it's correct to say that "try... do_this... except ExceptionType... pass" expresses an optional action. Precisely because there is no alternative. It's like an if without else.

> > This could be written more directly, because we do not need the condition:
> >    ? <do_something>
> >    option <do_something>
> > with the meaning: "Try & do this, but if there's an exception just let
> >    it down."
> 
> What does ?let it down? mean?

It means more or less:
   except Exception:
      pass

> If you mean ?if there's an exception raised, just let it propagate up?,
> that's what happens *without* a ?try?. So I can only assume you mean
> something different.

Yes, see above.

> I don't understand the behaviour you describe. Perhaps if you would
> explain what you think the difference should be between:
> 
>     wibble()
>     try:
>         foo()
>         bar()
>         baz()
>     wobble()
> 
> versus this:
> 
>     wibble()
>     foo()
>     bar()
>     baz()
>     wobble()
> 
> How should the behaviour differ?
> 
Well, i think my example circle drawing func answers your question. It does not propagate the exception, instead precisely avoids this.
    wibble()
    try:
        foo()
        bar()
        baz()
    wobble()
<==>
    wibble()
    try:
        foo()
        bar()
        baz()
    except Exception:
       pass
    wobble()

But I prefere the 'option' syntax not only because it expresses the intention in a clearler and more direct way; also because it applies on a single statement -- which avoids potential some misuse like may happen in your example: try with several statements can more easily catch unexpected errors -- but this happens with standard try...except too.

Another example:
   result = ...
   option result.applyTransforms(transforms)
<==>
   result = ...
   try:
      option result.applyTransforms(transforms)
   except Exception:
      pass

Now, I agree with the critics in another reply that
   except Exception:
      pass
is very different of
   except NameError:
      pass

We could probably find a way to specify the type of exception:
   option result.applyTransforms(transforms) except NameError

Denis
------
la vita e estrany


From ben+python at benfinney.id.au  Sat Apr 25 12:34:36 2009
From: ben+python at benfinney.id.au (Ben Finney)
Date: Sat, 25 Apr 2009 20:34:36 +1000
Subject: [Python-ideas] why not try without except?
References: <20090425103207.5e6479ab@o> <873abxyu50.fsf@benfinney.id.au>
	<20090425122322.7f7da364@o>
Message-ID: <87tz4dxbr7.fsf@benfinney.id.au>

spir <denis.spir at free.fr> writes:

> Well, i think my example circle drawing func answers your question. It
> does not propagate the exception, instead precisely avoids this.

>     wibble()
>     try:
>         foo()
>         bar()
>         baz()
>     wobble()
> <==>
>     wibble()
>     try:
>         foo()
>         bar()
>         baz()
>     except Exception:
>        pass
>     wobble()

Thank you for explaining.

I am solidly against this proposal, as I think it's a pattern that
should be discouraged, not accommodated.

> Another example:
>    result = ...
>    option result.applyTransforms(transforms)
> <==>
>    result = ...
>    try:
>       option result.applyTransforms(transforms)
>    except Exception:
>       pass

Can you show some real-world code that would be improved by this? I
don't know of any code where *all* exceptions, especially unexpected
ones, should simply be thrown away.

If there are specific exception types that should be discarded, it is
better to list them explicitly, and make sure your code is very clear on
*why* those specific exceptions are being discarded.

If you want exceptions to be handled in a generic way, it is better to
set up a custom exception handler.

-- 
 \       ?If you are unable to leave your room, expose yourself in the |
  `\            window.? ?instructions in case of fire, hotel, Finland |
_o__)                                                                  |
Ben Finney



From g.brandl at gmx.net  Sat Apr 25 15:32:39 2009
From: g.brandl at gmx.net (Georg Brandl)
Date: Sat, 25 Apr 2009 13:32:39 +0000
Subject: [Python-ideas] why not try without except?
In-Reply-To: <20090425122322.7f7da364@o>
References: <20090425103207.5e6479ab@o>	<873abxyu50.fsf@benfinney.id.au>
	<20090425122322.7f7da364@o>
Message-ID: <gsv3dn$cjp$1@ger.gmane.org>

spir schrieb:

> Well, i think my example circle drawing func answers your question. It does not propagate the exception, instead precisely avoids this.
>     wibble()
>     try:
>         foo()
>         bar()
>         baz()
>     wobble()
> <==>
>     wibble()
>     try:
>         foo()
>         bar()
>         baz()
>     except Exception:
>        pass
>     wobble()

And while we're at it, let's introduce

on error resume next:
    foo()
    bar()
    baz()

Georg

-- 
Thus spake the Lord: Thou shalt indent with four spaces. No more, no less.
Four shall be the number of spaces thou shalt indent, and the number of thy
indenting shall be four. Eight shalt thou not indent, nor either indent thou
two, excepting that thou then proceed to four. Tabs are right out.



From lists+python-ideas at jimpryor.net  Sat Apr 25 18:10:11 2009
From: lists+python-ideas at jimpryor.net (Jim Pryor)
Date: Sat, 25 Apr 2009 12:10:11 -0400
Subject: [Python-ideas] Revised**12 PEP on Yield-From
In-Reply-To: <49EAC4F9.90107@canterbury.ac.nz>
References: <49EAC4F9.90107@canterbury.ac.nz>
Message-ID: <20090425161011.GC11800@vaio.nyu.edu>

I created a pure Python decorator-based implementation of PEP 380, based
on the semantics in this latest draft (Revised**12).

There's a "simple" version and an "optimized" version: the simple
version is easier to follow; the optimized version does special handling
for nested yield from calls.

They're both now posted at ActiveState:

http://code.activestate.com/recipes/576727/
http://code.activestate.com/recipes/576728/

>From the descriptions:

> <http://www.python.org/dev/peps/pep-0380/> proposes new syntax ("yield
> from") for generators to delegate control to a "subgenerator" (really to
> any iterator). Any send/next/throw/close calls to the delegating
> generator are forwarded to the delegee, until the delegee is exhausted.
> 
> This is being considered for inclusion in Python 2.7, but I wanted a
> way to play around with the design pattern now (and in case the PEP
> isn't soon accepted, and on older Python installations regardless of
> what happens with future versions of Python).
> 
> So I came up with this decorator-based solution. The "supergenerator"
> decorator wraps the delegating generator with a control handler that
> takes care of directing send/next/throw/close calls to the delegator or
> delegee, as appropriate.

Hope others find them a useful contribution. Of course, I welcome any
feedback.


-- 
Jim Pryor
profjim at jimpryor.net


From python at rcn.com  Sun Apr 26 03:00:26 2009
From: python at rcn.com (Raymond Hettinger)
Date: Sat, 25 Apr 2009 18:00:26 -0700
Subject: [Python-ideas] Updating PEP 315:  do-while loops
Message-ID: <74E2C336A61B4EE68C5AF37959C73174@RaymondLaptop1>

Am working on PEP 315 again, simplifying the proposal by focusing on a more standard do-while loop.

The two motivating cases are:

1) A pattern of repeated code blocks before and inside a traditional while-loop.  Here's an example from random.sample():

    j = _int(random() * n)
    while j in selected:
        j = _int(random() * n)


2) A pattern of ending a "while True:" loop with an "if not <cond>: break".  Here's an example from random.normalvariate():

    while 1:
        u1 = random()
        u2 = 1.0 - random()
        z = NV_MAGICCONST*(u1-0.5)/u2
        zz = z*z/4.0
        if zz <= -_log(u2):
            break


The challenge has been finding a syntax that fits well with the patterns in the rest of the language.
It seems that every approach has it's own strengths and weaknesses.

1) One approach uses the standard do-while name but keeps usual python style formatting 
by putting the condition at the top instead of the bottom where it is in C and Java:

    do ... while j in selected:
        j = _int(random() * n)


    do ... while zz > -_log(u2):
        u1 = random()
        u2 = 1.0 - random()
        z = NV_MAGICCONST*(u1-0.5)/u2
        zz = z*z/4.0

This syntax uses the ellipsis for an additional mental cue reminding the reader that the
enclosed code block is executed before the condition is evaluated.  It is not unlike
the with-statement which has both enter and exit behaviors signaled by a single leading
keyword and an indented block.  Also, code-at-the-top approach fits well with 
the way the compiler would generate code with the condition test being 
preceding the loop body:

    0   SETUP_LOOP             30 (to 33)
    3   JUMP_ABSOLUTE          18
    6   LOAD_NAME               0 (x)
    9   LOAD_NAME               1 (y) 
    12  COMPARE_OP              0 (<)
    15  POP_JUMP_IF_FALSE     33
    18      <loop body>
              ...
    30  JUMP_ABSOLUTE           6
    33  LOAD_CONST               1 (None)
    36  RETURN_VALUE

I like the ellipsis form because of the mental cue it provides, but the proposal still 
works with any other spelling variant:

    do_while <cond>:
    do while <cond>:
    do_body_first_and_then_loop_if_the_test_condition_is_true <cond>:


2) Another approach is to put the test at the end.  Something like:

    do:
        j = _int(random() * n)
    :while j in selected

    do:
        j = _int(random() * n)
        while j in selected

These seem syntactically weird to me and feel more like typos than real python code.
I'm sure there are many ways to spell the last line, but in the two years since I first worked
on the PEP, I haven't found any condition-at-the-end syntax that FeelsRight(tm).

Probably everyone has an opinion on these, but I don't any think those could be called 
bike-shedding.  The spelling does matter and hopefully the cleanest solution can be found.
It would be really great if we were to get some decent spelling of do-while in the language.



Raymond


P.S.  I've punted on the more complex PEP-315 variants with repeated setup code:

    <setup code>
    while <condition>:
        <loop body>
        <setup code>

It was hard enough to get a decent spelling for do-while.

Also, I've punted on the while-cond-with-assignment case that is being handled by
other proposals such as:

    while f.read(20) as data != '':
        ...


From steve at pearwood.info  Sun Apr 26 03:05:56 2009
From: steve at pearwood.info (Steven D'Aprano)
Date: Sun, 26 Apr 2009 11:05:56 +1000
Subject: [Python-ideas] why not try without except?
In-Reply-To: <gsv3dn$cjp$1@ger.gmane.org>
References: <20090425103207.5e6479ab@o> <20090425122322.7f7da364@o>
	<gsv3dn$cjp$1@ger.gmane.org>
Message-ID: <200904261105.57286.steve@pearwood.info>

On Sat, 25 Apr 2009 11:32:39 pm Georg Brandl wrote:

> And while we're at it, let's introduce
>
> on error resume next:
>     foo()
>     bar()
>     baz()

Is that meant to be a serious suggestion or a sarcastic rejoinder? 
Either way, I'm not sure what "on error resume next" is supposed to do, 
so I don't know whether it is a good idea or a bad idea.



-- 
Steven D'Aprano


From bruce at leapyear.org  Sun Apr 26 03:08:26 2009
From: bruce at leapyear.org (Bruce Leban)
Date: Sat, 25 Apr 2009 18:08:26 -0700
Subject: [Python-ideas] why not try without except?
In-Reply-To: <200904261105.57286.steve@pearwood.info>
References: <20090425103207.5e6479ab@o> <20090425122322.7f7da364@o> 
	<gsv3dn$cjp$1@ger.gmane.org> <200904261105.57286.steve@pearwood.info>
Message-ID: <cf5b87740904251808o7cfc4cadw6f2c30750c3f0a73@mail.gmail.com>

This should answer the question:
http://www.developerfusion.com/code/4325/on-error-resume-next-considered-harmful/

On Sat, Apr 25, 2009 at 6:05 PM, Steven D'Aprano <steve at pearwood.info>wrote:

> On Sat, 25 Apr 2009 11:32:39 pm Georg Brandl wrote:
>
> > And while we're at it, let's introduce
> >
> > on error resume next:
> >     foo()
> >     bar()
> >     baz()
>
> Is that meant to be a serious suggestion or a sarcastic rejoinder?
> Either way, I'm not sure what "on error resume next" is supposed to do,
> so I don't know whether it is a good idea or a bad idea.
>
>
>
> --
> Steven D'Aprano
> _______________________________________________
> Python-ideas mailing list
> Python-ideas at python.org
> http://mail.python.org/mailman/listinfo/python-ideas
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-ideas/attachments/20090425/6fd77f61/attachment.html>

From eric at trueblade.com  Sun Apr 26 03:31:17 2009
From: eric at trueblade.com (Eric Smith)
Date: Sat, 25 Apr 2009 21:31:17 -0400
Subject: [Python-ideas] Updating PEP 315:  do-while loops
In-Reply-To: <74E2C336A61B4EE68C5AF37959C73174@RaymondLaptop1>
References: <74E2C336A61B4EE68C5AF37959C73174@RaymondLaptop1>
Message-ID: <49F3B965.1090700@trueblade.com>

Raymond Hettinger wrote:
> Am working on PEP 315 again, simplifying the proposal by focusing on a 
> more standard do-while loop.

I posted this to python-dev before I saw your post over here. Sorry 
about that.

My message was:
You might want to note in the PEP that the problem that's being solved 
is known as the "loop and a half" problem, at least for your motivating 
case #2.

http://www.cs.duke.edu/~ola/patterns/plopd/loops.html#loop-and-a-half

> 2) A pattern of ending a "while True:" loop with an "if not <cond>: 
> break".  Here's an example from random.normalvariate():
> 
>    while 1:
>        u1 = random()
>        u2 = 1.0 - random()
>        z = NV_MAGICCONST*(u1-0.5)/u2
>        zz = z*z/4.0
>        if zz <= -_log(u2):
>            break



From steve at pearwood.info  Sun Apr 26 04:25:03 2009
From: steve at pearwood.info (Steven D'Aprano)
Date: Sun, 26 Apr 2009 12:25:03 +1000
Subject: [Python-ideas] why not try without except?
In-Reply-To: <20090425103207.5e6479ab@o>
References: <20090425103207.5e6479ab@o>
Message-ID: <200904261225.03948.steve@pearwood.info>

On Sat, 25 Apr 2009 06:32:07 pm spir wrote:
> Hello,
>
> In various cases, we need to do something (set or update a var or
> attribute, launch an action, compute data), or not, according to a
> condition that is a potential case of exception. Python provides the
> try...except construct to allow straightforward expression of the
> non-exceptional case without overloading the code with explicit
> checkings.
>
> Still, in the common case above, the result is
>    try:
>       <do_something>
>    except ErrorType:
>       pass


I've had a look in my code, and I very rarely have pass in the 
except-block. I don't think this is any where near as common as you 
think.


> For instance
> ? ?try:
> ? ? ? text += doc.footer
> ? ?except ErrorType:
> ? ? ? pass
> or
> ? ?if hasattr(doc, 'footer')
> ? ? ? text += doc.footer


What is ErrorType? If you mean AttributeError, say so :)

A third alternative is to make sure that doc.footer always exists, even 
if it is only the empty string, and then just write:

text += doc.footer

In many cases, I prefer that. I don't like attributes which sometimes 
exist and sometimes don't.


> Actually, this shows that 'try' alone can be considered analog to
> 'if' without 'else' clause, 

I think that is stretching the analogy past breaking point. In the 
if...else case, only one of the if-block or else-block are taken. But 
that's not true in the case of try...except: potentially both blocks 
are taken.

L = []
try:
    for x in ('1.2', '3.4', '5.6', '7..7'):
        print x
        L.append(float(x))
except ValueError:
    print '%s is not a valid float' % x
finally:
    print L


As you can see from running that code snippet, both the try and except 
blocks get run. And there is no analogy to finally in if...else.


> needed when the condition may raise an 
> exception.  Both express an optional action. This could be written 
> more directly, because we do not need the condition: ? <do_something>
>    option <do_something>
> with the meaning: "Try & do this, but if there's an exception just
> let it down."


But what exception? You have to specify what sort of exception(s) you 
want to ignore, because not all exceptions mean the same thing. 
Consider this:

try:
    text += doc.footer

# just ignore exceptions

Do you really want to ignore ALL exceptions? I would say not! You want 
to ignore a *single* exception, AttributeError. You don't want to hide 
exceptions like NameError (text or doc don't exist) or TypeError (text 
and doc.footer can't be added), because they indicate bugs that should 
not be hidden. Nor do you want to catch exceptions like 
KeyboardInterrupt.

Since the Python compiler can't read the programmer's mind and know what 
exceptions to catch or not to catch, there are only three alternatives:

(1) The current situation: you *must* specify the exception-clause.

(2) Ignore *all* exceptions. This will hide bugs, causing them to crop 
up later when they will be harder to debug. It will also make the 
program un-interruptable by catching KeyboardInterrupt.

(3) Ignore some sub-set of exceptions, like Exception. But again, this 
will still catch exceptions you didn't intend, and possibly let through 
exceptions you wanted to catch.


> The same syntax may be reused in other contexts such as for having
> optional parameters: def f(data, option param):
>    def f(data, ? param):
> Actually, the meaning is the same: "Try & and read a second argument,
> but if you step on an exception just let it down." 

First of all, I think that this will allow a style of programming which 
is lazy and undisciplined and that will lead to more bugs, not fewer. 
If param is *really* optional, then you are probably dealing with 
something best written as two different functions:

def f_without_param(data):

def f_with_param(data, param):


But for the sake of the argument, let's accept that we should write the 
code with an optional argument in that way. Now what happens?

Firstly, every time you refer to param, you need to prefix the line 
with '?' because param may or may not exist. This is inconvenient and 
will lead to bugs when you forget to include a '?' at the start of the 
line. Worse, it will lead to more bugs:

def f(data, ? prefix):
    ? data = prefix + dat  # oops a typo
    print data

f("visible", "in")
=> prints "visible"


What is the interaction between "optional" param and namespaces? 
Consider:

def f(? param):
    ? print param

f()  # I expect this will do nothing, swallowing the NameError exception

param = "spam"
f(param)  # I expect this to print "spam"

f()  # What will this do?


The third case is interesting: should f() print "spam" because param 
exists in the global namespace, or should it do nothing because the 
local argument param doesn't exist? Whichever choice you make, it will 
make some people unhappy.



-- 
Steven D'Aprano


From larry at hastings.org  Sun Apr 26 04:35:15 2009
From: larry at hastings.org (Larry Hastings)
Date: Sat, 25 Apr 2009 19:35:15 -0700
Subject: [Python-ideas] Updating PEP 315:  do-while loops
In-Reply-To: <74E2C336A61B4EE68C5AF37959C73174@RaymondLaptop1>
References: <74E2C336A61B4EE68C5AF37959C73174@RaymondLaptop1>
Message-ID: <49F3C863.9040606@hastings.org>


Raymond Hettinger wrote:
> 1) One approach uses the standard do-while name but keeps usual python 
> style formatting by putting the condition at the top instead of the 
> bottom where it is in C and Java: 

I don't like the condition-at-the-top variants, with or without an 
ellipsis.  (At first glance anyway.)

> I haven't found any condition-at-the-end syntax that FeelsRight(tm).
>
> Probably everyone has an opinion on these, but I don't any think those 
> could be called bike-shedding.  The spelling does matter and hopefully 
> the cleanest solution can be found.

I couldn't quite parse that, but I sure hope it meant "I'm up for yet 
another round of discussion".

I propose we allow an "if" suffix to "break" and "continue".  So I'd 
spell your loop this way:

    do:
        j = _int(random() * n)
        continue if j in selected

This would require there be no implied "while True" for a "do" block.  
If you hit the end of the "do" block, without following a continue, 
you'd just leave the block.  So this program would be legal:

    do:
      print("wtf?")
    print("well, that was pointless")

It would print two strings then exit.  It would *not* loop.  A little 
unexpected I guess.

On the other hand this form gives a pleasing alternate spelling to 
"while True":

    do:
        continue

It also provides a dandy spelling for early-exit programming, without 
nesting your ifs, or early return, or throwing an exception, or using 
gotos which thankfully we don't have:

    def user_typed_1_then_2():
        do:
           print("enter 1, then 2")
           i = input()
           break if i != '1'
           i = input()
           break if i != '2'
           return True
        print("why do you mistreat me so? it was a simple request!")
        return False

Maybe it's just me, but I think that spelling and those semantics are 
charmingly Pythonic.

HTH,


/larry/


From python at rcn.com  Sun Apr 26 04:39:10 2009
From: python at rcn.com (Raymond Hettinger)
Date: Sat, 25 Apr 2009 19:39:10 -0700
Subject: [Python-ideas] Updating PEP 315:  do-while loops
References: <74E2C336A61B4EE68C5AF37959C73174@RaymondLaptop1>
	<49F3B965.1090700@trueblade.com>
Message-ID: <6097EC22D9B448C7857BB8A0247DA9E1@RaymondLaptop1>


[Eric Smith]
> You might want to note in the PEP that the problem that's being solved 
> is known as the "loop and a half" problem, at least for your motivating 
> case #2.

Thanks.  When I update the PEP, will add that link.
Wanted to get people's reaction here first.
Do the condition-at-the-top format work for everyone
or can someone think-up a condition-at-the-bottom
approach the doesn't suck with respect to existing
Python syntax.


Raymond



From python at rcn.com  Sun Apr 26 04:43:41 2009
From: python at rcn.com (Raymond Hettinger)
Date: Sat, 25 Apr 2009 19:43:41 -0700
Subject: [Python-ideas] Updating PEP 315:  do-while loops
References: <74E2C336A61B4EE68C5AF37959C73174@RaymondLaptop1>
	<49F3C863.9040606@hastings.org>
Message-ID: <91F251EF5E864ED29BCA5182DE76F0F0@RaymondLaptop1>


[Larry Hastings]
>> Probably everyone has an opinion on these, but I don't any think those 
>> could be called bike-shedding.  The spelling does matter and hopefully 
>> the cleanest solution can be found.
> 
> I couldn't quite parse that, but I sure hope it meant "I'm up for yet 
> another round of discussion".

Yes. Absolutely.

Though I would like a fair hearing for the "do ... while <cond>:" proposal.
To my eyes, it has clear meaning and sucks less than every alternative
that I've seen so far.


Raymond


From pyideas at rebertia.com  Sun Apr 26 04:49:47 2009
From: pyideas at rebertia.com (Chris Rebert)
Date: Sat, 25 Apr 2009 19:49:47 -0700
Subject: [Python-ideas] Updating PEP 315: do-while loops
In-Reply-To: <49F3C863.9040606@hastings.org>
References: <74E2C336A61B4EE68C5AF37959C73174@RaymondLaptop1>
	<49F3C863.9040606@hastings.org>
Message-ID: <50697b2c0904251949x5a1442apa35157e11a7e3c9e@mail.gmail.com>

On Sat, Apr 25, 2009 at 7:35 PM, Larry Hastings <larry at hastings.org> wrote:
<snip>
> It also provides a dandy spelling for early-exit programming, without
> nesting your ifs, or early return, or throwing an exception, or using gotos
> which thankfully we don't have:
>
> ? def user_typed_1_then_2():
> ? ? ? do:
> ? ? ? ? ?print("enter 1, then 2")
> ? ? ? ? ?i = input()
> ? ? ? ? ?break if i != '1'
> ? ? ? ? ?i = input()
> ? ? ? ? ?break if i != '2'
> ? ? ? ? ?return True
> ? ? ? print("why do you mistreat me so? it was a simple request!")
> ? ? ? return False

I don't see why the if-as-suffix is needed when we already have a
one-liner for such situations (e.g.):

if i != '1': break

It's much more uniform and only one character longer.

Cheers,
Chris
-- 
I have a blog:
http://blog.rebertia.com


From rrr at ronadam.com  Sun Apr 26 04:50:52 2009
From: rrr at ronadam.com (Ron Adam)
Date: Sat, 25 Apr 2009 21:50:52 -0500
Subject: [Python-ideas] Updating PEP 315:  do-while loops
In-Reply-To: <74E2C336A61B4EE68C5AF37959C73174@RaymondLaptop1>
References: <74E2C336A61B4EE68C5AF37959C73174@RaymondLaptop1>
Message-ID: <49F3CC0C.8000807@ronadam.com>



Raymond Hettinger wrote:

> 2) Another approach is to put the test at the end.  Something like:
> 
>    do:
>        j = _int(random() * n)
>    :while j in selected
> 
>    do:
>        j = _int(random() * n)
>        while j in selected

> These seem syntactically weird to me and feel more like typos than real python code.
> I'm sure there are many ways to spell the last line, but in the two years since I first worked
> on the PEP, I haven't found any condition-at-the-end syntax that FeelsRight(tm). 

I think having the "while" start some blocks but end others is very weird 
indeed and makes it harder to keep stuff in my head.


I'd prefer a bare "while:" with a more visible "break if" and possibly 
"continue if" expressions.


Visualize the while, continue, and break with syntax highlighting.

      while:
          j = _int(random() 8 n)
          break if j not in selected

Or if the reverse comparison is preferred you could do...

      while:
          j = _int(random() * n)
          continue if j in selected
          break



Or we could allow break and continue in the if-else expression syntax ...

      while:
          ...
          continue if (j in selected) else break

or..

      while:
          ...
          break if (j not in selected) else continue


In this last case the final 'else continue' can be dropped off sense it 
would be redundant:

      while:
          ...
          break if (j not in selected)


That would give us back the first example again.


By moving the break and continue out from under the if statement it makes 
it more visible and follows a common pattern of having lines start with 
keywords, which is very readable with syntax highlighting.  It also still 
allows the break to be anywhere in the body which is more flexible.

Because it is vary close to the current while syntax, is there really any 
need to rename the bare 'while' to 'do'?   I just think of it as 'do for a 
while' or short for 'while True'.

Cheers,
    Ron























From larry at hastings.org  Sun Apr 26 05:39:05 2009
From: larry at hastings.org (Larry Hastings)
Date: Sat, 25 Apr 2009 20:39:05 -0700
Subject: [Python-ideas] Updating PEP 315: do-while loops
In-Reply-To: <50697b2c0904251949x5a1442apa35157e11a7e3c9e@mail.gmail.com>
References: <74E2C336A61B4EE68C5AF37959C73174@RaymondLaptop1>	
	<49F3C863.9040606@hastings.org>
	<50697b2c0904251949x5a1442apa35157e11a7e3c9e@mail.gmail.com>
Message-ID: <49F3D759.7030605@hastings.org>

Chris Rebert wrote:
> I don't see why the if-as-suffix is needed when we already have a
> one-liner for such situations (e.g.):
>
> if i != '1': break
>
> It's much more uniform and only one character longer.
>   

I certainly see your point.  Let me take it a step further: the "do: ... 
while <condition>" construct isn't needed, given that it's already 
expressible with "while True: ... if not <condition>: break".

It's true, "break if <condition>" and "continue if <condition>" are 
redundant constructs.  But this debate is over refining our syntactic 
sugar--charting what is arguably a redundant construct.  Therefore 
proposing redundant constructs for the sake of clarity is on the table.  
I think "break if <condition>" and "continue if <condition>" enhance 
readability; they make the control flow pop out at you more than "if 
<condition>: break" and "if <condition>: continue" do.  "break if" and 
"continue if" have the advantage of following established Python 
syntactic precedent.


/larry/


From larry at hastings.org  Sun Apr 26 05:55:52 2009
From: larry at hastings.org (Larry Hastings)
Date: Sat, 25 Apr 2009 20:55:52 -0700
Subject: [Python-ideas] Updating PEP 315:  do-while loops
In-Reply-To: <49F3CC0C.8000807@ronadam.com>
References: <74E2C336A61B4EE68C5AF37959C73174@RaymondLaptop1>
	<49F3CC0C.8000807@ronadam.com>
Message-ID: <49F3DB48.8050909@hastings.org>

Ron Adam wrote:
> I'd prefer a bare "while:" with a more visible "break if" and possibly 
> "continue if" expressions.

I do like "break if" and "continue if", unsurprisingly as I suggested 
them in parallel.  I'm not sure about the "while:"; your "while: ... 
continue if ; break" strikes down some of the clarity we're trying to 
achieve here.



/larry/


From pyideas at rebertia.com  Sun Apr 26 05:59:16 2009
From: pyideas at rebertia.com (Chris Rebert)
Date: Sat, 25 Apr 2009 20:59:16 -0700
Subject: [Python-ideas] Updating PEP 315: do-while loops
In-Reply-To: <49F3D759.7030605@hastings.org>
References: <74E2C336A61B4EE68C5AF37959C73174@RaymondLaptop1>
	<49F3C863.9040606@hastings.org>
	<50697b2c0904251949x5a1442apa35157e11a7e3c9e@mail.gmail.com>
	<49F3D759.7030605@hastings.org>
Message-ID: <50697b2c0904252059j381cee39i65d5a9981ff739f4@mail.gmail.com>

On Sat, Apr 25, 2009 at 8:39 PM, Larry Hastings <larry at hastings.org> wrote:
> Chris Rebert wrote:
>>
>> I don't see why the if-as-suffix is needed when we already have a
>> one-liner for such situations (e.g.):
>>
>> if i != '1': break
>>
>> It's much more uniform and only one character longer.
>>
>
> I certainly see your point. ?Let me take it a step further: the "do: ...
> while <condition>" construct isn't needed, given that it's already
> expressible with "while True: ... if not <condition>: break".
>
> It's true, "break if <condition>" and "continue if <condition>" are
> redundant constructs. ?But this debate is over refining our syntactic
> sugar--charting what is arguably a redundant construct. ?Therefore proposing
> redundant constructs for the sake of clarity is on the table. ?I think
> "break if <condition>" and "continue if <condition>" enhance readability;
> they make the control flow pop out at you more than "if <condition>: break"
> and "if <condition>: continue" do. ?"break if" and "continue if" have the
> advantage of following established Python syntactic precedent.

Except the ternary operator only allows for expressions, not
statements; IIRC, GvR was somewhat reluctant about having any sort of
ternary operator, so this further extension seems unlikely to me.
Allowing general if-suffixes on things also seems a bit Perlish, imho.
I find the current state of syntactic simplicity quite appealing.

Regarding visibility, how is the last word of the last line in a loop
body, further set off by the if's colon, not visible enough? I
personally haven't ever had problems spotting the breaks in
loop-and-a-half code.

On another note, I actually do like the bare-while-as-while-True idea
quite a bit. +1 on that.

Cheers,
Chris
-- 
http://blog.rebertia.com


From larry at hastings.org  Sun Apr 26 06:23:39 2009
From: larry at hastings.org (Larry Hastings)
Date: Sat, 25 Apr 2009 21:23:39 -0700
Subject: [Python-ideas] Updating PEP 315: do-while loops
In-Reply-To: <50697b2c0904252059j381cee39i65d5a9981ff739f4@mail.gmail.com>
References: <74E2C336A61B4EE68C5AF37959C73174@RaymondLaptop1>	
	<49F3C863.9040606@hastings.org>	
	<50697b2c0904251949x5a1442apa35157e11a7e3c9e@mail.gmail.com>	
	<49F3D759.7030605@hastings.org>
	<50697b2c0904252059j381cee39i65d5a9981ff739f4@mail.gmail.com>
Message-ID: <49F3E1CB.2070102@hastings.org>


Chris Rebert wrote:
> Except the ternary operator only allows for expressions, not
> statements; IIRC, GvR was somewhat reluctant about having any sort of
> ternary operator, so this further extension seems unlikely to me.
>   

The suffix form of "if" can also be found in generator expressions and 
list/set/dict comprehensions.  What all these constructs have in common 
is that they're expressions, which means they can be buried in the 
middle of other expressions.  And being able to bury an "if" in there 
too is a boon to the programmer--one that the "if" *statement* could not 
grant.

But this is a subtle point, easily lost on non-language-nerds.  I work 
with an excellent programmer who nevertheless didn't grasp the 
distinction, even as I attempted to explain it to him.  So, for most 
Python programmers, the language feels like "an if statement looks like 
this, and btw you can have ifs in the middles and ends of lines too 
sometimes".  I hardly think the suffix form of "if" to "break" and 
"continue" would be the conceptual straw that broke the back of the 
language's cognitive load.

As for GvR, I gave up trying to predict his reactions a long time ago.  
Though I admit I'd be very surprised if my proposal, or indeed any 
proposal, won favor; PEP 315 is long enough in the tooth I fear we are 
doomed to never find a majority.  (A majority being defined here as 
either 2/3 majority of python-dev voters, or indeed the lone BDFL.)  
Witness PEP 3103, the switch statement; I think we'd all like to have 
one, if we could only figure out how to spell it.

And I personally would value a switch statement more than do/while,


/larry/


From rrr at ronadam.com  Sun Apr 26 06:44:16 2009
From: rrr at ronadam.com (Ron Adam)
Date: Sat, 25 Apr 2009 23:44:16 -0500
Subject: [Python-ideas] Updating PEP 315:  do-while loops
In-Reply-To: <49F3DB48.8050909@hastings.org>
References: <74E2C336A61B4EE68C5AF37959C73174@RaymondLaptop1>	<49F3CC0C.8000807@ronadam.com>
	<49F3DB48.8050909@hastings.org>
Message-ID: <49F3E6A0.70903@ronadam.com>



Larry Hastings wrote:
> Ron Adam wrote:
>> I'd prefer a bare "while:" with a more visible "break if" and possibly 
>> "continue if" expressions.
> 
> I do like "break if" and "continue if", unsurprisingly as I suggested 
> them in parallel.  I'm not sure about the "while:"; your "while: ... 
> continue if ; break" strikes down some of the clarity we're trying to 
> achieve here.

The bare "while:" can be a completely separate issue.


And...

     while True:
         ...
         continue if condition
         break

I'm not sure why you think that is less or not clear.  It's just an 
possible to do if you add 'continue if'.  <shrug>

The equivalent would be:

     while True:
         ...
         break if (not condition)

But this reverses the test and that could express what you are doing in a 
less clear way.  Being able to do it both ways is a good thing I think.



If you allow the if-else syntax to be used with flow control keywords then 
you have 6 possibilities.

       break if condition else pass
       break if condition else continue

       continue if condition else pass
       continue if condition else break

       pass if condition else break
       pass if condition else continue

It makes sense to shorten some of these..

       break if condition      # 'else pass' doesn't do anything
       continue if condition   # ""

I'm not sure the others are needed, they may allow some problems to be 
expressed more explicitly.

The nice thing about extending python this way is these expressions may 
also work in for loops and possibly other places.

Because python is based as much on practical need rather than just what is 
possible to do.  We may only need the two last shorter versions above.

But I won't complain if all six of the longer versions above also work. ;-)

In anycase Raymond was asking for ideas that are consistent with pythons 
syntax and I think these suggestions are.  The one inconsistant thing is 
if-else expressions normally return a value, and in these cases, they 
either should raise a syntax error or return None if they are used on the 
right hand side of an expression.

Ron

















From josiah.carlson at gmail.com  Sun Apr 26 07:06:14 2009
From: josiah.carlson at gmail.com (Josiah Carlson)
Date: Sat, 25 Apr 2009 22:06:14 -0700
Subject: [Python-ideas] Updating PEP 315: do-while loops
In-Reply-To: <6097EC22D9B448C7857BB8A0247DA9E1@RaymondLaptop1>
References: <74E2C336A61B4EE68C5AF37959C73174@RaymondLaptop1>
	<49F3B965.1090700@trueblade.com>
	<6097EC22D9B448C7857BB8A0247DA9E1@RaymondLaptop1>
Message-ID: <e6511dbf0904252206u38cd65a6sd1707658e3242732@mail.gmail.com>

On Sat, Apr 25, 2009 at 7:39 PM, Raymond Hettinger <python at rcn.com> wrote:
>
> [Eric Smith]
>>
>> You might want to note in the PEP that the problem that's being solved is
>> known as the "loop and a half" problem, at least for your motivating case
>> #2.
>
> Thanks. ?When I update the PEP, will add that link.
> Wanted to get people's reaction here first.
> Do the condition-at-the-top format work for everyone
> or can someone think-up a condition-at-the-bottom
> approach the doesn't suck with respect to existing
> Python syntax.

Condition at the bottom sucks for similar reasons as why ...
def foo(...):
    (many lines of code)
foo = classmethod(foo)

Having your condition, modification, etc., at the end is just plain
annoying.  I like my conditions up top.

I agree with you that the 3-suite do/while/else option is just plain
ugly.  I also agree that the "do while <condition>:" variants are
currently the prettiest, but it still doesn't feel quite right to me.

When confronted with these things, I usually go for the "while True"
or "while first or <condition>" versions, and throw a comment just
before the condition in the "while True" variant so my eye is drawn
there.

 - Josiah


From steve at pearwood.info  Sun Apr 26 07:20:23 2009
From: steve at pearwood.info (Steven D'Aprano)
Date: Sun, 26 Apr 2009 15:20:23 +1000
Subject: [Python-ideas] Updating PEP 315: do-while loops
In-Reply-To: <e6511dbf0904252206u38cd65a6sd1707658e3242732@mail.gmail.com>
References: <74E2C336A61B4EE68C5AF37959C73174@RaymondLaptop1>
	<6097EC22D9B448C7857BB8A0247DA9E1@RaymondLaptop1>
	<e6511dbf0904252206u38cd65a6sd1707658e3242732@mail.gmail.com>
Message-ID: <200904261520.24408.steve@pearwood.info>

On Sun, 26 Apr 2009 03:06:14 pm Josiah Carlson wrote:
> Condition at the bottom sucks for similar reasons as why ...
> def foo(...):
> ? ? (many lines of code)
> foo = classmethod(foo)
>
> Having your condition, modification, etc., at the end is just plain
> annoying. ?I like my conditions up top.

But if the condition isn't tested until the bottom of the block, then 
putting it at the top is weird.


-- 
Steven D'Aprano


From josiah.carlson at gmail.com  Sun Apr 26 07:22:52 2009
From: josiah.carlson at gmail.com (Josiah Carlson)
Date: Sat, 25 Apr 2009 22:22:52 -0700
Subject: [Python-ideas] Updating PEP 315: do-while loops
In-Reply-To: <49F3D759.7030605@hastings.org>
References: <74E2C336A61B4EE68C5AF37959C73174@RaymondLaptop1>
	<49F3C863.9040606@hastings.org>
	<50697b2c0904251949x5a1442apa35157e11a7e3c9e@mail.gmail.com>
	<49F3D759.7030605@hastings.org>
Message-ID: <e6511dbf0904252222i8f522ebh6a501d545dff65af@mail.gmail.com>

On Sat, Apr 25, 2009 at 8:39 PM, Larry Hastings <larry at hastings.org> wrote:
> Chris Rebert wrote:
>>
>> I don't see why the if-as-suffix is needed when we already have a
>> one-liner for such situations (e.g.):
>>
>> if i != '1': break
>>
>> It's much more uniform and only one character longer.
>>
>
> I certainly see your point. ?Let me take it a step further: the "do: ...
> while <condition>" construct isn't needed, given that it's already
> expressible with "while True: ... if not <condition>: break".
>
> It's true, "break if <condition>" and "continue if <condition>" are
> redundant constructs. ?But this debate is over refining our syntactic
> sugar--charting what is arguably a redundant construct. ?Therefore proposing
> redundant constructs for the sake of clarity is on the table. ?I think
> "break if <condition>" and "continue if <condition>" enhance readability;
> they make the control flow pop out at you more than "if <condition>: break"
> and "if <condition>: continue" do. ?"break if" and "continue if" have the
> advantage of following established Python syntactic precedent.

FYI, this was proposed a few months ago (see the first post here:
http://mail.python.org/pipermail/python-ideas/2008-September/002083.html
).  I was and still am -1 on any variant of "continue if <condition>"
or "break if <condition>" for the same reasons I was then.  They are a
slippery slope (why not "(raise Exception) if <condition>"?), and they
hide the control flow to the left of the line where we are used to
seeing it on the right side of the line.

 - Josiah


From bruce at leapyear.org  Sun Apr 26 07:30:47 2009
From: bruce at leapyear.org (Bruce Leban)
Date: Sat, 25 Apr 2009 22:30:47 -0700
Subject: [Python-ideas] why not try without except?
In-Reply-To: <200904261225.03948.steve@pearwood.info>
References: <20090425103207.5e6479ab@o>
	<200904261225.03948.steve@pearwood.info>
Message-ID: <cf5b87740904252230y6f7ea81fhbf045fa30fbdc838@mail.gmail.com>

The idea of making it easy blindly ignore all exceptions is not very
interesting.

On Sat, Apr 25, 2009 at 7:25 PM, Steven D'Aprano <steve at pearwood.info>wrote:

> On Sat, 25 Apr 2009 06:32:07 pm spir wrote:
>
> >    if hasattr(doc, 'footer')
> >       text += doc.footer
>
> A third alternative is to make sure that doc.footer always exists, even
> if it is only the empty string, and then just write:
>
> text += doc.footer
>
> In many cases, I prefer that. I don't like attributes which sometimes
> exist and sometimes don't.


It is a common case though and convenient to have a simple way to resolve
it. Adding a statement level ? operator would be a mistake in my opinion
because it clobbers the entire statement rather than just the specific. If
only :-) there were a way to get an attribute while at the same time
specifying the default value if the attribute didn't exist, say something
like:

    getattr(var, attribute [, default] )

Of course, I'd also it would make sense to also have:

    getitem(var, index [, default] )

--- Bruce
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-ideas/attachments/20090425/7ad4012d/attachment.html>

From george.sakkis at gmail.com  Sun Apr 26 08:22:31 2009
From: george.sakkis at gmail.com (George Sakkis)
Date: Sun, 26 Apr 2009 02:22:31 -0400
Subject: [Python-ideas] Updating PEP 315: do-while loops
In-Reply-To: <200904261520.24408.steve@pearwood.info>
References: <74E2C336A61B4EE68C5AF37959C73174@RaymondLaptop1>
	<6097EC22D9B448C7857BB8A0247DA9E1@RaymondLaptop1>
	<e6511dbf0904252206u38cd65a6sd1707658e3242732@mail.gmail.com>
	<200904261520.24408.steve@pearwood.info>
Message-ID: <91ad5bf80904252322p2ce89e25r1120294509bae442@mail.gmail.com>

On Sun, Apr 26, 2009 at 1:20 AM, Steven D'Aprano <steve at pearwood.info> wrote:
> On Sun, 26 Apr 2009 03:06:14 pm Josiah Carlson wrote:
>> Condition at the bottom sucks for similar reasons as why ...
>> def foo(...):
>> ? ? (many lines of code)
>> foo = classmethod(foo)
>>
>> Having your condition, modification, etc., at the end is just plain
>> annoying. ?I like my conditions up top.
>
> But if the condition isn't tested until the bottom of the block, then
> putting it at the top is weird.

"Weird" is an understatement; I'd rather say outright misleading.
There's nothing in "do ... while cond" that even hints that the
condition is not checked the first time.

George


From steve at pearwood.info  Sun Apr 26 08:42:36 2009
From: steve at pearwood.info (Steven D'Aprano)
Date: Sun, 26 Apr 2009 16:42:36 +1000
Subject: [Python-ideas] Updating PEP 315:  do-while loops
In-Reply-To: <74E2C336A61B4EE68C5AF37959C73174@RaymondLaptop1>
References: <74E2C336A61B4EE68C5AF37959C73174@RaymondLaptop1>
Message-ID: <200904261642.37008.steve@pearwood.info>

On Sun, 26 Apr 2009 11:00:26 am Raymond Hettinger wrote:

> Am working on PEP 315 again, simplifying the proposal by focusing on
> a more standard do-while loop.
>
> The two motivating cases are:
[snip]

A quick-and-dirty data point for you. Using Google code search, I found 
approx 80,000 hits for "lang:Pascal until" versus 129,000 hits 
for "lang:Pascal while". So the number of cases of test-at-top versus 
test-at-bottom loops are very roughly 3:2 in Pascal code indexed by 
Google.

(Disclaimer: both searches revealed a number of false positives. I've 
made no attempt to filter them out.)

For those who aren't familiar with Pascal, the test-at-top and 
test-at-bottom loops are spelled:

WHILE condition DO
  BEGIN
    suite;
  END;

and

REPEAT
  suite;
UNTIL condition;

WHILE exits the loop when condition is False, and REPEAT...UNTIL exits 
when condition is True.

So I think the first question we need to deal with is, should we be 
discussing do...while or do...until? Is there a consensus on the sense 
of the condition test?

while True:  # do...while, exit when condition is false
    suite
    if not condition: break

or 

while True:  # do...until, exit when condition is true
    suite
    if condition: break


Or do we doom this proposal by suggesting variant spellings that cover 
both cases?


> The challenge has been finding a syntax that fits well with the
> patterns in the rest of the language. It seems that every approach
> has it's own strengths and weaknesses.
>
> 1) One approach uses the standard do-while name but keeps usual
> python style formatting by putting the condition at the top instead
> of the bottom where it is in C and Java:

I'm not entirely comfortable with writing the test at the top of the 
loop, but having it executed at the bottom of the loop. But I'd prefer 
it to no do-loop at all.

Oh, I should ask... will whitespace around the ellipsis be optional?


> 2) Another approach is to put the test at the end.  Something like:
>
>     do:
>         j = _int(random() * n)
>
>     :while j in selected
>
>     do:
>         j = _int(random() * n)
>         while j in selected
>
> These seem syntactically weird to me and feel more like typos than
> real python code. I'm sure there are many ways to spell the last
> line, but in the two years since I first worked on the PEP, I haven't
> found any condition-at-the-end syntax that FeelsRight(tm).

I don't like either of those variants. I agree, they feel strange. The 
only variant that I find natural is the Pascal-like:

do:
    suite
until condition

where the "until" is outdented to be level with the do. My vote goes for 
this, although I'm probably biased by my familiarity with Pascal. How 
does that feel to you?


I don't think I could live with

do:
    suite
while condition

even though it isn't strictly ambiguous, nor does it prevent nesting a 
while-loop inside the do-loop:

do:
    suite
    while flag:
        another_suite
while condition

But it looks disturbingly like an invalid line.


If outdenting is impossible, then I could live with it being indented:

do:
    suite
    until condition  # or while if you prefer


but what happens if there is indented code following the until/while? 
Syntax error, or does it just never get executed?


My preferences, in order of most-liked to least-liked, with my score:

(1) +1
do:
    suite
until condition # exit loop when condition is true

(2) +0.5
do...until condition: # exit loop when condition is true
    suite

(3) +0.5
do...while condition: # exit loop when condition is false
    suite

(4) +0
do:
    suite
    until/while condition
    mystery_suite  # what happens here?

(5) -0
The status quo, no do-loop at all.

(6) -0.5
do:
? ? suite
:while/until condition

(7) -1
do:
    suite
while condition




-- 
Steven D'Aprano


From steve at pearwood.info  Sun Apr 26 08:46:43 2009
From: steve at pearwood.info (Steven D'Aprano)
Date: Sun, 26 Apr 2009 16:46:43 +1000
Subject: [Python-ideas] Updating PEP 315: do-while loops
In-Reply-To: <91ad5bf80904252322p2ce89e25r1120294509bae442@mail.gmail.com>
References: <74E2C336A61B4EE68C5AF37959C73174@RaymondLaptop1>
	<200904261520.24408.steve@pearwood.info>
	<91ad5bf80904252322p2ce89e25r1120294509bae442@mail.gmail.com>
Message-ID: <200904261646.44029.steve@pearwood.info>

On Sun, 26 Apr 2009 04:22:31 pm George Sakkis wrote:
> "Weird" is an understatement; I'd rather say outright misleading.
> There's nothing in "do ... while cond" that even hints that the
> condition is not checked the first time.

Well, not quite. The ellipsis is a hint that there's something unusual 
going on. I agree that the meaning of the ellipsis is not intuitive, 
but that's not important. There's nothing intuitive about loops in the 
first place :)

I could live with ellipsis, although it's not my first preference. I'd 
call it a syntactic wart, like writing a tuple of one item as 
(object,). 


-- 
Steven D'Aprano


From steve at pearwood.info  Sun Apr 26 09:10:25 2009
From: steve at pearwood.info (Steven D'Aprano)
Date: Sun, 26 Apr 2009 17:10:25 +1000
Subject: [Python-ideas] why not try without except?
In-Reply-To: <cf5b87740904252230y6f7ea81fhbf045fa30fbdc838@mail.gmail.com>
References: <20090425103207.5e6479ab@o>
	<200904261225.03948.steve@pearwood.info>
	<cf5b87740904252230y6f7ea81fhbf045fa30fbdc838@mail.gmail.com>
Message-ID: <200904261710.25703.steve@pearwood.info>

On Sun, 26 Apr 2009 03:30:47 pm Bruce Leban wrote:

> If only :-) there were a way to get an attribute while
> at the same time specifying the default value if the attribute didn't
> exist, say something like:
>
>     getattr(var, attribute [, default] )

But the OP isn't suggesting default values. See, for example, his 
discussion of ignoring any errors when the user doesn't supply a 
parameter to a function. If all he wanted was a default value, we can 
do that already.


> Of course, I'd also it would make sense to also have:
>
>     getitem(var, index [, default] )

Hmmm... interesting... 

Of course you can already do this:

    var[index:index+1] or [default]

if var is a list, and 

    var.get(key, default)

if it is a dictionary.
 


-- 
Steven D'Aprano


From rhamph at gmail.com  Sun Apr 26 09:39:24 2009
From: rhamph at gmail.com (Adam Olsen)
Date: Sun, 26 Apr 2009 01:39:24 -0600
Subject: [Python-ideas] Updating PEP 315: do-while loops
In-Reply-To: <74E2C336A61B4EE68C5AF37959C73174@RaymondLaptop1>
References: <74E2C336A61B4EE68C5AF37959C73174@RaymondLaptop1>
Message-ID: <aac2c7cb0904260039r4a5087bbpd9846bf72247234d@mail.gmail.com>

On Sat, Apr 25, 2009 at 7:00 PM, Raymond Hettinger <python at rcn.com> wrote:
> Also, I've punted on the while-cond-with-assignment case that is being
> handled by
> other proposals such as:
>
> ? while f.read(20) as data != '':
> ? ? ? ...

We can actually already do this, via a little used option of iter():

for data in iter(lambda: f.read(20), ''):
    ....

The primary limitation is the need to use lambda (or
functools.partial, but it's no better).  The next limitation is that
even with lambda you can only use an expression, not a statement.

You can bite the bullet and define a function first, but that ends up
being better accomplished with a generator, and in these cases it's
often better still to use a while True: loop.  Back to square one.
I'm not sure all the reorderings are actually more *readable*, rather
than just more appealing to write.


-- 
Adam Olsen, aka Rhamphoryncus


From stefan_ml at behnel.de  Sun Apr 26 09:45:08 2009
From: stefan_ml at behnel.de (Stefan Behnel)
Date: Sun, 26 Apr 2009 09:45:08 +0200
Subject: [Python-ideas] Updating PEP 315: do-while loops
In-Reply-To: <49F3D759.7030605@hastings.org>
References: <74E2C336A61B4EE68C5AF37959C73174@RaymondLaptop1>		<49F3C863.9040606@hastings.org>	<50697b2c0904251949x5a1442apa35157e11a7e3c9e@mail.gmail.com>
	<49F3D759.7030605@hastings.org>
Message-ID: <gt13e4$jhf$1@ger.gmane.org>

Larry Hastings wrote:
> Chris Rebert wrote:
>> I don't see why the if-as-suffix is needed when we already have a
>> one-liner for such situations (e.g.):
>>
>> if i != '1': break
>>
>> It's much more uniform and only one character longer.
>>   
> 
> I certainly see your point.  Let me take it a step further: the "do: ...
> while <condition>" construct isn't needed, given that it's already
> expressible with "while True: ... if not <condition>: break".

Hmm, so why add another keyword like "do", instead of just giving "while" a
default condition "True", so that you could write

    while:
        state = do_stuff_here()
        if predicate(state): break

I think that the missing condition at the top makes it pretty clear that
the end condition must be inside the loop.

Stefan



From stefan_ml at behnel.de  Sun Apr 26 09:47:17 2009
From: stefan_ml at behnel.de (Stefan Behnel)
Date: Sun, 26 Apr 2009 09:47:17 +0200
Subject: [Python-ideas] Updating PEP 315:  do-while loops
In-Reply-To: <49F3CC0C.8000807@ronadam.com>
References: <74E2C336A61B4EE68C5AF37959C73174@RaymondLaptop1>
	<49F3CC0C.8000807@ronadam.com>
Message-ID: <gt13i5$jhf$2@ger.gmane.org>

Ron Adam wrote:
> I'd prefer a bare "while:" with a more visible "break if" and possibly
> "continue if" expressions.
> 
> Visualize the while, continue, and break with syntax highlighting.
> 
>      while:
>          j = _int(random() 8 n)
>          break if j not in selected

Yep, I should read threads to the end before bringing in 'new' ideas...

Stefan



From denis.spir at free.fr  Sun Apr 26 10:51:08 2009
From: denis.spir at free.fr (spir)
Date: Sun, 26 Apr 2009 10:51:08 +0200
Subject: [Python-ideas] why not try without except?
In-Reply-To: <200904261105.57286.steve@pearwood.info>
References: <20090425103207.5e6479ab@o> <20090425122322.7f7da364@o>
	<gsv3dn$cjp$1@ger.gmane.org>
	<200904261105.57286.steve@pearwood.info>
Message-ID: <20090426105108.254c2ee2@o>

Le Sun, 26 Apr 2009 11:05:56 +1000,
Steven D'Aprano <steve at pearwood.info> s'exprima ainsi:

> On Sat, 25 Apr 2009 11:32:39 pm Georg Brandl wrote:
> 
> > And while we're at it, let's introduce
> >
> > on error resume next:
> >     foo()
> >     bar()
> >     baz()
> 
> Is that meant to be a serious suggestion or a sarcastic rejoinder? 

It is certainly sarcastic -- never mind -;) Georg could have saved some typing by reading comments and examples...

What I had in mind (and commented), is precisely not a syntactic construct able to catch any exception for a whole block of statements. E.g. an example like
   option self.fill(self.fillColor)
can only catch AttributeError for 'fillColor' or for 'fill'.

Now I realise that what I miss is a kind of handy, clear, and not misuseable idiom for coping with undefined variables/attributes/parameters.
Otherwise one needs to fall back to the reflexive-check-instead-of-try idiom:
   if hasattr(self,'fillColor'):
      self.fill(self.fillColor)
or use try... except ... pass.
'None' is also often (mis)used for this. People aware of the issues with None will rather have a custom undef-meaning object instead. Right?
   UNDEF = object()
But this also requires checking instead of trying, doesn't it? It also requires binding a default UNDEF value to names that, conceptually speaking, can well remain undefined (UNDEF is still a value, as well as None).

This is rather close to the use of Haskell's optional type (I guess it's called 'Maybe' -- checked, see e.g. http://en.wikibooks.org/wiki/Haskell/Hierarchical_libraries/Maybe). And I think Scala has something similar, too. But these special types fit well the semantics of statically-typed and compile-time-checked languages.

One issue with None is that it can be used as a valid value. A workaround may be a user-unsettable undef-meaning builtin value. Or a builtin isDef() function. Still, both of these methods require checking. While the 'option' (or '?') proposal allows trying-instead-of-checking. 
Why not restrict it to a set of sensible exception types??for instance (this is what I would like):

   option foo
or
   ? foo
<==>
   try:
      foo
   except (NameError, AttributeError):
      pass

Moreover: Haskell's Maybe type can be used for return values. I really wonder about python action-functions (that do not produce a result) returning None instead of nothing at all. This is another story, indeed, but I see it closely related at the meaning level.
   searched = seq.find(....)	# may not return anything, not even None
   option print searched

Denis
------
la vita e estrany


From steve at pearwood.info  Sun Apr 26 11:05:01 2009
From: steve at pearwood.info (Steven D'Aprano)
Date: Sun, 26 Apr 2009 19:05:01 +1000
Subject: [Python-ideas] why not try without except?
In-Reply-To: <20090426105108.254c2ee2@o>
References: <20090425103207.5e6479ab@o>
	<200904261105.57286.steve@pearwood.info>
	<20090426105108.254c2ee2@o>
Message-ID: <200904261905.02141.steve@pearwood.info>

On Sun, 26 Apr 2009 06:51:08 pm spir wrote:

> Now I realise that what I miss is a kind of handy, clear, and not
> misuseable idiom for coping with undefined
> variables/attributes/parameters. 

The best way to deal with undefined variables is to make sure that there 
never are any. In other words: an undefined variable is a bug. Fix the 
bug, don't hide it.



-- 
Steven D'Aprano


From denis.spir at free.fr  Sun Apr 26 11:52:40 2009
From: denis.spir at free.fr (spir)
Date: Sun, 26 Apr 2009 11:52:40 +0200
Subject: [Python-ideas] why not try without except?
In-Reply-To: <200904261905.02141.steve@pearwood.info>
References: <20090425103207.5e6479ab@o>
	<200904261105.57286.steve@pearwood.info>
	<20090426105108.254c2ee2@o>
	<200904261905.02141.steve@pearwood.info>
Message-ID: <20090426115240.07047ca8@o>

Le Sun, 26 Apr 2009 19:05:01 +1000,
Steven D'Aprano <steve at pearwood.info> s'exprima ainsi:

> On Sun, 26 Apr 2009 06:51:08 pm spir wrote:
> 
> > Now I realise that what I miss is a kind of handy, clear, and not
> > misuseable idiom for coping with undefined
> > variables/attributes/parameters. 
> 
> The best way to deal with undefined variables is to make sure that there 
> never are any. In other words: an undefined variable is a bug. Fix the 
> bug, don't hide it.
 
I understand your position, indeed.
Mine is: optional things and optional actions are basic and common notions. Even more in modelizing / programming fields. Are they important enough (for you) to require a dedicated idiom in a (your favorite) PL?

class Shape(object):
   def __init__(self, ..., ?fill_color)
      .......
      ?self.fill_color = fill_color
   def show(self):
      .......
      ?self.fill(self.fill_color)

Moreover, imagine NoReturnValueError instead of None; raised by python when a func call is used as an expression.
   x = f()	# may raise NoReturnValueError
		# will never bind None except explicitely returned
This would first avoid numerous bugs, often difficult to catch because silent:
   y = x.sort()	# awa!
(*) (**)

Also one could then write, for instance:

   def show(self):
      .......
      ?self.fill(self.fill_color)
      ?self.showTansforms(user.getTransforms())

(if NoReturnValueError belongs to the set of error types caught by optional actions)

OK, won't fight for this anyway ;-)


denis


(*) I would also love a way to prevent the opposite bug:
   s.strip()   # awa!
(**) Actually, I must be a fan of Pascal's procedure vs function distinction. Find it sensible and meaningful.

------
la vita e estrany


From greg.ewing at canterbury.ac.nz  Sun Apr 26 12:18:16 2009
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Sun, 26 Apr 2009 22:18:16 +1200
Subject: [Python-ideas] Updating PEP 315:  do-while loops
In-Reply-To: <91F251EF5E864ED29BCA5182DE76F0F0@RaymondLaptop1>
References: <74E2C336A61B4EE68C5AF37959C73174@RaymondLaptop1>
	<49F3C863.9040606@hastings.org>
	<91F251EF5E864ED29BCA5182DE76F0F0@RaymondLaptop1>
Message-ID: <49F434E8.6040804@canterbury.ac.nz>

I like the structure of the exising PEP 315 proposal, but
not the selection of keywords. It seems confused -- what
exactly are you repeating? The part after the "do", the
part after the "while", or both?

My suggestion is

   while:
     <statements>
   gives <condition>:
     <more statements>

There could also be a degenerate form without
the second suite:

   while:
     <statements>
   gives <condition>

-- 
Greg



From greg.ewing at canterbury.ac.nz  Sun Apr 26 13:23:46 2009
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Sun, 26 Apr 2009 23:23:46 +1200
Subject: [Python-ideas] Updating PEP 315:  do-while loops
In-Reply-To: <91F251EF5E864ED29BCA5182DE76F0F0@RaymondLaptop1>
References: <74E2C336A61B4EE68C5AF37959C73174@RaymondLaptop1>
	<49F3C863.9040606@hastings.org>
	<91F251EF5E864ED29BCA5182DE76F0F0@RaymondLaptop1>
Message-ID: <49F44442.5050101@canterbury.ac.nz>

Raymond Hettinger wrote:

> Though I would like a fair hearing for the "do ... while <cond>:" proposal.
> To my eyes, it has clear meaning

I think that's only because you already know what it's
supposed to mean.

To me, it's very far from clear that the "..." means
"the expression after this really goes way down there
after the suite".

Also, anything which only addresses test-at-bottom and
not test-half-way-through only covers a very small
proportion of use cases, IMO.

-- 
Greg


From greg.ewing at canterbury.ac.nz  Sun Apr 26 13:23:55 2009
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Sun, 26 Apr 2009 23:23:55 +1200
Subject: [Python-ideas] Updating PEP 315:  do-while loops
In-Reply-To: <49F3CC0C.8000807@ronadam.com>
References: <74E2C336A61B4EE68C5AF37959C73174@RaymondLaptop1>
	<49F3CC0C.8000807@ronadam.com>
Message-ID: <49F4444B.3070605@canterbury.ac.nz>

Ron Adam wrote:

> I'd prefer a bare "while:" with a more visible "break if" and possibly 
> "continue if" expressions.
> 
>      while:
>          j = _int(random() 8 n)
>          break if j not in selected

But putting "if" after "break" doesn't make the exit
point any more visible to my eyes. If anything, it
makes it *less* visible.

I think there's another thing at work here, too. Having
an outdented clause for the exit point makes it very
obvious that it is part of the syntax of the loop
structure, and consequently that there is only *one*
exit point from the loop. This is in contrast to
break statements, which are much less disciplined and
can occur more than once and inside nested statements.

-- 
Greg


From greg.ewing at canterbury.ac.nz  Sun Apr 26 13:38:07 2009
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Sun, 26 Apr 2009 23:38:07 +1200
Subject: [Python-ideas] why not try without except?
In-Reply-To: <20090426105108.254c2ee2@o>
References: <20090425103207.5e6479ab@o> <20090425122322.7f7da364@o>
	<gsv3dn$cjp$1@ger.gmane.org> <200904261105.57286.steve@pearwood.info>
	<20090426105108.254c2ee2@o>
Message-ID: <49F4479F.6040307@canterbury.ac.nz>

spir wrote:

>    option self.fill(self.fillColor)
> can only catch AttributeError for 'fillColor' or for 'fill'.

You mean that 'option' *only* catches AttributeError?

What makes you think that AttributeError is the most
common exception to want to catch in these kinds of
situations? Seems to me you're just as likely to
want to catch one of ValueError, IndexError, KeyError,
etc.

Also, I don't find your fillColor example very
convincing. If I have a class that may or may not
have a fillColor, I set the fillColor to None when
it doesn't have one -- I don't just leave the
fillColor attribute undefined. The code then
becomes

   if self.fillColor:
     self.fill(self.fillColor)

or if I want to minimise attribute lookups,

   color = self.fillColor
   if color:
     self.fill(color)

-- 
Greg


From steve at pearwood.info  Sun Apr 26 13:38:00 2009
From: steve at pearwood.info (Steven D'Aprano)
Date: Sun, 26 Apr 2009 21:38:00 +1000
Subject: [Python-ideas] why not try without except?
In-Reply-To: <20090426115240.07047ca8@o>
References: <20090425103207.5e6479ab@o>
	<200904261905.02141.steve@pearwood.info>
	<20090426115240.07047ca8@o>
Message-ID: <200904262138.00705.steve@pearwood.info>

On Sun, 26 Apr 2009 07:52:40 pm spir wrote:

> Mine is: optional things and optional actions are basic and common
> notions. Even more in modelizing / programming fields. Are they
> important enough (for you) to require a dedicated idiom in a (your
> favorite) PL?
>
> class Shape(object):
>    def __init__(self, ..., ?fill_color)
>       .......
>       ?self.fill_color = fill_color
>    def show(self):
>       .......
>       ?self.fill(self.fill_color)

In this case, fill_colour shouldn't be optional. There are two better 
ways to deal with it. Here's one:

class UnfilledShape(object):  # no fill colour at all.
    ...

class FilledShape(UnfilledShape):  # has a fill colour
    ...



or if you prefer:

class Shape(object):
    def __init__(self, ..., fill_color=TRANSPARENT):
        ...


Either solution is better than writing code that is filled with checks 
for missing values. Whether you manually check it yourself, or have 
syntax support, you still make the code more complicated.



-- 
Steven D'Aprano


From denis.spir at free.fr  Sun Apr 26 14:17:20 2009
From: denis.spir at free.fr (spir)
Date: Sun, 26 Apr 2009 14:17:20 +0200
Subject: [Python-ideas] Updating PEP 315:  do-while loops
In-Reply-To: <49F3CC0C.8000807@ronadam.com>
References: <74E2C336A61B4EE68C5AF37959C73174@RaymondLaptop1>
	<49F3CC0C.8000807@ronadam.com>
Message-ID: <20090426141720.230a9c85@o>

Le Sat, 25 Apr 2009 21:50:52 -0500,
Ron Adam <rrr at ronadam.com> s'exprima ainsi:

> Visualize the while, continue, and break with syntax highlighting.
> 
>       while:
>           j = _int(random() 8 n)
>           break if j not in selected
> 
> Or if the reverse comparison is preferred you could do...
> 
>       while:
>           j = _int(random() * n)
>           continue if j in selected
>           break

But

(1) This does not give any advantage compared to existing syntax
   if condition: break | continue

(2) It does not solve the problem which, I guess, is to express the loop's logic obviously -- meaning in the header (or footer) instead of hidden somewhere in the body -- and nicely avoid numerous "while 1:" and "break" as a side-effect. What Raymond's proposals provide.

What is not obvious at all is that:
    do ... while j in selected:
        j = _int(random() * n)
means the assignment will be done at least once (and no NameError thrown). Where/how is it said? The above code is supposed to mean something like:

    do once then while j in selected:
        j = _int(random() * n)

But maybe it's only me.
Footer solutions (condition at end of loop) seem not to have Raymond's favor; still they are obvious in comparison:

  do:
        j = _int(random() * n)
  <your favored end condition syntax> j in selected

Denis
------
la vita e estrany


From denis.spir at free.fr  Sun Apr 26 14:30:26 2009
From: denis.spir at free.fr (spir)
Date: Sun, 26 Apr 2009 14:30:26 +0200
Subject: [Python-ideas] Updating PEP 315:  do-while loops
In-Reply-To: <200904261642.37008.steve@pearwood.info>
References: <74E2C336A61B4EE68C5AF37959C73174@RaymondLaptop1>
	<200904261642.37008.steve@pearwood.info>
Message-ID: <20090426143026.48f6a424@o>

Le Sun, 26 Apr 2009 16:42:36 +1000,
Steven D'Aprano <steve at pearwood.info> s'exprima ainsi:

> The only variant that I find natural is the Pascal-like:
> 
> do:
>     suite
> until condition
> 
> where the "until" is outdented to be level with the do. My vote goes for 
> this, although I'm probably biased by my familiarity with Pascal. How 
> does that feel to you?

Same for me.

> I don't think I could live with
> 
> do:
>     suite
> while condition

I could survive it because it is linked to the "do:" header. But the real drawback for me is that it should rather express the exit-condition, because it is placed at the end.
Anyway, I would support the "while" version if "until" gets to much opposition (for introducing 2 new keywords instead of only 'do').
Both are much better imo than the header versions.

Denis
------
la vita e estrany


From larry at hastings.org  Sun Apr 26 14:31:22 2009
From: larry at hastings.org (Larry Hastings)
Date: Sun, 26 Apr 2009 05:31:22 -0700
Subject: [Python-ideas] Updating PEP 315: do-while loops
In-Reply-To: <20090426011259.58c52c9e@bhuda.mired.org>
References: <74E2C336A61B4EE68C5AF37959C73174@RaymondLaptop1>	<49F3C863.9040606@hastings.org>	<50697b2c0904251949x5a1442apa35157e11a7e3c9e@mail.gmail.com>	<49F3D759.7030605@hastings.org>
	<20090426011259.58c52c9e@bhuda.mired.org>
Message-ID: <49F4541A.30704@hastings.org>


Mike Meyer wrote:
>> It's true, "break if <condition>" and "continue if <condition>" are 
>> redundant constructs.  But this debate is over refining our syntactic 
>> sugar--charting what is arguably a redundant construct.  
>>     
> No, it isn't. Well, it wasn't originally, but got hijacked into being
> that. More on the original in a minute.
>   
[...]
> Minor spelling variants (which is what these are)
> are pretty much poison as far as readability is concerned, and are
> unPythonic.
>
> The original proposal - while it was syntactic sugar of sorts - dealt
> with a real issue: Python doesn't have a natural loop construct for
> the common case of having a conditional at the bottom. The existing
> conditional loop can be coerced into that role in a couple of ways,
> but all of them have problems.
>   

I don't follow the distinction you draw.  In what way is do-while not 
straight syntactic sugar?  We can already express the construct with 
"while True: ... if not <expression>: break".  There's nothing unnatural 
about spelling it this way, and it's worked for longer than either of us 
have been using Python.

So, from my perspective, the proposal is *asking* for redundant syntax. 
  And therefore describing my suggestions as "unPythonic" and "pure 
poison" because they're redundant is sophistry.

> The proposed solution puts the conditional at the top, which is pretty
> much mandated by existing python control flow syntax: control
> constructs appear at the top of blocks, control the code indented
> beneath them, and they are terminated by a dedent (yes, I'm ignoring
> statement stacking). Unfortunately, this solution feels strange, what
> with the conditional at the wrong end of the controlled block.
>
> A syntax that puts the conditional at the bottom end of the controlled
> block either uses a dedent to signal the end of the control block,
> which clashes with existing constructs, or doesn't use such a dedent,
> which means it's trivially turned into conditional in the middle
> construct. Which means - as we've just seen - that it's in danger of
> being a spelling variant of the existing solutions.
>
> What's really needed is a completely different looping construct.

I don't agree with this conclusion.  We're talking about "do/while", 
which is traditionally different from other looping structures.  You 
call that "clashing", I call that its "raison d'etre".

Having slept on it, I think we won't do any better than the usual 
approach for cooking up new Python syntax: view C syntax through 
Python-colored glasses.  I claim this transformation works as follows:
  * Start with the C form, in K&R spelling, with curly braces.
  * Replace the opening curly brace with a colon.
  * Remove the closing curly brace entirely, and any non-newline 
whitespace after it.
  * Remove the parentheses around the conditional expression.
  * Remove semicolons.

Applying that transformatino to C's "if", "if/else", and "while" gives 
us the Python syntax.  And it shows us the way to write "for".

So let's apply it to the C do/while loop.  This:

    do {
        something();
    } while (condition);

As viewed through my "Python glasses" becomes this:

    do:
        something()
    while condition

Yes, it's surprising that there's no colon on the "while" line.  But we 
aren't indenting a new code block, so we shouldn't have a trailing 
colon.  And Python has never used a leading colon to mean "outdent but 
continue the construct"; if it did we'd spell if-else as "if 
<condition>: ... :else:".

If it looks weird to you to have a "while" not followed by a colon... 
good!  do/while is a weird construct.  We should *expect* it to look as 
weird in Python as it does in C.

I therefore assert this syntax is optimal; certainly I doubt anyone will 
suggest one I'd like more.


I still like "break if" and "continue if", but I now see those as 
separate.  And I should not be surprised if they are henceforth 
ignored--as you point out they are wholly redundant.  I think they're 
pleasing in their redundancy but I'm surely in the minority.


/larry/
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-ideas/attachments/20090426/ee45fcbb/attachment.html>

From denis.spir at free.fr  Sun Apr 26 14:46:07 2009
From: denis.spir at free.fr (spir)
Date: Sun, 26 Apr 2009 14:46:07 +0200
Subject: [Python-ideas] Updating PEP 315: do-while loops
In-Reply-To: <e6511dbf0904252206u38cd65a6sd1707658e3242732@mail.gmail.com>
References: <74E2C336A61B4EE68C5AF37959C73174@RaymondLaptop1>
	<49F3B965.1090700@trueblade.com>
	<6097EC22D9B448C7857BB8A0247DA9E1@RaymondLaptop1>
	<e6511dbf0904252206u38cd65a6sd1707658e3242732@mail.gmail.com>
Message-ID: <20090426144607.1fee23de@o>

Le Sat, 25 Apr 2009 22:06:14 -0700,
Josiah Carlson <josiah.carlson at gmail.com> s'exprima ainsi:

> Condition at the bottom sucks for similar reasons as why ...
> def foo(...):
>     (many lines of code)
> foo = classmethod(foo)
> 
> Having your condition, modification, etc., at the end is just plain
> annoying.  I like my conditions up top.

You're wrong:

> def foo(...):
>     (many lines of code)
> foo = classmethod(foo)

is a bad syntax only because nothing tells you that, to understand foo's meaning properly, you need first to take something into account that is written after its def.

Where as in the following

do:
   <suite>
until <condition>

(or any footer variant) the header "do:" warns you about the loop's logic: loop at least once, then check for an exit condition -- logically written at the loop's exit point ;-)

If the real issue you have in mind is that the loop's body may not fit inside the editor window, well...

Denis
------
la vita e estrany


From denis.spir at free.fr  Sun Apr 26 15:57:01 2009
From: denis.spir at free.fr (spir)
Date: Sun, 26 Apr 2009 15:57:01 +0200
Subject: [Python-ideas] why not try without except?
In-Reply-To: <200904262138.00705.steve@pearwood.info>
References: <20090425103207.5e6479ab@o>
	<200904261905.02141.steve@pearwood.info>
	<20090426115240.07047ca8@o>
	<200904262138.00705.steve@pearwood.info>
Message-ID: <20090426155701.16af7dfa@o>

Le Sun, 26 Apr 2009 21:38:00 +1000,
Steven D'Aprano <steve at pearwood.info> s'exprima ainsi:

> On Sun, 26 Apr 2009 07:52:40 pm spir wrote:
> 
> > Mine is: optional things and optional actions are basic and common
> > notions. Even more in modelizing / programming fields. Are they
> > important enough (for you) to require a dedicated idiom in a (your
> > favorite) PL?
> >
> > class Shape(object):
> >    def __init__(self, ..., ?fill_color)
> >       .......
> >       ?self.fill_color = fill_color
> >    def show(self):
> >       .......
> >       ?self.fill(self.fill_color)
> 
> In this case, fill_colour shouldn't be optional. There are two better 
> ways to deal with it. Here's one:
> 
> class UnfilledShape(object):  # no fill colour at all.
>     ...
> 
> class FilledShape(UnfilledShape):  # has a fill colour
>     ...
> 
> 
> 
> or if you prefer:
> 
> class Shape(object):
>     def __init__(self, ..., fill_color=TRANSPARENT):
>         ...
> 
> 
> Either solution is better than writing code that is filled with checks 
> for missing values. Whether you manually check it yourself, or have 
> syntax support, you still make the code more complicated.

I like your solutions and I sincerally thank you for them.
Still, think at this:
There are numerous methods to cope with such "optional things", such as your subclassing above, using None, using custom special object, checking before trying, try.. except.. pass, and so on. But all of them all are custom expressions of the "optional" concept. Which is not a weird notion at all: it's instead omni-present in everyday life as well as in many programs.
As python and most PLs do not have any builtin idiom matching it, we certainly are not really aware of it (first myself, before it became clear thanks to this thread). Especially because we can use different constructs to cope with it, depending on the specific case, it does come up in a unique form.
But now I am rather that if we had learnt python with it, then we would use it regularly.

As for your remark "In this case, fill_colour shouldn't be optional.": yes, it should! Precisely, it's the good property of an optional parameter (in the model) to be optional (in the code). There is no natural law of CS that says that this is A-Bad-Thing. Instead it's always better to have the program better mirror what it is intended to express (*). Nothing will ever explode if I code the Shape class like above.

Denis
------

(*) This my personal Zeroest Law Of Programming ;-)

la vita e estrany


From stephen at xemacs.org  Sun Apr 26 18:36:29 2009
From: stephen at xemacs.org (Stephen J. Turnbull)
Date: Mon, 27 Apr 2009 01:36:29 +0900
Subject: [Python-ideas] Updating PEP 315: do-while loops
In-Reply-To: <20090426144607.1fee23de@o>
References: <74E2C336A61B4EE68C5AF37959C73174@RaymondLaptop1>
	<49F3B965.1090700@trueblade.com>
	<6097EC22D9B448C7857BB8A0247DA9E1@RaymondLaptop1>
	<e6511dbf0904252206u38cd65a6sd1707658e3242732@mail.gmail.com>
	<20090426144607.1fee23de@o>
Message-ID: <87hc0bl6cy.fsf@uwakimon.sk.tsukuba.ac.jp>

spir writes:

 > do:
 >    <suite>
 > until <condition>
 > 
 > (or any footer variant) the header "do:" warns you about the loop's
 > logic: loop at least once, then check for an exit condition --
 > logically written at the loop's exit point ;-)
 > 
 > If the real issue you have in mind is that the loop's body may not
 > fit inside the editor window, well...

Not for me.  The real issue is that people think top-down, which has a
double meaning.  First, we write sequences of statements from
top-to-bottom on the page, and read them the same way.  Second, we
construct applications by starting with a high-level abstraction, and
gradually add complexity by implementing lower and lower level
functionality.[1]

Note that C, and many other languages, implement some kind of include
functionality in large part to allow you to declare many identifiers
*early* but without interfering with the correspondence of top-down
*reading habits* to top-down *design*.

So as I'm reading down the page, it's natural for the structured
concept (high level decision of what to do) to be above the detailed
implementation (lower level decisions about how to do it).  Having the
complete condition available at the top of a structure is something
I've always wanted from do ... while or repeat ... until (but didn't
know that until Raymond pointed it out<wink>), because it's not enough
to know that this is a structure.  Rather, having the condition
available tells me (in many cases) whether this is a normally-once,
occasionally many, "retry" kind of function, or a typically
multi-iteration task that just happens to need to always be done at
least once.  That often affects my perception of the code enough to
demand rereading if the condition is of the "wrong" kind based on my
prior expectation.

YMMV of course.

Footnotes: 
[1]  Yes, I'm aware that time-and-motion studies of programmers show
that few programmers actually write programs that way.  In fact most
programmers, including excellent ones, switch back and forth between
top-down, bottom-up, and other modes (inside-out?).  What's important
is that in many cases it makes it much easier to read and understand
somebody else's program.



From solipsis at pitrou.net  Sun Apr 26 19:26:31 2009
From: solipsis at pitrou.net (Antoine Pitrou)
Date: Sun, 26 Apr 2009 17:26:31 +0000 (UTC)
Subject: [Python-ideas] Updating PEP 315:  do-while loops
References: <74E2C336A61B4EE68C5AF37959C73174@RaymondLaptop1>
	<49F3C863.9040606@hastings.org>
	<91F251EF5E864ED29BCA5182DE76F0F0@RaymondLaptop1>
Message-ID: <loom.20090426T171915-61@post.gmane.org>

Raymond Hettinger <python at ...> writes:
> 
> Yes. Absolutely.
> 
> Though I would like a fair hearing for the "do ... while <cond>:" proposal.

I find it disturbing. First, the condition appears on top but is only evaluated
at the end of the loop. Second, it's not usual for synctatical tokens to be
composed of several words ("do ... while" versus "while", "if", "for", etc.).
Third, it introduces a semi-keyword ("do") which won't be used anywhere else,
and abuses an existing object ("...") as synctatical glue.

I'm in favor of not trying to solve this particular "problem". Keeping the
grammar simple is valuable in itself (look at Javascript, C++...).

Regards

Antoine.




From grosser.meister.morti at gmx.net  Sun Apr 26 19:44:17 2009
From: grosser.meister.morti at gmx.net (=?ISO-8859-1?Q?Mathias_Panzenb=F6ck?=)
Date: Sun, 26 Apr 2009 19:44:17 +0200
Subject: [Python-ideas] Updating PEP 315:  do-while loops
In-Reply-To: <74E2C336A61B4EE68C5AF37959C73174@RaymondLaptop1>
References: <74E2C336A61B4EE68C5AF37959C73174@RaymondLaptop1>
Message-ID: <49F49D71.8020803@gmx.net>

Why not:

loop:
	...
	if cond: break


where "loop:" is short for "while True:".


From denis.spir at free.fr  Sun Apr 26 20:04:37 2009
From: denis.spir at free.fr (spir)
Date: Sun, 26 Apr 2009 20:04:37 +0200
Subject: [Python-ideas] Updating PEP 315: do-while loops
In-Reply-To: <87hc0bl6cy.fsf@uwakimon.sk.tsukuba.ac.jp>
References: <74E2C336A61B4EE68C5AF37959C73174@RaymondLaptop1>
	<49F3B965.1090700@trueblade.com>
	<6097EC22D9B448C7857BB8A0247DA9E1@RaymondLaptop1>
	<e6511dbf0904252206u38cd65a6sd1707658e3242732@mail.gmail.com>
	<20090426144607.1fee23de@o>
	<87hc0bl6cy.fsf@uwakimon.sk.tsukuba.ac.jp>
Message-ID: <20090426200437.4c15dd77@o>

Le Mon, 27 Apr 2009 01:36:29 +0900,
"Stephen J. Turnbull" <stephen at xemacs.org> s'exprima ainsi:

I do not understand the resistance against the "footer" options, there must be something I cannot figure out. So my favored pattern in case a header option would win the game is:

until condition:
   <suite>

I think
* 'until' expresses far better an exit-condition than 'while' (even if this advantage is largely broken by the fact it is placed on top)
* in this case(*) we don't need any weird nicety like '...' (I say 'weird' because the sign itself has nothing to do with the meaning we want it to carry.)

(*) First because 'until' is new in python. Second because it carries a sense of 'end'.

> So as I'm reading down the page, it's natural for the structured
> concept (high level decision of what to do) to be above the detailed
> implementation (lower level decisions about how to do it).  Having the
> complete condition available at the top of a structure is something
> I've always wanted from do ... while or repeat ... until (but didn't
> know that until Raymond pointed it out<wink>), because it's not enough
> to know that this is a structure. 

Sure. While it certainly depends on individual (largely unconscious) strategies of programming and parsing. Anyway.

How important is this really in practice?
* Consider that to properly understand if..(else), if...elif...elif, try...except, for...(else) blocks, etc... we need to do the same kind of mental job. Who complains about it?
* Worse: Many of these blocks have _optional_ clauses. Even worse: you're not even warned about it by the header. This does not trouble me so much...
* Usually the loop fits in a page, so that your eyes can jump forward. (Otherwise there are other means, including folding and refactoring ;-)

This is not joking. I do not find

do:
   ...
   ...
   ...
until cond

more difficult to parse than for instance

if cond:
   ...
   ...
   ...
elif cond:
   ...
   ...
   ...
elif cond:
   ...
   ...
   ...

Anyway. 
Denis
------
la vita e estrany


From larry at hastings.org  Sun Apr 26 22:20:25 2009
From: larry at hastings.org (Larry Hastings)
Date: Sun, 26 Apr 2009 13:20:25 -0700
Subject: [Python-ideas] Updating PEP 315: do-while loops
In-Reply-To: <20090426144255.21223d2a@bhuda.mired.org>
References: <74E2C336A61B4EE68C5AF37959C73174@RaymondLaptop1>	<49F3C863.9040606@hastings.org>	<50697b2c0904251949x5a1442apa35157e11a7e3c9e@mail.gmail.com>	<49F3D759.7030605@hastings.org>	<20090426011259.58c52c9e@bhuda.mired.org>	<49F4541A.30704@hastings.org>
	<20090426144255.21223d2a@bhuda.mired.org>
Message-ID: <49F4C209.8090804@hastings.org>


Mike Meyer wrote:
> This is just arguing about the meaning of "syntactic sugar". In some
> sense, almost all control constructs are syntactic sugar, as all you
> really need is lambda and if
> (http://en.wikipedia.org/wiki/Lambda_Papers). However, anything that
> lets me reduce the line count by removing boilerplate or - even
> better, duplicate code - moves out of that category for me. Please
> feel free to disagree.

Then by your own definition do/while has yet to move out of the category 
of "syntactic sugar".  do/while does not reduce line count, and if it 
could be said to remove boilerplate or duplicate code it does so to a 
vanishingly small degree.

So what's left?  My working definition of "syntactic sugar" is: 
"likely-redundant syntax that makes expressing a program more appealing 
somehow".  I think do/while is syntactic sugar.  Is it worth making the 
language bigger to add it?  Particularly when we can't agree on what 
would be good syntax?  The fact that we're still having the debate 
implies... well, that people like arguing on the internet I guess.  But 
do/while can be found in Pascal, and C / C++, and all the C clones 
(Java, C#) cloned that too, so perhaps it has merit.

Anyway, this is one reason why I like the "do:" statement I proposed.  
(The other one--the one where I also proposed "continue if" and "break 
if".  But given your reaction to those let's ignore 'em for now.)  This 
isn't *pure* sugar--it's a mildly novel flow control construct for 
structured programming.  Certainly Python doesn't have anything like it 
right now; "while True:" isn't an exact match (and "while False:" is 
equivalent to "if 0:").  My "do:" would allow spelling do/while as:

    do:
        something()
        if condition: continue

I think that's a reasonable do/while, and so economical: we only added 
one new keyword.  This form of "do:" has other uses too, as per my 
previous email.

Not that I expect this syntax to take the world by storm.

>>     do:
>>         something()
>>     while condition
>>     
> Except Guido has already rejected this - or at least stated that he doesn't really like it.

I can believe that.  I still think it's as good as we're going to do for 
a literal translation of do/while.

If there was a syntax everyone liked we'd already be using it.  The fact 
that we haven't hit on one by now--recall that the PEP is six years 
old--implies that we're not likely to find one.  We've added lots of 
syntax to Python over the last six years, stuff that has made the 
language more powerful, more flexible, and even faster in some places.  
do/while doesn't give us any new power, flexibility, or speed--I daresay 
if we never added do/while we wouldn't really miss it.

If you have it handy, can you cite where he said he didn't like that?  
Not that I don't believe you; I'd just like to read it.

Has Guido cited a syntax that he *did* like?


/larry/


From g.brandl at gmx.net  Sun Apr 26 22:36:57 2009
From: g.brandl at gmx.net (Georg Brandl)
Date: Sun, 26 Apr 2009 22:36:57 +0200
Subject: [Python-ideas] why not try without except?
In-Reply-To: <20090426105108.254c2ee2@o>
References: <20090425103207.5e6479ab@o>
	<20090425122322.7f7da364@o>	<gsv3dn$cjp$1@ger.gmane.org>	<200904261105.57286.steve@pearwood.info>
	<20090426105108.254c2ee2@o>
Message-ID: <gt2glc$lkt$1@ger.gmane.org>

spir schrieb:
> Le Sun, 26 Apr 2009 11:05:56 +1000, Steven D'Aprano <steve at pearwood.info> 
> s'exprima ainsi:
> 
>> On Sat, 25 Apr 2009 11:32:39 pm Georg Brandl wrote:
>> 
>>> And while we're at it, let's introduce
>>> 
>>> on error resume next: foo() bar() baz()
>> 
>> Is that meant to be a serious suggestion or a sarcastic rejoinder?
> 
> It is certainly sarcastic -- never mind -;) Georg could have saved some 
> typing by reading comments and examples...
> 
> What I had in mind (and commented), is precisely not a syntactic construct 
> able to catch any exception for a whole block of statements.

It certainly sounded that way in the original post.

> E.g. an example like option self.fill(self.fillColor) can only catch 
> AttributeError for 'fillColor' or for 'fill'.

Or AttributeErrors raised inside of fill().

> Now I realise that what I miss is a kind of handy, clear, and not misuseable
>  idiom for coping with undefined variables/attributes/parameters. Otherwise 
> one needs to fall back to the reflexive-check-instead-of-try idiom: if 
> hasattr(self,'fillColor'): self.fill(self.fillColor) or use try... except ...
>  pass.

That depends.  Usually you would always make "fillColor" an attribute of the
object, and either let fill() accept None or not call it if fillColor is None.

> 'None' is also often (mis)used for this. People aware of the issues with None
> will rather have a custom undef-meaning object instead. Right? UNDEF =
> object()

No.  None is exactly meant to indicate the absence of a meaningful value.

> This is rather close to the use of Haskell's optional type (I guess it's 
> called 'Maybe' -- checked, see e.g. 
> http://en.wikibooks.org/wiki/Haskell/Hierarchical_libraries/Maybe). And I 
> think Scala has something similar, too. But these special types fit well the
>  semantics of statically-typed and compile-time-checked languages.

And in Python, None fills the role of Haskell's "Nothing" constructor.  There
is no need for a "Just" constructor, since we don't have static types.

> One issue with None is that it can be used as a valid value.

Of course.  Every object is a "valid value".

> A workaround may be a user-unsettable undef-meaning builtin value. Or a
> builtin isDef() function.

It seems you should make sure you know how Python handles namespaces.

> Still, both of these methods require checking. While the 'option' (or '?')
> proposal allows trying-instead-of-checking. Why not restrict it to a set of
> sensible exception types? for instance (this is what I would like):
> 
> option foo or ? foo <==> try: foo except (NameError, AttributeError): pass

It is almost always an error if a local name is not defined.  I've already
covered the case of AttributeErrors above.

> Moreover: Haskell's Maybe type can be used for return values. I really wonder
>  about python action-functions (that do not produce a result) returning None
>  instead of nothing at all.

Functions implicitly returning None is just a convention.  If you don't like
the ambiguity between "procedures" returning None and "functions" returning
None, define your own "not-valid" value and return it from functions.  It won't
result in code that is easier to read though.

> This is another story, indeed, but I see it 
> closely related at the meaning level.
>     searched = seq.find(....)	# may not return anything, not even None

What is "searched" supposed to be, then?  "Not anything" is not an option.

Georg

-- 
Thus spake the Lord: Thou shalt indent with four spaces. No more, no less.
Four shall be the number of spaces thou shalt indent, and the number of thy
indenting shall be four. Eight shalt thou not indent, nor either indent thou
two, excepting that thou then proceed to four. Tabs are right out.



From arnodel at googlemail.com  Sun Apr 26 22:58:40 2009
From: arnodel at googlemail.com (Arnaud Delobelle)
Date: Sun, 26 Apr 2009 21:58:40 +0100
Subject: [Python-ideas] Updating PEP 315: do-while loops
In-Reply-To: <49F4C209.8090804@hastings.org>
References: <74E2C336A61B4EE68C5AF37959C73174@RaymondLaptop1>	<49F3C863.9040606@hastings.org>	<50697b2c0904251949x5a1442apa35157e11a7e3c9e@mail.gmail.com>	<49F3D759.7030605@hastings.org>	<20090426011259.58c52c9e@bhuda.mired.org>	<49F4541A.30704@hastings.org>
	<20090426144255.21223d2a@bhuda.mired.org>
	<49F4C209.8090804@hastings.org>
Message-ID: <B9AF8247-6F39-49D8-97E4-2511B9504EA9@googlemail.com>


On 26 Apr 2009, at 21:20, Larry Hastings wrote:

>
> Anyway, this is one reason why I like the "do:" statement I  
> proposed.  (The other one--the one where I also proposed "continue  
> if" and "break if".  But given your reaction to those let's ignore  
> 'em for now.)  This isn't *pure* sugar--it's a mildly novel flow  
> control construct for structured programming.  Certainly Python  
> doesn't have anything like it right now; "while True:" isn't an  
> exact match (and "while False:" is equivalent to "if 0:").  My "do:"  
> would allow spelling do/while as:
>
>   do:
>       something()
>       if condition: continue
>
> I think that's a reasonable do/while, and so economical: we only  
> added one new keyword.  This form of "do:" has other uses too, as  
> per my previous email.
>
> Not that I expect this syntax to take the world by storm.

It's funny because I had something like this in mind when I myself  
suggested 'break if ...' and 'continue if ...' a little while ago  
(although I would go for a 'loop:' that doesn't need to be continue'd  
explicitely).  I had come to the point of view that the restrictions  
of the looping construct had in fact somehow liberated it and made it  
more versatile by making

    if condition: break

and

    if condition: continue

common control flow constructs inside a while loop, that in particular  
encompass the do ... while / repeat ... until constructs, etc seen in  
other languages (I may have expressed myself better in the original  
thread).  My motivation for 'break/continue if ...' was that it would  
make those constructs stand out more (and could be hilighted clearly  
as such in editors) - I was a bit surprised when I was told it would  
make code less readable and that it was a slippery slope, but then I  
have lots of ideas, most of which don't turn out to be very good :)

-- 
Arnaud



From arnodel at googlemail.com  Sun Apr 26 23:03:46 2009
From: arnodel at googlemail.com (Arnaud Delobelle)
Date: Sun, 26 Apr 2009 22:03:46 +0100
Subject: [Python-ideas] why not try without except?
In-Reply-To: <gt2glc$lkt$1@ger.gmane.org>
References: <20090425103207.5e6479ab@o>
	<20090425122322.7f7da364@o>	<gsv3dn$cjp$1@ger.gmane.org>	<200904261105.57286.steve@pearwood.info>
	<20090426105108.254c2ee2@o> <gt2glc$lkt$1@ger.gmane.org>
Message-ID: <D9A5616B-819F-4B03-B5BE-6C19EDB6F289@googlemail.com>


On 26 Apr 2009, at 21:36, Georg Brandl wrote:

> None is exactly meant to indicate the absence of a meaningful value.

... thus making it a meaningful value :)

-- 
Arnaud



From aahz at pythoncraft.com  Mon Apr 27 03:15:31 2009
From: aahz at pythoncraft.com (Aahz)
Date: Sun, 26 Apr 2009 18:15:31 -0700
Subject: [Python-ideas] Updating PEP 315: do-while loops
In-Reply-To: <B9AF8247-6F39-49D8-97E4-2511B9504EA9@googlemail.com>
References: <74E2C336A61B4EE68C5AF37959C73174@RaymondLaptop1>
	<49F3C863.9040606@hastings.org>
	<50697b2c0904251949x5a1442apa35157e11a7e3c9e@mail.gmail.com>
	<49F3D759.7030605@hastings.org>
	<20090426011259.58c52c9e@bhuda.mired.org>
	<49F4541A.30704@hastings.org>
	<20090426144255.21223d2a@bhuda.mired.org>
	<49F4C209.8090804@hastings.org>
	<B9AF8247-6F39-49D8-97E4-2511B9504EA9@googlemail.com>
Message-ID: <20090427011531.GA21741@panix.com>

Because I can't resist:

do-while diddy diddy dum diddy do
-- 
Aahz (aahz at pythoncraft.com)           <*>         http://www.pythoncraft.com/

"If you think it's expensive to hire a professional to do the job, wait
until you hire an amateur."  --Red Adair


From python at rcn.com  Mon Apr 27 03:20:10 2009
From: python at rcn.com (Raymond Hettinger)
Date: Sun, 26 Apr 2009 18:20:10 -0700
Subject: [Python-ideas] Updating PEP 315: do-while loops
References: <74E2C336A61B4EE68C5AF37959C73174@RaymondLaptop1><49F3C863.9040606@hastings.org><50697b2c0904251949x5a1442apa35157e11a7e3c9e@mail.gmail.com><49F3D759.7030605@hastings.org><20090426011259.58c52c9e@bhuda.mired.org><49F4541A.30704@hastings.org><20090426144255.21223d2a@bhuda.mired.org><49F4C209.8090804@hastings.org><B9AF8247-6F39-49D8-97E4-2511B9504EA9@googlemail.com>
	<20090427011531.GA21741@panix.com>
Message-ID: <C604EFB2CD1246F79BF8C4C89DA52B04@RaymondLaptop1>


[Aahz]
> Because I can't resist:
> 
> do-while diddy diddy dum diddy do

:-)


From pyideas at rebertia.com  Mon Apr 27 03:32:01 2009
From: pyideas at rebertia.com (Chris Rebert)
Date: Sun, 26 Apr 2009 18:32:01 -0700
Subject: [Python-ideas] Updating PEP 315: do-while loops
In-Reply-To: <20090427011531.GA21741@panix.com>
References: <74E2C336A61B4EE68C5AF37959C73174@RaymondLaptop1>
	<49F3C863.9040606@hastings.org>
	<50697b2c0904251949x5a1442apa35157e11a7e3c9e@mail.gmail.com>
	<49F3D759.7030605@hastings.org>
	<20090426011259.58c52c9e@bhuda.mired.org>
	<49F4541A.30704@hastings.org>
	<20090426144255.21223d2a@bhuda.mired.org>
	<49F4C209.8090804@hastings.org>
	<B9AF8247-6F39-49D8-97E4-2511B9504EA9@googlemail.com>
	<20090427011531.GA21741@panix.com>
Message-ID: <50697b2c0904261832y1264e61fgf411da1f988c96b3@mail.gmail.com>

On Sun, Apr 26, 2009 at 6:15 PM, Aahz <aahz at pythoncraft.com> wrote:
> Because I can't resist:
>
> do-while diddy diddy dum diddy do

+1 QOTW

- Chris

> --
> Aahz (aahz at pythoncraft.com) ? ? ? ? ? <*> ? ? ? ? http://www.pythoncraft.com/
>
> "If you think it's expensive to hire a professional to do the job, wait
> until you hire an amateur." ?--Red Adair


From ben+python at benfinney.id.au  Mon Apr 27 04:40:42 2009
From: ben+python at benfinney.id.au (Ben Finney)
Date: Mon, 27 Apr 2009 12:40:42 +1000
Subject: [Python-ideas] Updating PEP 315: do-while loops
References: <74E2C336A61B4EE68C5AF37959C73174@RaymondLaptop1>
	<49F3C863.9040606@hastings.org>
	<50697b2c0904251949x5a1442apa35157e11a7e3c9e@mail.gmail.com>
	<49F3D759.7030605@hastings.org> <gt13e4$jhf$1@ger.gmane.org>
Message-ID: <87ws96x1hx.fsf@benfinney.id.au>

Stefan Behnel <stefan_ml at behnel.de>
writes:

> Hmm, so why add another keyword like "do", instead of just giving
> "while" a default condition "True", so that you could write
> 
>     while:
>         state = do_stuff_here()
>         if predicate(state): break
> 
> I think that the missing condition at the top makes it pretty clear
> that the end condition must be inside the loop.

Or that there isn't an end condition at all.

I don't see how this indicator is any less clear than the current
spelling::

    while True:
        state = do_stuff_here()
        if predicate(state): break

-- 
 \      ?I would rather be exposed to the inconveniences attending too |
  `\      much liberty than those attending too small a degree of it.? |
_o__)                                                ?Thomas Jefferson |
Ben Finney



From stephen at xemacs.org  Mon Apr 27 06:48:31 2009
From: stephen at xemacs.org (Stephen J. Turnbull)
Date: Mon, 27 Apr 2009 13:48:31 +0900
Subject: [Python-ideas] Updating PEP 315:  do-while loops
In-Reply-To: <49F49D71.8020803@gmx.net>
References: <74E2C336A61B4EE68C5AF37959C73174@RaymondLaptop1>
	<49F49D71.8020803@gmx.net>
Message-ID: <87eiveln1c.fsf@uwakimon.sk.tsukuba.ac.jp>

Mathias Panzenb?ck writes:
 > Why not:
 > 
 > loop:
 > 	...
 > 	if cond: break
 > 
 > 
 > where "loop:" is short for "while True:".

Er, maybe because not every idiom involving a constant needs to be
replaced by new syntax?



From stephen at xemacs.org  Mon Apr 27 06:58:39 2009
From: stephen at xemacs.org (Stephen J. Turnbull)
Date: Mon, 27 Apr 2009 13:58:39 +0900
Subject: [Python-ideas] Updating PEP 315: do-while loops
In-Reply-To: <20090426200437.4c15dd77@o>
References: <74E2C336A61B4EE68C5AF37959C73174@RaymondLaptop1>
	<49F3B965.1090700@trueblade.com>
	<6097EC22D9B448C7857BB8A0247DA9E1@RaymondLaptop1>
	<e6511dbf0904252206u38cd65a6sd1707658e3242732@mail.gmail.com>
	<20090426144607.1fee23de@o>
	<87hc0bl6cy.fsf@uwakimon.sk.tsukuba.ac.jp>
	<20090426200437.4c15dd77@o>
Message-ID: <87d4aylmkg.fsf@uwakimon.sk.tsukuba.ac.jp>

spir writes:
 > Le Mon, 27 Apr 2009 01:36:29 +0900,
 > "Stephen J. Turnbull" <stephen at xemacs.org> s'exprima ainsi:
 > 
 > I do not understand the resistance against the "footer" options,

I have no objection to "footer options", unless they involve new
syntax.  We already have

    while True:
        # suite
        if condition: continue

and none of the proposals for footers improve readability over that,
IMO.

The argument that a do-while with the terminal condition at the head
is a new structure and has some possible advantages in readability is
correct; that's all I had to say.  Let me now add "but it's weak,
IMO."  So I'm with Larry; we don't need this PEP.  There may be some
way to do it, but the current suggestions aren't attractive enough.

Your suggestion of "until <condition>:" may be the best of the lot,
but it's ambiguous for the same reasons "do ... while <condition>:"
is.  Many people will read that as "while not <condition>:".




From bruce at leapyear.org  Mon Apr 27 06:52:30 2009
From: bruce at leapyear.org (Bruce Leban)
Date: Sun, 26 Apr 2009 21:52:30 -0700
Subject: [Python-ideas] Updating PEP 315: do-while loops
In-Reply-To: <87ws96x1hx.fsf@benfinney.id.au>
References: <74E2C336A61B4EE68C5AF37959C73174@RaymondLaptop1> 
	<49F3C863.9040606@hastings.org>
	<50697b2c0904251949x5a1442apa35157e11a7e3c9e@mail.gmail.com> 
	<49F3D759.7030605@hastings.org> <gt13e4$jhf$1@ger.gmane.org> 
	<87ws96x1hx.fsf@benfinney.id.au>
Message-ID: <cf5b87740904262152r1469a98eh39829cb4ebd3ade5@mail.gmail.com>

I think a legitimate issue here is that break and continue can be hard to
see sometimes when they're nested several layers deep. That makes while
True: ... if x: break less clear than it could be. So let me throw out two
straw proposals:

(1) Allow break and continue to be spelled BREAK and CONTINUE which makes
them stand out a bit more.

    while True:
        floob
        yuzz
        if wum:
            BREAK
        thnad

(2) Allow a line containing break or continue to start with a -> token
outside the normal indentation.

    while True:
        floob
        yuzz
        if wum:
    ->      break
        thnad

I don't expect either of these proposals to attract support. I present them
to put a small focus on the issue of making break/continue more visible.

--- Bruce
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-ideas/attachments/20090426/f6c9d528/attachment.html>

From rrr at ronadam.com  Mon Apr 27 07:09:35 2009
From: rrr at ronadam.com (Ron Adam)
Date: Mon, 27 Apr 2009 00:09:35 -0500
Subject: [Python-ideas] Updating PEP 315: do-while loops
In-Reply-To: <B9AF8247-6F39-49D8-97E4-2511B9504EA9@googlemail.com>
References: <74E2C336A61B4EE68C5AF37959C73174@RaymondLaptop1>	<49F3C863.9040606@hastings.org>	<50697b2c0904251949x5a1442apa35157e11a7e3c9e@mail.gmail.com>	<49F3D759.7030605@hastings.org>	<20090426011259.58c52c9e@bhuda.mired.org>	<49F4541A.30704@hastings.org>	<20090426144255.21223d2a@bhuda.mired.org>	<49F4C209.8090804@hastings.org>
	<B9AF8247-6F39-49D8-97E4-2511B9504EA9@googlemail.com>
Message-ID: <49F53E0F.5010604@ronadam.com>



Arnaud Delobelle wrote:
> 
> On 26 Apr 2009, at 21:20, Larry Hastings wrote:
> 
>>
>> Anyway, this is one reason why I like the "do:" statement I proposed.  
>> (The other one--the one where I also proposed "continue if" and "break 
>> if".  But given your reaction to those let's ignore 'em for now.)  
>> This isn't *pure* sugar--it's a mildly novel flow control construct 
>> for structured programming.  Certainly Python doesn't have anything 
>> like it right now; "while True:" isn't an exact match (and "while 
>> False:" is equivalent to "if 0:").  My "do:" would allow spelling 
>> do/while as:
>>
>>   do:
>>       something()
>>       if condition: continue
>>
>> I think that's a reasonable do/while, and so economical: we only added 
>> one new keyword.  This form of "do:" has other uses too, as per my 
>> previous email.
>>
>> Not that I expect this syntax to take the world by storm.
> 
> It's funny because I had something like this in mind when I myself 
> suggested 'break if ...' and 'continue if ...' a little while ago 
> (although I would go for a 'loop:' that doesn't need to be continue'd 
> explicitely).  I had come to the point of view that the restrictions of 
> the looping construct had in fact somehow liberated it and made it more 
> versatile by making
> 
>    if condition: break
> 
> and
> 
>    if condition: continue
> 
> common control flow constructs inside a while loop, that in particular 
> encompass the do ... while / repeat ... until constructs, etc seen in 
> other languages (I may have expressed myself better in the original 
> thread).  My motivation for 'break/continue if ...' was that it would 
> make those constructs stand out more (and could be hilighted clearly as 
> such in editors) - I was a bit surprised when I was told it would make 
> code less readable and that it was a slippery slope, but then I have 
> lots of ideas, most of which don't turn out to be very good :)


I think it would help it to stand out.

     while 1:
        ...
        if a == 2: b = finished
        if b == 3: breakit()
        if c == 4: break
        if d == 5: baker = name
        if e == 6: steak = done

     while 1:
        ...
        if a == 2: b = finished
        if b == 3: breakit()
        break if c == 4
        if d == 5: baker = name
        if e == 6: steak = done

Ron


From denis.spir at free.fr  Mon Apr 27 11:51:47 2009
From: denis.spir at free.fr (spir)
Date: Mon, 27 Apr 2009 11:51:47 +0200
Subject: [Python-ideas] why not try without except?
In-Reply-To: <gt2glc$lkt$1@ger.gmane.org>
References: <20090425103207.5e6479ab@o> <20090425122322.7f7da364@o>
	<gsv3dn$cjp$1@ger.gmane.org>
	<200904261105.57286.steve@pearwood.info>
	<20090426105108.254c2ee2@o> <gt2glc$lkt$1@ger.gmane.org>
Message-ID: <20090427115147.577b86a9@o>

[off list]

Le Sun, 26 Apr 2009 22:36:57 +0200,
Georg Brandl <g.brandl at gmx.net> s'exprima ainsi:

> > This is another story, indeed, but I see it 
> > closely related at the meaning level.
> >     searched = seq.find(....)	# may not return anything, not even
> > None  
> 
> What is "searched" supposed to be, then?  "Not anything" is not an option.

Raise an exception, obviously (like I showed before). 

Either None [or any or word you like (nil?)] is, say "non-assignable" or rather non-usable in any kind of expression and get NoneError because you tried to use it.
Or the func does not return anything, literally (like a pascal procedure), and you get NoReturnValueError for trying to use it.

The only safe alternative is to have 2 kinds of funcs: action-funcs and production-funcs. My workarouns is to use always verbs for actions and nouns (naming the result) for productions. This issue is that in english the same word is often both a noun and a verb ;-) (moral: flexibility is not only an advantage)

There is side-topic you evoke in your reply:
<< If you don't like the ambiguity between "procedures" returning None and "functions" returning None, define your own "not-valid" value and return it from functions. >>

Actually, this was not my point as you can see above, still this issue also exists. But we have the proper solution for it. A production-func is intended to normally return a result; if it cannot do it, we face an "exceptional case" so just throw an exception to handle it.
This is indeed very different of the issue caused by the fact that we have no mean to distinguish action-funcs. Code using or assigning the None result will run free, often fail somewhere else, thus causing usually hard-to-find bugs. You know it, for sure.
The reason is that None is assignable. Consequentely, a func should not return it if not explicitely told to.

Denis
------
la vita e estrany


From denis.spir at free.fr  Mon Apr 27 12:19:10 2009
From: denis.spir at free.fr (spir)
Date: Mon, 27 Apr 2009 12:19:10 +0200
Subject: [Python-ideas] Updating PEP 315: do-while loops
In-Reply-To: <87d4aylmkg.fsf@uwakimon.sk.tsukuba.ac.jp>
References: <74E2C336A61B4EE68C5AF37959C73174@RaymondLaptop1>
	<49F3B965.1090700@trueblade.com>
	<6097EC22D9B448C7857BB8A0247DA9E1@RaymondLaptop1>
	<e6511dbf0904252206u38cd65a6sd1707658e3242732@mail.gmail.com>
	<20090426144607.1fee23de@o>
	<87hc0bl6cy.fsf@uwakimon.sk.tsukuba.ac.jp>
	<20090426200437.4c15dd77@o>
	<87d4aylmkg.fsf@uwakimon.sk.tsukuba.ac.jp>
Message-ID: <20090427121910.33f4ef7c@o>

[off list -- because it's a side-issue]

Le Mon, 27 Apr 2009 13:58:39 +0900,
"Stephen J. Turnbull" <stephen at xemacs.org> s'exprima ainsi:

Thanks for your comments.

> Your suggestion of "until <condition>:" may be the best of the lot,

Certainly not the best for all readers, but the simplest as of now ;-)


> but it's ambiguous for the same reasons "do ... while <condition>:"
> is.

The source of ambiguity and even mislead is that it is placed on top while the condition is evaluated at end of loop. Anyway, it's an exit condition so it fits better down there IMO where ever it may be evaluated.
The reason why I favor 'footer' options.

> Many people will read that as "while not <condition>:".

My opinion about that is the following:
* A new keyword is enough to force the reader to consider the code with attention and expect something different.
* Obviously 'until' cannot have the same meaning as 'while' -- rather the opposite in fact. Don't you think so? (english is not my mother tongue)
* The issues with '...' is that (1) it obfuscates the code (2) there is no link between the sense of ellipsis itself and the proposed semantics (*)

Denis

(*) It would be better used in ranges (python ':') or as line-continuation code (python '\').
------
la vita e estrany


From pyideas at rebertia.com  Mon Apr 27 12:45:34 2009
From: pyideas at rebertia.com (Chris Rebert)
Date: Mon, 27 Apr 2009 03:45:34 -0700
Subject: [Python-ideas] Updating PEP 315: do-while loops
In-Reply-To: <20090427121910.33f4ef7c@o>
References: <74E2C336A61B4EE68C5AF37959C73174@RaymondLaptop1>
	<49F3B965.1090700@trueblade.com>
	<6097EC22D9B448C7857BB8A0247DA9E1@RaymondLaptop1>
	<e6511dbf0904252206u38cd65a6sd1707658e3242732@mail.gmail.com>
	<20090426144607.1fee23de@o> <87hc0bl6cy.fsf@uwakimon.sk.tsukuba.ac.jp>
	<20090426200437.4c15dd77@o> <87d4aylmkg.fsf@uwakimon.sk.tsukuba.ac.jp>
	<20090427121910.33f4ef7c@o>
Message-ID: <50697b2c0904270345y7358a962m5dbbf14c541d56f3@mail.gmail.com>

On Mon, Apr 27, 2009 at 3:19 AM, spir <denis.spir at free.fr> wrote:
> [off list -- because it's a side-issue]
>
> Le Mon, 27 Apr 2009 13:58:39 +0900,
> "Stephen J. Turnbull" <stephen at xemacs.org> s'exprima ainsi:
>
> Thanks for your comments.
>
>> Your suggestion of "until <condition>:" may be the best of the lot,
>
> Certainly not the best for all readers, but the simplest as of now ;-)
>
>
>> but it's ambiguous for the same reasons "do ... while <condition>:"
>> is.
>
> The source of ambiguity and even mislead is that it is placed on top while the condition is evaluated at end of loop. Anyway, it's an exit condition so it fits better down there IMO where ever it may be evaluated.
> The reason why I favor 'footer' options.
>
>> Many people will read that as "while not <condition>:".
>
> My opinion about that is the following:
> * A new keyword is enough to force the reader to consider the code with attention and expect something different.
> * Obviously 'until' cannot have the same meaning as 'while' -- rather the opposite in fact. Don't you think so? (english is not my mother tongue)

As someone whose native language is English, I can say that Stephen's
reading would indeed intuitively be considered the obvious logical
opposite of a "while" loop; also, his interpretation matches the
behavior of "until" loops in both Perl and Ruby.

Cheers,
Chris
-- 
http://blog.rebertia.com


From tjreedy at udel.edu  Mon Apr 27 19:49:37 2009
From: tjreedy at udel.edu (Terry Reedy)
Date: Mon, 27 Apr 2009 13:49:37 -0400
Subject: [Python-ideas] Updating PEP 315: do-while loops
In-Reply-To: <49F4C209.8090804@hastings.org>
References: <74E2C336A61B4EE68C5AF37959C73174@RaymondLaptop1>	<49F3C863.9040606@hastings.org>	<50697b2c0904251949x5a1442apa35157e11a7e3c9e@mail.gmail.com>	<49F3D759.7030605@hastings.org>	<20090426011259.58c52c9e@bhuda.mired.org>	<49F4541A.30704@hastings.org>	<20090426144255.21223d2a@bhuda.mired.org>
	<49F4C209.8090804@hastings.org>
Message-ID: <gt4r7g$s4m$1@ger.gmane.org>

A quick summary of my views:

1. When I programmed in C, I hardly if ever used do...while.  I have 
read that this is true of other C coders also.  So I see no need for a 
Python equivalent.

2. On the other hand, loop and a half use is a regular occurrence. 
Anything that does not arguably improve

while True:
   part1()
   if cond: break
   part2()

such as

loop
   part1()
whhile not cond:
   part2()

seems pretty useless.  But even the above does not work great for 
multiple "if... break"s and not at all for "if...continue".

3. I do not see 'if cond: break' as that much of a problem.  One can 
emphasize with extra whitespace indent or comment.
   if cond:                         break #EXIT#

In any case, I see it as analogous to early return

def f():
   part1()
   if cond: return
   part2()

and while there are purists who object to *that*, I have not seen any 
proposals to officially support better emphasis (other than the existing 
whitespace or comment mechanisms).

In fact, it is not uncomment to use 'return expr' instead of 'ret=expr; 
break' when a loop (for or while) is the last part of a def.

4. "do ... while cond:" strikes me as ugly.  Aside from that, I do not 
like and would not use anything that puts the condition out of place, 
other than where it is executed.  I would hate to read such code and 
expect it would confuse many others.

5. We well never reach consensus.  PEP 315 might as well be withdrawn.

Terry Jan Reedy



From thebrasse at brasse.org  Mon Apr 27 21:33:53 2009
From: thebrasse at brasse.org (=?UTF-8?B?TWF0dGlhcyBCcsOkbmRzdHLDtm0=?=)
Date: Mon, 27 Apr 2009 21:33:53 +0200
Subject: [Python-ideas] Nested with statements
Message-ID: <c23311650904271233k7671e988m35d1bf1bd8fd048b@mail.gmail.com>

Hi!

Using contextlib.nested() often me headaches... (This is my first time
posting, so please bear with me...) :-)

I often want to write code similar to this:

with conextlib.nested(A(), B()) as a,b:
    # Code that uses a and b.

My problem is that if B() throws an exception then no context manager
will ever be created to release the resources that A() acquired. This
isn't really contextlib.nested's fault since it never comes into
existence.

It would be really useful to have a shorthand for creating truly
nested with statements. My idea then is this: couldn't the language be
tweaked to handle this? It might look something like this:

with A(), B() as a,b:
    # Code that uses a and b.

This would translate directly to:

with A() as a:
    with B() as b:
        # Code that uses a and b.

I really think that this approach to a shorthand for nested with
statements would be better than using contextlib.nested (and perhaps
this is what contextlib.nested intented to do from the beginning).

Regards,
Mattias


From arnodel at googlemail.com  Mon Apr 27 21:40:41 2009
From: arnodel at googlemail.com (Arnaud Delobelle)
Date: Mon, 27 Apr 2009 20:40:41 +0100
Subject: [Python-ideas] Updating PEP 315: do-while loops
In-Reply-To: <gt4r7g$s4m$1@ger.gmane.org>
References: <74E2C336A61B4EE68C5AF37959C73174@RaymondLaptop1>	<49F3C863.9040606@hastings.org>	<50697b2c0904251949x5a1442apa35157e11a7e3c9e@mail.gmail.com>	<49F3D759.7030605@hastings.org>	<20090426011259.58c52c9e@bhuda.mired.org>	<49F4541A.30704@hastings.org>	<20090426144255.21223d2a@bhuda.mired.org>
	<49F4C209.8090804@hastings.org> <gt4r7g$s4m$1@ger.gmane.org>
Message-ID: <CF2790EF-21E6-430B-B826-A2448B1C9E80@googlemail.com>


On 27 Apr 2009, at 18:49, Terry Reedy wrote:

> A quick summary of my views:
>
> 1. When I programmed in C, I hardly if ever used do...while.  I have  
> read that this is true of other C coders also.  So I see no need for  
> a Python equivalent.
>

I haven't got much C that I wrote myself, but on my most recent C  
project, out of 100 loops:

* 65 are while loops
* 31 are for loops
* 4 are do-while loops

In the py3k C source, out of 100 loops I see:

* 60 for loops
* 32 while loops
* 8 do-while loops

> 2. On the other hand, loop and a half use is a regular occurrence.  
> Anything that does not arguably improve
>
> while True:
>  part1()
>  if cond: break
>  part2()
>
> such as
>
> loop
>  part1()
> whhile not cond:
>  part2()
>
> seems pretty useless.  But even the above does not work great for  
> multiple "if... break"s and not at all for "if...continue".
>

This is my opinion also.

> 3. I do not see 'if cond: break' as that much of a problem.  One can  
> emphasize with extra whitespace indent or comment.
>  if cond:                         break #EXIT#
>
> In any case, I see it as analogous to early return
>
> def f():
>  part1()
>  if cond: return
>  part2()
>
> and while there are purists who object to *that*, I have not seen  
> any proposals to officially support better emphasis (other than the  
> existing whitespace or comment mechanisms).
>

My idea with 'break/continue if' was, in my mind, such a proposal.  I  
thought that it would enshrine what I perceived as a very common  
construct ('if ...: break/continue') into the language and make it  
easier to spot when scanning code.

> In fact, it is not uncomment to use 'return expr' instead of  
> 'ret=expr; break' when a loop (for or while) is the last part of a  
> def.
>
> 4. "do ... while cond:" strikes me as ugly.  Aside from that, I do  
> not like and would not use anything that puts the condition out of  
> place, other than where it is executed.  I would hate to read such  
> code and expect it would confuse many others.
>

I definitely agree with this.

-- 
Arnaud



From arnodel at googlemail.com  Mon Apr 27 21:54:58 2009
From: arnodel at googlemail.com (Arnaud Delobelle)
Date: Mon, 27 Apr 2009 20:54:58 +0100
Subject: [Python-ideas] Nested with statements
In-Reply-To: <c23311650904271233k7671e988m35d1bf1bd8fd048b@mail.gmail.com>
References: <c23311650904271233k7671e988m35d1bf1bd8fd048b@mail.gmail.com>
Message-ID: <47C04CDA-8B21-495B-B1FA-E74AD3050B05@googlemail.com>


On 27 Apr 2009, at 20:33, Mattias Br?ndstr?m wrote:
>
[...]
> It would be really useful to have a shorthand for creating truly
> nested with statements. My idea then is this: couldn't the language be
> tweaked to handle this? It might look something like this:
>
> with A(), B() as a,b:
>    # Code that uses a and b.
>
> This would translate directly to:
>
> with A() as a:
>    with B() as b:
>        # Code that uses a and b.

There was a discussion about this on this list:

http://mail.python.org/pipermail/python-ideas/2009-March/003188.html

I can't remember the outcome.

-- 
Arnaud



From thebrasse at brasse.org  Mon Apr 27 22:28:39 2009
From: thebrasse at brasse.org (=?UTF-8?B?TWF0dGlhcyBCcsOkbmRzdHLDtm0=?=)
Date: Mon, 27 Apr 2009 22:28:39 +0200
Subject: [Python-ideas] Nested with statements
In-Reply-To: <47C04CDA-8B21-495B-B1FA-E74AD3050B05@googlemail.com>
References: <c23311650904271233k7671e988m35d1bf1bd8fd048b@mail.gmail.com>
	<47C04CDA-8B21-495B-B1FA-E74AD3050B05@googlemail.com>
Message-ID: <c23311650904271328hc511346x151a105bcf5b8ed6@mail.gmail.com>

2009/4/27 Arnaud Delobelle <arnodel at googlemail.com>:
>
> On 27 Apr 2009, at 20:33, Mattias Br?ndstr?m wrote:
>>
> [...]
>>
>> It would be really useful to have a shorthand for creating truly
>> nested with statements. My idea then is this: couldn't the language be
>> tweaked to handle this? It might look something like this:
>>
>> with A(), B() as a,b:
>> ? # Code that uses a and b.
>>
>> This would translate directly to:
>>
>> with A() as a:
>> ? with B() as b:
>> ? ? ? # Code that uses a and b.
>
> There was a discussion about this on this list:
>
> http://mail.python.org/pipermail/python-ideas/2009-March/003188.html
>
> I can't remember the outcome.
>

Oh. Thanks for pointing that out. I'll try and contribute to that
thread instead.

:.:: mattias


From ben+python at benfinney.id.au  Tue Apr 28 01:46:48 2009
From: ben+python at benfinney.id.au (Ben Finney)
Date: Tue, 28 Apr 2009 09:46:48 +1000
Subject: [Python-ideas] Updating PEP 315: do-while loops
References: <74E2C336A61B4EE68C5AF37959C73174@RaymondLaptop1>
	<49F3B965.1090700@trueblade.com>
	<6097EC22D9B448C7857BB8A0247DA9E1@RaymondLaptop1>
	<e6511dbf0904252206u38cd65a6sd1707658e3242732@mail.gmail.com>
	<20090426144607.1fee23de@o>
	<87hc0bl6cy.fsf@uwakimon.sk.tsukuba.ac.jp>
	<20090426200437.4c15dd77@o>
	<87d4aylmkg.fsf@uwakimon.sk.tsukuba.ac.jp>
	<20090427121910.33f4ef7c@o>
Message-ID: <87skjtwtg7.fsf@benfinney.id.au>

spir <denis.spir at free.fr> writes:

> [off list -- because it's a side-issue]

Actually, you replied on-list, so I'll feel free to do the same :-)

> Le Mon, 27 Apr 2009 13:58:39 +0900,
> "Stephen J. Turnbull" <stephen at xemacs.org> s'exprima ainsi:
> 
> > Your suggestion of "until <condition>:" may be the best of the lot,
> > but it's ambiguous for the same reasons "do ... while <condition>:"
> > is.
> > Many people will read that as "while not <condition>:".
> 
> My opinion about that is the following:
> * A new keyword is enough to force the reader to consider the code
> with attention and expect something different.

You can't have it both ways, though: The keyword ?until? is already
strongly associated with certain behaviour in other programming
languages, which is presumably a strong reason to choose it for this
instance. But it also has a strong English-language relationship with
?while?.

> * Obviously 'until' cannot have the same meaning as 'while' -- rather
> the opposite in fact. Don't you think so? (english is not my mother
> tongue)

That's exactly what Stephen's saying: ?until <condition>? and ?while
not <condition>? are fungible in English.

> Denis

Denis, could you please change your From field so that it shows the name
?Denis Spir? which you're clearly happy to be known as?

-- 
 \        ?It is the responsibility of intellectuals to tell the truth |
  `\                    and to expose lies.? ?Noam Chomsky, 1967-02-23 |
_o__)                                                                  |
Ben Finney



From greg.ewing at canterbury.ac.nz  Tue Apr 28 02:49:41 2009
From: greg.ewing at canterbury.ac.nz (Greg Ewing)
Date: Tue, 28 Apr 2009 12:49:41 +1200
Subject: [Python-ideas] Updating PEP 315: do-while loops
In-Reply-To: <gt4r7g$s4m$1@ger.gmane.org>
References: <74E2C336A61B4EE68C5AF37959C73174@RaymondLaptop1>
	<49F3C863.9040606@hastings.org>
	<50697b2c0904251949x5a1442apa35157e11a7e3c9e@mail.gmail.com>
	<49F3D759.7030605@hastings.org>
	<20090426011259.58c52c9e@bhuda.mired.org>
	<49F4541A.30704@hastings.org> <20090426144255.21223d2a@bhuda.mired.org>
	<49F4C209.8090804@hastings.org> <gt4r7g$s4m$1@ger.gmane.org>
Message-ID: <49F652A5.9050809@canterbury.ac.nz>

Terry Reedy wrote:
> But even the above does not work great for 
> multiple "if... break"s and not at all for "if...continue".

Personally I don't care at all about that. This is about
providing a clearly structured way to write the most
common case.

As soon as you have multiple breaks you've already got
one foot in spaghetti-land and any notion of clean
structure has gone out the window.

-- 
Greg


From ben+python at benfinney.id.au  Tue Apr 28 01:50:10 2009
From: ben+python at benfinney.id.au (Ben Finney)
Date: Tue, 28 Apr 2009 09:50:10 +1000
Subject: [Python-ideas] Updating PEP 315:  do-while loops
References: <74E2C336A61B4EE68C5AF37959C73174@RaymondLaptop1>
	<49F49D71.8020803@gmx.net>
Message-ID: <87ocuhwtal.fsf@benfinney.id.au>

Mathias Panzenb?ck <grosser.meister.morti at gmx.net>
writes:

> Why not:
> 
> loop:
> 	...
> 	if cond: break
> 
> 
> where "loop:" is short for "while True:".

Already addressed. We already have a way of spelling that, which is
?while True:?. There needs to be some significant gain to introduce a new
spelling for the same thing.

-- 
 \        ?Good morning, Pooh Bear?, said Eeyore gloomily. ?If it is a |
  `\   good morning?, he said. ?Which I doubt?, said he. ?A. A. Milne, |
_o__)                                                _Winnie-the-Pooh_ |
Ben Finney



From rrr at ronadam.com  Tue Apr 28 08:58:50 2009
From: rrr at ronadam.com (Ron Adam)
Date: Tue, 28 Apr 2009 01:58:50 -0500
Subject: [Python-ideas] Updating PEP 315: do-while loops
In-Reply-To: <gt4r7g$s4m$1@ger.gmane.org>
References: <74E2C336A61B4EE68C5AF37959C73174@RaymondLaptop1>	<49F3C863.9040606@hastings.org>	<50697b2c0904251949x5a1442apa35157e11a7e3c9e@mail.gmail.com>	<49F3D759.7030605@hastings.org>	<20090426011259.58c52c9e@bhuda.mired.org>	<49F4541A.30704@hastings.org>	<20090426144255.21223d2a@bhuda.mired.org>	<49F4C209.8090804@hastings.org>
	<gt4r7g$s4m$1@ger.gmane.org>
Message-ID: <49F6A92A.40900@ronadam.com>



Terry Reedy wrote:
> A quick summary of my views:
> 
> 1. When I programmed in C, I hardly if ever used do...while.  I have 
> read that this is true of other C coders also.  So I see no need for a 
> Python equivalent.
> 
> 2. On the other hand, loop and a half use is a regular occurrence. 
> Anything that does not arguably improve
> 
> while True:
>   part1()
>   if cond: break
>   part2()
> 
> such as
> 
> loop
>   part1()
> whhile not cond:
>   part2()
> 
> seems pretty useless.  But even the above does not work great for 
> multiple "if... break"s and not at all for "if...continue".

I agree. I don't see lack of do--while as a wart.


> 3. I do not see 'if cond: break' as that much of a problem.  One can 
> emphasize with extra whitespace indent or comment.
>   if cond:                         break #EXIT#
> 
> In any case, I see it as analogous to early return
> 
> def f():
>   part1()
>   if cond: return
>   part2()
> 
> and while there are purists who object to *that*, I have not seen any 
> proposals to officially support better emphasis (other than the existing 
> whitespace or comment mechanisms).
> 
> In fact, it is not uncomment to use 'return expr' instead of 'ret=expr; 
> break' when a loop (for or while) is the last part of a def.

The 'break if'/'continue if' would make the break and continue more visible 
in some cases, but I now realize it also is limited to a single line rather 
than a block.

    ...
    if exit_cond:
        ...
        ...
        break
    ...

Break and continue are pretty visible as long as they are on a line by 
themselves, so maybe we should just stick with what we have.


> 4. "do ... while cond:" strikes me as ugly.  Aside from that, I do not 
> like and would not use anything that puts the condition out of place, 
> other than where it is executed.  I would hate to read such code and 
> expect it would confuse many others.

I agree.


> 5. We well never reach consensus.  PEP 315 might as well be withdrawn.

Unless there is some very nice examples or performance reasons to show 
otherwise, I just don't see enough need to be a convincing change.

Ron


From denis.spir at free.fr  Tue Apr 28 11:04:25 2009
From: denis.spir at free.fr (spir)
Date: Tue, 28 Apr 2009 11:04:25 +0200
Subject: [Python-ideas] Nested with statements
In-Reply-To: <47C04CDA-8B21-495B-B1FA-E74AD3050B05@googlemail.com>
References: <c23311650904271233k7671e988m35d1bf1bd8fd048b@mail.gmail.com>
	<47C04CDA-8B21-495B-B1FA-E74AD3050B05@googlemail.com>
Message-ID: <20090428110425.1eea5299@o>

Le Mon, 27 Apr 2009 20:54:58 +0100,
Arnaud Delobelle <arnodel at googlemail.com> s'exprima ainsi:

> > with A(), B() as a,b:
> >    # Code that uses a and b.
> >
> > This would translate directly to:
> >
> > with A() as a:
> >    with B() as b:
> >        # Code that uses a and b.  
> 
> There was a discussion about this on this list:
> 
> http://mail.python.org/pipermail/python-ideas/2009-March/003188.html
> 
> I can't remember the outcome.

There was no clear outcome, for sure ;-)

Except maybe it was stated that 
   with A(), B() as a,b:
should rather be spelled
   with A() as a, B() as b:
to reuse the syntax of imports.

Denis

------
la vita e estrany


From chambon.pascal at wanadoo.fr  Tue Apr 28 21:59:25 2009
From: chambon.pascal at wanadoo.fr (Pascal Chambon)
Date: Tue, 28 Apr 2009 21:59:25 +0200
Subject: [Python-ideas] Nested with statements
In-Reply-To: <49F751FB.9040102@wanadoo.fr>
References: <c23311650904271233k7671e988m35d1bf1bd8fd048b@mail.gmail.com>	<47C04CDA-8B21-495B-B1FA-E74AD3050B05@googlemail.com>
	<20090428110425.1eea5299@o> <49F751FB.9040102@wanadoo.fr>
Message-ID: <49F7601D.8020605@wanadoo.fr>


> spir a ?crit :
>> Le Mon, 27 Apr 2009 20:54:58 +0100,
>> Arnaud Delobelle <arnodel at googlemail.com> s'exprima ainsi:
>>
>>   
>>>> with A(), B() as a,b:
>>>>    # Code that uses a and b.
>>>>
>>>> This would translate directly to:
>>>>
>>>> with A() as a:
>>>>    with B() as b:
>>>>        # Code that uses a and b.  
>>>>       
>>> There was a discussion about this on this list:
>>>
>>> http://mail.python.org/pipermail/python-ideas/2009-March/003188.html
>>>
>>> I can't remember the outcome.
>>>     
>>
>> There was no clear outcome, for sure ;-)
>>
>> Except maybe it was stated that 
>>    with A(), B() as a,b:
>> should rather be spelled
>>    with A() as a, B() as b:
>> to reuse the syntax of imports.
>>
>> Denis
>>
>> ------
>> la vita e estrany
>> _______________________________________________
>> Python-ideas mailing list
>> Python-ideas at python.org
>> http://mail.python.org/mailman/listinfo/python-ideas
>>
>>
>>     
I agree with the idea of auto-nesting "with", however in the case you 
pointed out, the main problem was the early evaluation of context 
managers ;
maybe a solution would be to delay the creation of context managers, 
with something like a partial application (cf functools).

Roughly, we'd need a "delayedNested" function, which takes zero-argument 
callables as parameters, and calls/instanciates them inside itself.

Then just call* delayedNested(partial(A,...arguments...), partial(B, 
...arguments...))*/ /to have what you want.

Yes I know, it's not pretty (mainly because of the lack of partial 
application syntax in python), but it's just a placeholder ^^

Regards,
Pascal
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/python-ideas/attachments/20090428/e1256609/attachment.html>

From grosser.meister.morti at gmx.net  Tue Apr 28 22:09:55 2009
From: grosser.meister.morti at gmx.net (=?ISO-8859-1?Q?Mathias_Panzenb=F6ck?=)
Date: Tue, 28 Apr 2009 22:09:55 +0200
Subject: [Python-ideas] Nested with statements
In-Reply-To: <49F7601D.8020605@wanadoo.fr>
References: <c23311650904271233k7671e988m35d1bf1bd8fd048b@mail.gmail.com>	<47C04CDA-8B21-495B-B1FA-E74AD3050B05@googlemail.com>	<20090428110425.1eea5299@o>
	<49F751FB.9040102@wanadoo.fr> <49F7601D.8020605@wanadoo.fr>
Message-ID: <49F76293.7050402@gmx.net>

Pascal Chambon wrote:
 >
 > I agree with the idea of auto-nesting "with", however in the case you
 > pointed out, the main problem was the early evaluation of context
 > managers ;
 > maybe a solution would be to delay the creation of context managers,
 > with something like a partial application (cf functools).
 >
 > Roughly, we'd need a "delayedNested" function, which takes zero-argument
 > callables as parameters, and calls/instanciates them inside itself.
 >
 > Then just call* delayedNested(partial(A,...arguments...), partial(B,
 > ...arguments...))*/ /to have what you want.
 >

It would be much shorter and more readable to manually nest the with statements.

	-panzi


From chambon.pascal at wanadoo.fr  Tue Apr 28 22:45:56 2009
From: chambon.pascal at wanadoo.fr (Pascal Chambon)
Date: Tue, 28 Apr 2009 22:45:56 +0200
Subject: [Python-ideas] Re : Undelivered Mail Returned to Sender
Message-ID: <49F76B04.1010805@wanadoo.fr>

 >Mathias Panzenb?ck a ?crit :
 >
 >Pascal Chambon wrote:
 >>
 >> I agree with the idea of auto-nesting "with", however in the case you
 >> pointed out, the main problem was the early evaluation of context
 >> managers ;
 >> maybe a solution would be to delay the creation of context managers,
 >> with something like a partial application (cf functools).
 >>
 >> Roughly, we'd need a "delayedNested" function, which takes 
zero-argument
 >> callables as parameters, and calls/instanciates them inside itself.
 >>
 >> Then just call* delayedNested(partial(A,...arguments...), partial(B,
 >> ...arguments...))*/ /to have what you want.
 >>
 >
 >
 >It would be much shorter and more readable to manually nest the with 
statements.
 >
 >  -panzi



Indeed, but the constructions we talk about would allow nesting dozens 
of context managers without any problem ;
Well then, you'll ask me "what kind of a perverse would need to 
imbricate dozens of context managers ???" ; sincerely I don't know ^^

But since "flat is better than nested", even just for 2 or 3 context 
managers, I feel a construct like "with A() as a, B() as b, C() as c:" 
is anyway better than 3 nested with statements

Regards,
pascal



From grosser.meister.morti at gmx.net  Tue Apr 28 23:46:44 2009
From: grosser.meister.morti at gmx.net (=?ISO-8859-1?Q?Mathias_Panzenb=F6ck?=)
Date: Tue, 28 Apr 2009 23:46:44 +0200
Subject: [Python-ideas] Re : Undelivered Mail Returned to Sender
In-Reply-To: <49F76B04.1010805@wanadoo.fr>
References: <49F76B04.1010805@wanadoo.fr>
Message-ID: <49F77944.1040803@gmx.net>

Pascal Chambon wrote:
>  >Mathias Panzenb?ck a ?crit :
>  >
>  >Pascal Chambon wrote:
>  >>
>  >> I agree with the idea of auto-nesting "with", however in the case you
>  >> pointed out, the main problem was the early evaluation of context
>  >> managers ;
>  >> maybe a solution would be to delay the creation of context managers,
>  >> with something like a partial application (cf functools).
>  >>
>  >> Roughly, we'd need a "delayedNested" function, which takes 
> zero-argument
>  >> callables as parameters, and calls/instanciates them inside itself.
>  >>
>  >> Then just call* delayedNested(partial(A,...arguments...), partial(B,
>  >> ...arguments...))*/ /to have what you want.
>  >>
>  >
>  >
>  >It would be much shorter and more readable to manually nest the with 
> statements.
>  >
>  >  -panzi
> 
> 
> 
> Indeed, but the constructions we talk about would allow nesting dozens 
> of context managers without any problem ;
> Well then, you'll ask me "what kind of a perverse would need to 
> imbricate dozens of context managers ???" ; sincerely I don't know ^^
> 
> But since "flat is better than nested", even just for 2 or 3 context 
> managers, I feel a construct like "with A() as a, B() as b, C() as c:" 
> is anyway better than 3 nested with statements
> 

Yes I'm +1 for "with A() as a, B() as b, C() as c:", too.
For the deleyedNested(...) thing I'm -1 however.


From g.brandl at gmx.net  Wed Apr 29 00:18:49 2009
From: g.brandl at gmx.net (Georg Brandl)
Date: Wed, 29 Apr 2009 00:18:49 +0200
Subject: [Python-ideas] Re : Undelivered Mail Returned to Sender
In-Reply-To: <49F77944.1040803@gmx.net>
References: <49F76B04.1010805@wanadoo.fr> <49F77944.1040803@gmx.net>
Message-ID: <gt7vc2$qeb$1@ger.gmane.org>

Mathias Panzenb?ck schrieb:

>> Indeed, but the constructions we talk about would allow nesting dozens 
>> of context managers without any problem ;
>> Well then, you'll ask me "what kind of a perverse would need to 
>> imbricate dozens of context managers ???" ; sincerely I don't know ^^
>> 
>> But since "flat is better than nested", even just for 2 or 3 context 
>> managers, I feel a construct like "with A() as a, B() as b, C() as c:" 
>> is anyway better than 3 nested with statements
>> 
> 
> Yes I'm +1 for "with A() as a, B() as b, C() as c:", too.
> For the deleyedNested(...) thing I'm -1 however.

I'm also +1 (explicitly on this syntax variant, not "with A(),B() as a,b",
since it seems a more logical step from the nested with statements).

Mattias and I have a patch in working; I expect it will be fine to amend
PEP 343 for this when it's ready.

Georg



From cmjohnson.mailinglist at gmail.com  Wed Apr 29 06:25:44 2009
From: cmjohnson.mailinglist at gmail.com (Carl Johnson)
Date: Tue, 28 Apr 2009 18:25:44 -1000
Subject: [Python-ideas] Nested with statements
In-Reply-To: <49F76293.7050402@gmx.net>
References: <c23311650904271233k7671e988m35d1bf1bd8fd048b@mail.gmail.com>
	<47C04CDA-8B21-495B-B1FA-E74AD3050B05@googlemail.com>
	<20090428110425.1eea5299@o> <49F751FB.9040102@wanadoo.fr>
	<49F7601D.8020605@wanadoo.fr> <49F76293.7050402@gmx.net>
Message-ID: <3bdda690904282125j1e2ce550rf0f174deb9e12fe6@mail.gmail.com>

Mathias Panzenb?ck wrote:
> Pascal Chambon wrote:
>>
>> Then just call* delayedNested(partial(A,...arguments...), partial(B,
>> ...arguments...))*/ /to have what you want.
>>
>
> It would be much shorter and more readable to manually nest the with
> statements.

You could improve the look a little by changing the input excepted. Say:

with delaynested( [A, B], [(A_arg1, A_arg2), B_args]):
    do_stuff()

A realistic example:

with delaynested([getlock, open], [None, ("file.txt",)]): #Notice that
the comma is important!
    do_stuff()

Is that so bad? I think it looks OK.

That said, it would be nice if there were a better way to do partial
functions than either lambda or functools.partial.

-- Carl


From grosser.meister.morti at gmx.net  Wed Apr 29 13:07:52 2009
From: grosser.meister.morti at gmx.net (=?ISO-8859-1?Q?Mathias_Panzenb=F6ck?=)
Date: Wed, 29 Apr 2009 13:07:52 +0200
Subject: [Python-ideas] Nested with statements
In-Reply-To: <3bdda690904282125j1e2ce550rf0f174deb9e12fe6@mail.gmail.com>
References: <c23311650904271233k7671e988m35d1bf1bd8fd048b@mail.gmail.com>	<47C04CDA-8B21-495B-B1FA-E74AD3050B05@googlemail.com>	<20090428110425.1eea5299@o>
	<49F751FB.9040102@wanadoo.fr>	<49F7601D.8020605@wanadoo.fr>
	<49F76293.7050402@gmx.net>
	<3bdda690904282125j1e2ce550rf0f174deb9e12fe6@mail.gmail.com>
Message-ID: <49F83508.8090509@gmx.net>

Carl Johnson wrote:
 > Mathias Panzenb?ck wrote:
 >> Pascal Chambon wrote:
 >>> Then just call* delayedNested(partial(A,...arguments...), partial(B,
 >>> ...arguments...))*/ /to have what you want.
 >>>
 >> It would be much shorter and more readable to manually nest the with
 >> statements.
 >
 > You could improve the look a little by changing the input excepted. Say:
 >
 > with delaynested( [A, B], [(A_arg1, A_arg2), B_args]):
 >     do_stuff()
 >
 > A realistic example:
 >
 > with delaynested([getlock, open], [None, ("file.txt",)]): #Notice that
 > the comma is important!
 >     do_stuff()
 >
 > Is that so bad? I think it looks OK.
 >
 > That said, it would be nice if there were a better way to do partial
 > functions than either lambda or functools.partial.
 >
 > -- Carl

Or:
with delaynested((getlock,), (open,"file.txt")) as lock, fp:
     do_stuff()

But what about keyword parameters? The clean solution is the new syntax.

	-panzi


From ziade.tarek at gmail.com  Wed Apr 29 20:47:45 2009
From: ziade.tarek at gmail.com (=?ISO-8859-1?Q?Tarek_Ziad=E9?=)
Date: Wed, 29 Apr 2009 20:47:45 +0200
Subject: [Python-ideas] having a "iterable" built-in
Message-ID: <94bdd2610904291147v2e578463k20178e3b27cdc000@mail.gmail.com>

Hello

unless I missed it, I couldn't find a built-in to check if an object
is iterable,

so I wrote this function :

def iterable(ob):
    try:
        iter(ob)
    except TypeError:
        return False
    return True


What about having such a built-in in Python ? (with the proper
implementation if course)

Regards
Tarek

-- 
Tarek Ziad? | http://ziade.org


From chambon.pascal at wanadoo.fr  Wed Apr 29 21:02:34 2009
From: chambon.pascal at wanadoo.fr (Pascal Chambon)
Date: Wed, 29 Apr 2009 21:02:34 +0200
Subject: [Python-ideas] having a "iterable" built-in
In-Reply-To: <94bdd2610904291147v2e578463k20178e3b27cdc000@mail.gmail.com>
References: <94bdd2610904291147v2e578463k20178e3b27cdc000@mail.gmail.com>
Message-ID: <49F8A44A.5030607@wanadoo.fr>


Tarek Ziad? a ?crit :
> Hello
>
> unless I missed it, I couldn't find a built-in to check if an object
> is iterable,
>
> so I wrote this function :
>
> def iterable(ob):
>     try:
>         iter(ob)
>     except TypeError:
>         return False
>     return True
>
>
> What about having such a built-in in Python ? (with the proper
> implementation if course)
>
> Regards
> Tarek
>
>   
Well, I guess the new abilities of abstract base classes and stuffs, in 
2.6, allow this : http://docs.python.org/library/collections.html

You should call something like isinstance(ovj, Iterable), which is less 
cute than isiterable(obj), but far more flexible ^^

Regards,
Pascal






From ziade.tarek at gmail.com  Wed Apr 29 23:34:16 2009
From: ziade.tarek at gmail.com (=?ISO-8859-1?Q?Tarek_Ziad=E9?=)
Date: Wed, 29 Apr 2009 23:34:16 +0200
Subject: [Python-ideas] having a "iterable" built-in
In-Reply-To: <49F8A44A.5030607@wanadoo.fr>
References: <94bdd2610904291147v2e578463k20178e3b27cdc000@mail.gmail.com>
	<49F8A44A.5030607@wanadoo.fr>
Message-ID: <94bdd2610904291434s76a9065ch2dd9cd3c6446fff0@mail.gmail.com>

On Wed, Apr 29, 2009 at 9:02 PM, Pascal Chambon
<chambon.pascal at wanadoo.fr> wrote:

> Well, I guess the new abilities of abstract base classes and stuffs, in 2.6,
> allow this : http://docs.python.org/library/collections.html
>
> You should call something like isinstance(ovj, Iterable), which is less cute
> than isiterable(obj), but far more flexible ^^

Right that's it, thanks :)


From cs at zip.com.au  Wed Apr 29 23:42:15 2009
From: cs at zip.com.au (Cameron Simpson)
Date: Thu, 30 Apr 2009 07:42:15 +1000
Subject: [Python-ideas] having a "iterable" built-in
In-Reply-To: <94bdd2610904291147v2e578463k20178e3b27cdc000@mail.gmail.com>
Message-ID: <20090429214215.GA694@cskk.homeip.net>

On 29Apr2009 20:47, Tarek Ziad? <ziade.tarek at gmail.com> wrote:
| unless I missed it, I couldn't find a built-in to check if an object
| is iterable,
| 
| so I wrote this function :
| 
| def iterable(ob):
|     try:
|         iter(ob)
|     except TypeError:
|         return False
|     return True

This is actually a bad way of doing it. Suppose using the iterator has
side effects? For example, "ob" might be a store-of-recent-messages,
which empties after they have been collected. Or a file() attached to a
pipe.
-- 
Cameron Simpson <cs at zip.com.au> DoD#743
http://www.cskk.ezoshosting.com/cs/

Morning people may be respected, but night people are feared.
        - Bob Barker <Bob.Barker at MCI.Com>


From adam at atlas.st  Thu Apr 30 00:08:33 2009
From: adam at atlas.st (Adam Atlas)
Date: Wed, 29 Apr 2009 18:08:33 -0400
Subject: [Python-ideas] having a "iterable" built-in
In-Reply-To: <94bdd2610904291147v2e578463k20178e3b27cdc000@mail.gmail.com>
References: <94bdd2610904291147v2e578463k20178e3b27cdc000@mail.gmail.com>
Message-ID: <EC0246BD-8AF4-4B4C-82DF-F2B13DD38839@atlas.st>

I don't think that's going to happen. The similar builtin "callable"  
has been removed as of Python 3.0 in favour of isinstance(x,  
collections.Callable), or in versions < 2.6, hasattr(x, '__call__'),  
and the convention is similar for iterables ("isinstance(x,  
collections.Iterable)" or "hasattr(x, '__iter__')").


On 29 Apr 2009, at 14:47, Tarek Ziad? wrote:
> Hello
>
> unless I missed it, I couldn't find a built-in to check if an object
> is iterable,
>
> so I wrote this function :
>
> def iterable(ob):
>    try:
>        iter(ob)
>    except TypeError:
>        return False
>    return True
>
>
> What about having such a built-in in Python ? (with the proper
> implementation if course)
>
> Regards
> Tarek
>
> -- 
> Tarek Ziad? | http://ziade.org
> _______________________________________________
> Python-ideas mailing list
> Python-ideas at python.org
> http://mail.python.org/mailman/listinfo/python-ideas



From tjreedy at udel.edu  Thu Apr 30 00:20:05 2009
From: tjreedy at udel.edu (Terry Reedy)
Date: Wed, 29 Apr 2009 18:20:05 -0400
Subject: [Python-ideas] having a "iterable" built-in
In-Reply-To: <94bdd2610904291147v2e578463k20178e3b27cdc000@mail.gmail.com>
References: <94bdd2610904291147v2e578463k20178e3b27cdc000@mail.gmail.com>
Message-ID: <gtajql$476$1@ger.gmane.org>

Tarek Ziad? wrote:
> Hello
> 
> unless I missed it, I couldn't find a built-in to check if an object
> is iterable,
> 
> so I wrote this function :
> 
> def iterable(ob):
>     try:
>         iter(ob)
>     except TypeError:
>         return False
>     return True
> 
> 
> What about having such a built-in in Python ? (with the proper
> implementation if course)

hasattr(ob, '__iter__')



From benjamin at python.org  Thu Apr 30 00:33:27 2009
From: benjamin at python.org (Benjamin Peterson)
Date: Wed, 29 Apr 2009 22:33:27 +0000 (UTC)
Subject: [Python-ideas] caching properties
Message-ID: <loom.20090429T222818-244@post.gmane.org>

I think it would be nice to add a "cache" argument to the property()
constructor. When "cache" was True, property would only ask the getter function
once for the result. This would simplify properties that require expensive
operations to compute.



From grosser.meister.morti at gmx.net  Thu Apr 30 01:03:35 2009
From: grosser.meister.morti at gmx.net (=?ISO-8859-1?Q?Mathias_Panzenb=F6ck?=)
Date: Thu, 30 Apr 2009 01:03:35 +0200
Subject: [Python-ideas] caching properties
In-Reply-To: <loom.20090429T222818-244@post.gmane.org>
References: <loom.20090429T222818-244@post.gmane.org>
Message-ID: <49F8DCC7.1020505@gmx.net>

Benjamin Peterson wrote:
> I think it would be nice to add a "cache" argument to the property()
> constructor. When "cache" was True, property would only ask the getter function
> once for the result. This would simplify properties that require expensive
> operations to compute.
> 

You mean like the once methods in eiffel?


From phd at phd.pp.ru  Thu Apr 30 01:06:20 2009
From: phd at phd.pp.ru (Oleg Broytmann)
Date: Thu, 30 Apr 2009 03:06:20 +0400
Subject: [Python-ideas] caching properties
In-Reply-To: <loom.20090429T222818-244@post.gmane.org>
References: <loom.20090429T222818-244@post.gmane.org>
Message-ID: <20090429230620.GA23182@phd.pp.ru>

On Wed, Apr 29, 2009 at 10:33:27PM +0000, Benjamin Peterson wrote:
> I think it would be nice to add a "cache" argument to the property()
> constructor. When "cache" was True, property would only ask the getter function
> once for the result. This would simplify properties that require expensive
> operations to compute.

http://ppa.cvs.sourceforge.net/viewvc/*checkout*/ppa/QPS/qps/qUtils.py

   See class CachedAttribute. Just use @CachedAttribute instead of
@property.

Oleg.
-- 
     Oleg Broytmann            http://phd.pp.ru/            phd at phd.pp.ru
           Programmers don't die, they just GOSUB without RETURN.


From jared.grubb at gmail.com  Thu Apr 30 01:09:31 2009
From: jared.grubb at gmail.com (Jared Grubb)
Date: Wed, 29 Apr 2009 16:09:31 -0700
Subject: [Python-ideas] caching properties
In-Reply-To: <loom.20090429T222818-244@post.gmane.org>
References: <loom.20090429T222818-244@post.gmane.org>
Message-ID: <22FA1753-62CA-430F-968D-41CB30F8B4DE@gmail.com>


On 29 Apr 2009, at 15:33, Benjamin Peterson wrote:
> I think it would be nice to add a "cache" argument to the property()
> constructor. When "cache" was True, property would only ask the  
> getter function
> once for the result. This would simplify properties that require  
> expensive
> operations to compute.

You know, I have been wishing for something like this as well. I  
contribute from time to time to the SCons project (a Python-based  
build system), and caching values of things is crucial to its  
performance.

I had hoped to use a pattern like this (which doesnt work):

class Foo(object):
    def _get_path(self):
         # Compute it
         path = do_something_to_compute_path()
         # Replace the property with the value itself, all future  
lookups get this value immediately
         object.__setattr__(self, 'path', path)  # (note: this is  
illegal and doesnt actually work)
         return path
    path = property(_get_path)

    def bar(self):
         # For whatever reason, this operation invalidated the cached  
value, so we reset it.
         # If and only if queried again is the new value created
         object.__setattr__(self, 'path', property(_get_path))

This would replace the way that SCons currently does it (doing this  
from memory, but the gist is right):

      def _old_way_get_path(self):
           try:
               cache = self._cache
               return cache['path']
           except KeyError:
               path = cache['path'] = do_something_to_compute_path()
               return path
      def old_bar(self):
            # invalidate the value:
            del self._cache['path']

The old way works, but every cached call "obj.path" requires three  
dict lookups (someobj.path, self._cache, and cache['path']), whereas  
the first pattern requires exactly one dict lookup for the cached  
value (someobj.path).

I would definitely like to see some mechanism (not necessarily how I  
imagined it) for supporting cached value lookups efficiently. I am  
curious to see how this discussion goes.

Jared


From benjamin at python.org  Thu Apr 30 01:23:44 2009
From: benjamin at python.org (Benjamin Peterson)
Date: Wed, 29 Apr 2009 23:23:44 +0000 (UTC)
Subject: [Python-ideas] caching properties
References: <loom.20090429T222818-244@post.gmane.org>
	<49F8DCC7.1020505@gmx.net>
Message-ID: <loom.20090429T232313-128@post.gmane.org>

Mathias Panzenb?ck <grosser.meister.morti at ...> writes:
> 
> You mean like the once methods in eiffel?

I'm not sure. I've never used Eiffel.






From pyideas at rebertia.com  Thu Apr 30 02:31:53 2009
From: pyideas at rebertia.com (Chris Rebert)
Date: Wed, 29 Apr 2009 17:31:53 -0700
Subject: [Python-ideas] caching properties
In-Reply-To: <loom.20090429T232313-128@post.gmane.org>
References: <loom.20090429T222818-244@post.gmane.org>
	<49F8DCC7.1020505@gmx.net> <loom.20090429T232313-128@post.gmane.org>
Message-ID: <50697b2c0904291731q1313d2fep6a526666bf2e52e3@mail.gmail.com>

On Wed, Apr 29, 2009 at 4:23 PM, Benjamin Peterson <benjamin at python.org> wrote:
> Mathias Panzenb?ck <grosser.meister.morti at ...> writes:
>>
>> You mean like the once methods in eiffel?
>
> I'm not sure. I've never used Eiffel.

http://en.wikipedia.org/wiki/Eiffel_(programming_language)#Once_routines

Sounds pretty much like what you're suggesting.

Cheers,
Chris
-- 
http://blog.rebertia.com


From ben+python at benfinney.id.au  Thu Apr 30 02:57:58 2009
From: ben+python at benfinney.id.au (Ben Finney)
Date: Thu, 30 Apr 2009 10:57:58 +1000
Subject: [Python-ideas] having a "iterable" built-in
References: <94bdd2610904291147v2e578463k20178e3b27cdc000@mail.gmail.com>
	<20090429214215.GA694@cskk.homeip.net>
Message-ID: <878wljq7op.fsf@benfinney.id.au>

Cameron Simpson <cs at zip.com.au> writes:

> On 29Apr2009 20:47, Tarek Ziad? <ziade.tarek at gmail.com> wrote:
> | def iterable(ob):
> |     try:
> |         iter(ob)
> |     except TypeError:
> |         return False
> |     return True
> 
> This is actually a bad way of doing it. Suppose using the iterator has
> side effects?

If getting an iterator of an object has side effects, I'd say that
object's implementation is buggy.

> For example, "ob" might be a store-of-recent-messages, which empties
> after they have been collected. Or a file() attached to a pipe.

Neither of which should lose any items merely by getting an iterable.

The only thing that should cause the iterable to ?lose? items is
calling its ?next? method, which never happens with Tarek's code above.

-- 
 \        ?If you go parachuting, and your parachute doesn't open, and |
  `\        you friends are all watching you fall, I think a funny gag |
_o__)             would be to pretend you were swimming.? ?Jack Handey |
Ben Finney



From george.sakkis at gmail.com  Thu Apr 30 03:21:56 2009
From: george.sakkis at gmail.com (George Sakkis)
Date: Wed, 29 Apr 2009 21:21:56 -0400
Subject: [Python-ideas] caching properties
In-Reply-To: <50697b2c0904291731q1313d2fep6a526666bf2e52e3@mail.gmail.com>
References: <loom.20090429T222818-244@post.gmane.org>
	<49F8DCC7.1020505@gmx.net> <loom.20090429T232313-128@post.gmane.org>
	<50697b2c0904291731q1313d2fep6a526666bf2e52e3@mail.gmail.com>
Message-ID: <91ad5bf80904291821y311e1fddib269f3d338f5426f@mail.gmail.com>

On Wed, Apr 29, 2009 at 8:31 PM, Chris Rebert <pyideas at rebertia.com> wrote:

> On Wed, Apr 29, 2009 at 4:23 PM, Benjamin Peterson <benjamin at python.org> wrote:
>> Mathias Panzenb?ck <grosser.meister.morti at ...> writes:
>>>
>>> You mean like the once methods in eiffel?
>>
>> I'm not sure. I've never used Eiffel.
>
> http://en.wikipedia.org/wiki/Eiffel_(programming_language)#Once_routines
>
> Sounds pretty much like what you're suggesting.

Properties are not plain methods though, they also support setters and
deleters. Should the cache be cleared after a set/delete ? Or does
cache=True implies fset=fdel=None ?

George


From benjamin at python.org  Thu Apr 30 04:45:52 2009
From: benjamin at python.org (Benjamin Peterson)
Date: Thu, 30 Apr 2009 02:45:52 +0000 (UTC)
Subject: [Python-ideas] caching properties
References: <loom.20090429T222818-244@post.gmane.org>
	<49F8DCC7.1020505@gmx.net>
	<loom.20090429T232313-128@post.gmane.org>
	<50697b2c0904291731q1313d2fep6a526666bf2e52e3@mail.gmail.com>
	<91ad5bf80904291821y311e1fddib269f3d338f5426f@mail.gmail.com>
Message-ID: <loom.20090430T024440-424@post.gmane.org>

George Sakkis <george.sakkis at ...> writes:
> 
> Properties are not plain methods though, they also support setters and
> deleters. Should the cache be cleared after a set/delete ? Or does
> cache=True implies fset=fdel=None ?

Actually, I was thinking that using caching with fset or fdel would not
be allowed.






From steve at pearwood.info  Thu Apr 30 05:57:07 2009
From: steve at pearwood.info (Steven D'Aprano)
Date: Thu, 30 Apr 2009 13:57:07 +1000
Subject: [Python-ideas] caching properties
In-Reply-To: <loom.20090429T222818-244@post.gmane.org>
References: <loom.20090429T222818-244@post.gmane.org>
Message-ID: <200904301357.08083.steve@pearwood.info>

On Thu, 30 Apr 2009 08:33:27 am Benjamin Peterson wrote:
> I think it would be nice to add a "cache" argument to the
> property() constructor. When "cache" was True, property would only
> ask the getter function once for the result. This would simplify
> properties that require expensive operations to compute.

-1 on an extra parameter to property.

+1 on a cache decorator.

It is clear and simple enough to write something like:

@property
@cache  # or "once" if you prefer the Eiffel name
def value(self):
    pass

Such a decorator could then be used on any appropriate function, not 
just for properties.



-- 
Steven D'Aprano 


From ben+python at benfinney.id.au  Thu Apr 30 07:24:26 2009
From: ben+python at benfinney.id.au (Ben Finney)
Date: Thu, 30 Apr 2009 15:24:26 +1000
Subject: [Python-ideas] caching properties
References: <loom.20090429T222818-244@post.gmane.org>
Message-ID: <87skjqpvcl.fsf@benfinney.id.au>

Benjamin Peterson <benjamin at python.org>
writes:

> I think it would be nice to add a "cache" argument to the property()
> constructor. When "cache" was True, property would only ask the getter
> function once for the result. This would simplify properties that
> require expensive operations to compute.

I would prefer this as a decorator (not affecting the function
signature), and applicable to any function (not just a property).

This is the ?memoize? pattern, implemented as a decorator in
<URL:http://code.activestate.com/recipes/496879/>.

-- 
 \     Lucifer: ?Just sign the Contract, sir, and the Piano is yours.? |
  `\     Ray: ?Sheesh! This is long! Mind if I sign it now and read it |
_o__)                                later?? ?http://www.achewood.com/ |
Ben Finney



From cs at zip.com.au  Thu Apr 30 08:08:16 2009
From: cs at zip.com.au (Cameron Simpson)
Date: Thu, 30 Apr 2009 16:08:16 +1000
Subject: [Python-ideas] having a "iterable" built-in
In-Reply-To: <878wljq7op.fsf@benfinney.id.au>
Message-ID: <20090430060816.GA3430@cskk.homeip.net>

On 30Apr2009 10:57, Ben Finney <ben+python at benfinney.id.au> wrote:
| Cameron Simpson <cs at zip.com.au> writes:
| > On 29Apr2009 20:47, Tarek Ziad? <ziade.tarek at gmail.com> wrote:
| > | def iterable(ob):
| > |     try:
| > |         iter(ob)
| > |     except TypeError:
| > |         return False
| > |     return True
| > 
| > This is actually a bad way of doing it. Suppose using the iterator has
| > side effects?
| 
| If getting an iterator of an object has side effects, I'd say that
| object's implementation is buggy.
|
| > For example, "ob" might be a store-of-recent-messages, which empties
| > after they have been collected. Or a file() attached to a pipe.
| 
| Neither of which should lose any items merely by getting an iterable.
| 
| The only thing that should cause the iterable to ?lose? items is
| calling its ?next? method, which never happens with Tarek's code above.

Oh. Sorry, my bad. You're absolutely right. I was confusing getting the
iterator with using it. Thanks!
-- 
Cameron Simpson <cs at zip.com.au> DoD#743
http://www.cskk.ezoshosting.com/cs/

...if you don't get the finger several times a day [while commuting in
Boston], you're not driving correctly.  - Mark Parrenteau, DJ, WBCN Boston


From denis.spir at free.fr  Thu Apr 30 09:00:04 2009
From: denis.spir at free.fr (spir)
Date: Thu, 30 Apr 2009 09:00:04 +0200
Subject: [Python-ideas] caching properties
In-Reply-To: <200904301357.08083.steve@pearwood.info>
References: <loom.20090429T222818-244@post.gmane.org>
	<200904301357.08083.steve@pearwood.info>
Message-ID: <20090430090004.57c2e51c@o>

Le Thu, 30 Apr 2009 13:57:07 +1000,
Steven D'Aprano <steve at pearwood.info> s'exprima ainsi:

> -1 on an extra parameter to property.
> 
> +1 on a cache decorator.
> 
> It is clear and simple enough to write something like:
> 
> @property
> @cache  # or "once" if you prefer the Eiffel name
> def value(self):
>     pass

I vote +1 for this version.
Cache and property are distinct features.

Denis
------
la vita e estrany


From ziade.tarek at gmail.com  Thu Apr 30 09:00:35 2009
From: ziade.tarek at gmail.com (=?ISO-8859-1?Q?Tarek_Ziad=E9?=)
Date: Thu, 30 Apr 2009 09:00:35 +0200
Subject: [Python-ideas] having a "iterable" built-in
In-Reply-To: <gtajql$476$1@ger.gmane.org>
References: <94bdd2610904291147v2e578463k20178e3b27cdc000@mail.gmail.com>
	<gtajql$476$1@ger.gmane.org>
Message-ID: <94bdd2610904300000m58ba33c4wfce4cc69d431879c@mail.gmail.com>

On Thu, Apr 30, 2009 at 12:20 AM, Terry Reedy <tjreedy at udel.edu> wrote:
> hasattr(ob, '__iter__')
>

Right, that works under Python 3, (It won't cover str under Python 2 though)


Tarek
-- 
Tarek Ziad? | http://ziade.org


From solipsis at pitrou.net  Thu Apr 30 13:06:25 2009
From: solipsis at pitrou.net (Antoine Pitrou)
Date: Thu, 30 Apr 2009 11:06:25 +0000 (UTC)
Subject: [Python-ideas] caching properties
References: <loom.20090429T222818-244@post.gmane.org>
	<200904301357.08083.steve@pearwood.info>
Message-ID: <loom.20090430T110239-547@post.gmane.org>

Steven D'Aprano <steve at ...> writes:
> 
> -1 on an extra parameter to property.
> 
> +1 on a cache decorator.

+1 for calling it "cached" or "memoize". I don't know if it has its place in the
builtin namespace, though. Putting it in functools would be fine.

(it may need a way to flush/invalidate the cache, too:

@cached
def foo(l=[4]):
    l[0] += 1
    return 4

print foo() # 5
print foo() # 5
foo.flush()
print foo() # 6

)




From jeremy at jeremybanks.ca  Thu Apr 30 14:36:17 2009
From: jeremy at jeremybanks.ca (Jeremy Banks)
Date: Thu, 30 Apr 2009 09:36:17 -0300
Subject: [Python-ideas] caching properties
In-Reply-To: <20090430090004.57c2e51c@o>
References: <loom.20090429T222818-244@post.gmane.org>
	<200904301357.08083.steve@pearwood.info> <20090430090004.57c2e51c@o>
Message-ID: <9e754ef50904300536n35009806w4509e4f17da0ebc3@mail.gmail.com>

I think one thing that should be addressed is how to change the cached
value, for example after the setter is called. Having multiple
decorators would seem to make this slightly more complicated. If we
were using only one, it would be easy to just add a cache property to
the decorated function, which the setter could easily deal with.

  @property(cache=True)
  def foo(self):
    return(self.bar * 99)

  @foo.setter
  def foo(self, bar):
    self.bar = bar / 99
    del foo.cache # or foo.cache = bar

I guess this doesn't actually work anyway because of the way setters
are specified, since it would refer to the setter instead of the
property. Hmm. Perhaps, since the return value of setters is currently
meaingless, it could be written into the cache, or used to specify if
the cache should be cleared? Or just clear the cache automatically any
time the setter is called.

I realize that these ideas would probably introduce excessive overhead
and have some other problems, but I think this is something that needs
to be figured out before too long, so having some more ideas out there
probably won't hurt.

- JB

On 2009-04-30, spir <denis.spir at free.fr> wrote:
> Le Thu, 30 Apr 2009 13:57:07 +1000,
> Steven D'Aprano <steve at pearwood.info> s'exprima ainsi:
>
>> -1 on an extra parameter to property.
>>
>> +1 on a cache decorator.
>>
>> It is clear and simple enough to write something like:
>>
>> @property
>> @cache  # or "once" if you prefer the Eiffel name
>> def value(self):
>>     pass
>
> I vote +1 for this version.
> Cache and property are distinct features.
>
> Denis
> ------
> la vita e estrany
> _______________________________________________
> Python-ideas mailing list
> Python-ideas at python.org
> http://mail.python.org/mailman/listinfo/python-ideas
>


From fetchinson at googlemail.com  Thu Apr 30 14:55:41 2009
From: fetchinson at googlemail.com (Daniel Fetchinson)
Date: Thu, 30 Apr 2009 05:55:41 -0700
Subject: [Python-ideas] caching properties
In-Reply-To: <200904301357.08083.steve@pearwood.info>
References: <loom.20090429T222818-244@post.gmane.org>
	<200904301357.08083.steve@pearwood.info>
Message-ID: <fbe2e2100904300555w44a31e2coeea50a66c0ccf63@mail.gmail.com>

>> I think it would be nice to add a "cache" argument to the
>> property() constructor. When "cache" was True, property would only
>> ask the getter function once for the result. This would simplify
>> properties that require expensive operations to compute.
>
> -1 on an extra parameter to property.
>
> +1 on a cache decorator.
>
> It is clear and simple enough to write something like:
>
> @property
> @cache  # or "once" if you prefer the Eiffel name
> def value(self):
>     pass
>
> Such a decorator could then be used on any appropriate function, not
> just for properties.

I agree, -1 on modifying @property and +1 on adding a new decorator to
functools or some other module, maybe a collection of useful
decorators.

Cheers,
Daniel


-- 
Psss, psss, put it down! - http://www.cafepress.com/putitdown


From steve at pearwood.info  Thu Apr 30 15:56:48 2009
From: steve at pearwood.info (Steven D'Aprano)
Date: Thu, 30 Apr 2009 23:56:48 +1000
Subject: [Python-ideas] caching properties
In-Reply-To: <9e754ef50904300536n35009806w4509e4f17da0ebc3@mail.gmail.com>
References: <loom.20090429T222818-244@post.gmane.org>
	<20090430090004.57c2e51c@o>
	<9e754ef50904300536n35009806w4509e4f17da0ebc3@mail.gmail.com>
Message-ID: <200904302356.48819.steve@pearwood.info>

On Thu, 30 Apr 2009 10:36:17 pm Jeremy Banks wrote:
> I think one thing that should be addressed is how to change the
> cached value, for example after the setter is called. 

For the use-case being discussed, you generally don't want to cache a 
value that changes. You generally want to cache a value which is 
expensive to calculate, but never changes.

However, there are uses for caches that expire after some time. I just 
can't think of any where I'd want them to be accessed via a property 
instead of a function call.


> Having multiple 
> decorators would seem to make this slightly more complicated.

I don't see why. The cache decorator could expose an interface to expire 
the cached value.

Here's a quick&dirty but working solution. (Tested under Python 2.5.)

def once(func):
    class Cache(object):
        def __call__(self, *args, **kwargs):
            try:
                return self._value
            except AttributeError:
                result = func(*args, **kwargs)
                self._value = result
                return result
        def expire(self):
            del self._value
    return Cache()


class Test(object):
    @property
    @once
    def expensive(self):
        import time
        time.sleep(20)
        return 1


Works like a charm :) (At least for a solution I knocked up in 30 
seconds.) The only downside is that to expire the cache, you need the 
not-very-obvious call:

Test.expensive.fget.expire()

rather than t.expensive.expire(), which can't work for obvious reasons.


> If we 
> were using only one, it would be easy to just add a cache property to
> the decorated function, which the setter could easily deal with.
...
> I guess this doesn't actually work anyway because of the way setters
> are specified, since it would refer to the setter instead of the
> property. Hmm. 

I'm not exactly sure that a solution that doesn't work can be said 
to "easily deal with" anything :)


-- 
Steven D'Aprano


From denis.spir at free.fr  Thu Apr 30 16:53:09 2009
From: denis.spir at free.fr (spir)
Date: Thu, 30 Apr 2009 16:53:09 +0200
Subject: [Python-ideas] caching properties
In-Reply-To: <fbe2e2100904300555w44a31e2coeea50a66c0ccf63@mail.gmail.com>
References: <loom.20090429T222818-244@post.gmane.org>
	<200904301357.08083.steve@pearwood.info>
	<fbe2e2100904300555w44a31e2coeea50a66c0ccf63@mail.gmail.com>
Message-ID: <20090430165309.4404f5d7@o>

Le Thu, 30 Apr 2009 05:55:41 -0700,
Daniel Fetchinson <fetchinson at googlemail.com> s'exprima ainsi:

> > It is clear and simple enough to write something like:
> >
> > @property
> > @cache  # or "once" if you prefer the Eiffel name
> > def value(self):
> >     pass
> >
> > Such a decorator could then be used on any appropriate function, not
> > just for properties.  
> 
> I agree, -1 on modifying @property and +1 on adding a new decorator to
> functools or some other module, maybe a collection of useful
> decorators.

While I regularly use caching/memoizing (e.g. for packrat parsing) and I like the idea of having it predefined in the language, now I wonder whether it's really worth it. Because usually it's a very simple thing to implement -- often two obvious lines of code -- and easy to understand, even for someone who does not know the principle yet.
E.g for a property getter:

def _getThing(...):
   if self._thing is not None:
      return self._thing
   <else compute thing>

(Also note that for a func you can cache on the func itself: func.cache -- but the 2-3 cases when I thought at that, it was rather a design issue with global vars.)

Maybe I'm missing the point?

Denis
------
la vita e estrany


From solipsis at pitrou.net  Thu Apr 30 17:18:56 2009
From: solipsis at pitrou.net (Antoine Pitrou)
Date: Thu, 30 Apr 2009 15:18:56 +0000 (UTC)
Subject: [Python-ideas] caching properties
References: <loom.20090429T222818-244@post.gmane.org>
	<200904301357.08083.steve@pearwood.info>
	<fbe2e2100904300555w44a31e2coeea50a66c0ccf63@mail.gmail.com>
	<20090430165309.4404f5d7@o>
Message-ID: <loom.20090430T151548-25@post.gmane.org>

spir <denis.spir at ...> writes:
> 
> now I wonder whether it's really worth it. Because usually it's a very simple
> thing to implement -- often two obvious lines of code -- and easy to
understand, even for someone who does
> not know the principle yet.

Well, for one, a generic implementation may have to be thread-safe.
Also, while it's easy to implement, it's the kind of useful primitive - like
enumerate() and others - which is nice to have builtin in the language or the
standard library.

Regards

Antoine.





From Scott.Daniels at Acm.Org  Thu Apr 30 21:22:48 2009
From: Scott.Daniels at Acm.Org (Scott David Daniels)
Date: Thu, 30 Apr 2009 12:22:48 -0700
Subject: [Python-ideas] caching properties
In-Reply-To: <200904302356.48819.steve@pearwood.info>
References: <loom.20090429T222818-244@post.gmane.org>	<20090430090004.57c2e51c@o>	<9e754ef50904300536n35009806w4509e4f17da0ebc3@mail.gmail.com>
	<200904302356.48819.steve@pearwood.info>
Message-ID: <gtcth7$iof$1@ger.gmane.org>

Steven D'Aprano wrote:
> Here's a quick&dirty but working solution. (Tested under Python 2.5.)
> 
> def once(func):
>     class Cache(object):
>         def __call__(self, *args, **kwargs):
>             try:
>                 return self._value
>             except AttributeError:
>                 result = func(*args, **kwargs)
>                 self._value = result
>                 return result
>         def expire(self):
>             del self._value
>     return Cache()

This is slightly better (name change as in Antoine Pitrou's comment):

     class cached(object):

         def __init__(self, function):
             self._function = function
             self._cache = {}

         def __call__(self, *args):
             try:
                 return self._cache[args]
             except KeyError:
                 self._cache[args] = self._function(*args)
                 return self._cache[args]

         def expire(self, *args):
             del self._cache[args]


> 
> class Test(object):
>     @property
>     @once
>     def expensive(self):
>         import time
>         time.sleep(20)
>         return 1

Similarly, I can use:
   class Test(object):
       @property
       @cached
       def expensive(self):
           import time
           time.sleep(20)
           return 1

     t = Test()
and use:
     t.expensive

> Works like a charm :) (At least for a solution I knocked up in 30 
> seconds.) The only downside is that to expire the cache, you need the 
> not-very-obvious call:
> 
> Test.expensive.fget.expire()

I'll bet you cannot quite do that; I need to do:
     Test.expensive.fget.expire(t)

Note that 'cached' can do multi-arg functions, though it
doesn't handle kwargs, since there is some ambiguity
between calling keyword-provided args and the args vector.
The change to cache redundant uses would be something like:

-        def __call__(self, *args):
+        def __call__(self, *args, **kwargs):
+            args = args, tuple(sorted(kwargs.items())):

-        def expire(self, *args):
+        def expire(self, *args, **kwargs)):
+            args = args, tuple(sorted(kwargs.items())):

However, this might be fairly counter-intuitive (specifically
when expiring some values, and thinking you have them all).

--Scott David Daniels
Scott.Daniels at Acm.Org



From arnodel at googlemail.com  Thu Apr 30 21:26:44 2009
From: arnodel at googlemail.com (Arnaud Delobelle)
Date: Thu, 30 Apr 2009 20:26:44 +0100
Subject: [Python-ideas] caching properties
In-Reply-To: <gtcth7$iof$1@ger.gmane.org>
References: <loom.20090429T222818-244@post.gmane.org>	<20090430090004.57c2e51c@o>	<9e754ef50904300536n35009806w4509e4f17da0ebc3@mail.gmail.com>
	<200904302356.48819.steve@pearwood.info>
	<gtcth7$iof$1@ger.gmane.org>
Message-ID: <20868A0D-0618-4CBA-9767-F57554895EF4@googlemail.com>


On 30 Apr 2009, at 20:22, Scott David Daniels wrote:

> Steven D'Aprano wrote:
>> Here's a quick&dirty but working solution. (Tested under Python 2.5.)
>> def once(func):
>>    class Cache(object):
>>        def __call__(self, *args, **kwargs):
>>            try:
>>                return self._value
>>            except AttributeError:
>>                result = func(*args, **kwargs)
>>                self._value = result
>>                return result
>>        def expire(self):
>>            del self._value
>>    return Cache()
>
> This is slightly better (name change as in Antoine Pitrou's comment):
>
>    class cached(object):
>
>        def __init__(self, function):
>            self._function = function
>            self._cache = {}
>
>        def __call__(self, *args):
>            try:
>                return self._cache[args]
>            except KeyError:
>                self._cache[args] = self._function(*args)
>                return self._cache[args]
>
>        def expire(self, *args):
>            del self._cache[args]

You're going to run into trouble with unhashable args.

-- 
Arnaud



From Scott.Daniels at Acm.Org  Thu Apr 30 21:53:56 2009
From: Scott.Daniels at Acm.Org (Scott David Daniels)
Date: Thu, 30 Apr 2009 12:53:56 -0700
Subject: [Python-ideas] caching properties
In-Reply-To: <20868A0D-0618-4CBA-9767-F57554895EF4@googlemail.com>
References: <loom.20090429T222818-244@post.gmane.org>	<20090430090004.57c2e51c@o>	<9e754ef50904300536n35009806w4509e4f17da0ebc3@mail.gmail.com>	<200904302356.48819.steve@pearwood.info>	<gtcth7$iof$1@ger.gmane.org>
	<20868A0D-0618-4CBA-9767-F57554895EF4@googlemail.com>
Message-ID: <gtcvbi$nq4$1@ger.gmane.org>

Arnaud Delobelle wrote:
> 
> On 30 Apr 2009, at 20:22, Scott David Daniels wrote:
>> This is slightly better (name change as in Antoine Pitrou's comment):
>>
>>    class cached(object): ...
> You're going to run into trouble with unhashable args.

As is entirely appropriate.  The impossible doesn't become possible.
Of course, we'd have to explain that in the docs for cached.

--Scott David Daniels
Scott.Daniels at Acm.Org



From denis.spir at free.fr  Thu Apr 30 22:19:00 2009
From: denis.spir at free.fr (spir)
Date: Thu, 30 Apr 2009 22:19:00 +0200
Subject: [Python-ideas] caching properties
In-Reply-To: <gtcth7$iof$1@ger.gmane.org>
References: <loom.20090429T222818-244@post.gmane.org>
	<20090430090004.57c2e51c@o>
	<9e754ef50904300536n35009806w4509e4f17da0ebc3@mail.gmail.com>
	<200904302356.48819.steve@pearwood.info>
	<gtcth7$iof$1@ger.gmane.org>
Message-ID: <20090430221900.6e4c0181@o>

Le Thu, 30 Apr 2009 12:22:48 -0700,
Scott David Daniels <Scott.Daniels at Acm.Org> s'exprima ainsi:

> This is slightly better (name change as in Antoine Pitrou's comment):
> 
>      class cached(object):
> 
>          def __init__(self, function):
>              self._function = function
>              self._cache = {}
> 
>          def __call__(self, *args):
>              try:
>                  return self._cache[args]
>              except KeyError:
>                  self._cache[args] = self._function(*args)
>                  return self._cache[args]
> 
>          def expire(self, *args):
>              del self._cache[args]

(Aside from the hashable issue pointed by Arnaud)
I wonder about having the whole parameter tuple as key for caching.
In packrat parsing, you may have more than one parameter (including the source, indeed) but only one is relevant for memoizing (the position). Cache has to be reset anyway when starting a new parse, so that having the source in keys is irrelevant.
Typically, if not a single value, I guess saved results form a simple array depending on an ordinal.

Denis
------
la vita e estrany


From jared.grubb at gmail.com  Thu Apr 30 22:59:55 2009
From: jared.grubb at gmail.com (Jared Grubb)
Date: Thu, 30 Apr 2009 13:59:55 -0700
Subject: [Python-ideas] caching properties
In-Reply-To: <gtcth7$iof$1@ger.gmane.org>
References: <loom.20090429T222818-244@post.gmane.org>	<20090430090004.57c2e51c@o>	<9e754ef50904300536n35009806w4509e4f17da0ebc3@mail.gmail.com>
	<200904302356.48819.steve@pearwood.info>
	<gtcth7$iof$1@ger.gmane.org>
Message-ID: <E0784098-6646-4698-A411-D4CFEE022245@gmail.com>


On 30 Apr 2009, at 12:22, Scott David Daniels wrote:
>
> This is slightly better (name change as in Antoine Pitrou's comment):
>
>    class cached(object):
>
>        def __init__(self, function):
>            self._function = function
>            self._cache = {}
>
>        def __call__(self, *args):
>            try:
>                return self._cache[args]
>            except KeyError:
>                self._cache[args] = self._function(*args)
>                return self._cache[args]
>
>        def expire(self, *args):
>            del self._cache[args]
>

The only thing I dislike is how many dictionary lookups are required  
in order to return the value after it's been cached. I count 4 lookups  
(object.prop, prop.__call__, self._cache, and self._cache[args]).  
These add up, especially if object.prop could have returned the value  
immediately without having to go through so much indirection (but this  
is not currently possible)

Jared


From g.brandl at gmx.net  Thu Apr 30 23:46:37 2009
From: g.brandl at gmx.net (Georg Brandl)
Date: Thu, 30 Apr 2009 23:46:37 +0200
Subject: [Python-ideas] having a "iterable" built-in
In-Reply-To: <94bdd2610904300000m58ba33c4wfce4cc69d431879c@mail.gmail.com>
References: <94bdd2610904291147v2e578463k20178e3b27cdc000@mail.gmail.com>	<gtajql$476$1@ger.gmane.org>
	<94bdd2610904300000m58ba33c4wfce4cc69d431879c@mail.gmail.com>
Message-ID: <gtd68e$ebq$1@ger.gmane.org>

Tarek Ziad? schrieb:
> On Thu, Apr 30, 2009 at 12:20 AM, Terry Reedy <tjreedy-noVtnmiAkQo at public.gmane.org> wrote:
>> hasattr(ob, '__iter__')
>>
> 
> Right, that works under Python 3, (It won't cover str under Python 2 though)

The old-style iteration protocol (__getitem__ and __len__) has not
disappeared in Python 3.  It's just that str got its own iterator type.

Georg

-- 
Thus spake the Lord: Thou shalt indent with four spaces. No more, no less.
Four shall be the number of spaces thou shalt indent, and the number of thy
indenting shall be four. Eight shalt thou not indent, nor either indent thou
two, excepting that thou then proceed to four. Tabs are right out.